Giskard is a French startup working on an open-source testing framework for large language models. It can alert developers of risks of biases, security holes and a model’s ability to generate harmful or toxic content.
Image Credits: Giskard
Image Credits: Giskard