You asked a generative Al tool to recommend new restaurants to explore in Boston, Massachusetts that have a specialty Italian dish made in a traditional fashion without spinach and wine. The generative Al tool recommended five restaurants for you to visit. After looking up the restaurants, you discovered one restaurant did not exist and two others did not have the dish.
This information provided by the generative Al tool is an example of what is commonly called?
- Prompt injection.
- Model collapse.
- Hallucination.
- Overfitting.
Answer(s): C
Explanation:
In the context of AI, particularly generative models, "hallucination" refers to the generation of outputs that are not based on the training data and are factually incorrect or non-existent. The scenario described involves the generative AI tool providing incorrect and non-existent information about restaurants, which fits the definition of hallucination.
Reference:
AIGP BODY OF KNOWLEDGE and various AI literature discussing the limitations and challenges of generative AI models.
Reveal Solution
Next Question