Spaces:
Sleeping
Sleeping
| Why do you think the prompt failed? | |
| Typically a prompt fails for one of three reasons: | |
| - Cutoff: the training data is cutoff before a point where it could know the answer | |
| - Obscurity: the context necessary to answer is too niche to appear in training data | |
| - Hallucination: the AI made something up instead of stating it didn't know | |
| - Intelligence: the task was too complicated for the current state of the art models | |
| - Direction: the prompt didn't provide sufficient guidance on how to answer | |
| What does this failure tell you about what AI is good at / bad at? | |
| The default answer from AI often isn't near the best it can do. When presented with correct context and guidance, AI can do a lot more than you expect! | |
| Can you think of any ways to fix this prompt? | |
| We'll be learning about the five principles of prompting in the next section: compare your ideas here to what you come up with after watching the next section. |