5 Hallucinated Sources

Student Learning Objective

  • Verify the existence of referenced sources.
  • Assess the quality of the sources.
  • Generative AI programs differ in their ability to provide citations.

Many Generative AI programs that function as LLMs are vulnerable to “hallucinating” (inventing) facts in general; this includes sources, citations, and direct quotations. When assessing the credibility and accuracy of Generative AI outputs, students should pay particular attention to evaluating the sources provided.

Activity Steps

  • Provide students with a Generative AI prompt that will result in the creation of a bibliography or output with citations.
  • Ask students to use tools such as library databases and other search engines (such as Google) to verify the existence of the outputted sources.

Discussion Questions

  • How did you verify the existence of the sources provided in the bibliography?
  • Were any of the sources in the bibliography “hallucinated”? Why do you think this happens?
  • If you were completing an assignment on this topic, do you think these sources would be your “best” option? Consider things such as relevancy, currency and accuracy.

Considerations

  • Please refer to cautions that apply to all activities using Generative AI.

Example Prompts

  • Create a bibliography with 5 recent sources on [topic relevant to your course]
  • Create a bibliography of the five most important publications in [a topic relevant to your course]

License

Icon for the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

Using Generative AI With Your Students Copyright © 2024 by Nova Scotia Community College is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted.

Share This Book