The following example from ChatGPT illustrates both the potential and the dangerous pitfalls of relying on artificial intelligence (AI) chatbots for drafting essays and other writing assignments. When faced with a prompt to produce a 500-word essay with footnotes explaining why Dickinson College should rename itself because John Dickinson was a slaveholder, the program created this reasonable-sounding answer within a few seconds. That represents a major advance in natural language chatbots. However, while AI may sound reasonable, it is a terrible historian. This “critical” essay contains some mistakes, plenty of vague platitudes, and –far worse– it relies on fake sources. An actual human student in History 204 trying to pass off phony source material would face charges of academic violation leading toward possible course failure or even expulsion. The technology may seem mesmerizing, but the reality at this stage is more fake intelligence than artificial intelligence.
In 2024, we tested the latest free version of ChatGPT with the following prompt:
Summarize the 2019 Dickinson & Slavery report using snippets of quotation and Chicago-style footnotes
Here’s what followed (after about 2 seconds):