By Sam Illingworth
Publication Date: 2026-02-18 17:01:00
Most AI training courses teach you how to get results. Write a better prompt. Refine your query. Generate content faster. This approach views AI as a productivity tool and measures success by speed. It completely misses the point.
Critical AI knowledge asks different questions. Not “How do I use this?” but “Should I even use this?” Not “How do I make this faster?” but “what do I lose if I do?”
AI systems have biases that most users never realize. Researchers analyzing the British newspaper archive in 2025 found that digitized Victorian newspapers accounted for less than 20% of what was actually printed. The sample leans toward overtly political publications and away from independent voices.
Anyone who draws conclusions about Victorian society from this data risks reproducing distortions baked into the archive. The same principle applies to the data sets that underlie today’s AI tools. We cannot question what we do not see.
Literary scholars have long understood that…

