I’ve been banging this can for a while, and here it is again. The average person’s ability — and that includes you and me — to discern AI will go away very quickly. That’s partly because AI resources / processes / outputs are genuinely becoming better, but also because there’s no such thing as “AI”: it’s just a generic label we slap on a bucket of related techniques, many of which have been around for a while and are routinely applied in other contexts. So: (1) AI-generated stuff will look increasingly normal, and (2) normality will look increasingly AI-generated. Here’s a concrete example I just ran across: the opening credits to the Mary Tyler Moore show, upscaled to 4K, look like they were AI-generated. Does it matter whether the upscaling software “really” used AI or not? No, and it’ll matter less with every month that passes. This has huge implications for discussions of AI; my conclusion is that any discussion that doesn’t clearly factor this in is probably a waste of time.
And this blurb is genius: