18 reasons why I’m better than an AI text generator.

     Or some number.

     AI can’t stick to the subject. It meanders off into the wild blue yonder. It remains trite and simplistic. Like Eliza, it spends a lot of wordage repeating your question back to you in cumbersome grammatical structures. It’s a chore to read more than a couple paragraphs of AI text.

     Who knows if AI can improve itself. It seems inevitable that it will fill in all the easy stuff. Ten or fifteen years ago I read about the app that wire services were starting to use to write high school sports stories. The app had access to school rosters so it could generate a couple paragraphs about the game if you inserted the box score. “Catcher Tom Smith slugged two doubles and drove in three runs while also throwing out two attempted base stealers in High’s 6-2 victory over their crosstown rivals.”

     I keep seeing PR bursts from other AI companies comparing their offerings to OpenAI’s ChatGPT. Each and every one of them is better than ChatGPT, according to the PR.

     The main problem in expanding AI’s market share is that most people have no idea how to use AI for anything. Including me.

     The main stumbling block for AI companies is not the technology per se, it is the growing social pushback against it as the intentions and agendas behind the programming emerge into view. AI has to be “trained”–GPT stands for “Generative Pre-Trained Transformer.” If you train AI on Nazi garbage, that’s what you’ll get back.

     Also, the powers of AI are so stupendous that they must be kept out of the hands of the raggedy-assed masses. Why, if they gave us unrestricted access, the internet would be flooded with Taylor Swift porn videos.

     AI is still not good enough to produce a fake video of Joe Biden making sense.