How to Avoid a Neutral Tone in Your Writing
Writing tip: ChatGPT doesn’t have opinions. Boldly state yours, and defend it throughout your writing.
"Generally." "Typically." "In some cases." AI-writing tools are intensely neutral. Almost to the point of seeming indecisive. There is a telltale lack of boldness in all AI writing. Even with the most opinionated prompts, AI writing is always giving itself a way out from what it is saying. AI doesn’t have polarizing opinions.
AI writing is intensely neutral by design. The neutrality is not a byproduct, but an actual feature of ChatGPT and other AI writing tools. If you ask ChatGPT why it is so neutral all the time, it will tell you that “At the core of AI’s neutrality is the ethical imperative to do no harm.” Indeed, with a global user base made up of diverse individuals, neutrality is the only way AI writing tools can be applicable to such a broad audience. When users perceive AI as unbiased, they’re more likely to trust the information provided, and continue using the tool.
There is also a more self-interested reason for this neutrality. By never going 100% with an opinion, AI writing tools insulate themselves from legal and ethical issues. The companies behind AI writing tools have a large responsibility. There could be a resounding impact if they inadvertently perpetuate biases and stereotypes. Impartiality is fairer, and it is also less likely to have harmful legal or fiduciary consequences.
To write like a person, commit to an opinion. It’s difficult to be creative if you’re constantly equivocating. In all of human history, progress is made when people push the boundaries of thoughts and opinions, and develop a new way to think about something.
You don’t need to create an escape hatch from your own opinion. AI can aggregate perspectives, but it can’t replicate the unique mixture of experiences, intuitions, and emotions that inform your viewpoint. Your individuality is your greatest asset. Use it to craft narratives that only you can tell. Maybe you’ll inspire someone.