Are We All Starting to Sound Like Robots? How AI Is Changing Our Language

0
Human face blending into a robotic head, AI language change.




It’s getting harder to tell what’s real online, isn’t it? We might think we can spot a fake image, and we know algorithms have their quirks, but evidence suggests we’re not great at noticing how they subtly change us. This is especially true when it comes to our language and what we think is normal.


Key Takeaways

  • Algorithms and AI aren't neutral; they shape our language, trends, and sense of identity to benefit platforms.
  • We often overestimate how extreme others' beliefs are, partly because social media algorithms show us more extreme content.
  • AI chatbots like ChatGPT can influence our word choices, with studies showing people using certain words more often after its release.
  • Platforms like Spotify can create and amplify trends (like hyperpop) by identifying user clusters and pushing content, blurring the line between real and artificially inflated trends.
  • This influence goes beyond language and consumption, potentially shaping our worldview and political leanings.
  • We need to constantly question why we see, say, and think certain things, and why platforms promote them, to maintain our own sense of reality.


The Perception Gap And Social Media


Think about the growing gap in how we see things, like political beliefs. We tend to think other people are way more extreme than they actually are. Social media makes this worse because the algorithms tend to show us the most extreme versions of reality. As someone who studies language and creates content, I’ve seen that controversial stuff gets shared more because it gets more reactions. This means we all end up seeing a more extreme picture of the world, and we start mistaking it for the real thing.



ChatGPT's Influence On Our Words


Something similar is happening with AI chatbots. You probably think ChatGPT is just speaking English to you, right? Well, it’s not speaking English in the same way we do. Just like the algorithm doesn’t show us reality perfectly, there are always distortions based on what the AI is fed and how it’s trained. For example, studies show that ChatGPT uses the word “delve” much more often than people typically do. This might be because the people who trained it, who were paid to do so, actually use that word more. Over time, this small difference gets reinforced in the model, even more than in the original speakers’ own language. Now, this is affecting everyone’s language. Multiple studies have found that since ChatGPT came out, people everywhere have been saying “delve” more in everyday conversations.


Essentially, we’re subconsciously mixing up the AI’s version of language with real language. This means the real thing is ironically becoming more like the machine version. It’s a feedback loop: AI shows us a version of reality, we think it’s real, and then we give it more data, which it uses to create more AI versions.



How Trends Get Made (And Amplified)


You can see this happening with algorithms and trends too. Take “hyperpop” music. It wasn’t really a thing until Spotify noticed a group of similar listeners in its algorithm. Once they created a “hyperpop” playlist, the style got a direction. People started arguing about what counted as hyperpop. The label and the playlist made the trend feel more real by giving people something to connect with or push against. As more people identified with hyperpop, more musicians started making it. All the while, the group of listeners and the algorithm got bigger, and Spotify kept pushing it because these platforms want to keep you engaged.


This blurs the line between a real trend and one that’s been artificially boosted. And this is how many fads enter the mainstream now. We start with a quiet interest in something, like maybe matcha or a specific chocolate bar. The algorithm spots this interest and pushes it to similar users, making the thing bigger. But just like how ChatGPT might misrepresent a word, the algorithm might misrepresent what people actually want. Businesses start making more of that product because they think it’s what people desire. Influencers create trends around it to go viral. But the algorithm often only shows you the most visually striking things that work well in videos.



The Danger Of Distorted Reality


TikTok, for instance, has a limited idea of who you are as a user, and it can’t possibly match your complex desires as a human. So, we have biased information going in, and that’s assuming social media is even trying to show us reality accurately, which it isn’t. Instead, it’s focused on what makes money. It’s in Spotify’s interest for you to listen to hyperpop, and it’s in TikTok’s interest for you to watch certain videos because that’s what can be sold. So, we have this gap between reality and its representation, and they constantly influence each other.


This is dangerous because it goes beyond just our language and what we buy. It affects the world we see as possible. For example, studies suggest that ChatGPT is more conservative when speaking Farsi, likely because the limited texts it was trained on in Iran reflect a more conservative political climate. Could this mean that an Iranian ChatGPT user starts thinking more conservative thoughts? We know that people in power sometimes change AI responses they don’t like, and use their platforms to boost their own messages. Does this mean millions of users are subconsciously being trained to agree with that ideology?



Staying Real In An AI World


We need to constantly remember that these aren’t neutral tools. Everything you see on social media or in chatbot responses has been filtered through what’s good for the platform, what makes money, and what fits the platform’s idea of who you are. When we ignore this, we see reality through a constant filter, which affects how we understand the world. If you’re talking more like ChatGPT, you’re probably thinking more like it too, or like TikTok, or Spotify.


But you can fight this. Constantly ask yourself: Why am I seeing this? Why am I saying this? Why am I thinking this? And why is the platform pushing this? If you don’t ask these questions, their version of reality will become yours. So, stay real.



Tags:

Post a Comment

0Comments

Post a Comment (0)

#buttons=(Ok, Go it!) #days=(20)

Our website uses cookies to enhance your experience. Check Now
Ok, Go it!