Sep 25, 2014 · 3 minutes

Human communication has always been a struggle when smartphone keyboards are involved. There are entire sites documenting the hilarious mistakes wrought by the iPhone's imperfect autocorrect feature, which more often than not makes text messages far less clear than a simple human spelling error would have.

Now Apple is taking its text predictions to a whole other level with QuickType, a new feature included in its iOS 8 update. The tool attempts to predict each word of a text message based on cues it's taken from every text or email a user has sent on an iPhone. Apple's been spying on your every digital gesture, just not in the way you might expect.

When I tried out QuickType, Apple thought I should tell a close friend that "I don't know how to make it a lot of fun." What are you trying to say, Apple? Am I depressed?

Apple's obsessive monitoring is creepy enough, but the fact that it now believes it can speak on your behalf is even more disconcerting. QuickType is an extreme example, but predictive behavior modification finds its way into every corner of our digital lives. Auto-complete tools, used by Google, Facebook, and Twitter with varying degrees of subtlety, are among the oldest and most familiar tactics tech companies use to influence what and whom we search for and therefore what we find on the other side of the click. Powerful recommendation engines from Amazon and Netflix aim to guide your purchasing and consumption habits, respectively. With every link we click, word we type, and post we like, the Internet's algorithms are listening -- and waiting to strike.

Much of this is done, ostensibly, to save people time and make our digital lives easier: "Why yes, Google, I did mean to add an attachment to this email." "A new season of "Mad Men" is now available? Thanks for the tip, Netflix!"

But while predicting our behavior is one thing, seeking to replicate it is an entirely different level of weirdness. This is where the line between who we are and who robots want us to be begins to blur. Are these algorithmic predictions a product of my behavior? Or is my behavior a product of the algorithms? QuickType is a big step toward the dystopian future envisioned by the television show "Black Mirror," wherein a technology company will recreate a robotic version of a deceased loved one based on every email, tweet, text, or Facebook post that person ever sent. The robot's speech patterns, vocabulary, and even their ideas are replicated by a sophisticated algorithm to create an alluring yet fundamentally inauthentic illusion of a human.

To take things even further, why should robots only ape our behavior after we die? Say I don't feel like meeting a friend for dinner. I'll just send my cyborg double while I stay at home watching Netflix -- thus feeding the algorithm even more data to create a more perfect me.

We're obviously not there yet, but these tools are already shaping and possibly stunting our sense of identity in significant ways -- and I fear it's only going to intensify with each younger generation. 30-year-olds like me had the luxury of finding ourselves on a pre-digital stage. There were no apps like "Google Now" to predict what we want before we want it. But it's worth considering the consequences of these tools when introduced into the lives of young people who haven't yet had the time to develop their own unique personalities. The result may be a sea of crippling conformity smuggled in under the guise of "personalization." Everything on the web can be personalized, but you have to start with the person first.

Philosophers have debated the nature of "free will" for centuries, with some arguing that it's the highest expression of a human's selfhood and others arguing that free will is an illusion -- our choices are merely the product of millennia of genetic variation and future circumstance. Now as algorithms continue to push and pull on our digital actions, and shape our behavior in their own data-corrupted images of ourselves, these debates over free will and identity may zoom off on an entirely new vector in the digital age.

First humans claimed God pulled the strings. Then we said it was genetics. Now the agents of our fates are slowly becoming corporate-controlled robots.