For a few years now, a little conspiracy has been going around the Internet that Facebook is listening in on your private conversations. I have friends and family who wholeheartedly believe this. It makes sense, right? At dinner, you were discussing vacation plans with your friend. “Should we go to Tokyo or Sydney,” you ask? The conversation carries on like this throughout dinner. Unbeknownst to you, those seemingly innocuous phones of yours have been sitting quietly on the table listening in to every word you say.
After dinner, you go home, fire up Facebook, and in your feed, there’s an advertisement for discount flights to Tokyo. Your friend also goes home, fires up Facebook, and in her feed, there’s a similar, if not identical advertisement.
Maybe you’re aware of it, and you call up your friend, and say, “Hey, you know how we were talking about our vacation at dinner? Well, I was just pushed an ad for cheap flights to Tokyo.”
Perhaps you’re unaware of it, and you call up your friend, and say, “Hey, it’s settled, we’re going to Tokyo. I just got us discounted flights.”
Either way, the result is likely the same, and the intended one at that. Somebody paid top dollar to ensure that you’re going to Tokyo, instead of Sydney.
Do actions speak louder than words?
Is the aforementioned scenario a coincidence? Hardly. Are they listening? Well, kind of, but not in the way that you probably think. The thing is, they don’t need to listen. Listening isn’t efficient, as explained by Facebook’s first ads targeting product manager, Antonia García Martínez, in his Op Ed for Wired. In short, it’s nearly impossible to handle that amount of data, but more importantly, it’s the wrong kind of data to begin with.
As a software engineer, this makes perfect sense to me, but conspiracy theories are much more fun, and make you, the user, much less culpable. Ignorance is bliss, but ignorance makes me sad, so I’d like to educate you, if you’ll let me.
The reason Facebook doesn’t need to listen in on everything you say in order to know everything about you is because they have far more sophisticated ways of doing this. Perhaps you’re familiar with the idiom Actions speak louder than words? In other words, what you say is much less important than what you do. In the non-digital world, you can be as dishonest as you like. In the digital world, everything you do is a statement of absolute truth about who you are.
For example, did Ted Cruz, the married Texas senator, who once defended a ban on sex toys, like a pornographic post on Twitter? He says, “no,” and we can believe whatever we like, but Twitter knows the truth, and if he was on Google beforehand, or Amazon, or, or, or, well, they know too. Again, that data doesn’t matter. What matters, at least to the people selling sex toys, is that even though Ted Cruz says he’s against sex toys, he’s likely to buy them, so let’s send him an ad for a pocket pussy because the guy likes to masturbate.
What matters, at least to the people selling sex toys, is that even though Ted Cruz says he’s against sex toys, he’s likely to buy them, so let’s send him an ad for a pocket pussy because the guy likes to masturbate.
Tweet
The irony of social media is that we’re under the impression that we’re curating our lives. While we might be fooling that former high school crush, or even your future employer (for now,) we’re certainly not fooling AI, because your data never lies. You’re free to live two lives, three, or as many as you like, at least until you end up in an institution, so why do we care so much about a company that says one thing, but does another?
Is Facebook two faced?
More recently, Facebook’s “10 Year Challenge” popped up. It was everywhere. You couldn’t escape it. You can’t escape it still. I cringed every time someone in my network posted their then-and-now profile pictures. For the most part, it wasn’t my internal network, rather the interlopers I’ve picked up over the years, and have basically forgotten about, until I get a birthday reminder, or in this case, an unwanted post.
I’ve started deleting said interlopers, and on their birthdays no less, because it’s right there in your face. My therapist says, “One less meaningful relationship per day, does the body good.”
Whenever, I get unwanted posts such as these, alarm bells go off in my head, and they should for you too, because Facebook doesn’t make mistakes when it comes to the content in your feed. This is what they do. If you see something you don’t like, strange games are afoot, and almost certainly, someone paid to put it there.
For my part, the cringe was threefold. The first, and less important, was my simple embarrassment for these people. The second, was a deeper understanding of network effects and how they’re exploited. And the last, which overlaps with the second, is how easily large populations can be reached and manipulated these days.
Take for example the Nazi propaganda machine. They had to take complete editorial control from the free press through Gleichschaltung, in order to, spread their messages of hate to the masses. It took years to sway a single country to do unimaginable things, perhaps the most unimaginable thing, give up their own free will.
Today, if you’d like to achieve a similar effect, you just have to purchase an ad on Facebook or pay an influencer. If you’re more sophisticated and would like to do something similar to the Nazis, like influence an election, you could just go the way of Cambridge Analytica. What I’d like you to take away from this is that getting someone to purchase or vote in the way that you’d like amounts to the same thing.
People can be easily manipulated to act on your behalf, especially when you know everything about them. Even give up their free will if you like.
Of course, the 10 Year Challenge isn’t nearly as nefarious. Well, at least as far as we know, and that’s kind of the point-most people don’t know. Sure enough, days after the challenge began, Wired’s Kate O’Neill speculated that maybe, just maybe, this fire meme that had taken over the Internet was a data mining tactic.
Should you unfriend Facebook?
Does Facebook want to friend you? Yes, dummy. Facebook, however, is the worst kind of friend to keep. They’re the kind of friend that doesn’t return your calls but rings you up every time they need a favor. Or the kind of friend that doesn’t split their portion of the bill but orders Veuve Clicquot for the table. So forth and so on.
Facebook, like you or I, is a person, at least in the legal sense. Unlike you or I, its moral responsibilities are much narrower than ours. In fact, it has one, a fiduciary responsibility to make the most amount of money possible for its shareholders. So, it’s easy to understand why Facebook is such a shitty friend.
Still, with Billions of friends, and a whole lot of influence, it’s a hard relationship to say goodbye to. It’s like being friends with the most popular kid in high school who keeps you in her circle, but mostly makes fun of you. Do you continue to get picked on just to stay in the popular crew or do you stand up for yourself only to find yourself with no friends at all? Is there a lesser of two evils? Always. Which suits you best? Well, that’s up to you, isn’t it?
All of life is a contract. I’ll repeat, all of life is a contract.
At the highest level, we all belong to the social contract. Thomas Hobbes is perhaps the most brutal in his interpretation of it and is my personal favorite. This makes sense, he was an empiricist. That being said, I believe the rationalist Jean-Jacques Rousseau’s interpretation is much more agreeable:
[The social contract] can be reduced to the following terms: Each of us puts his person and all his power in common under the supreme direction of the general will; and in a body we receive each member as an indivisible part of the whole.Jean-Jacques Rousseau
Everything underneath this is still contractual, but to a lesser degree.
For instance, I have a marriage contract with my wife from now until death. The contract is meant to be mutually beneficial. There is no prenup, but there are terms to keep us protected should one party betray our contract.
When you get out of bed in the morning, that’s a contract with yourself, to follow through with all of the other contracts you’ve made for the rest of the day. Go to work. Go to the gym. Call your mother. Meet your friends for dinner. So forth and so on.
You and Facebook have a contract. There’s no amount of ridiculous posts about how you revoke Facebook’s right to your content, blah, blah, blah, that can change this. It’s insane. You should know this. Stop doing this. The good news, however, if you have the time to read them, is that like my marriage contract with my wife, there are terms. If you don’t like them, you can remove yourself from the contract. Tricky though, like my marriage, whatever you did, or shared, lives on with that other party. Your photos. Your comments. Your rants. They also belong to Facebook.
Can you ever unfriend Facebook? I don’t know. Can you ever forget a longtime friend? Can you ever forget a longtime lover? There are distances to which each of us are capable of going, but none of us are capable of going the distance. Computers though, they can go the distance.
Ghost in the machine
Traditionally, the only way to get a computer to do something was to write down an algorithm explaining how. But we’ve come so much further. Today machine-learning algorithms are different: they figure it out on their own, by inferences from data. And the more data they have, the better they get. Now we don’t have to program computers; they program themselves.
Facebook uses machine learning for things we’ve already discussed, like deciding which updates to show you based on preferences from over 80,000 indicators. Facebook has Billions of users across its products. Most users post almost blow-by-blow accounts of their lives too; it’s like having a live feed of social life on planet earth. And they’re getting better by the day with what to do with all that data.
For instance, if Facebook, did in fact, use the 10 Year Challenge as a data mining tactic to refine their facial recognition technology for aging, they would use nearest-neighbor, which is the simplest and fastest learning algorithm ever invented. It’s already learned to recognize faces through the vast database we’ve already given it of images labeled face/not face. Perhaps we’ve now given it another vast database of images labeled face then/face today.
These days, Facebook can predict nearly everything. Well, everything social, but a lot can be drawn from these connections. Take for instance, friend recommendations. It might use the rule Friends of friends are likely to be friends for that, but each instance involves three people: if Rebecca and Naomi are friends, and Naomi and Blake are also friends, then Rebecca and Blake are potential friends. Each of these rules can be turned into a feature template in a relational model, and a weight for it can be learned based on how often the feature occurs in the data. As such, the features themselves can also be learned from the data.
All this power comes at a cost, however. In an ordinary classifier, inferring an entity’s class from its attributes is a matter of a few lookups and a bit of arithmetic. In a network, each node’s class depends indirectly on all the others’, and we can’t infer it in isolation. A typical social network has millions of nodes or more. Luckily, because the model of the network consists of repetitions of the same features with the same weights, we can often condense the network into “supernodes,” each consisting of many nodes that we know will have the same probabilities, and solve a much smaller problem with the same result. A problem like, Tokyo or Sydney?
So, going all the way back to the beginning, here’s why it seems like Facebook is listening to you.
In the days leading up to your dinner with your friend, you both knew that you would be discussing your upcoming vacation. You didn’t do any research, but your friend did.
As she clicked around the Internet looking at all sorts of travel related websites about Tokyo and Sydney, a little bit of code followed her around called a Facebook Pixel. But she’s not on Facebook, you say. Tricky right? It doesn’t matter, it follows her anyway.
Meanwhile, on Instagram, you’ve spent a lot of time, liking, commenting, and just generally admiring mutual friends posts from Tokyo. Maybe you’ve gone even deeper and searched hashtags pertaining to Tokyo, or locations, or restaurants, etc.
Elsewhere, your mutual friend Sara, just got back from Tokyo. Both of you spent time looking at, and liking, her photos.
Facebook pulls all this noise together, this data, and develops a clear picture for an advertiser. These two girls, who are close friends, have been spending a lot of time looking at vacations, specifically, Tokyo.
Alas, you get pushed ads the same night you were talking about travelling to Tokyo. Magic? Yes, kind of. Black magic, and it’s about to change the world. Were they listening? They didn’t need to. That’s old news, Cold War, spy shit, we’ve evolved.
Don’t ghost me, influence me.
One of the things I’m most interested about is what to do with this technology. One phenomena of relational learning that’s particularly intriguing is word of mouth. How does information propagate in a social network? Can we measure each member’s influence and target just enough of the most influential members to set off a wave of word of mouth? The answer is, yes. Statistically, marketing a product to the single most influential member-trusted by many followers who are in turn trusted by many others, and so on-is as good as marketing to a third of all the members in isolation.
In Isaac Asimov’s Foundation, the scientist Hari Seldon manages to mathematically predict the future of humanity and thereby save it from decadence. According to Seldon, people are like molecules in gas, and the law of large numbers ensures that even if individuals are unpredictable, whole societies aren’t. Relational learning reveals why this is not the case. If people were independent, each making decisions in isolation, societies would indeed be predictable, because all those random decisions would add up to a fairly constant average. But when people interact, larger assemblies can be less predictable than smaller ones, not more. If confidence and fear are contagious, each will dominate for a while, but every now and then an entire society will swing from one to the other.
It’s not all bad news, though. If we can measure how strongly people influence each other, we can estimate how long it will be before a swing occurs, even if it’s the first one.
Let’s be like Hari Seldon, but not get it wrong this time around.
The most important question of the 21stcentury.
One of the most important questions of the twenty-first century, is who you share your data with. The most important question of the twenty-first century, is what we do with all that data.
Today your data can be of four kinds: data you share with everyone, data you share with friends or coworkers, data you share with various companies (wittingly or not,) and data you don’t share.
The first type includes things like Yelp, Amazon, and TripAdvisor Reviews, LinkedIn resumes, blogs, tweets, and so on. This data is very valuable and is the least problematic of the four. You make it available to everyone because you want to, and everyone benefits.
The second kind of data should also be unproblematic, but it isn’t because it overlaps with the third. You share updates and pictures with your friends on Facebook. Facebook has billions of friends. Day by day, it learns a lot more about the world than any one person does. It would learn even more if it had even better algorithms than the ones we discussed, and they’re getting better every day.
Facebook’s main use for all this knowledge is to target ads to you. In return, it provides the infrastructure for your sharing. That’s the contract you make when you use Facebook. The biggest problem is that Facebook is also free to do things with the data and the models that are not in your interest, and you have no way to stop it.
What did we learn?
Well, if you’ve made it this far, we’ve learned that we know very little, and we ought to learn more.
We’re concerned with silly little things, like privacy, because of our own egos and fears, but it’s these base characteristics of humanity that are the problem, and it’s what AI exploits best.
For quite some time, well before you met your friend for dinner, Facebook already knew that you were unhappy with your life, that you longed for connection, that you needed to escape, and that you would go as far as Tokyo, and even blow up your credit rating to do just that.
Some of these things are convenience, but convenience for what? We don’t live important enough lives to have servants. I don’t know about you, but at the turn of the next century, I’d like for humanity to lead meaningful enough lives to deserve not a servant, but an assistant of this nature. Nevertheless, if we stay the path, and remain ignorant, we will most certainly become the servants of computers.
Machine learners have no ego, no fear, and this is their greatest strength. Conversely, they don’t care about you, or me, and this is their greatest weakness.
It’s all sad but it’s not bad,
Stephen