The "Hackable" Human may become more "Hackable" and hence more vulnerable.
Affective Computing is the study of systems designed to make use of human emotions. What happens when it's applied at a large scale?
I was scrolling through Twitter when an “interesting” article[1] popped up on my feed:
Aggression detection you say? Facial Recognition? Hmmm. Interesting.
I had time on my hands and so opened the link.
This Russian firm had plans to sell its “Aggression Detection” software, to countries in Latin America, the Middle East, and maybe… just “maybe” Asia (As if the company hadn’t considered China in its prospects).
This didn’t sit well with me.
Facial Recognition technology may be the one thing almost every techie agrees to be bad. Almost… perhaps not the growing trend of Neo-fascist Dark Enlightenment[2] figures out there but those of us who are more “liberty attuned”. We (techies and the futurists in our midst) are a culture that has endured and thrived in privacy. Many of us came to the internet to pursue the sort of privacy we felt missing in the real world.[3] We observed ourselves susceptible to the possibilities of coercion by governments. And so, sought refuge.
Today, we find ourselves feeling a lack of privacy everywhere. The internet that we once felt private has ceded ground to advertisers.[4] As if things weren’t bad enough in the meat space already!
Humanity creates technology to surpass its limitations. But as I’ve mentioned before[5], we have ways of fucking it all up and somehow limiting ourselves(through laws or technology).
Sometimes this is necessary. We would jump to our oblivion if it were not for the laws of gravity. Likewise, imagine if we had no laws to regulate bio-weapons or nuclear weapons. Though certain parties may try their best to violate these contracts, we find that a majority comply.
This latest limitation, on the other hand, that of surveillance technology… that shit genuinely scares me.
It’s not just the countless ways in which it is shown to misunderstand People of Color. That part is bad enough[6]. But couple that with the increasing growth of authoritarianism almost everywhere… and one sees a ghastly image being painted.
This particular article depicted a company that seeks to merge surveillance with something different. Its selling point was not just looking at people and being an ever-present observer. It included the “supposed” ability to detect and interpret the emotions of the person being watched. In this case, its technology claimed to interpret “Aggression”.
This is called Affective Computing[7].
And no, and I do not mean the surveillance aspect of things. Just the “interpreting the emotions” part of it all.
The term Affective Computing was coined by the Computer Scientist Rosalind Picard and refers to the study and development of systems that make use of human affects/emotions.
Human emotions are a fickle thing and our ability to guess the emotions of others reveals itself in the statuses of a lot of people who succeed. When we often talk about success and emotion in the same sentence, we invoke the acronym EQ (Emotional Intelligence Quotient) for a reason. We like to think that our ability to interpret (more like guess) the emotions of others leads us to more favorable outcomes[8].
I’m halfway done with Seth Godin’s book This is Marketing (I’ll review it and post my notes in due time). I started reading it primarily due to my skepticism of marketing.
I sat there (admittedly scrolling through Twitter) while asking myself why the fuck companies put so much time and effort into convincing people to part ways with their cash. The answer was obvious: because it makes them more money! But perhaps what I was really trying to get at the root of was why I felt marketing to be so vile.
We all hate ads. We all hate how we are targeted with ads. I mean that shit involves mostly selling our data; something we’ve become so desensitized to because it is the price we have to pay for participation in today’s virtual economy. Marketing and ads are as old as the rise of modern-day America[9].
But Seth makes a great case for the empathy involved in marketing. He tries to argue that we are all marketers because we all have a personal narrative we want to convey to the world. That was when it hit me.
That was why marketing was so effective. At the core of the greatest marketing campaigns in the world was the ability to understand our emotions and provide a product that fits our desires. In short, all great marketers were doing, was empathize really fucking well. And so, marketing dominates every aspect of our lives.
Why did you buy that MacBook or that iPhone? Those questions can be easily answered if there are material conditions involved. Many times there are. But other times, there aren’t. Take for example that one person who tries to dress “ A la mode”. Or the early adopter of technologies even if those technologies may suck at the time of release. THIS is where and why marketing excels. Because it can grapple with those emotions you aren’t conscious of. It understands your desire for status even if it’s subconscious. It’s just really good Social Engineering[10].
Now back to our discussion on Affective Computing and surveillance technology. So far we know that this technology falls short of its promises. Affective Computing technology that predicts emotions correctly only 30% percent of the time is bound to negatively affect society given how much of it depends on human emotions. Our institutions are predicated on the way we feel about certain topics (hence the existence of ideologies). Facial Recognition can’t even get identity right.
That said, we also know that it’s not long until there is a breakthrough. The story of computing has been one of desire. Get enough people to care about an issue and it will be solved.
Knowing this, we should do all in our might to prevent this sort of technology from becoming the “norm”. If any technology needs heavy regulation, it should be this one.
Why? Well, to answer this we must ask ourselves why Authoritarian governments seek this sort of technology in the first place. We must also ask ourselves why companies would enter such a market.
The answer to the latter can be described in terms of growth in demand. If there’s a want there will be people who think they can satisfy that want (at a profit).
According to Research and Markets, the Affective Computing market is bound to explode at a Compound Annual Growth Rate of 37.4% during the 2020-2025 period[11]. I have no clue how the pandemic will impact this projection, but given how much surveillance technology we’re already seeing integrated into our daily lives[12], it is bound to bolster those numbers.
They describe this growth in this way:
The increasing need to capture customer behavior & personality, the demand for software platforms for the provision of efficient learning tools in educational institutes, and the rapidly growing use of supportive tools for medical emergencies are the major drivers that have bolstered the software providers to offer effective computing software platforms.
That’s for Affective Computing alone.
Facial Recognition technology registered a CAGR of 17.6%. That’s less growth than Affective Computing technology, but dramatic growth regardless.[13]
I haven’t come across any stats depicting the market for a niche that infuses both. But would I be wrong in assuming that there may be a large unfulfilled demand for that sort of niche?
I mean, business reports always fail to mention entities that would most benefit from such technologies.
In all those reports whose stats I cited, there’s this desire to be neutral about autocratic uses. Companies are cited as the main costumers, but governments can be costumers too can’t they?
Let’s take Business Insider’s summary of the report as an example. When talking about Chinese demand, they say:
- For instance, China is witnessing a high mobile transactional volume, which is expected to create a demand for biometrics.According to the China Internet Network Information Center (CNNIC), in 2018, around 583.4 million people used mobile payment transactions in the country, up from 527 million users in 2017.
- The ongoing COVID-19 pandemic, affecting more than 140 countries in the present day, has increased the importance of facial recognition payments. Telpo, in China, is pitching this recognition technology, as a method for a sanitary alternative to fingerprint scanners, and for identifying individuals without the risk of close contact.
Realize the one entity missing in the discussion? If you said the Chinese government, you deserve a round of applause. If you said anything but that… well… you may have been living under a rock for the past few years. I’ll leave you some educational material down below.
The Russian company I mentioned before (NTechLab) saw massive funding by the Russian government as well as an “anonymous” Middle Eastern partner[14].
So we’ve seen that there’s a steady demand for this sort of technology for marketing purposes. But also evidently for autocratic surveillance as well. I haven’t yet talked about why these regimes seek this technology and I probably should (although it is a bit self-evident).
The one thing authoritarian regimes value most (besides the maintenance of power) is social order.
China has become the sort of dystopia we’ve all secretly craved. And yet upon reflection, we don’t want it that much anymore do we?
In an article written by Ross Anderson for The Atlantic titled “The Panopticon Is Already Here”[14], Ross writes:
Xi’s pronouncements on AI have a sinister edge. Artificial intelligence has applications in nearly every human domain, from the instant translation of spoken language to early viral-outbreak detection. But Xi also wants to use AI’s awesome analytical powers to push China to the cutting edge of surveillance. He wants to build an all-seeing digital system of social control, patrolled by precog algorithms that identify potential dissenters in real time.
He goes on to further cite ways in which this sort of surveillance is being used to undermine the rights of the Muslim minority Uyghur population:
Meanwhile, AI-powered sensors lurk everywhere, including in Uighurs’ purses and pants pockets. According to the anthropologist Darren Byler, some Uighurs buried their mobile phones containing Islamic materials, or even froze their data cards into dumplings for safekeeping, when Xi’s campaign of cultural erasure reached full tilt. But police have since forced them to install nanny apps on their new phones. The apps use algorithms to hunt for “ideological viruses” day and night. They can scan chat logs for Quran verses, and look for Arabic script in memes and other image files.
I fear that global institutions may not be ready for the sort of devastation that awaits them if they are to release fully functional Affective Computing into the wild especially if it were coupled with the sort of surveillance tech we see today. We’ve seen the issues with bias. But let’s even consider a world in which such bias does not exist. Let’s consider a world in which the technology runs well, and by well I mean with a 100% accuracy.
There are bound to be some great use cases. The neurodivergent kid who has a difficult time deciphering the feelings of his friends no longer has to feel possible distress at having “said the wrong thing”. We’re seeing genuinely ambitious use cases already[15].
Feeling down? What if Spotify knew your mood and could recommend the perfect playlist, and Netflix the perfect movie? This sort of technology is bound to extend our ability to empathize with each other. But it also solves a lot of issues.
On the other hand, I’ve already talked about Authoritarian regimes clamping down on dissent. But this time imagine being surveilled 24/7. The slightest wince in your eyes or the briefest glimmer of skepticism at an authoritarian leader’s words could get you killed. Bear in mind that certain regimes have no problem disposing of people they do not like. China has an excess of organ donors for a reason…[16]
We haven’t even talked about the horrors of the new type of ads that may spring up…
The Human’s strongest trait, the one that allowed us to build the laws that guide our civilization, the one that allowed us to both cooperate and compete, this thing we call emotion, is also the point at which we are most vulnerable. But if we are, so are our institutions which are essentially predicated on the way we feel.
In the words of EO Wilson:
The real problem of humanity is the following: we have paleolithic emotions; medieval institutions; and god-like technology. And it is terrifically dangerous, and it is now approaching a point of crisis overall.
We haven’t upgraded these paleolithic emotions and have yet to do the same to the medieval institutions predicated on said paleolithic emotions. As such, if we proceed with the introduction of these tools into our societies, with no modification of these institutions, we are truly fucked.
The Hackable human has been around for a long while. They’ve been there when you’ve been fooled by a phishing email. They’ve been there when you spent most of your day playing that addictive freemium game. They’ve been there when you went to a mall and bought more than you needed from Amazon.
You are already hackable but it takes quite a bit of skill. Imagining a world in which that is made effortless and automatable, is scary indeed and we should not let it become the norm.
Thanks for reading. If you haven’t subscribed yet, what are you waiting for? Get more essays like this one sent directly to your inbox!
Feel this essay worthy of discussion? Why not share it. Hopefully, it sparks an interesting conversation!
I did promise some of you some educational material so here lie some notes and references:
Notes and References.
[1] https://onezero.medium.com/aggression-detection-is-coming-to-facial-recognition-cameras-around-the-world-90f73ff65c7f
[2] https://en.wikipedia.org/wiki/Dark_Enlightenment
[3] https://en.wikipedia.org/wiki/Internet_culture
[4] https://onezero.medium.com/how-to-destroy-surveillance-capitalism-8135e6744d59
[5] https://perceptions.substack.com/p/life-extension-immortality-and-societypart
[6] https://theconversation.com/emotion-reading-tech-fails-the-racial-bias-test-108404
[7] https://en.wikipedia.org/wiki/Affective_computing
[8] https://www.fastcompany.com/3047455/why-emotionally-intelligent-people-are-more-successful
[9]
[10] https://us.norton.com/internetsecurity-emerging-threats-what-is-social-engineering.html
[11] https://www.researchandmarkets.com/reports/5129183/affective-computing-market-by-technology-touch?utm_source=BW&utm_medium=PressRelease&utm_code=dmb4tp&utm_campaign=1428517+-+Global+Affective+Computing+Market+to+Witness+Explosive+Growth+-+37.4%25+CAGR+Projected+During+2020-2025+with+the+Market+Reaching+%24139.9+Billion+by+2025&utm_exec=joca220prd
[12] https://thenextweb.com/syndication/2020/09/12/schools-are-buying-up-surveillance-technology-to-fight-covid-19/
[13] https://markets.businessinsider.com/news/stocks/facial-recognition-market-growth-trends-and-forecast-2020-2025-1029180372?op=1#
[14]https://www.theatlantic.com/magazine/archive/2020/09/china-ai-surveillance/614197/
[15]https://www.affectiva.com/
[16]https://www.nbcnews.com/news/world/china-forcefully-harvests-organs-detainees-tribunal-concludes-n1018646
Lex Fridman’s podcast with Rosalind Picard was truly amazing and something I believe worth watching(or listening).
If you want to show some appreciation of my work you can donate either Bitcoin or Ethereum to these addresses:
Bitcoin: bc1qutv60pg0zfuftgvswhm4mjcvtw8jy8amk4nqhm
Ethereum: 0x6bf7Ee266Cf11C8879646ab5c619aCc960c5C821
Till next time!