“This is the kind of surveillance that people can actually like. There’s satellites up there not only to spy on us, but to help us lead better lives.”
— Marcel Salathé, head of the Digital Epidemiology Lab at Switzerland’s École Polytechnique Fédérale de Lausanne on using satellite photographs to correlate obesity and environment.
Oh, fuck off Marcel.
As C.S. Lewis said: “Of all tyrannies, a tyranny exercised for the good of its victims may be the most oppressive…”
I am no technophobe.
The newspaper I work for was the first paper in Oregon to have a website. I’m (obviously) an active blogger and have developed many relationships that I value greatly that started and exist almost entirely in cyberspace. As a historian, having the world’s archives literally at my fingertips is inestimably valuable, and I love being able to read old, obscure journals and memoirs on my Kindle for 99 cents when I’d have to pay hundreds of dollars to access them in rare used copies or limited run reprints.
But technology is, like fire or government, a dangerous servant and a fearful master. And what worries me is that we are becoming increasingly indifferent to any distinction between the two. And the insidious notion that we’re surrendering more and more space in our lives to intrusive technology for our own benefit galls me on a cellular level.
Of course government surveillance is for our own good — to protect us from criminals, or terrorists, or (somehow) from making ourselves fat. And giant corporations only collect our data so that they can sell us things we really, really want. Of course. What’s to worry about?
The new feature, announced by Amazon (last week) alongside new devices including a microwave and a wall clock at an event in Seattle, is one of several upgrades that will expand the virtual assistant’s ability to listen to and understand the world around it. Alexa will able to confer with you in whispers before the end of the year, making Amazon’s voice‐operated assistant less awkward to use when someone is, say, sleeping nearby. Amazon will also make its assistant capable of listening for trouble such as breaking glass or a smoke alarm when you’re away from home, a feature called Alexa Guard.
Meanwhile, inside Amazon’s labs, the company is experimenting with giving Alexa a rudimentary form of emotional awareness, enabling it to listen for the sound of frustration in a person’s voice.
“We’re going beyond recognizing words,” says Rohit Prasad, the vice president who heads work on the artificial intelligence inside Alexa’s guts.
Talk sexy to me, Alexa!
There’s also evidence that some consumers are wary of advances in the ability of devices like the Echo to listen to them. “Privacy concerns have already been a barrier to adoption,” says Werner Goertz, a research director at analyst Gartner. “The industry’s efforts have not been sufficient to remove this misapprehension.” Goertz spoke to WIRED after disabling the Alexa installed in his hotel room to stop it from hearing its name and butting into the conversation.
The frustration‐detection feature Amazon is testing in the lab illustrates the tension between using AI to improve functionality, and privacy. For some consumers, Amazon knowing about their feelings in addition to their purchases and music choices might seem a step too far.
In the wake of the September 11, 2001, attacks, some of my conservative friends dismissed my criticisms of the so‐called Patriot Act. Understandably shaken by the scale and ferocity of the attacks, they believed that the government had to have the power and authority to do whatever was necessary to protect the citizens of the United States. And, they argued, they didn’t have anything to hide, so they didn’t have anything to worry about. That outlook didn’t sit too well with their Don’t Tread On Me Flag.
In 2003 there was an uproar — not a big enough one if you ask me — about the U.S. government’s Total Information Awareness program, which was a massive undertaking to capture and collate just about everything about everybody (as the name implies) to sift through it in the hunt for terrorists. In order to tamp down the furor, the program was renamed Terrorism Information Awareness, which totally fixed it.
There was a little bitty scandal when it came out that TIA’s honcho Admiral John Poindexter (of Iran‐Contra notoriety) thought we should let the market’s mass predictive ability work the problem, rewarding “investors” who correctly predicted terrorist acts. That seemed gauche, apparently.
TIA got reined in a bit, though its capabilities mainly just folded up under the National Security Administration. Thanks to Edward Snowden (whatever one thinks of his actions) we got a very detailed glimpse of the extent of surveillance the NSA engages in, including surveillance of Americans. Shockingly, government officials were less than forthcoming about this.
In 2013, Senator Ron Wyden of Oregon asked Director of National Intelligence James Clapper:
“Does the NSA collect any type of data at all on millions, or hundreds of millions, of Americans?”
Clapper said no. That was a lie. Of course Clapper has, for the past five years, lied about lying, but that’s just how they roll in his line of work.
The massive opportunity to glean from data, an opportunity that our day‐to‐day activities create, is getting weirder fast. China is instituting a system of “social credit” that sounds like something out of a dystopian science fiction novel:
It’s been in the pipeline for years: a sprawling, technological mass surveillance network the likes of which the world has never seen. And it’s already been switched on.
China’s “Social Credit System”– which is expected to be fully operational by 2020 – doesn’t just monitor the nation’s almost 1.4 billion citizens. It’s also designed to control and coerce them, in a gigantic social engineering experiment that some have called the “gamification of trust”.
That’s because the massive project, which has been slowly coming together for over a decade, is about assigning an individual trust score to each and every citizen, and to businesses too.
According to China’s Communist Party, the system will “allow the trustworthy to roam freely under heaven while making it hard for the discredited to take a single step.”
Science Alert reports that:
For positive personal and social acts – such as paying bills on time, engaging in charity, and properly sorting your recycling – citizens get their score bumped up, which gives them access to perks, like better credit facilities, cheaper public transport, and even shorter wait times for hospital services.
But if you break the rules, beware. People who are late with payments, or caught jaywalking or smoking in non‐smoking areas, will be punished.
In what’s being described as a “digital dictatorship,” their score takes a hit for each infraction, meaning they incur things like financial penalties and even travel restrictions.
That’s what happened to investigative journalist Liu Hu, who says the social credit system destroyed his career after he was blacklisted for making accusations of government corruption.
Branded “dishonest,” he had access to rail travel suspended, and his social media accounts – comprising some 2 million followers – were reportedly shut down, effectively making his job impossible.
Welcome to the future.
I know for sure that there are control freaks in the U.S. — hell, in my own hometown — who get aroused by the power to do good, all that pro‐social behavior, that such a system could bring. The only thing that stands between them and achieving a utopia of surveillance‐and‐control‐to‐bring‐us‐better‐lives is the legal constraints that inhere in the Constitution. We should be demanding of every candidate for office their position — in detail — on surveillance programs, their understanding of the Forth Amendment (and the First and Second, too — hell all of ’em) and judge them accordingly.
And we should each of us think long and hard about the ways we enable the surveillance culture, the ways in which we grant permission and entry to the most intrusive technologies in history.
One thing for sure: I don’t care if Alexa can answer back in the voice Michelle Pfeiffer, I ain’t talking to her. Ever.