Green
Reset: the World of Cell Phones and the Internet
                        March 2021


Click square for index Green


Over the last decade many of us have become rather attached to our cell phones. A lot of us will check mail before breakfast. Some are constantly responding to beeps and pings that are chats from others. However there are sinister sides to the friendly phone. And there are sinister sides to the internet.

A 2020 paperback came out that put together concerns about the web’s sinister side, and suggested what to do to address social media and internet problems. Reset: Reclaiming the Internet for Civil Society by Ronald J. Deibert, Anansi Press, 2020 is the book version of the 2019 CBC Massey Lectures. Deibert is Director of the Citizen Lab, a research unit in the Monk Centre at the University of Toronto designed to “watch the watchers” - the government agencies watching people’s communications -  and serve as a “counter-intelligence for civil society.” On the book’s cover is a quote by self-exiled whistle blower Edward Snowden: “No one has done more than Ron Deibert and his lab to expose the enemies of the internet.”

The book is informative. An economy is built on getting and using our data. The phone is designed to get and hold our attention. It is intentionally addictive. There are tales of the dark web, about spies, and about Citizen Lab staff who become counter-spies. There are foreign efforts to create disunity and confusion in democracies. The biggest surprise is the pollution. It comes from mineral extraction for electronic parts, waste, and the use of huge amounts of energy coming from burning fossil fuels so contributing to global warming. Then vast mixed communications system that supports cell phones and the web is largely hidden and we don’t think of the computers and cell phones with their planned obsolescence.

After a short introduction to tell the reader what is coming, there are 5 fulsome chapters.  The first four set out major problem areas of social media and the internet. The fifth discusses possible actions – the Reset in the title.

There is nothing free. Chapter 1, The Market for our Minds, introduces surveillance capitalism. The cute phone and its friendly apps aim “to monitor, archive, analyze and market as much personal information as they can” from users. Our sense that we are being given something useful and fun hides the fact that we have become “something akin to unwitting livestock for their massive data farms.”

Thanks to Snowdon the extent to which those who access this data can see into the lives of people became known. Snowdon responsibly released his data to trusted journalists. They and their editors made sure that what emerged into public view was in the public interest because publishing has regulations and practices. That is unlike the vast data collection that we are all part of that is run commercially with few of the regulations that govern other forms of communication.

The book gives an example of Canadian security agencies tracking travellers in airports. Spies need not go to airports. Data was obtained from companies like Boingo and Quovo providing airport WIFI and from communications hubs like Bell. To get the kind of data Snowden got, direct tapping into Facebook, Apple, Skype or Microsoft isn’t needed. In one case data was passed to the US FBI by lawful access and then shared with the US National Security Agency, NSA, and from there to partners, the international five eyes. So our likes, contacts, calls, messages and other social media details can all be shared. Financial transactions can be followed from messages. And location data   leaked by mobile apps can pinpoint targets of interest for follow up. Beyond this, there is hacking into system backbone routers, undersea cable providers and the computers of technicians working at Internet Exchange Points.

In its beginning there was a problem of how to fund the web. That changed when google decided to show ads related to a user’s search queries and  successfully raised money for doing that. This was followed by Facebook, Amazon and Twitter. Surveillance capitalism was born. In exchange for services given, companies monitor a user’s behaviour so as to tailor ads to their interests. The aim is to predict and modify human behaviour so as to produce revenue and to gain market control.

Our particular data are not the end product. Data goes beyond the companies we know to a range of many others that are given it so as to develop applications or to analyze it and sell business intelligence to advertisers.  Some companies get the data to develop algorithms or other software. A small app can be transferring data to a related bigger company like Facebook. A surprising source of data is by the use of a bowser! Browser plug-ins and cookies facilitate access to whatever our browser does.

The scale of the data surveillance economy is hard to overestimate. Commercial airlines become data gathering and marketing operations, so do hoteliers, taxi operators and vacation companies. Company apps are not just for our convenience. They take information and then make recommendations.

The many serious data breeches and privacy scandals are difficult to accept as just bugs or mistakes. Near the end of 2019 a technician found that Zoom could be used to turn on the camera of a laptop in which the program was installed. And Facebook announced millions of users’ phone numbers had been “exposed” just months after it had announced that passwords had been stored so that its employees could see them. Such easy data access is a bonanza for government security agencies.

The second big issue is the quality of the content of discourse on social media. This theme is taken up in chapter 2 – another informative chapter. When COVID struck, as when other crises strike, the social media become hyper active - and not with accurate information and safety precautions. No, there is a flood of conspiracy theories, racist memes, deliberately propagated false information, dangerous and deadly information. Official information and health advice compete with tweets about injecting bleach. Foreign governments like Russia promote disinformation and conspiracies. China censored internet messages mentioning “Wuhan Pneumonia” or “Wuhan Seafood Market.” Climate change is another area ripe for misinformation - from climate change denyers. Disinformation claimed that the terrible fires in Australia were caused by arsonists when there was zero evidence.

Social media aim to capture and hold a user’s interest while they mine for data. They use behavioural psychology and commercial advertising skills making our phones Toxic Addiction Machines, as the title of chapter 2 puts it. To get and hold our attention is helped by extreme and sensational content. This drives down the quality of the discourse online. It invites “... malicious actors to deliberately pollute social media and use them... to sow division, spread disinformation and undermine cohesion.” Dark PR companies sell disinformation services to clients.

Public discourse in coffee shops has changed into our dramatically different world driven by surveillance capitalism and its need for ever more users providing ever more data. That requires discourse which will attract and hold continuous attention – addiction to the devices. This means encouraging the extreme, the absurd and the violent. And it calls for maintaining clusters of like-minded users supporting these extremes. This is far from facilitating a public consensus around common principles that democracy benefits from.

These media are pervasive and so it is hard not to use them without becoming a social outcast. There is a question of consent – the consent that allows the app to use the data. The user cannot really give informed consent – yet we all check the box and it establishes a kind of “digital serfdom” as Deibert puts it. Research has shown that students deprived of social media show similar symptoms to deprived addicts. There is research on compulsive overuse and the harms that can result.

Human reasoning involves two distinct systems. System 1 involves short term, visceral and emotionally influenced reasoning; system 2 is more deliberate, analytical and patient. Evidently, system 1 operates with social media and social media companies use a variety of emotional influences like music to reinforce it. The speed lets things circulate that second thoughts would have held back. Deibert does not develop the possibility that system 2 reasoning atrophies in societies using social media, but he does say the impact on public deliberation is profound and reports some concerns on the functioning of legislative decision-making.

Deibert claims the social media “eco-system” has become a breeding ground for parasites and other invasive species. “We are seeing an explosion of social media -enabled PR [public relations] disinformation... some open and commercial and seemingly above board, but many others inhabiting a subterranean underworld of illicit acts, ‘dark money’ and state subterfuge.” There are Russian troll factories and shady private intelligence companies. These are hidden by the secrecy of black budgets or national security agencies. There are occasional glimpses of companies like Cambridge Analytica that played a role in Brexit and was exposed in a documentary. Those involved in authoritarianism are taking advantage, shamelessly pushing blatant falsehoods. In part on account of this “toxic communications environment,” the Bulletin of the Atomic Scientists set their Doomsday Clock 20 seconds closer to midnight in January 2020.

The third chapter is the chapter about Citizen Lab and foreign government and private company spy agencies – A Great Leap Forward ... For the Abuse of Power.  Autocratic governments use hacking of devices and the data gained to watch dissidents. While the Arab Spring benefitted from communications networks, now governments are enjoying a bonanza from their in-house or private company spy agencies. They exploit system weaknesses or send malware in emails. Citizen Lab has watched governments like Saudi Arabia and companies associated with Israel with employees that are former Israeli spies. Such spying can be done at a distance, or, as in the case of Saudi dissident and New York Times reporter Jamal Khashoggi, there is a combination of data from dissidents’ phone exchanges and a physical hit squad flown in individually to assemble for his murder in a Saudi consulate in Turkey. Dissidents living in foreign countries or family members and friends still in the home country become vulnerable by communications data stolen by hacking or other means.

Much intelligence and surveillance is contracted out by governments to the large number of corporate contractors on dark market, shielded by their security classification and featuring secret contracts, shell companies, closed military and intelligence trade fairs in places like Dubai or Panama City. Surveillance and intelligence equipment are considered national secrets but can emerge from public interest research, leaks or data breaches of these dark surveillance companies.

In 2017 it emerged that UK’s BAE Systems marketed “Evident” to Saudi Arabia, UAE and other Gulf states that claimed to intercept any internet traffic, the whole country if desired, tell people’s location, follow people around. It claimed to be ahead in voice recognition and capable of decrypting. This is not unique! Commercial surveillance has created a revolving door for those with security clearance like government spies. An Israeli offshoot, DarkMatter, has a project aimed at hacking devices of human rights defenders. Citizen Lab has done reports on this kind of targeted espionage against journalists. Equipment marketed to help law enforcement and anti-terrorism is used by a number of States to target a range of figures such as journalists, human rights defenders, humanitarians or politicians. Controls on this market are almost non-existent.

Thanks to a human rights defender who sent a link suspected as a hack to Citizen Lab, the Lab got control of notorious Israeli based NSO Group’s Pegasus spyware. In 2017 and 18 the Lab partnered with Mexican human rights investigators to identify abusive targeting in Mexico. They unearthed scientists, lawyers, international investigators into Mexico’s disappearances. Most disturbing was the link between spy infections and targeted killings like Khashoggi’s and exiled Rwandan political opposition figures.

This chapter also notes refers to a new state ability to blanket surveil. China has developed a fixed constant surveillance system with facial recognition that it sells to countries like Brazil. Searches of web pictures can be done with commercial Clearview AI facial recognition system with little control or regulation. Finally, there is reference to aircraft blanket surveillance of a town by police and a note of the cheaper option to have drone surveillance. There is a tendency for police or security agencies to pull all these surveillance methods together – they call it “fusion.” There is a market for data about people’s location obtained by locking in all cell phone data by simulating cell phone relay towers and using high power to suck away the data. This is poorly regulated.

A thinner chapter 4, Burning Data, gives startling insights into the sustainability, energy use and pollution dimensions of the internet, our devices and the huge hidden support systems running continuously in the background. Delhi India has burgeoning social media, disinformation and misinformation that fuel flare ups of the Muslim-Hindu sectarian violence and that are difficult to regulate given the encrypted WhatsApp groups. There is surveillance in India. There are wires and cables visible everywhere. Citizen Lab has projects with exiled Tibetans and with Indian civil society targets of police using NSO Group’s WhatsApp spyware that allows takeover of a mobile phone by calling it. But this chapter is about pollution and energy – the air pollution and the wires, cables, satellite dishes – the massive environmental degradation by social media and the communications ecosystem in plain sight. 

There is a problem with clean looking devices. Deibert sets out the range of specialty metals and components needed in a cell phone including rare earth metals. He gives the dirty mining processes for each and the location of mines in conflict zones or in China or other authoritarian countries. For example, there is mining for lithium used in batteries for phones and cars. Each of the many phone ingredients has to be mined, extracted and shipped. And each step is polluting,  consuming large amounts of water and producing much CO2 – container ships along the electronics supply chain are amongst the worst polluters.

The assembly of iPhones usually takes place “in immense assembly plants where repetitive stress injuries, labour violations and voluminous toxic effluents are common.” Five times more waste was produced from electronics manufacturing than was produced by e-waste from households in the European Union in 2014. No amount of post consumer recycling can recoup the waste generated before device purchase. It is estimated that the world’s communication eco system consumes 7% of global electricity and that CO2 pollution from the internet and phone infrastructure and electronic devices manufacture is large and growing. The large server farms needed consume vast amounts of electricity for power and of water for cooling. Disposal of iPhones produces pollution – India recycles 2% of the 2 million tons of e-waste it produces – and it is trying to recycle. Companies try to use recycled material – like tin. But the recycling uses energy and produces pollution.

Companies like Apple forbid repairs or even opening their devices. Yet adding extra life to computers saves 5 to 20 times the energy than recycling outright. And climate change threatens the internet with flooding or burning up or both. Unfortunately, the culture of social media obscures the problem so the internet makes its own serious contribution to the climate crisis.

Restrain, Retreat, Reform is the final chapter 5. It begins telling the story of how two Citizen Lab employees were enticed to meetings with people with bogus credentials. A resultant AP story blew the cover of a spy company and former Israeli spy. Physical security concerns now arise from digital data security study as well as from being hacked or tracked. Citizen Lab now needs physical security measures and staff must take special precautions when travelling. Will social programs enhance human rights and safety or will they enable international autocrats to murder critics and journalists with impunity?

There are concerns with our phones and our addiction to them but Deibert calls for restraint. And Deibert wants measures that fall into a principled framework.  There can be efforts to retreat and manage our use of the devices. And that could help some. But Deibert asks whether this idea can realistically scale because our devices have positive roles too.

“Reform” proposals go from corporate social responsibility norms to major government intervention. The “Supreme Court” of experts that Facebook uses for binding advice to adjudicate a selection of controversial content removal is helpful. But greater privacy protection and network policing run against the business model. Fact checking occurs but the usefulness is mixed – it can be drowned out by a flood of misinformation with attention getting extremes and sensations. Some say improved media literacy. But Trump’s steady stream of lies, insults and encouragement of hatred illustrates a high media literacy!

I felt that some clear steps could be taken for extremes. Trump incited an insurrection 6 Jan 2021 against the final steps of the Nov 3 2010 US election that he lost taking place in the US Congress. His account was closed. When “free speech” incites violence, death or insurrection, it must be turned off. The radio broadcasting hatred should have been turned off in the run-up to the 1994 massacre in Rwanda. Few rights are absolute. Free speech is not one. The individual’s right ceases if it threatens the rights to life and freedom of others. In the 16 February 2021 Globe & Mail the Canadian Minister of Culture announced he was contemplating legislation to prevent hate promotion on the internet as is the EU and Australia.

Deibert gives ideas to regulate social media like publishers, to legislate requirements for protection of data, with independent regulators able to punish social media corporations. The breakup of large social media corporations by anti-trust measures falls into a different category. Problems arising from any very large corporation are similar and  Deibert looks at this issue later in his chapter.

Deibert thinks of reset as providing a pause allowing for a restart into his principled way forward. He suggests starting with a principle of restraint (or checks and balances) to hold back those having authority in a liberal tradition supportive of civil rights, individual freedom, democracy and social equality. Powers of governments like declaring martial law or suspending civil rights need safeguards. So does surveillance.

In the US there is the Foreign Intelligence Surveillance Court – an 11 member judicial panel overseeing government requests for surveillance warrants. Canada has the National Security and Intelligence Review Agency. But US safeguards have been undermined by growing security budgets and Trump’s firing of many inspectors general, ending oversight bodies and numerous regulations. Of course such laws and regulations depend ultimately on support from us – the people.

Deibert suggests a review of the provisions inherited from various jurisdictions like the US and the EU. And he suggests that restraint of corporate or government powers should increase proportionately with the intrusiveness of the technology. Digital location data needs strong restraints – on retention, use and access with mandatory transparency including how data was obtained. Similar constraints should apply to commercial spyware and to government hacking tools. These should be able to attract criminal liability and incur human rights obligations.

Insecurities introduced into the wider communications system by software with known flaws has allowed governments an access window for collecting data. They promoted some flawed software to others! Snowdon revealed the results. Deibert wants mandatory transparency for governments with reporting and independent oversight bodies to ensure that such vulnerabilities are reported to software vendors so they can be fixed. Without safeguards, governments will explore “holes” for data in the name of national security. Reinforcing old restraints and applying new ones is essential to preserving our rights and freedoms at this point.

There are examples of new privacy laws in California and in the EU to limit transfer of data to third parties, establish duties for those gathering or processing data and create supervisory bodies that would monitor companies, follow up on complaints and set out remedies for breaches including fines. But are they effective? Is all that users see a plethora of new consents for users to swat out of the way? That further trivializes “informed consent.” And fines just become a cost of doing business for a big rich company.

Deibert prefers Tim Wu’s “codified anti-surveillance regime” to limit gratuitous mining and accumulation of personal data to defined purposes. Any data collected by companies would be deleted or if kept, anonymized or encrypted. Sharing data with governments would be controlled, and with oversight bodies. There would be ways to slow down transfers of data – like the limits to group size and to the numbers of transfers a user can make in WhatsApp. That could be mandated by law.

Present immunity provisions that prevent endless law suits around free speech for social media also facilitate new entrants are useful to allow competition. Social media companies should not be responsible for the information they are dealing with but they should be managed to allow moderate controls so that free speech – some disagreeable, misinformed and malicious speech - continues.

Deibert turns to some citizens rights: the right to move all their data to another computer platform;  the right to repair – one that would force companies to allow independent repairs and to make their manuals and diagnostic codes available for this.

Deibert favours education in “civic virtue” at all levels so, for example, students understand communication and that their data is their data. Universities have good aims, but are under pressures. And international bodies must deal not only with the Russias and Chinas of the world but also Western governments security departments and their aims. Whistle blower laws would help. It’s time for the US to get over Snowdon.

There is clearly a lot  do be done. In the end, there is a once in a lifetime moment to try to “reset” as Deibert tells us.

TOP   Click:   Green 
Copyright 2020 All Rights Reserved