In the communicative climate that we outlined in our previous article, the “Center for Humane Technology” (CHT) seems to offer some more reassuring perspectives or at least solutions to reflect on to change the impact of technology on people’s lives.
The CHT was born from the commitment of three figures with very interesting professional profiles: Tristan Harris, who worked for Google as a Design Ethicist until 2015; Aza Raskin, a mathematician with work experience at Mozilla Corporation; and Randima Fernando, a technologist and sociologist.
The initial idea of founding the center came from Tristan Harris, who has become the main spokesperson for the work of the CHT and the ethics program applied to the tech world. In the numerous interviews he has given over the years, Harris often speaks about the importance of his training at the Stanford Persuasive Technology Lab as a crucial moment where he could reflect on the fallibility and limits of human nature. At the Persuasive Technology Lab he explored the study of the techniques through which the human mind can be influenced and persuaded through technology.
During his work experience at Google, he had serious doubts about the impact of technology on society. It seemed clear to him that the business model of companies like Google, Twitter and Facebook was aimed at keeping the user connected to their platforms for as much time as possible. Some strategies were designed with extreme attention to the principles of UX and UI (user experience and user interface), such as notifications, content refreshing, likes, and recommendations, just to mention the most well-known and visible tools.
In 2013, Harris decided to share his concerns with his colleagues at Google through a presentation titled “A Call to Minimize Distraction and Respect Users’ Attention,” which also marks the first step towards the CHT’s business. In one of the first slides, it reads: “Never before in history have the decisions of a handful of designers (mostly men, white, living in SF, aged 25-35) working at three companies (Facebook, Google and Apple) had so much impact on how millions of people around the world spend their attention. We should feel an enormous responsibility to get this right!”. It was a cry of alarm that came from within a restricted and powerful system which was developing within it the ability to change the lifestyle of billions of people.
The negative effects, not always desired by the algorithms, were: spread of fake news, mental health, loneliness, addiction and reduction of the ability to concentrate. In 2013 alone, according to the data, Google engineered more than 11 billion disruptions into people’s lives every day (Harris, 2013). The solution that Harris proposed was that of a Google design revolution that had as its objective the quality and health of people’s time. “We can design to reduce the volume and frequency of interruptions. We can design to be respectful about when to notify users – let it wait unless it’s important. We can design to keep users focused, by putting temptations further away when they’re trying to accomplish goals. […] We could have a team to standardize our design ethics and define best practices to minimize distraction.”
According to Harris, such a change is possible only if three fundamental points are first understood:
- human beings have certain vulnerabilities
- those vulnerabilities can be amplified and exploited
- the product design, as it currently is, exploits these vulnerabilities and causes people to act in an impulsive way.
The Center for Humane Technology is currently involved in raising awareness on the issue of technology to build a discourse and guidelines for technological change. The dissemination of research and studies takes place through documents and meetings.
The meetings, which take place mainly in the United States, are aimed not only at ordinary citizens but also students, educators and teachers, until they are able to create awareness among technology experts, politicians and investors. Each one of them takes part in the action of change according to their abilities and availability. If the goal, in fact, is to change the mentality with which social media is built and used, then everyone’s contribution will be needed. On the Center’s site there is a wealth of resources and advice divided by profession or social role. For example, a guide was recently released to help parents manage the relationship between their children and digital devices during the pandemic.
The CHT’s work is centered around three main areas:
- Educating the public: shedding light on the damage caused by social media platforms to individuals and society with the aim of creating customer pressure that requires products which are more targeted to real human needs. Among the resources made available are: the latest news, presentations and interviews with the CHT, some fundamental documents to understand the impact of technology and finally a Podcast, Your Undivided Attention, which explores some issues with external interventions.
- Informational support for public policies: public and private meetings with policy makers to support the creation of a legal architecture and a political direction that aims at greater protection of citizens in the information sphere.
- Technology experts’ support: practical tools in order to make companies improve their cultural awareness and change the design of technology. On the website, one can fine information and tools to design and prototype such us a worksheet for the development of and ethical project.
New Agenda for Technology, 2019
In April 2019, after six years of CHT activity, Harris gave a lecture on the current state of technology development. Although the analysis on the relationship between society and technology is quite in-depth, perhaps the propositional part could have been developed in more detail and more concretely. The end goal is always the same: to build a technological infrastructure that can take into account human needs without upsetting people’s lives, their habits and their political positions.
In an interview with the New York Times, Harris stated that to create “humane technology, we need to think deeply about human nature, and that means more than just talking about privacy.” Reflecting on human nature means building a concert of different disciplines that can account for the fragility of the human mind and the strategies to respect them: in addition to information technology and design, sociology, psychology, philosophy will all be useful.
To begin a reflection that moves in this direction, Harris often quotes another phrase that highlights the disparity between our intellectual abilities and the power of technology: “we have Paleolithic emotions, medieval institutions, and god-like technology” (Edward O. Wilson). He understands that there cannot be a technological revolution that does not take the state of the human being seriously.
There are three causes that have led to what Harris calls “human downgrading”:
- Artificial social systems: change in the perception of social relations and the view of one’s own image.
- Artificial intelligence: an algorithm that exploits human weaknesses not to improve the online experience but to increase profit.
- Extractive attention economy: the goal is to keep the user connected on the platform, but the quality of the time spent on the network has no value leading to serious repercussions on the ability to focus: we self-interrupt every 40 seconds (Harris, 2019)
The CHT agenda, then, proposes a structural change. Starting from a better understanding of how human beings function both in groups and in solitude, a new infrastructure can be designed that incentivizes companies to compete on important problems, rewarding strategies that ensure positive interactions. The ethical transition will also change the business model of these companies: from user profiling for advertising purposes to platform membership. We must ask ourselves how many users will be willing to give up the freedom of access (free membership) of social networks such as Facebook and Twitter to preserve their psychological health, mindful of the warning “If you are not paying, you are the product.”
Harris’s proposal appears excellent in intentions and purposes, but it is all to be evaluated in its concrete applications. One must also wonder how much these huge multinationals, decidedly profit-oriented and with investors to account for, can change their approach and business model so profoundly.
cover image credit: The Met (Object: 37556 // Accession: 1991.456.12)
Testo Originale: Francesca Balestro / Traduzione: Peter Briggs