The Glass Room has five thematic areas bringing to life the hidden aspects of digital technologies: Deeply Personal which looks into the dark side of personal data; Invisible Labor which examines the human and environmental processes behind technologies; Trust in Us which features works on Big Tech; Big Mother which looks at the risks and rewards of technology; and Open the Box which examines the journeys that our personal data takes.
Just as we are unpredictable and imperfect, so is our personal data. The information you give when you sign up for an app or browse the web might confirm you are ‘you’, but that personal data doesn’t say everything about you. Nevertheless, that data can still be used to suggest you a date, deny you access to a bar or to a loan, target you with a political ad, or determine if you’re fit for a job.
You probably open your laptop or swipe your phone several times a day, if not per hour. How often do you think about where the device was made, who assembled it, how long it took? Technologies are fundamentally built on material resources, human labor, and large amounts of our data. We don’t normally see how those resources are extracted, how they are put to use, or the impact they have on the environment. Alongside the low-cost or outsourced labor behind the manufacturing of technologies, there is another invisible workforce supporting big tech: As users and customers, we are also workers in the digital economy. The valuable data we provide to tech companies helps them build, maintain, and improve their products – and their business models. Behind our screens, there are millions of invisible actions required to assemble, maintain, train, and optimize the technology we use. When it comes to the data economy, who is working for whom?
Trust in us
The exhibits at this table explore the true cost of ‘free’ technologies. They suggest the impact of Big Tech’s business models and the way their innovations and working cultures impact users, consumers, employees, and society at large. As these companies have become subject to increased scrutiny, public trust has shifted and the conversation has turned to what kind of social contract we can demand from them. How can the powerful tech giants be held accountable when they “move fast and break things”?
We often hear the ominous phrase “Big Brother is watching you.” But what about when you are kept track of by a more nurturing figure, more like a “Big Mother” looking after your wellbeing? When public institutions and private companies use biometrics, tracking, scoring, and profiling technologies to make our lives safer and more efficient, or to provide aid and care to vulnerable populations, how do we weigh the risks versus the rewards? There is a growing market for technologies that promise increased control, security, and protection from harm. At the same time, they can normalize surveillance at a macro and micro level – from the shape of a child’s ear to satellite images of acres of farmland. Often, those who need the most support may have the least control over how or when their data is being used. Who gains control and who loses it as we strive to reduce risks and provide more care for individuals, communities, and the environment?
Open the box
The artists’ projects and visual stories in this section reveal a variety of invisible ‘data journeys’ your information can take. Hands-on tools and short videos give you ways to see what your data looks like and how it is used on the ‘other side’ of the screen. Animations and interactive tools demystify digital processes, showing you what social media platforms and websites can infer about you. Why do you see one kind of ad in your feed while your friend sees another? Can you tell the difference between a fake or real tech product?