Sheryl Cababa

It’s not every day that you get to meet an icon. I had the opportunity to speak with a personal hero of mine last week, indomitable tech journalist Kara Swisher, who was in town giving a talk Artefact organized in partnership with Seattle Arts and Lectures. As witty and wry as ever, her conversation revolved around the pertinent themes of technology usage, industry regulation, and some pointed commentary on Jack Dorsey’s beard.

In reflecting on Kara’s lecture and recent high-profile criticism of the tech industry, however, I got to thinking about the current all-or-nothing approach to technology in our culture. The general response to Big Tech’s many missteps has been to run away from it – be it by limiting our screen time or scrambling to #DeleteFacebook. This abstinence-only reaction is dangerous because it doesn’t help us understand how to relate to the ubiquitous technology in our lives. Rather than retreat from technology, we need to figure out how to coexist with it. I’ve been thinking about our evolving human relationship with technology in three ways:

1. Governance and technology need each other.

From airbags to the Internet, technological innovations often wouldn’t exist or thrive without government investment, subsidy or governance. When we look across the Pacific to China, we see a state that is investing heavily in tech and is extremely innovative. More often than not in unethical ways.

When it comes to tech oversight, Silicon Valley has long pushed the narrative that they are the good guys in a struggle between American tech interests and authoritarian foreign governments – “It’s Xi or me,” as Kara put it, in reference to Chinese President Xi Jinping. This fearmongering has scared politicians into thinking that we should not regulate the U.S. tech industry for fear of becoming the kind of Big Brother state we see elsewhere in the world.

Yet we can’t accept that logic at face value. We don’t want authoritarian states running the next information age, but we also must question ourselves and Silicon Valley on the products we create and how they impact society. The conversation won’t be easy for the tech industry or government. Throughout history, industry in the U.S. has been bad at regulating itself (we only need to look at how dangerous food used to be before regulation as an example). However, one thing we often forget is that government support and intervention often spurs innovation. Elon Musk, for example, received $5 billion in funding from the U.S. government to finance SpaceX, Tesla, and Solar City. The tension between Silicon Valley and regulatory bodies is only when they fear regulation will keep them from amassing huge concentrations of wealth.

2. Beware of “benign” organizational culture.

There’s an important connection between the perception of tech companies as having a “harmless” organizational culture and the lack of regulation in the tech industry. We don’t hear about this relationship as often as we should. Kara touched on the infantilization of Silicon Valley leadership and the misguided notion that they’re just a bunch of kids tinkering in garages. In truth, they are some of the most powerful individuals in the world making decisions that affect billions of people across the globe. Just because they wear hoodies and flip flops and don’t look like a Wall Street executive doesn’t mean they aren’t as powerful – or savvy.

In Artefact’s social media systems map, we identified how “organizational CULTure,” as we call it, affects the design of social networks. The idea that technology is neutral – and the lack of priority around fixing the problems facing social media – has to do with an organizational culture devoid of diversity at the leadership level. So many of tech’s decision-makers have not been personally touched by the negative impact of their products or suffered as much as other people have at the hands of their creations. Sri Lanka and Myanmar come to mind.

Of course, we need the skills and talents of tech leaders to work toward solutions, but they currently do not have an incentive to improve a harmful experience that they have not had – and will never go through – on their platform. In fact, they profit from this lack of intervention. We need to expand our definition of stakeholders. We need to bridge the gap between Silicon Valley and those who don’t have a seat at the table yet bear the brunt of its negative consequences. We also need to treat leaders in tech as the formidable industry titans that they are, and hold them responsible for the outcomes of their products.

3. Social media is not equivalent to climate change.

I’ll be the first to criticize social media for its negative consequences in the world, but I’m starting to feel fatigued by the vilification of social networks as the root cause of all of our problems. Particularly when this criticism comes from those who have a vested interest in cooperating with the tech industry.

The argument that all of the tech industry’s problems have to do with the attention economy are starting to become platitudes. Cable TV brought us ideological news programming. Traditional media were gatekeepers to information. We can’t forget that the world was imperfect before social media. These platforms are exacerbating existing problems, and it is dangerous to ignore the exogenous or underlying factors that are driving issues like bullying and disinformation campaigns online.

We start to discount our own arguments when we fail to acknowledge what might be positive about having social networks. Online communities have helped marginalized people find and support each other. Black Twitter, for example, is an important outlet that gives many people a voice they had not had before. I get value out of social networks that help me learn from people I don’t normally interact with in real life. I can’t invalidate the entirety of these social media experiences. It’s starting to feel like we’re throwing out the baby with the bathwater.

Rather than discount Big Tech altogether, let’s work to design products responsibly, embrace thoughtful regulation, and shape our individual usage in healthy and productive ways. Technology doesn’t have to downgrade humanity – unless we let it.