Via Knowledge@Wharton,
Author Franklin Foer reflects on the dangers of losing ourselves in a society dependent on a handful of tech firms.
French philosopher Rene Descartes famously said “I think, therefore I am.” But in the digital age, what we think and how we live are being influenced in a big way by just a handful of tech firms: We are informed by Google and entertained by Apple; we socialize on Facebook and shop on Amazon. It’s time to reclaim our identities and reassert our intellectual independence, according to Franklin Foer, a national correspondent for The Atlantic and former editor of The New Republic, in his book, World Without Mind: The Existential Threat of Big Tech.
He recently joined the Knowledge@Wharton show, which airs on SiriusXM channel 111, to explain why these firms’ hold on society is a cautionary tale for the future.
An edited transcript of the conversation follows.
Knowledge@Wharton: Tech companies such as Amazon have truly transformed themselves over the last couple of decades [and become a big part of our lives].
Franklin Foer: Amazon is really one of the most impressive specimens in the entire history of American business. It started off as a bookstore, then it morphed into becoming the ‘everything’ store. And it’s morphed beyond that. We know about Amazon Web Services and how it powers the cloud. We’ve seen how it just keeps expanding, culminating most recently in its decision to purchase Whole Foods. The same could be said for Google, which set out to organize knowledge but then became Alphabet, which has this massive portfolio, including a life-sciences company that aims to make us immortal.
Where do these companies end? Do we have a problem with their size? These are questions that go to the fundamental nature of our economy and whether we can really have a competitive, capitalistic system. There are more fundamental questions [we need to ask] about the future of our culture and our democracy because these companies amass tremendous troves of data about us. Those troves of data are portraits of our psyche. They use this incredibly powerful information about us in order to alter our behavior. There’s a huge amount of convenience that comes with that, but there are also real, important questions that need to be asked of these companies.
In the last couple of months, we’ve started to ask some of these questions. The outcome of the last election, with the proliferation of fake news and the debate over Facebook’s culpability in that question, has triggered a real backlash against that company. There are a number of flashpoints that have shifted the debate [about society and technology] considerably.
Knowledge@Wharton: There are three or four companies, which are giants in the tech world, that have unbelievable amounts of control over so many things in our society.
Foer: Absolutely. The Europeans call them GAFA: Google, Amazon, Facebook and Apple. There are a couple of reasons why these guys have triggered so much anxiety and why I found myself drawn to asking hard questions of them. The first is the accumulation of data. The second is the way in which that data ends up getting leveraged. We’re in the realm of algorithms and machine learning and artificial intelligence, where the advantages that accrue to the companies that have mastery over those things end up compounding over time.
So the gap grows between those big four and everybody else. We may already have reached the point where people have stopped trying to chase them. In Silicon Valley, the greatest ambition now is not to displace Google or Facebook. It’s to get bought by Google and Facebook. There’s a real question about [the future of] entrepreneurship here. Where are the opportunities? If you cease to exist in an economy where you can displace those big players, the incentive to aim for the stars and to try to create those kinds of unicorn companies diminishes.
Knowledge@Wharton: You write in the book, “As these companies have expanded, marketing themselves as champions of individuality and pluralism, their algorithms have pressed us into conformity and laid waste to privacy. They’ve produced an unstable and narrow culture of misinformation and put us on a path to a world without private contemplation, autonomous thought or solitary introspection. A world without mind.”
Foer: There’s so much about technology that’s so wonderful. I have a daughter who’s 12 years old. When she was born, there was no iPhone, there was no Kindle, there was hardly social media. Over the course of this decade, incredible things have happened. They’re real monuments to human creativity, and it’s hard not to bow down before these creations.
But the magical qualities of these creations shouldn’t distract us, shouldn’t preclude us from asking skeptical questions because the stakes here are supremely high. Over the course of the long history of humanity, we’ve always had tools that have been extensions of us. You could argue that technology is one of the things that defines us as a species.
But what’s getting automated right now isn’t upper-body strength. We’re not automating our ability to plough the fields or make widgets. We’re talking about the automation of mental exercises. These companies have technologies that are intellectual technologies. [They come] between us and reality. They are the filter we use to get news and information. They intend to create virtual realities that we’re going to be inhabiting, and they’re trying to complete this long merger between man and machine.
Soon, these technologies are going to be not just worn on our wrists or worn as glasses. They’re going to be implanted within us. We need to ask the biggest questions about what makes us human, what are the things that we want to preserve in this transition? You can’t fight the flow of technology. But we should also assume that, as human beings, we have agency. We have the ability to shape our own destiny, and we should be active in doing that, not just passively accepting whatever comes next.
Knowledge@Wharton: How impactful have some of these changes been on retail? Malls have gone significantly down in the last few years, and manufacturing has become more automated.
Foer: Let’s just take that one question of the future of retail, for instance. My dad was a small-business owner. He had a small chain of stores. He taught me a real appreciation for the value of small business and capitalism. He was also — in a weird combination — an antitrust lawyer, which really affected my thinking about capitalism, the virtues of having a competitive, diverse marketplace and what that means for us as consumers. But we also need to think about what it means for us as citizens.
While prices may be low [with automation], we need to start asking questions about the future of work. As stores disappear, a big source of jobs is evaporating. I think about it in terms of what makes life meaningful. If we live in a world where we’re planted in our own houses and we’re able to summon every movie, every book to our fingertips, that takes away a great opportunity to go out and experience culture in a collective sort of way. I think about commerce as being a fundamental social experience. When I go to the store, I get out of my house. I interact with other people. It may seem trivial, it might seem incredibly superficial, but those interactions are really important to us in the way that we think about our fellow human beings and about the quality of our own lives.
What comes next when commerce is entirely virtual? How will human interaction change? How will our society change? Are we happy with those changes? At what price [comes] convenience and efficiency? I don’t pretend that these are easy answers, and I don’t pretend that we’re not accruing incredible benefits from all of these changes. But we should also spend a little time thinking about what we’re losing in the process.
Knowledge@Wharton: We’ve also transformed into a society where income inequality is a staggering issue.
Foer: Absolutely. We need to look at the ways in which these companies exacerbate the divide, the ways in which they sit on these piles of cash. If you work for one of these companies, your life is amazing, right? We all know about their famous corporate campuses and the incredible benefits that come with working for one of these monopolistic firms. But what we see a lot of in the economy is not just a gap between the rich and poor in the aggregate sense. There’s almost a gap between the rich and poor within each of these sectors.
If you’re the second or third player in one of these fields, you don’t get paid the same because these companies collect the monopolistic rent. They are able to because they have such a dominance in their field and they don’t actually have to worry about competition. They can sit on piles of cash and distribute it in whatever way they want. They can hoard it, as Apple does, or they can distribute it to their workers in terms of benefits that keep their workers tethered to those companies. But everybody else in the economy doesn’t have the pleasure of benefiting from monopolistic rents, so the gap grows.
Knowledge@Wharton: How do you see Facebook’s role in how we consume media changing in the future?
Foer: I want to talk about this from a very narrow perspective, which is that I’m a journalist. Over the course of my career as a journalist, the profession has become extremely dependent on Google and Facebook. As advertising markets collapsed, there became this need to scale up in a quick sort of way, and the only way to get revenue was through growing traffic. The only way to grow traffic was by relying on these platforms. That meant that journalism needed to master these platforms. It’s a very unhealthy state of dependence. The values of those platforms end up becoming the values of everybody who depends on those platforms.
As an editor, the type of work that we did changed because we needed to succeed in Facebook. It’s kind of a debasing thing where the headlines we wrote had to be sensationalistic in a way that could travel on Facebook. The subjects that we had to write about had to tap into the hive mind that existed on Facebook. Instead of shaping the news, instead of making choices that were ennobling for our readers, trying to expand the minds of our readers, we ended up doing a whole lot of pandering. It can’t be healthy in the long run.
I edited a magazine that was left of center. The mood that exists in the world right now is not left of center, it’s kind of left. I found that, just to get traffic, there was this temptation constantly to pander to what politicians call “the base.” I see this all the time. It’s a dissent to be somebody who disagrees with whatever the consensus is — it’s to be cast out. Ultimately, it’s just not healthy for our politics to have these two tribes.
We think about our politics as extremely polarized, and it is. But it’s also extremely conformist right now. If you live in one of these two tribes, your informational ecosystem is extremely restricted. Facebook is a feedback loop where you get what you want to hear. We just get driven further and further into our corners through this technology that’s giving us what we want.
Knowledge@Wharton: What about Apple’s role?
Foer: Of the four big companies, Apple is the one that troubles me the least. I dislike the way in which it collects data. But at the end of the day, Apple is a hardware company and less involved in the sorts of intellectual technologies that I’ve described. Apple has done things to remake the music industry, for instance, that probably on balance I don’t like. But if I were to rank the four companies in terms of their perniciousness, I would put Apple at the bottom of the list.
Knowledge@Wharton: What about Google?
Foer: To me, the problem with Google is its ever-expanding goals. The thing that bothers me about Google is that there’s almost a religious intensity to what they do. [Co-founders] Sergey Brin and Larry Page come from the world of artificial intelligence. Artificial intelligence is this incredible thing, but there are different ways to practice artificial intelligence. There are all sorts of ways in which it’s an incredible convenience. But there are other people who want to achieve what’s called ‘AI complete,’ which is to create an artificial intelligence that is truly akin to a human intelligence, that has an understanding of language. There’s a whole, almost messianic vision that comes with it.
I’m sure you’ve heard of Ray Kurzweil, an amazing engineer who has this idea of singularity, of this moment where we completely merge with the machine, and the machines become smarter than the humans. We end up downloading our brains into this virtual world where we live forever. It’s really a religious vision. Ray Kurzweil is the director of engineering at Google now, and I think that Larry Page has a version of this sort of fantasy that he entertains. That’s his ambition for the company.
It’s a bit of sci-fi fantasy, so I’m not really concerned about singularity. What I’m concerned about is that when you believe you’re on this kind of messianic mission, and when you treat your job with that kind of religious fervor, all of the temporal concerns, all of the concerns about law and ethics and the present and what you might be destroying, end up getting thrown out the window. This is a problem that I have more generally with these companies.
You might think that I wrote a left-wing book, but I think I wrote a pretty deeply conservative book where I’m really worried about the fate of important institutions. There’s a lot of wisdom built into the things that we’ve developed over time. I worry that some of these companies are just so fervent, so hubristic and self-confident about what they’re doing that they don’t really pause to consider what’s being destroyed in the course of rushing to a glorious future.