Full interview- Cal Newport on Surviving Screens and Social Media in Isolation

Today I present to you in a FULL INTERVIEW one of my favorite authors,named Cal Newport,and why the books that he presents are the books that you NEED to read and share with others.

Quite a bold statement,I know,but you be convinced after reading this article.

A computer scientist on why the quality of your quarantine may come down to how you use your technology. 

Thinking about what you entered……..
You are in!

Right now, for so many people self-isolating in the face of the escalating coronavirus pandemic, technology is the main link to the outside world. It’s allowing us to maintain crucial contact with friends, family, and coworkers, and providing information and much-needed outlets for joy, amusement, and creativity in a rather bleak time. However, it can also be the source of deep anxiety and distraction: never has it been easier to stress-refresh your Twitter timeline looking for the latest Covid-19 numbers, or pick up your phone to text a friend only to fall into a mindless internet black hole.

Of course, this has always been the dual nature of our networked world: the potential of limitless access to information and connection curbed by rising rates of anxiety-related disorders, and an increasing allergy to focus and attention. But as Cal Newport, a computer scientist who teaches at Georgetown, says about our self-distanced moment, “what the current crisis does is take those elements that make technologies tricky to deal with in the normal case, and basically turns it up to 11.”

In recent years, Newport has become a go-to resource on all things digital, one of the rare voices that resists the lightning-fast speed at which technology develops, calling for slow and deliberate consideration about how the tech we use is affecting culture and human behavior. In his books Deep Work and Digital Minimalism, he offers practices that can help readers reclaim autonomy over devices that can all too easily hijack attention (in the first, he explains the merits of mono-tasking and deep concentration; the second helps you think about developing a personal philosophy of tech use that prunes back all platforms that don’t bring value or purpose into your life).

With screen time on the rise, we reached out to Newport for advice on how to best navigate the current moment. His advice won’t just help you avoid the “distraction spiral” that can eat your day in a whirlpool of anxiety and stress—it’ll help you rethink the importance of intention throughout this unprecedented period of isolation, and why determining how you spend your quarantine could help determine how your life spins forward once this is all said and done.

In Digital Minimalism, you put forward an idea that is meant to help people reclaim autonomy over the way they use their devices. This strikes me as a moment—when we’re all locked at home—when that would be particularly important.

The underlying ideas [in Digital Minimalism] are being thrown into stark relief. There’s this core idea that technology is this dualistic thing. Now we’re all thrown into a world where we have to be inside on technology all day. It’s absolutely a lifeline—and yet that negative side of tech has never been more intense.

The analogy I like is from Plato’s Phaedrus dialogue: the soul can be thought of as a chariot driver trying to control two horses. The chariot driver represents the rational thinking, planning part of the human mind. One horse is the noble impulses, and the other horse is the base or ignoble impulses.

You can take a tool, like a social network, and if you use it real carefully, you can empower the noble horse in that allegory. It can elevate what you’re able to do with your day, and the quality of your life. If you use it casually, like a psychological pacifier, it just supercharges the ignoble horse. You find yourself completely spiraling, lost in anxiety and distraction.

Have you found yourself struggling, or being more tempted? Or maybe you’re so well-practiced that you’re immune to it?

At the beginning, when things were really starting to shift [with coronavirus], I wrote a post for my audience about digital minimalism and what I recommend you do. My thing is: look at the news in the morning, see what’s going on, then don’t look at it again until the next day. That was my suggestion. But I found at first, even that simple task of, “Well, let me just check the news this morning,” you look up and it’s 90 minutes later! So I realized I had to reign in what I meant by “check the news everyday.” So even I was caught off guard by just how sticky the distraction spiral is right now.

Unless you are organizing a public health response, there’s really no benefit to be gained by doing a daily exhaustive news search about what’s happening. I get the Washington Post. Looking at the front page tells me everything I need to know.

This is a little bit flippant, but in terms of closing things down for public health, one of the big boosts they could make would probably be shutting down Twitter. That technology, what it brings out of people, and the type of discussion there, is really not well suited for this type of moment.

Yeah I have a friend who stopped looking at it at night because it was just wigging him out. But then I look at something like Instagram, and it feels to me like people have actually been connecting there. On there, it seems like the idealistic, and probably naïve, version of what people had hoped the internet might be is maybe proving to actually be true?

I completely could believe that. Because everything is so intensified, it’s forcing people to go through this minimalist ritual. You’re getting real careful: These tools have all of this potential and danger inherent in them, and I’ve got to start to get pretty careful about what I am trying to accomplish here. Before, it was so easy to be casual: I use Twitter, Instagram, and Tik Tok. None of it’s bad. Now, suddenly, it’s like, “Wait a second. Twitter is giving me a panic attack. Instagram, if I look at these accounts, calms me down.” People have built these affinity-based communities on Instagram, where Twitter seems to have evolved more into just really stressed out people yelling at public officials.

When we spoke last year, I made an allusion to cigarettes. Now—and I mean this in a careful way—I’m thinking synthetic opioids, narcotics, have relevance here. Opioids in a medical context are incredibly important for pain management after surgery. You would not want a world that did not have opioid-based pain killers. They can help your grandmother with her hip replacement. They can also destroy your life in another context.

These technologies are life-giving and powerful, and we wouldn’t want to not have them. At the same time, if you’re spending your day on Twitter right now, it’s shredding your psychological health. It’s the physical equivalent of sitting here with drain cleaner, taking shots every hour. But if you’re on a Zoom call with your parents or cousins or something, it could be giving you the exact opposite effect!

One of the things I work a lot on is really trying to understand the philosophy of technology, and it’s complicated how technology afflicts culture. We were treating it a lot more simplistically, and forcing everyone to be stuck with technology all day long, with these really intense positives and negatives living side-by-side is going to necessarily complicate the way we think about these tools.

This is completely speculative, but if I were to ask which of those two sides you think is going to win out in this war—the simplified good tech or bad tech—how would you respond to that?

First I would change it to good tech use, bad tech use, not good tech, bad tech. Even that is a little bit too simplistic. The dominant philosophical framework with which people think about technology in culture is called “technological instrumentalism.” It basically posits that tech is incredibly neutral. What matters is how people use it.

You can also use technology to understand other social forces—a different philosophy called technological determinism: technologies can have attributes and properties that change human behavior in ways that’s unplanned. In the book I’m working on now, I get into how the introduction of internal email into the office radically changed, almost overnight, how people work. It was completely unintentional.

It was similar when Facebook added the like button. At first there was a set reason: people are writing redundant comments and that’s a waste of space. It’s just engineer stuff: How inefficient is this that everyone says good or congrats? Let’s have a like button for that, so the comments that are there can be more substantive. But it introduced this unpredictable stream of social approval indicators and suddenly people started using it way, way more. They took 100% advantage of it once they realized that was the effect, but no one planned that.

Tech can have huge effects on us. It can change our behavior in ways that we don’t realize and that no one really intended for. So to me it’s not good or bad—but more about intentional or casual. If you’re intentional with technology, you know what you’re trying to do, you know what you care about, and you’re putting tools to use to help that, but when you see, “This isn’t helping, I’m just on this too much,” or “It’s making me feel bad,” you’re observing it. Where casual is just: Why not? This could be interesting.

The danger of casualness is that it unlocks technological determinism late in these technologies, and you look up and your life is completely different. So for me the question is: is intentionality or casualness going to win out? I’m optimistic. I think this is exposing us to that dialectic in incredibly sharp relief. Once you see it, you say, “I really like the side of this that you get when you’re very intentional, versus the side when you’re not. On this side, lies heaven—on that side, lies hell.”

Not to spoil the book, but how did email change the way we communicate and work?

A bunch of different things happen when we brought low-friction communication into the workplace. Basically, communication that had zero cost, both in terms of time and energy but also social capital. There’s social capital involved if I’m going to go to your office and knock on your door and negotiate that interaction. All that goes away if it’s just email.

What was universally observed is that the amount of communication in the office just skyrocketed. We began communicating way more than we ever had before, because this tool was there. Work became way more enmeshed in this ongoing ad hoc, unstructured conversation all day. We just think that’s “work”—but that’s completely different. That’s a very new thing and it’s a very specific way to work.

I can’t find any evidence that anyone ever sat down and said, “Here’s how we’re going to get a competitive edge. We’re going to communicate a lot more. We’re going to hook everyone up into these channels so we can just all be in a conversation all the time.” It just happened because the tool was there. It turns out that that way of working basically just doesn’t work once you’re above like four people.

It just conflicts with the way the human brain works. It’s an incredibly inefficient deployment of all this cognitive capital. It’s a terrible way to try to get value out of brains. And no one planned it. You put those two things together and it’s like well maybe we should think of some alternative, which is what the book is.

The casual use is resonating because I’ve started watching more TV in quarantine to turn my brain off the end of the day, and I can barely make it through an episode of a show without checking my phone. You had that great quote from Laurence Scott in your book about “a moment can feel strangely flat if it exists only in itself”—that is now how these moments feel to me.

This is what I discovered during that experiment where I had 1600 people spend a month away from all these optional technologies—especially on the younger end of the spectrum. People had a very hard time being alone with their thoughts. They just weren’t used to it. Solitude, by the way, used to be a skill like speaking: yeah, it’s very complicated if you didn’t learn to do it, but most people just learn to do it naturally because they were exposed to it.

For people today, you’ve lost that. But solitude is absolutely crucial. Time alone with your thoughts is how you structure your experience, and on those structures you can understand where you are in your life and where you want to go. Without that you’re just adrift. You’re basically just being pushed around by winds and attention economy contraptions. Where you are, what you are, what you aren’t and what you want to be—that just takes thought. Now we’re forced to do a lot of thought, because there’s only so much we can, in our apartment, look at the same screen before our eyes bleed.

To the point of technological determinism—and maybe that doesn’t quite apply here—the other night a friend texted me that she got laid off. I didn’t quite know how to respond, but I knew I needed to respond, so I gave it a thumbs down to buy some time. Then I sent a longer, more thoughtful response, but it made me do the thought experiment of: the discomfort that comes from having to engage in a complex emotional encounter with another person, if we can just basically distill all that complexity down to a heart, a thumbs up, or a thumb down, what is that going to do to our emotional intelligence?

I think it’s a great example because if we follow through that case, almost certainly that feature comes from an engineer, or a product group lead in Apple somewhere. There is probably some typical very rational geek version for it, which is usually about some sort of efficiency thing. You bring this in and it can completely change, let’s say, emotional dynamics and the interactions of hundreds of millions of people. Then it changes the mental schema with which people understand emotions, and it could have a huge ramification on our culture and the way our culture understands and thinks about things like sadness or support or something like this. Maybe it’s for the better. Maybe it’s for the worse. But it could be this massive change at the cultural scale that no one planned. What was the thing that sparked this change? An engineer somewhere saying it’s a waste to put an emoticon in a separate text box. That’s techno determinism: you can have these huge unintentional consequences that no one planned. They’re unpredictable.

There was this moment where the fashionable response to techno concerns was like, “Look, it’s just evolution. Kids these days are different. We thought our parents were old-fashioned. Technology just evolves us in our culture in this continual positive way and don’t be reactionary.”

And it’s not like that. It has these random, unintentional swings. We really should care about it and think about it and be like, “Do we like the way this is changing the way we think about emotions?” And if not, then let’s not use that technology. Email is great—much easier than voicemail—but if you see that, wait a second, we’re sending ten times more messages, that side effect might not be for the best for the office, maybe we want to change it in some way.

It seems to me like some of the polestars or guiding forces when it comes to computer programming are efficiency, speed, and optimization—and those seem somewhat opposed to the slow, deliberate, reflective attitude needed to stop and ask, how is this affecting us? How do you think about balancing those two things?

One question is: what are you trying to efficiently do? What is it you’re trying to speed up? What’s the benefit function here that makes your life better? Efficiency devoid of a particular objective is a metric adrift. A computer scientist would care about the efficiency of an algorithm because you have a lot of things to want to use the processor, and you don’t want to spend more time than you need using a processor. There’s something you’re trying to gain there. It allows more to happen on the machine.

In our life, for some things, faster is fine. If you give me a way to clean the dishes faster, that’s probably better. But when it comes to information and information flow, it’s not necessarily true that having more information or being exposed to as much as possible per unit time is better for your life.

Optimization [is different]. Not to get too technical, but there’s a form of optimization called combinatorial optimization where basically you have different possible solutions and each solution is just a combination of things. If you’re in a particularly geeky mood, you can think about life as a combinatorial optimization process. You have your day and you’re trying to figure out what combinations of activities and targets of focus do I want to lay out that’s going to give me the highest benefit and that will make me feel the best about this day?

Once you’re asking that question, then you really get a lot more picky about things. If I took the two hours I spent incredibly efficiently yelling at public figures and I spent them doing something else that was maybe a little bit slower but felt more rewarding, I might be building towards a solution that’s going to be more beneficial. I like the optimization mindset because once you’re thinking about and trying to make this the best day possible, trying to optimize this goal, it just makes you much more critical about your individual behaviors.

Efficiency I don’t care much about. Optimization I do.

There is a belief that the more quickly and more efficient you can get through emails, the better worker you are, right? That seems to be an example of where we have mistakenly valued efficiency. I feel like people are performing “work” by going, “How quickly can I respond to this email?” And that’s actually antithetical to what they’ve been hired to do, which is harness their brain power to come up with smart solutions to things.

Well you’ve hit the nail on the head. The tricky thing about optimization is you have to have something to optimize. If you’re writing a combinatorial optimization algorithm for computer science, you’re given a function called a cost function that just tells you this is how good this is, and this is how good this other solution is.

One place where we get into trouble—what basically explains what you’re talking about—in knowledge work, we don’t have that function. It’s very difficult to say, “Here is how valuable knowledge worker X was to us today.” It’s hard to measure that in a way that, if I’m a baseball player, I’m measuring my on base percentage. In the absence of, “Here’s what we’re trying to optimize” and “Oh, when we stopped doing this it got better,” we end up in crazy places.

That same thing is so true for technology in your personal life. Once you know what you’re trying to optimize for, it’s so much easier to make decisions about your technology because now you have something to push it up against. Is this helping this thing I care about or not? It makes all the difference.

If you don’t know what you’re all about, what you’re trying to do, what you care about, then you end up doing things like a digital detox: “I’ll just take a break for a few weeks because I’m overloaded, and then go back to it again.” The nonsensical way to deal with the problem is to take a break from something that’s causing trouble. How about change the thing? You don’t tell alcoholics, “I’ve got a great detox plan for you. You’re not going to drink for two weeks and then I’ll meet you at the bar to celebrate when the two weeks are over.”

Now is the time to do this work, figuring out this is what matters in my life. I think a lot of people are confronting that. The flip side of pain is that it reveals what matters here. And knowing what you care about, now you have something to optimize again. Now when you’re looking at Facebook, you are able to say, “Actually the only thing in here that’s really supporting something I care about is this group of people that are very important to me as I meet on a Facebook group.” So, once a week, go into that group and participate and it takes 30 minutes of my life and gives me huge benefits.

That type of really intentional sharp-eyed tech use requires first knowing what you’re trying to optimize. The whole game changes once you know what it is that you’re trying to do, what matters to you. Maybe that’s the big advice for this moment when people are thinking about nonwork tech, in particular. Like a laser beam, identify what matters to you. Almost everything you do should be serving that in the best way you can find to serve it. And the stuff that’s not, get it out of your life. You don’t have time for it right now.

This interview has been edited and condensed.

Thinking about what you entered……..
You are in!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.