These 7 experts explain why Trump won, Harris lost — and what the election result means
- today, 1:11 PM
- businessinsider.com
- 0
I still remember exactly where I was when I realized I’d lost control of my Twitter account. I woke up one morning in March of last year to an email, delivered sometime in the middle of the night, stating that the password on my Twitter account had changed. I had just returned from my bachelor party, which involved bookending a long weekend of heavy alcohol consumption with cross-country flights. No longer as spry as I was in my 20s, I was coming up on my second (third?) day of recovery at home in Los Angeles. I recall being partly delirious while futilely trying to log back into my account. I had two-factor authentication enabled; had I accidentally accepted a login request in my post-bacchanal stupor? Did I even recall getting a notification before the password change? More importantly: Was this really happening to me?I started my Twitter account just over 15 years ago, on July 4th, 2009. I had just graduated from college into a post-2008 financial crisis economy and was spending the summer in Washington, D.C., desperately hunting for a job in journalism. These were among the earliest days of social media’s capture of America’s news-industrial complex, back when Twitter was a place for real-time news like Sully Sullenberger’s “Miracle on the Hudson” commercial jet landing off of Manhattan. I had no idea that Twitter would soon become the dominant engine at the core of journalists’ media diets; what I did know was that jobs like “social media manager” kept cropping up on job boards and newsletters (and in my friends’ LinkedIn bios, for some reason!) and most of those jobs seemed to involve posting to Twitter.
I quickly developed a knack for packaging content for social media, enough to land me a job as The Atlantic’s first social media editor, which formed a strong foundation on my résumé at a time when newsrooms were starting to see major returns in forms of digital audience growth on burgeoning social media sites. During the 2010s, before Facebook’s retrenchment from hard news and related pivot to video and well before Elon Musk’s regime rebranded Twitter as X, breaking news ruled supreme, and most of my subsequent jobs in newsrooms focused on aggressively shotgunning new content into the social media ecosystem in an effort to achieve some sort of high-traffic virality. With the collapse of traffic to media outlets from social media in the early part of this decade, the role of social media editor has transformed from what it once was, but at the time it mostly involved constantly monitoring Twitter. Even today, thinking about ancillary dashboard tools like Tweetdeck gives me a migraine.
For me, the first decade on Twitter was a mixed bag. By the time I lost control of my account, I had more than 28,000 followers. Through that audience, I gained plenty of jobs and professional opportunities, but also lost them just as quickly, usually over ill-advised tweets. I made plenty of friends and even a few romantic partners through the network (viral tweets about dogs wearing pants carry some cache in the dating world, believe it not), but also just as many detractors. Eventually, my activity on Twitter after that first decade slowed to the speed of self-promotion: posting my new work, sourcing story ideas, and occasionally engaging in social media’s favorite pastime of dunking on morons. Mostly, though, Twitter filled the minutes of idleness between other activities, a default state of constant information consumption that for years collapsed the boundaries between my professional and personal lives. And despite every problem that cropped up for me on the network, I just kept using it.
The morning I realized I was locked out of my account was a sweet relief. I wasn’t worried about my direct messages really, and nobody has tweeted from my account since taking it over (the only action taken: unfollowing everyone I had followed in the last decade, which, fine). But that feeling of sweet relief alone was also slightly disconcerting. I had spent nearly 15 years building a (relatively) loyal audience, that sacred source of power in the media, and found myself cut off from it in a day. Why wasn’t I more upset?The answer is simple: Social media is designed to get you to post—and being a chronic poster sucks.
That social media has always been a vessel we poured ourselves into is no secret: after all, it’s the content created by users that fuels the engine of social networks in the first place. As I wrote more than a decade ago, the design logic of social networks can best be described by Leonardo DiCaprio’s most memorable one-liner from Christopher Nolan’s 2010 film Inception: “You create the world of the dream. We bring the subject into that dream and fill it with their subconscious.” Facebook and Twitter were always just “the world of the dream,” the architecture of digital expression—and that architecture is designed to get you to share as much of yourself as possible. That information, of course, is the end product alongside your attention that makes social networks viable businesses, but it also has a psychological impact of turning every user into not just a “content creator” but, fundamentally, an oversharer. The maw of Twitter calls out to us to fill it with our thoughts, our feelings, and inner workings, turning introverts into loudmouths; the “audience” is there, so serve it.This is not an inherently bad development, but living life Extremely Online is exhausting. Constant engagement with social networks to monitor trends and joust with internet morons is a Sisyphean indulgence, while racking up “shares” and “likes” is, at its core, an experience that reduces even the most thoughtful expression to a cycle of escalating dopamine hits. The resulting experience is a historic level of connectivity between individuals, but with that comes a variety of psychic harms ranging from anxiety and depression to a toxic sense of self-worth to political radicalization, according to research. The fear that the internet is making us stupid has been a throughline in the development of the social web since its inception; I’d argue, at least based on my personal experience, that my brain had become both addled by constant engagement with the social web to maintain the audience I’d fought to build because I thought it was in my best professional interest. I felt I had to feed the beast or that audience would disappear; unfortunately, I found out that it was just as easy to ruin my own life on social media simply by not shutting up when appropriate.
Once my account was gone, I was freed from all that. The sudden rupture also provided the perfect opportunity to start reevaluating my relationship with the network. Why was I sharing so much, opening myself up to criticism or conflict, when I could just as easily keep my head down and go about living my life? Was anything I had to say really that important? Was it worth it to build an audience to which to market my journalistic output? I already had a longstanding audience: my family, my friends, now my wife. Why did I even care about all those other people in the first place? What benefit was it to me to fill someone else’s dream world with my subconscious?
Losing control of my Twitter account had freed me from the shackles of chronic posting and, in a way, sent a shock to a brain atrophied by time spent extremely online. I started reading more, and longer, rather than mindlessly scrolling through a feed; information consumption became an active, considered act rather than just a passive one. I stopped seeing things through “The Facebook Eye,” that mental lens that reduces the world into potential content for posting. I wrote tweets and deleted them, over and over and over again, the thrill of keeping my thoughts to myself a suprisingly novel experience. Twitter—sorry, X—still has some value as a news source, but I’ve transitioned from active participant to sideline observer, a shift that has done wonders for my mind, body, and soul.
The Boston political boss Martin Lomasney was well known for his advice on discretion: “Never write if you can speak; never speak if you can nod; never nod if you can wink.” If he were alive today, he would almost certainly include to his adage a well-worn piece of advice that social media veterans are especially fond of: “Never tweet.”
No comments