I decided near the end of last year that my word for this year would be “attention.” I would pay attention to what I was paying attention to, and to what I was getting in return f0r these payments of the scarcest asset I possess.
I pretty quickly realized that I couldn’t stand watching sitcoms, or tv generally, or listening to commercial radio, or watching popular media. I could see how all of these media used the science of attention to capture my attention (and how they thearbitraged it to their advertisers.
And I got to read a lot more.
I’ve had a couple of occasions lately to apply my attention studies directly to real human beings, as opposed to electronic media. Younger lawyers with substance-abuse or other mental-health problems would email me late at night, demanding to know why I thought thus or such of them, or why I behaved coldly toward them at a social function years ago that I had long since forgotten.I then invited both not to communicate with me anymore; I filtered their emails to junk on the off chance that they would not accept my invitations.
But still, despite my conscious attention to attention and my sometimes heavy handed preservation of my attention in the face of those who thought they deserved it, I remained on Twitter.
Twitter was a special case because, first, while I could see how Twitter was trying to arbitrage my attention, it was doing it so badly that it didn’t bother me much. And second, while people who thought they deserved my attention could reach me through Twitter, they could do so only once before I muted or blocked them; and they were limited to 140 characters to do so.
The Ineternet has allowed businesses to arbitrage our attention by giving people three things they didn’t have twenty years ago:
- A place to yammer and spew;
- The feeling that someone is listening; and
- Occasionally an actual audience.
In so doing it has lowered the barriers to publication. (The Internet has no interest in such barriers—the more content there is, the more attention social media companies can harvest from the readers and sell to the advertisers.)
So those of us who care about our attention have to set our own barriers, to minimize the dross, the chaff, and the noise. An obvious way of doing that is to eliminate social media. And I had, early on in this adventure, eliminated Facebook from my repertoire. Twitter remained because that 140-character limit acted as a sort of barrier to entry: You get 140 characters to demonstrate that you can afford my attention. If not, bye!
That 140-character limit made people, unsurprisingly, better writers. People will write to fill the available space; less space requires more thought.
Then Twitter went to 280 characters (twice as much content to harvest! Woohoo! Twice as much attention to arbitrage!) and the site went to shit. Instead of having 280 characters to prove themselves worthy of attention, twitterers suddenly had 280. And they used those 280 to say THINGS THEY COULD HAVE SAID IN 140 WITH MORE WORDS.
I thought the attentional situation might be salvaged. I started by muting or blocking everyone who used substantially more than 140 characters. Then people I had enjoyed following for years stopped giving a damn about the terseness and elegance that 140 had forced upon them. And I realized that the only way I was going to protect my attention from an onslaught of twice-as-long-as-necessary tweets was to bail from Twitter entirely. Ad so I did. @markwbenett is @nolongerbennett.
One unfortunate aspect of my years of involvement with Twitter is that instead of setting my thoughts down in a methodical and coherent way here, I tended to released them in dribs and drabs to the twitterverse. They are still around somewhere, but nowhere that I could point to and say, “for my thoughts on this topic, go here.”
So I have a lot of catching up to do.