- By Andrew Woolfolk
- Published: April 9th, 2013
On the anniversary of the website’s formation, the change it has brought to our world has been both good and bad
At the end of March, social media superpower Twitter celebrated its seventh birthday. Throughout the day Thursday, March 21, many posted congratulatory remarks on their Twitter pages — in the allotted amount of characters, mind you.
Since its inception, the site has grown to host more than 500 million registered users and is worth more than an estimated $140 billion. Its servers are more than able to handle the 340 million tweets sent out per day.
Not bad for a site that only lets you use 140 characters or less. My second paragraph nearly doubled that amount. Oh how garrulous I am — or am I?
Twitter has been a major factor in the change of the American mindset considering the written word. In the past few decades, the shift from the consumption of longer, more elaborated articles to the desire for more condensed, sporadic headlines has become quite evident to even the most innocent of bystanders.
“I think, first, it’s a challenge. It’s a challenge to say something meaningful in 140 characters,” communications professor Amy Bonebright said, referring to the allure of the site. “And then, the whole idea is that you want to speak to people and have them retweet it, it’s a whole challenge combined. I think people like the fact that they can jump on there and scroll through really quick and get headlines. Headlines of what people are thinking, headlines of news stories and headlines of what’s going on out there. I think that has been one of the main attractions.”
You constantly hear the older generation remark of how today’s youth just “isn’t what it used to be.”
While the overall veracity of that argument may be up for debate, in terms of our attention span, our elders may be right on that one.
In 2008, two German professors from the University of Hamburg and one from the University of Hanover conducted a study and found that our attention span has fallen from 12 seconds to eight seconds since 2000. The same study also concluded that goldfish have an average attention span of nine seconds.
Those darn goldfish.
That same study also found that on an average Web page of nearly 600 words, users only finish reading an average of 28 percent of it. For every additional 100 words, users spend only 4.4 seconds on the page.
I can only imagine how many of my audience members have already quit on me, but I digress.
To simplify the data, we have become a generation of skimmers. Think about your time spent on Twitter or Facebook. How much of it is with your finger on the scroll button, mindlessly skimming through statuses concerned with life-altering occurrences, such as what dress your friend wore or how your neighbor’s meal at his favorite fast-food restaurant is going?
This is the mindset we have, where these monumental incidents trump an escape into a novel or a lengthy news article. It is not necessarily wrong, but it is not necessarily something to be proud of, either.
It is also not classified information that those diagnosed with Attention Deficit Hyperactivity Disorder (ADHD) seem to be more prevalent now.
A 2011 study by the Center for Disease Control and Prevention found that the amount of children diagnosed with ADHD increased by 5.5 percent each year from 2003 to 2007. The same study found that, currently, nearly 10 percent of children ages 4 to 17 will be diagnosed with some form of the disorder.
I will not even get into the popular debate of whether the disease is often misdiagnosed, but I will say that, yet again, statistics serve as the true voice of what is happening in our world.
A 2004 study by the Official Journal of the American Academy of Pediatrics concluded that children exposed to television or Internet activity during the ages of 1 to 3 have an increased risk of being diagnosed with ADHD as early as age 7.
If we fully dedicate our time to sites such as Twitter and Facebook, we must be ready to accept the dire realities that result from social media saturation.
Even people such as Bonebright, who said that she loves Twitter, see the drawbacks that the site has.
“I think the area it’s been most hurtful in is just how people write grammatically,” Bonebright said. “People think, ‘Oh yeah, in my articles and in my papers, I actually have to write that word out.’ I think that has been a little more hurtful. Sometimes, I can’t even read the tweets.”
Regardless of whether one agrees with the effect Twitter has on us, it appears obvious that the site is here to stay for years to come. Recent changes in our news cycle — such as the long-running investigative journalism program “Nightline” being moved to the undesired 12:35 a.m. time slot to make room for the social media saturated Jimmy Kimmel Show — make clear the direction in which we are headed.
“Twitter definitely has staying power,” Bonebright said. “I think for Twitter, the simplicity of it is the driving force, and therefore, it doesn’t need to change a lot.”
But if by some miracle you managed to endure reading this article to the very end, then congratulations on being one of the few exceptions to the ever-increasing normality of neurotic thought.
Take a bow for outlasting that stupid goldfish.