The Internet, Social Media, and Technology: Is it making us dumberer?
It was one of those mornings where you get your coffee, sit down to dig through your morning flurry of email, and you happen across an email that catches your eye. For me, it was a monthly newsletter from Brain Pickings – in this case an article entitled “Tip-of-the-Tongue Syndrome,” Transactive Memory, and How the Internet Is Making Us Smarter by Maria Popova. If you haven’t subscribed to Brain Pickings, I highly recommend it – especially for the creative types who love to be pushed and inspired. I can’t necessarily define why this caught my eye – possibly because I had not seen anything related to the Internet from Brain Pickings lately, or maybe I wasn’t in a mad dash to get my inbox back to zero… either way it was a good find and caught my attention exponentially longer than most things that hit my queue of communication. In a (big) nutshell, this very well written and very well referenced article touches upon a very common question – is technology and the Internet making us smarter?
As with many of us who have disjointed work days, crazy family schedules, community responsibilities, and other never-ending distractions via social media and Candy Crush (the most evil game ever made), it is easy to start to lose focus, feel like your memory is shot, and that the technology of today has created a work and social environment constantly requiring us to plug-in and stay connected. We start to feel like our brains have turned in to a big pile of mush. I hear more and more about this, so part of what I found interesting about this article was its counterpoint to many of the current beliefs of “technology driven brain mush”. In her article, Popova references what looks like a very interesting book by Clive Thompson called Smarter Than You Think: How Technology is Changing Our Minds for the Better (public library).
Outsourced Memory + “Tip of the Tongue Syndrome”
As described by both Popova and Thompson, we now live in a world where we have quickly become to rely on technology for information reference as opposed to retaining that knowledge internally ourselves. What used to be a never-ending challenge to be book smart has been replaced by “Googling it” on the fly from anywhere, at any time. This has lead to a world of "outsourced memory". Historically,we have had similar types of outlets. This was the purpose of scholars and libraries – have go-to people and places to provide information we need, when we need it, without the requirement of personal, independent retention. Now, that service is being provided by the internet, search engines, and tech which have miraculously harnessed the collective knowledge of humanity to be queried in fractions of a second. Cool, right? Well, Maybe. The Internet and technology have without a doubt decreased our need to remember things. We have tools to remember our grocery lists, anniversaries and birthdays – heck I don’t even know my Mom’s cell phone number. Why would I, my phone LOVES to do all of these little things for me. I am so lucky – unless my phone is dead and then I am in a big world of hurt. LinkedIn keeps track of how I know the CEO at Company XYZ. Facebook reminds of who my friends are from high school and when they were born. Pinterest keeps track of cool recipes and ideas so I don't have to commit them to memory. The list goes on-and-on. The huge dump of knowledge (normally found in our semantic memory) into technology has never been higher. Is this a good thing? What this has led us to, as described by Thompson is the “Tip-of-the-tongue syndrome”. Thompson explains,
“Tip-of-the-tongue syndrome is an experience so common that cultures worldwide have a phrase for it. Cheyenne Indians call it navonotootse’a, which means “I have lost it on my tongue”; in Korean it’s hyeu kkedu-te mam-dol-da, which has an even more gorgeous translation: “sparkling at the end of my tongue.” The phenomenon generally lasts only a minute or so; your brain eventually makes the connection. But … when faced with a tip-of-the-tongue moment, many of us have begun to rely instead on the Internet to locate information on the fly. If lifelogging … stores “episodic,” or personal, memories, Internet search engines do the same for a different sort of memory: “semantic” memory, or factual knowledge about the world. When you visit Paris and have a wonderful time drinking champagne at a café, your personal experience is an episodic memory. Your ability to remember that Paris is a city and that champagne is an alcoholic beverage — that’s semantic memory.”
So herein lays the question – does this on-demand storage of and access to information make us smarter or perpetuate the mush and make us collectively dumberer?
Community Information Sharing
We’ve all heard that old saying “two heads are better than one” and I think many of us would agree. If that is true, billions of heads would be better than two, right? Maybe. Personally I do feel this is true, but only if there are some levels of formality, checks and balances, and filtering utilized to increase the accuracy of this information. Thompson references the work of Daniel Wegner, a Harvard psychologist who in the early 1980’s first explored this notion of collective rather than individual knowledge. This work was accomplished by observing how partners in long-term relationships often divide and conquer memory tasks in sharing the household’s administrative duties:
"Wegner suspected this division of labor takes place because we have pretty good “metamemory.” We’re aware of our mental strengths and limits, and we’re good at intuiting the abilities of others. Hang around a workmate or a romantic partner long enough and you begin to realize that while you’re terrible at remembering your corporate meeting schedule, or current affairs in Europe, or how big a kilometer is relative to a mile, they’re great at it. So you begin to subconsciously delegate the task of remembering that stuff to them, treating them like a notepad or encyclopedia. In many respects, Wegner noted, people are superior to these devices, because what we lose in accuracy we make up in speed."
This point touched close to home, literally. For example, my beautiful wife Tracy is a marriage and family therapist at Austin Mental Health here in Austin. She is truly an expert in relationships, adolescent development, and inner personal communication. I rely on her more than the Internet when I have questions about dealing with our kids, family members, and friends. This being said though, an important takeaway from this article is that as with any trusted resources, whether it be a friend, professor, encyclopedia, or other trusted reference – it is critical that we understand the strengths and weaknesses of that which we rely so heavily upon for this information. While Tracy is my go-to resource for information about relationships, I would be crazy (no offense, Sweety!) to ask her how to change out the hard drive on my laptop. She is an expert in her field, but not one to be trusted with fixing a $1000+ piece of hardware – as such, my understanding of her strengths and weaknesses as a reference is a critical part of knowing when to user her as an access point of knowledge. The same is true for the Internet, social media, and search engines. Though it would be nice to think that all search engines are created equal and that they serve up an unbiased plethora of information, we all (hopefully) know that each has a super secret algorithm used to access and serve its content. This is not necessarily a bad thing but it is important to understand that though easily accessible, as with my wife, each has its strengths and weaknesses, and understanding the underlying design of each tool and resource will determine the reliability and credibility of the information we receive. As such, an increasingly important part of our unconscious learning habits in the world of today is cataloguing the strengths and weaknesses of the hardware and software we use as references. This is the "metamemory" that Wegner references. The technology and online references we use daily only make us smarter if we innately understand the limitations of each and use our own brains to determine the quality and accuracy of that content they provide. An example would be comparing Wikipedia to a silly blog post like the one (which you are hopefully still reading). Most Wikipedia users understand that there are checks and balances on the site to ensure the site has the most accurate information which is constantly contributed to and vetted by the global public. The antithesis of this, would be this silly blog post where I write whatever we want, unfiltered for the myriads of Monkee-Boy fan boys (ok, so we don’t have myriads, maybe like 10 or 11). Assuming that you agree with me that Wikipedia is more reliable that my blog post, check out this 2011 article about "The Top 10 Reasons Students Cannot Cite or Rely on Wikipedia". One of the most trusted, community driven, resources online also has its problems and skeptics even though it is become more frequently used as reference for truth by students. Well, that’s a little scary.
Opinion vs fact. Fact vs. opinion.
The internet has exceedingly started to blur the lines between fact and fiction and as a result we may collectively be getting “smarter” with the wrong information. Social media doesn’t help this at all either. Now that we can share ideas in seconds without the need for references, footnotes, and factual data, falsehoods can spread like wildfire. An example would be this article found on TechCrunch called "Social Media Is 'Worst Menace To Society' Says Turkey PM, 25 Twitter Users Arrested". Right or wrong, the world is using technology in new ways and the technology of today has the power to change the world for good or evil. I truly believe our collective intelligence is indeed raised by the Internet due to the immediate access and quantity of data available, even if we do rely on the outsourced memory more than ever before. It is certainly a slippery slope , however, that can easily be taken advantage of since there truly is no site-by-site, or tweet-by-tweet, qualifier of content accuracy. This is very scary and if users simply turn off their brains and don’t question the validity of the information they are consuming + sharing and consider the motives of the people creating the content + tools, we will all collectively be getting dumberer. Thompson supports this by saying
“These tools make even the amateurs among us radically smarter than we’d be on our own, assuming (and this is a big assumption) we understand how they work. At their best, today’s digital tools help us see more, retain more, communicate more. At their worst, they leave us prey to the manipulation of the toolmakers. But on balance, I’d argue, what is happening is deeply positive.”
With the truly un-quantified amount of data at our fingertips, our propensity is to take what we read as truth. How do we solve this? No idea. Maybe a subject for a different blog post. Maybe there is an app for it that I don't know about. Nonetheless, it is something that we should all consider - we will only collectively keep from getting dumberer if we use our brains in new ways as we surf, consume, and share. Now go forth and create something valuable. Create something positive. Create something true. I’m going back to my Cheerios…
Four Tips For Hiring a Good SEO Firm for Your Business
In today's SEO climate, where Google penalties can result in all of your web pages getting removed from search results, hiring the wrong SEO company can have disastrous implications on your ability to conduct business.Read more
Why Going Viral is a Weak Goal (And Tips to Go Shareable)
Do you remember Zack Danger Brown's potato salad Kickstarter? I recently received a message from Kickstarter asking me to either change or confirm my address so that I could receive my reward (a photograph of Zack making the potato salad) for pledging $2.00 to the campaign. In case you forgot (I did), Zack's potato salad Kickstarter went viral this past July. What started out as a joke raised over $55,000 in just 30 days.Read more