These are popular questions: Does technology bring us closer together or merely put more devices in between faces? Does technology enhance knowledge or does it outsource memory, weakening a brain's ability to recall? Jaron Lanier, Kevin Kelly, and many others have provided their own answers through their publications.
Right now I like Clive Thompson' Smarter Than You Think and Maria Popova's accompanying analysis. Without actually having finished reading the book (but yes, I plan on doing so) I appreciate the story Thompson is trying to develop. The most common counterpoint to the technology resistant is that technology can be a powerful tool if we choose to use it consciously and with a purpose. You control the mobile phone, not vice versa. Thompson takes this argument and blows up the examination tenfold.The first opponent to technology, Thompson cites, must have been Socrates back in the days of yonder. As Socrates' story goes, the Egyption god Theuth invented writing as a gift to Thamus, the king of Egypt at the time. The technology of writing, Socrate's parable argues, would "devastate the Greek tradition of debate and dialectic, and would render people incapable of committing anything to memory because knowledge stored was not really knowledge at all."
And we all know how Gutenberg took that notion and crushed it by developing the first printer, essentially revolutionizing how humans share and democratize information. The rapid growth in information production eventually yields the conundrum of storage. Thompson presents this problem as the tip of the tongue syndrome - that moment when you can't quite remember the full detail. And the syndrome is really about employing two different types of memory. One, Thompson argues, is more valuable than the other.
Tip-of-the-tongue syndrome is an experience so common that cultures worldwide have a phrase for it. Cheyenne Indians call it navonotootse’a, which means “I have lost it on my tongue”; in Korean it’s hyeu kkedu-te mam-dol-da, which has an even more gorgeous translation: “sparkling at the end of my tongue.” The phenomenon generally lasts only a minute or so; your brain eventually makes the connection. But … when faced with a tip-of-the-tongue moment, many of us have begun to rely instead on the Internet to locate information on the fly. If lifelogging … stores “episodic,” or personal, memories, Internet search engines do the same for a different sort of memory: “semantic” memory, or factual knowledge about the world. When you visit Paris and have a wonderful time drinking champagne at a café, your personal experience is an episodic memory. Your ability to remember that Paris is a city and that champagne is an alcoholic beverage – that’s semantic memory. … What’s the line between our own, in-brain knowledge and the sea of information around us? Does it make us smarter when we can dip in so instantly? Or dumber with every search?
I usually laugh every time I have to use the word "meta" in non-ironic fashion, but without getting to Star Wars-ey or Star Trekky, the outsourcing of memory described above has resulted in the brain's development of meta-memory - knowing not the specific details of what I'm trying to remember, but knowing the details of how to use my smartphone and Evernote, the application that stores all the specific details for me.
This ability to "google" one another's memory stores, Thompson argues, is the defining feature of our evolving relationship with information – and it's profoundly shaping our experience of knowledge:
Transactive memory helps explain how we’re evolving in a world of on-tap information.
He illustrates this by turning to the work of Betsy Sparrow, a graduate student of Wegner's, who conducted a series of experiments demonstrating that when we know a digital tool will store information for us, we're far less likely to commit it to memory. On the surface, this may appear like the evident and worrisome shrinkage of our mental capacity. But there's a subtler yet enormously important layer that such techno-dystopian simplifications miss: This very outsourcing of memory requires that we learn what the machine knows – a kind of meta-knowledge that enables us to retrieve the information when we need it. And, reflecting on Sparrow's findings, Thomspon points out that this is neither new nor negative:
We’ve been using transactive memory for millennia with other humans. In everyday life, we are only rarely isolated, and for good reason. For many thinking tasks, we’re dumber and less cognitively nimble if we’re not around other people. Not only has transactive memory not hurt us, it’s allowed us to perform at higher levels, accomplishing acts of reasoning that are impossible for us alone. It wasn’t until recently that computer memory became fast enough to be consulted on the fly, but once it did – with search engines boasting that they return results in tenths of a second – our transactive habits adapted.
Evidence suggests that when it comes to knowledge we’re interested in — anything that truly excites us and has meaning — we don’t turn off our memory. Certainly, we outsource when the details are dull, as we now do with phone numbers. These are inherently meaningless strings of information, which offer little purchase on the mind. … It makes sense that our transactive brains would hand this stuff off to machines. But when information engages us — when we really care about a subject — the evidence suggests we don’t turn off our memory at all.
Put shortly, technology presents an opportunity for us to turn information into knowledge, an opportunity to expend neurons on developing insight while we leave more trivial bits of information at the tip of the tongue.