Taking A Gigabyte Out Of Your Brain
Ever since Plato declared that writing will make people “cease to exercise memory because they rely on that which is written,” humans have had a tendency to criticize the newest methods of disseminating, recording, and representing ideas. The accusations are always similar to Plato’s: the precious internal memory is supplanted by external memory, making retention of knowledge obsolete. However, as valid as these criticisms may be, it’s foolish to argue that writing is inherently detrimental as, especially for the few centuries after the invention of the printing press in 15th century, it spurred on massive intellectual movements and democratized knowledge. But in the past century, different methods of disseminating ideas have sprung up, giving us an unprecedented amount of ways to store and communicate information. The digital age we currently inhabit took the human mind by storm. As it continues to swirl around us at blinding speeds we should take a moment to zen out and hark back to Plato. I mean, they do say he was a pretty smart dude.
Plato’s beef started with two soul-like entities – internal and external memory. He claims that true knowledge can only come from the former, so as we store more and more information into the latter we weaken our minds by focusing less on the knowledge and more on how to access it. This is a phenomenon called transactive memory. It’s common among couples, families, teams, and any other mutually cooperative group of people. It extends to other group psychology phenomena in evolutionarily beneficial ways–except when manifested as groupthink in large, threatening groups of people.
[su_pullquote align=”right”]This ease of access means we don’t really need to remember things to get by, so our brains pay less attention to details we would otherwise put effort in to retain.[/su_pullquote]However, nowadays we share memory not just with human beings, but with devices with which we share information at increasing rates. In addition, the ubiquity of tools for research allows us to access pretty much anything we would want to know at the tap of a finger. This ease of access means we don’t really need to remember things to get by, so our brains pay less attention to details we would otherwise put effort in to retain. This retention isn’t only for fun facts like “how long do our blood vessels stretch in a line,” but also for significant memories such as those intimate, personal moments we may post on social media. A 2018 study led by Princeton psychologist Diana Tamir concluded that media use such as taking photographs and posting on social media during an event impairs one’s subsequent memory of that event . Not only is this because of distraction, but it is also because the event is externalized. It is placed somewhere else to be stored for future recollection while impairing the quality of that recollection at the same time
[su_pullquote align=”right”]We are detaching from our present experiences and becoming more wired in with a technological network that stores memories, but in a different, less personal way.[/su_pullquote]But what’s the difference here between scrapbooks or any other photo album? The key difference is the ease with which social media and technology can be used compared to these more tactile items. Nowadays, we are so accustomed to the order in which we tap buttons on our phones that we put much less care into how we document our memories. Sure, we may tailor our photos or posts to perfection but even this process is less about the memory than it is about how the final product appears. All of these little routines surrounding the memory end up dominating our attention. Betsy Sparrow and a couple other leading memory psychologists confirmed this back in 2011 in a landmark study by suggesting “we are becoming symbiotic with our computer tools”, detaching from our present experiences and becoming wired in to a technological network that stores memories in a brand new dystopian, impersonal way.
[su_pullquote]Maybe we are truly becoming more like computers and less like human beings.[/su_pullquote]The metaphors comparing human interaction to computer networks have been around as long as computers have. To say humans are “hardwired” or “programmed” to do something seems perfectly valid. But it may be that these metaphors aren’t as abstract as one may think. Maybe we are truly becoming more like computers and less like human beings. Our relationships with our phones and computers have become just as complex as our relationships with other human beings. Storing memories into our phone, confiding in Google for personal questions, and posting about our lives online has replaced the normal interpersonal ways of sharing these things. Somebody from ten years ago would most likely be appalled at how we interact with each other during special life events. If there is one thing we must not forget, it is that nothing can replace those transient, special moments in life that fill us with pleasure once they happen. Luckily, we know that even without Google.
Christian Fogerty ‘19 studies in the College of Arts & Sciences. He can be reached at c.fogerty@wustl.edu.