How many mobile numbers can you recite without referring to an external source? If that number is more than five, congratulations—you belong to the top 25 percentile of people who do. Pretty cool if not for the fact that it’s a completely made-up stat. Though my fabrication probably doesn’t stray too far from the truth.

You don’t need an article to tell you how storing information on devices is commonplace. Even Instagram does a great job reminding you what happened this day four years ago; something you’d only be able to conjure if you consciously, manually tracked your memories, and who’s psychotic enough to do that on a regular basis?

And of course, the infamous ‘Google Effect’. The phenomenon where individuals are more likely to forget information they know can be easily accessed online. This heavy reliance may lead to passive consumption of data as opposed to active processing and engagement, dulling our critical thinking in the long run.

Experiments have shown how even taking photographs can alter your memory of an event i.e. participants with cameras retain comparatively less mental information than those without. Worse still if said photos are intended for sharing, because the added factor of their presentation and liability to judgement poses as another distraction from the encounter. It’s unfair to solely call it digital amnesia.

We’ve taken notes and marked calendars long before they were little squares on our screens. We seem prone to not depend on internal memory whenever convenient. Which, to no one’s surprise, potentially weakens recollection skills by reducing the brain’s ability to effectively encode and consolidate fresh information.

Nathaniel Barr, Gordon Pennycook, Jennifer A. Stolz, and Jonathan A. Fugelsang cover this tendency to offload thinking in favour of low-effort intuitive processing in ‘The brain in your pocket: Evidence that Smartphones are used to supplant thinking’. Building upon prior concepts of “cognitive miserliness”, a simple Cognitive Reflection Test best illustrates.

A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost? __cents.

Chances say your initial response would be wrong, and basic math would spell out why. Typical scores of college students and online-recruited participants registered only 33 percent correct (Campitelli & Gerrans, 2014; Frederick, 2005).

Thankfully, apart from adequate sleep and physical exercise, memory can be boosted via simple brain training. No need for memory palaces like Sherlock (though proven useful). Simple sudoku or word puzzles help. Otherwise, journaling or a daily mental recap. If that’s somehow still asking too much, start with memorising a couple of significant birthdays rather than let social media do it for you.

Not only will concentration and analysis be enhanced, problem-solving and overall mental agility will also be sharpened. And like what mahjong does for the elderly, the risk of dementia will also be reduced.

Perhaps most interesting of the benefits will be the lesser-known effect it has on creativity. A well-trained memory means the better you are at association. This is great for drawing connections between seemingly unrelation concepts and ideas, sparking eureka moments and driving innovation. So what are you waiting for? Try Remembering ThingsTM today!

Don’t get me wrong. Freedom is great. Power to the people. Without it, I wouldn’t be able to write a scathing op-ed about what of it makes me weary (thanks, Ancient Greece, birthplace of modern democracy). Trust me, as both a consumer and producer of content, I fully acknowledge the irony.

Dispersing legislative and judicial authority prevents a single entity or individual from abusing their position—which is generally the direction you’d want to head as a civilisation. It’s this civic participation that promotes accountability, but what happens when the opportunity to participate is too freely available? And active participants are only composed of select personalities that are naturally inclined to, well, participate?

Let me steer away from the notion of government and focus on culture. The Internet was obviously the great usher of an equitable albeit virtual society. With effectively no one owning or governing it, in the words of Berkeley astronomer Clifford Stoll, “It’s the closest thing to true anarchy that ever existed.”

Anyone with access could contribute as much as they could partake. From requiring expensive equipment, experience and connections to make and market an album or a movie, to the ability to do so without all that save a smartphone, dramatically levelled the playing field (resulting in stats like the one about lifetimes worth of hours needed to watch all existing videos on YouTube alone).

It would be beyond ungrateful to lament about the extremely wide spectrum of choice. We will never run out of things to watch when we train the machine to automatically feed us with at least four more you might like this.

But thus, we stay stuck in a loop of limited exposure.

Talk to anyone from the days of yore (specifically, before digital TV), and you’ll find most of them are able to bond over what was on screen at prime time. Author of The Nineties called it “the last era that held to the idea of an objective, hegemonic mainstream before everything began to fracture” in his exposition on the defining decade.

It may not be direct causation but is it possible one factor pushing an all-time high divisive climate of opinions and temperaments is the fact that we remain chowing down only what appeals to us, made by people who already share the same perspectives as us?

Our last major shared experience was probably COVID. And maybe Tiger King. Now, at the seeming height of streaming, enter Sora. OpenAI’s next big thing since ChatGPT constructs realistic videos from text prompts at a standard that is frightening. It’s great that tools to create are available for anyone to express their ideas (maybe not so much for graduates who spent years earning qualifications to use earlier versions of said tools, but c’est la vie).

It means more diversity, representation, and recognition. However, at this zenith of infotainment free-for-all, opening ourselves to alternative viewpoints is definitely going to take a little more conscious diligence than sitting back to let an algorithm decide what to watch.

I often wonder what Andy Warhol would think about current celebrity culture, given his most attributed quote about a universal 15-minute notoriety. Which is not even verbatim, apparently. Prophecy aside, what would the visual artist make of the 21st century sea of trashy reality TV and viral reels?

Putting people on a pedestal traces back to royalty and religious figures throughout history. This, apart from making Jesus the OG influencer and another pun about God-shaped holes, demonstrates how an innate aspirational desire existed even before the advent of mass media. It’s almost like preparation met opportunity with the rise of Hollywood, tabloid culture and the successive Internet-accelerated commodification of fame.

There’s plenty of literature exploring celebrity impact on societal dynamics, but would it be fair to say the root of the obsession is a little more complex? Quite literally anyone can cultivate a fan base; without even being human. First, it was pets, now it’s AI thirst traps.

You have to admit the metrics are inconsistent too. Widespread circulation and exponential interconnectivity of diverse platforms today allow individuals of various fields to gain recognition, even going on to become an international phenomenon. Yet, we don’t necessarily regard their achievements with the same weight as the ones within the entertainment industry. Say, a semi-decent actor versus an exceptional... accountant. The extent of our interest can be equated with how much time these personalities spend in the spotlight; their relevance a parallel to how prominent they remain after we notice them, whether for their careers or their antics, à la Musk, Trump, etc.

So what fundamental aspects of human psychology does this enduring allure reflect? Why do we confer this status to entertainers, specifically? What makes fame increasingly enticing to each subsequent generation since? To loosely quote a TikTokker, “Think about it—medieval peasants didn’t ask the jester for a photo after his courtroom fart.”

I’m not against celebrities; I’m just not for inflating a performance beyond what it is. Being influenced is one thing, idolising is another. It’s that eternal debate of whether we should divorce a person’s work from their conduct, no doubt prompted by the characters we’ve dubbed "tortured geniuses".

If anything, these may be the least prospects whose behaviour we’d want to emulate. The very nature of the profession demands a certain spoonful of egocentric attributes. Worse still if said personas were thrown into a star-making machine from an impressionable age (doesn’t help that K-pop trainees eventually graduate to become ‘idols’).

Imagine spending your formative season ingrained with the need to be validated because your worth is directly proportionate to public opinion. Imagine being constantly engulfed by people who relate to you like a product because they have a job to do. What sort of worldview would that shape?

I’d argue that present-day fame transcends escapism. It has gone a little deeper beyond connection to identification, and thus emotional attachment. We surely know better than to consider everyone with a voice a role model, but in a time where fame is powered by the very attention and admiration we give, let’s perhaps not freely relinquish this respect and value to a fallible sense of extraordinary.

I can't recall the last time I did, off the top of my head. My mind goes straight to basic survival like consumption and absolution of energy; eating and defecating—excuse the savoury start to this article. Yet, even these exercises are hardly ever carried out solely anymore. You spend your lunch with mobile Netflix and play <insert top App Store game> on the crapper.
It certainly doesn’t help that AI is continuously advancing its proficiencies. If the Industrial Revolution reduced back-breaking labour 300 years ago, bestowing folks time to pursue interests outside the daily grind, AI is now doing the same with mental labour. Which means more time on our hands, and theconstant need to do something only intensifies.


We’re wired for stimulation, as exemplified by doom scrolling. Even without Tik-Tok induced dopamine highs, we’re too permeated in a state of overstimulation to acknowledge it. On numerous occasions, I’ve caught myself thumbing my phone not only during commercials (thanks, YouTube) but the shows that I’m watching.
It blows my mind to recall that listening to music used to be a pastime. Ever since they made gramophones fit in our pockets, songs are now musical white noise for commute. Even then, Spotify isn't the app you’re primarily engaging with. You’re sifting emails, answering texts, replying to comments (I promise this is not a smartphone-hating piece).

A lot came with modern convenience but a lot left as well. With everything instantaneously available, value is lost and gratitude diminishes. It's an extreme analogy but we once (and for certain parts of the world still, optional or not) had to physically get out there and source for sustenance; not just click "Check Out".
This displacement is so poetically encapsulated in Triangle of Sadness, after the motley crew gets marooned on the island. The dynamic shift based on life and death priorities effectively spells out how challenging and therefore, valuable a simple task like keeping yourself fed can be.

We are evidently geared for different times. Consider a washer-dryer versus manually doing a load of laundry. With this luxury of time, we should allow ourselves to simmer in one activity in a moment.

Say it with me: "Meaningful engagement" isn't a hippie phrase

A study by the Institute of Psychiatry at the University of London proved that multitasking makes you dumber. Think poor sleeping habits are bad? It's found that multitasking is detrimental to your IQ more severely than losing 40 winks or watching hours of trash TV. Done chronically and it can decrease grey matter density in parts of the brain.
There's no self-help angle here. I could advise to schedule “deep work” at “peak performance time” set away from “distractions” but I'd instead proffer the sinful cliché of a perspective change.

Is it plausible to retrain ourselves to concentrate even amid internal and external interferences? To plan for recreation—in its true sense—the way we do with healthy work practices. In blocks of pure, present, recognition.
In the past, fasting was one religious way to connect to a higher power. Perhaps not only because the devout abstains from sensory indulgence but the absence of needing to hunt/kill/gather/flay/cook/clean likely resulted in hours returned; hours used for quiet meditation. A contemporary equivalent wouldn't be one from bodily grub but mental fodder.

You'd be amazed how long a weekend can be without the Internet. Remove media consumption from leisure and all that’s left is either existential panic at newfound boredom or production. To create. Write, sketch, heck, dance. Explore what the body and mind are capable of. Appreciate the endeavour and how we can afford to partake in it.

Dial down the ambition; we don’t have to do them all. Only one at a time. And for once in a long time, focus.