Marshall and I had been going back and forth for over three hours. What started as a casual conversation about innovation had spiralled into an intense debate over a so-called revolutionary invention. A device that supposedly produced free energy. Someone had made the claim online, even going as far as meeting the president and presenting a said car that ran on free energy. Marshall, ever the optimist, was willing to entertain the possibility that it could be real.

I, on the other hand, was having none of it. “It’s a scam,” I said, for what must have been the tenth time. “You can’t just create energy out of nothing. That violates the laws of physics.”

“But what if,” Marshall countered, “we just don’t understand everything yet?”

“Science evolves,” I said, “but laws like thermodynamics don’t just stop being true because someone on the internet claims they found a loophole.”

I believe Marshall sighed, shaking his head. “You’re so sure about this. But what if, just this once, you’re wrong?”

That question lingered in my mind long after the debate ended. Not because I doubted my stance, science is clear on this, but because it forced me to reflect on my own approach to knowledge. Dostoevsky once said, “Man is a creature that can get accustomed to anything.” Had I become too accustomed to my role as the rational one, the debunker, the one who knows? Had I stopped being open to learning, not about free energy, but about the psychology of belief, the power of persuasion, and the broader human tendency to seek out hope, even in the face of impossibility?

During the course of our debate, Marshall challenged me further, asking, “Who decides what is true? The majority? The same majority that once believed the earth was flat and killed those who disagreed?”

I responded, “You conflate truth with belief and consensus. Truth, in its objective form, is independent of human perception. The earth was never flat, even when the majority believed so. Their belief was simply incorrect. Truth does not change, only our understanding of it does.”

Marshall wasn’t convinced. The nature of truth is often misunderstood. I think people cling to ideas because they feel true, not necessarily because they are. And yet, even as I defended objective reality, I recognised that my approach to such discussions needed refining.

I have always prided myself on being grounded in reason. I am not easily swayed by pseudoscience, conspiracy theories, or the latest “breakthrough” that claims to defy fundamental laws. Over the years, I think I have sharpened my ability to spot a scam, vague technical jargon, appeals to emotion over evidence, claims that sidestep peer review and rigorous testing.

But something about that conversation with Marshall hit me and hit me hard, maybe because we have been friends since first grade. It was about the nature of belief and how people become invested in ideas that, to them, feel real. Marshall wasn’t an idiot. He wasn’t gullible. In fact, he still remains one of the best minds I know. He was just someone who wanted to believe that the world still had grand, undiscovered possibilities. And perhaps, in his own way, he was challenging me to consider whether my scepticism had hardened into dogma.

That night, I dug out an old journal. Flipping through pages, I found a quote I’d scribbled after my first heartbreak: “The most dangerous lies are the ones we need to believe.” Back then, I’d applied it to love. Now, it felt like a key to understanding my friend’s defiance.

We adapt to survive not just physically, but emotionally. We grow calluses around tender hopes, learn to navigate a world where miracles are rare, and disillusionment is cheap. Marshall’s faith was a refusal to accept that some problems might not have solutions. It was a classical rebellion against despair.

And what of my own adaptations? I traced the arc of my scepticism: the years spent dissecting scams, the pride in being “the rational one.” But reason, I realised, can calcify into rigidity. I’d built an identity around debunking, so much so that I’d forgotten how to engage with the why behind the myths, the human ache that turns whispers into dogma.

I do not intend to speak for him, but I think he wasn’t arguing it’s real. He was arguing it’s possible.

Could it be that the device wasn’t a machine; it was a metaphor? Maybe, it represented the tantalising idea that humanity could course-correct, that our mistakes weren’t irrevocable. However, for me, debunking it was a talisman against chaos. It was a way to assert order in a world that feels increasingly chaotic.

I believe that was what Marshall was clinging to, even if he couldn’t articulate it outright. He wanted to believe that someone, somewhere, had found a way to crack the code, to upend everything we thought we knew. And while I was and still am convinced the claim is false, I have to admit that my immediate instinct wasn’t to explore why people believe these things, but rather to shut the idea down outright.

Had I been so intent on being a teacher, that I lost out on the benefits of being a student?

A few days after our debate, I found myself researching about historical scientific hoaxes, perpetual motion machines, cold fusion claims, miracle health cures. The promise of something too good to be true is incredibly powerful. People aren’t just convinced by flawed science; they are convinced by the hope that something revolutionary can change the world.

So, I started reading more, not about physics (I already knew the answer there), but about human psychology, about persuasion, about the fine line between scepticism and cynicism. I realised that I didn’t have to abandon reason to have more meaningful conversations. I just had to shift my approach from arguing to understanding.

As I sank more into this research, I came to see that the best way to challenge the commodification of misinformation isn’t just by refuting it. It’s by understanding why people believe it in the first place. Telling someone they’re wrong doesn’t change their mind. Asking them why they believe something, engaging with their reasoning, and guiding them to question their own conclusions is far more effective.

I wasn’t wrong about free energy; it still is a scam. But I realised that my certainty had made me dismissive rather than curious. Instead of asking, “Why do people want to believe this?” I had only focused on, “How do I prove this wrong?” There is a difference between being right and being effective in a discussion.

So what is the point of this? The world is full of scams. But it’s also full of Wright Brothers moments of ideas that once seemed impossible until they weren’t. The task isn’t to dismiss the impossible, but to discern, with patience and humility, the line between miracle and mirage.

And sometimes, to do that, you need someone who’ll argue with you until 2 a.m., not to change your mind, but to keep it agile. To remind you that inquiry, at its best, is a dialogue not just with others, but with the parts of ourselves we’ve too quickly deemed certain.