Bioreality
- Nikola Njegovan

- Oct 7
- 6 min read
Updated: Oct 31

Throughout my day, I have a cup of tea. It sits next to me while I work, a quiet companion of mine, my computer, and the two screens I stare at for most of my waking hours. The cup itself is a piece of engineering, designed to keep the tea at precisely 135 degrees Fahrenheit for as long as the battery can last. I use this cup because of the way I drink, slowly, in sips, pausing between tasks. It will get cold before I would finish half the cup if it wasn't constantly heated, but in that same regard, I rarely stop other tasks long enough to truly sit down and simply enjoy the tea.
And it’s in that small ritual that I’m reminded that something about our world has shifted.
We design and use tools that serve our fragmented lives, tools that extend our attention across too many surfaces at once. My tea stays warm not because I’ve made space to enjoy it, but because I need technology to preserve it while I remain absorbed in technology.
Most of us can feel this shift, even if we struggle to name it. We sense it at the edges of our conversations, but we rarely put words to it because we are too close to it, too involved in its making, too invested in the tools that are reshaping everything around us.
The pandemic only accelerated what was already underway. We’ve slipped further into a fabricated world. The more we lean into digital spaces, the more our lives are mediated by symbols words, images, text, and feeds that point toward the real world but are not the real thing.
And in this space, something is emerging, if not already here: a subtle distrust of reality itself.
Generative AI makes this even sharper. Machines are not just calculating, they’re producing words, images, and voices that look and sound like ours. They can pass for us. They can blur the line between what is authentic and what is synthetic.
The result is not just a question of “fake news” or misinformation. It’s deeper. It’s about whether we can still tell the difference between the world itself and the representations of it we’ve created.
Maya and the World of Symbols
Most of our days are now spent with symbols, not things. We scroll, type, text, and post. Logical sequences of words and icons structure our experience. But they are only markers. They point at the real without ever touching it.
I was reminded of this recently while rewatching a lecture by Alan Watts, one of the great Zen teachers of the West. In it, he spoke about the Hindu concept of Maya, the idea that the world of forms, names, and symbols is not the ultimate reality, but rather a necessary veil or structure through which we make sense of life. Symbols are essential to society and culture; they allow us to organize, communicate, and act together.
But Watts also warned about a distinctly human tendency: we begin to confuse the symbols for the reality itself. We become so focused on maintaining the structure of society that we cast out anyone who might disrupt it. This pattern has played out across history. Sometimes it’s as small as someone being dismissed from a company for not following bureaucratic principles. Other times it’s entire religious systems being challenged by rebels, leading to schisms, reformations, or the rise of new orders altogether.
This is part of the human condition: we cling to the structures we build, and we defend them fiercely, even at the cost of silencing those who speak inconvenient truths. But the courage of individuals to speak their perspective even when society tries to suppress it is also what drive us forward. They remind us that the symbols are not reality itself, and that no structure, however useful, can contain the whole of life.
When Technology Shapes our Shared Language
Another layer to this story is how deeply technology shapes not just what we do, but how we think and speak. Our communication, our symbols, even our metaphors are increasingly influenced by the tools we use to organize information.
Take the spreadsheet. In the business and corporate world, spreadsheets have been such a dominant way of structuring data that they have quietly reshaped how people communicate. You will often hear someone explain an idea in terms of “columns” and “rows,” or frame decisions as if every variable must fit neatly into a cell. The tool has become so ingrained in the way we work that it does not just store information, it becomes the mental model for how information is supposed to be.
The same thing happens with jargon and acronyms. “Corporate speak” is essentially a shared symbolic language, a shorthand built to move information quickly across hierarchies and teams. Over time that shorthand becomes the language itself. It influences how people perceive problems, how they discuss solutions, and even how they measure success. What began as a convenient tool for clarity often evolves into a rigid structure that narrows thought.
This is exactly what Alan Watts was warning about when he spoke of Maya. We create symbols and structures to help us navigate reality, but then we begin to mistake those symbols for the reality itself. The spreadsheet, the acronym, the corporate framework, the AI-generated phrase or image, all of these are tools of representation. They are useful, but they are not the world itself. The risk is that by immersing ourselves too fully in these symbolic systems, we begin to forget what they are meant to represent. We risk the chance of placing the symbols above our own real connection with each other and cling to the illusion to feel safe because we divide ourselves and forget that we are the same living beings, bound by the same needs and fragility.
The Question of Direction
When I was younger I studied engineering and was deeply involved in both the academic and social sides of undergraduate life. One afternoon I attended a lecture delivered by a former president of the American Cancer Society. He began his talk to the room of young bioengineering candidates with a simple but powerful idea. He said that bioengineering could be defined in two ways: as “engineering for the sake of biology” or as “using biology as the tool for engineering.”
That distinction has stayed with me ever since. On the one hand, we can look at something like generative AI. It is, in a sense, engineering for the sake of biology, but it also relies heavily on biomimicry. Its architectures are built on observing and reproducing patterns found in biological systems. In that way biology itself (or at least the observation of biology) becomes the tool for engineering.
During my last year at university, all students in engineering-related degrees were required to take an ethics class. The reasoning was clear. Creating and using technology, no matter the field or purpose, is always a choice. Do we serve life, or do we bend life into something that serves us?
That same question now applies to artificial intelligence, to progress itself, and to the tools we are unleashing into the world. But here is where things become even more pressing. The technology we are building today is amplified. Its direction and trajectory carry consequences far greater than the simple question of whether someone can present information in a spreadsheet.
AI in its current form is already pushing progress at a speed we have never experienced before. The problem is not acceleration itself. The problem is that we may be accelerating without steering. Driving fast is not inherently dangerous because of the speed. It is dangerous because of what happens when you hit the wall.
Remember What is Real
Most of the time, my cup of tea is just sitting there, warm and waiting, preserved at the intended temperature by the engineered technology that surrounds my life. I take sips while I write, while I answer emails, while I stare into the glow of the screens. Rarely do I stop to sit with it, to notice the stillness of the liquid or the way a ripple forms and dissolves across its surface.
But that is exactly what we must remember. To pause, to notice, to hold on to the simple reality that is right in front of us. The warmth of the tea is real. The surface tension is real. The moment of stillness is real. And when we take the time to recognize it, we remember something even greater: our humanity. Not only our own, but the humanity in others.
It is this recognition that makes it possible to use the tools we are building wisely. Because in the end, we absolutely need each other. We always have. Our survival, our progress, our creativity, our breakthroughs have all come from the same source: our ability to trust, to rely on, and to help one another.
We will choose to meet this new age with that same humanity. We will, because we have done it before. We have faced moments where the ground beneath us seemed to shift, where new tools reshaped the fabric of society, and we found our footing together. We have adapted, we have rebuilt, we have reimagined. Each time we have returned to what is most real and most human.
The velocity of progress is not going to slow, but we are not powerless passengers. We are its guides. And the foundation of that guidance will not come from algorithms or abstractions, but from the memory of what is real. The ripple on the surface of the tea. The presence of another human being. The shared trust that binds us together.
This is the ground we will stand on as we build. This is how we will guide progress instead of being consumed by it.






