11:35:54

Beyond the Smartphone

16th July 2025

My recent concept work for the Kiroshi HUD is, as I mentioned on its page, based on the Kiroshi eye implants from Cyberpunk 2077 (will this guy ever stop waffling on about that game?) and it got me thinking about how eventually we won't need a smartphone.


Whilst the HUD from the game is based mostly on the eye implants, it works in tandem with the cyberdeck, which most people would currently determine to be the equivalent to a portable custom desktop-class PC. Not quite as big and official as a laptop, but not quite as touch-based and baby-interfaced as a smartphone. Somewhere in between the two. But it's for having a fully self-controlled device on the go with actual compute power rather than merely a toy.


By 2077, cyberdecks are no longer confined to a cool little handheld device, and are embedded into you as implants, I guess as the internal deck without the deck. It becomes an extension of you, a neural linked chip to your brain. Your brain's operating system, so to speak. And the core functionality of this deck integrates with your Kiroshi eye implants as your "screen".


Therefore, it could be said your "deck"/smartphone equivalent basically vanishes and you "absorb" it to become it. You think it, it shows in front of you, it becomes a "sixth sense".


In my HUD concept I still have a little touch panel attached to the hand, as naturally you can't think to do things on a Quest 3, which is a limitation due to BCIs not quite being there yet. We have some interesting things being developed by Neuralink, and Meta themselves have got some things in the works both as proposed research papers and prototypes of external wrist-based neural measurement interfaces.


That last one is what is pictured in the header image, on the right. The rest of the setup is a scroll ring and a prototype called "Orion", which is a pair of smart glasses with a camera/screen. They prototyped it a little while back now, and it should be an official product sometime over the coming year(s?). The three parts all work in tandem to allow you to "think to your wrist" and scroll on the ring to interact with the content on the glasses without swinging your arms around like a weirdo.


Nonetheless, this prototype is not quite to the same level as the Kiroshi/cyberdeck pairing. I would think that if combined with a Neuralink BCI, a thinner version of the glasses will likely eventually become the equivalent.


But it it got me thinking; what will this world look like? No one will need a smartphone anymore. You probably won't see people staring down at a screen. You'll probably just see them with their eyes glazed over like they're daydreaming or asleep. Maybe even as though they're hallucinating.


What if you are looking at something, do you just ignore what's in front of you? How do you signal to others that you're not present? In Cyberpunk 2077, if you look closely when any other character takes a call or controls something else, their eyes glow orange or blue to communicate that they aren't present, akin to the EyeSight feature on the Vision Pro communicating similar presence to those around you.


Then there are other things - you won't need to use your hands for scrolling or interacting; this will be a godsend, as it means that you won't get carpal tunnel syndrome for extensive computer usage, as many designers will tell you gets them eventually. That's a good thing.


Though I guess maybe your brain will suffer the strain. It opens up a privacy Pandora box of issues. What if people see through your eyes? What if people hack your thoughts? What if companies sell your thoughts? What if you lose yourself in the process?


Maybe a smartphone is not that bad, haha.


But I guess the fact of the matter is, eventually we'll decide to move beyond it as the alternatives will be on par. The world of Cyberpunk 2077 doesn't have the AI developments we have (as most are rogue AIs or banned and shunned), so imagine having an adaptive AR eye implant which can code itself to be as you think it as you run an AI model alongside your brain.


People have spoken about this before, but what if eventually you can decide, by thinking, that you wish to view the world around you in an anime filter? Or any style, for that matter.


What if you upgrade to view infrared, micro/radio waves, UV? New colours?


More cynically, like the one Black Mirror episode, block people from your vision? Deepfake people's faces onto everyone else. You could make everyone you think negatively about look like Agent Smith from the Matrix, haha.


Yeah.


Many crazy and weird things could happen. Will happen, most likely. I'm sure nothing terrible will come of this technology (sarcasm).


Either way, I'm sold on the promise of no CTS or eye strains. Much better ergonomics in BCIs over a mouse or touchscreen.

——————

——————

——————