top of page

The Future of Computing

We're less than 2 months into 2024, and we have seen 2 devices that, in my eyes, give us a glimpse into the future of our phones and laptops. The ultra-hyped Apple Vision Pro pushes the boundaries of mixed reality interfaces and the Rabbit R1 teases the potential of natural-language-based, app-free interactions with our devices. In isolation, these promising gen-1 devices have their share of limitations. What excites me more than these devices in themselves is a not-so-distant future where their paths converge, blending mixed reality with natural language interaction to redefine our relationship with technology.

 

Let's delve deeper into what this inevitable convergence means for the future of computing.




Augmenting our realities


Apple's 'spatial computing' device, the Vision Pro isn't the first mixed reality VR headset, but its low-latency high-fidelity pass-through, coupled with incredibly accurate eye and gesture tracking gives its users a taste of what it feels like to have their virtual and real worlds almost seamlessly integrate into each other. I highly recommend watching Casey Neistat's video to truly appreciate how awe-inspiring future iterations of this technology will be. Although I wouldn't recommend most people to buy this version of the Vision Pro, the device literally opens our eyes to a future where our reality will truly be augmented by technology in an unobtrusive manner.


Talking to our devices

 

The Rabbit R1 was all the rage in the media coverage of CES 2024. The Verge aptly describes this device as a 'universal controller for apps'. It uses a 'large action model' to understand complex, multi-layered instructions in natural language to take actions in apps on your behalf. I don't see this technology succeeding as a stand-alone device that's paired to our phones. Instead, I'm sure this is what the Google Assistants, and Siris, and Alexas of the world will evolve into - fundamentally changing how we interact with our mobile devices.

 

The convergence


Ok, now dream with me and envision your laptop or phone of the future. It will bring together what the future versions of the Apple Vision Pro and Rabbit R1 can do. Your traditional screens will be replaced by virtual screens pinned to your real-world surroundings, precisely tracking your eye and body movements to interact with what you see. You won't have to open and navigate through specific apps to perform tasks. You can simply talk to your device like you would with a human and expect it to be context-aware enough to take the exact series of actions that you desire. For the Iron Man fans out there, yes, I am telling you that you'll soon have your own version of Jarvis!


It could get scary!

 

As excited I am about this eventuality, I'd be amiss if I didn't mention the risks. Issues like privacy and device addiction will definitely get amplified. Then there's the Pandora's box of not being able to distinguish between the virtual and real-world or not being able to tell a human apart from an AI model. Like with any transformational technology, our laws, regulations, and even lifestyles will have to evolve to navigate the potential pitfalls. If we manage to do that, we're on the precipice of the next big revolution in personal computing.

 

Thank you for tuning in. If you liked this, make sure to subscribe, and I'll see you in the next episode!


12 views0 comments

Recent Posts

See All
bottom of page