Make Your VR Controllers Handle Like Two-Handed Weapons

Wielding things like two-handed swords in VR can be awkward. There’s no sense of grasping a solid object. The controllers (and therefore one’s hands) feel floaty and disconnected from one another, because they are. [Astro VR Gaming] aims to fix this with a DIY attachment they are calling the ARC VR Sword Attachment.

The ARC is a 3D-printed attachment that allows a player to connect two controllers together. They can just as easily be popped apart, which is good because two separate controllers is what one wants most of the time. But for those moments when hefting a spear or swinging a two-handed sword is called for? Stick them together and go wild.

The original design (the first link up above) uses magnets, but an alternate version uses tapered inserts instead, and provides a storage stand. Want to know if the ARC is something you’d like to make for yourself? Watch it in action in the video embedded just under the page break.

VR is an emerging technology with loads of space for experimentation and DIY problem solving. We wish more companies would follow Valve’s example of hacker-friendly hardware design, but even just providing CAD models of your hardware to make attachments easier to design would be a big step forward, and something every hacker would welcome.

Continue reading “Make Your VR Controllers Handle Like Two-Handed Weapons”

FPV Flying In Mixed Reality Is Easier Than You’d Think

Flying a first-person view (FPV) remote controlled aircraft with goggles is an immersive experience that makes you feel as if you’re really sitting in the cockpit of the plane or quadcopter. Unfortunately, while your wearing the goggles, you’re also completely blind to the world around you. That’s why you’re supposed to have a spotter nearby to keep watch on the local meatspace while you’re looping through the air.

But what if you could have the best of both worlds? What if your goggles not only allowed you to see the video stream from your craft’s FPV camera, but you could also see the world around you. That’s precisely the idea behind mixed reality goggles such as Apple Vision Pro and Meta’s Quest, you just need to put all the pieces together. In a recent video [Hoarder Sam] shows you exactly how to pull it off, and we have to say, the results look quite compelling.

Continue reading “FPV Flying In Mixed Reality Is Easier Than You’d Think”

VR Headset With Custom Face Fitting Gets Even More Custom

The Bigscreen Beyond is a small and lightweight VR headset that in part achieves its small size and weight by requiring custom fitting based on a facial scan. [Val’s Virtuals] managed to improve fitment even more by redesigning a facial interface and using a 3D scan of one’s own head to fine-tune the result even further. The new designs distribute weight more evenly while also providing an optional flip-up connection.

It may be true that only a minority of people own a Bigscreen Beyond headset, and even fewer of them are willing to DIY their own custom facial interface. But [Val]’s workflow and directions for using Blender to combine a 3D scan of one’s face with his redesigned parts to create a custom-fitted, foam-lined facial interface is good reading, and worth keeping in mind for anyone who designs wearables that could benefit from custom fitting. It’s all spelled out in the project’s documentation — look for the .txt file among the 3D models.

We’ve seen a variety of DIY approaches to VR hardware, from nearly scratch-built headsets to lens experiments, and one thing that’s clear is that better comfort is always an improvement. With newer iPhones able to do 3D scanning and 1:1 scale scanning in general becoming more accessible, we have a feeling we’re going to see more of this DIY approach to ultra-customization.

VR Headset With HDMI Input Invites A New Kind Of Cyberdeck

Meta’s Quest VR headset recently got the ability to accept and display video over USB-C, and it’s started some gears turning in folks’ heads. [Ian Hamilton] put together a quick concept machine consisting of a Raspberry Pi 400 that uses a VR headset as its monitor, which sure seems like the bones of a new breed of cyberdeck.

With passthrough on, one still sees the outside world.

The computer-in-a-keyboard nature of the Pi 400 means that little more than a mouse and the VR headset are needed to get a functional computing environment. Well, that and some cables and adapters.

What’s compelling about this is that the VR headset is much more than just a glorified monitor. In the VR environment, the external video source (in this case, the Raspberry Pi) is displayed in a window just like any other application. Pass-through can also be turned on, so that the headset’s external cameras display one’s surroundings as background. This means there’s no loss of environmental awareness while using the rig.

[Note: the following has been updated for clarity and after some hands-on testing] Video over USB-C is technically DisplayPort altmode, and both the video source and the USB-C cable have to support it. In [Ian]’s case, the Raspberry Pi 400 outputs HDMI and he uses a Shadowcast 2 capture card to accept HDMI on one end and outputs video over USB-C on the other.

Here’s how it works: the Quest has a single USB-C port on the side, and an app (somewhat oddly named “Meta Quest HDMI link”) running on the headset takes care of accepting video over USB and displaying it in a window within the headset. The video signal expected is UVC (or USB Video Class), which is what most USB webcams and other video devices output. (There’s another way to do video over USB-C which is technically DisplayPort altmode, and both the video source and the USB-C cable have to support it. That is not what’s being used here; the Quest does not support this format. Neither is it accepting HDMI directly.) In [Ian]’s case, the Raspberry Pi 400 outputs HDMI and he uses a Shadowcast 2 capture card to accept HDMI on one end and output UVC video on the other, which is then fed into the Quest over a USB-C cable.

As a concept it’s an interesting one for sure. Perhaps we’ll see decks of this nature in our next cyberdeck contest?

Meta Cancels Augmented Reality Headset After Apple Vision Pro Falls Flat

The history of consumer technology is littered with things that came and went. For whatever reason, consumers never really adopted the tech, and it eventually dies. Some of those concepts seem to persistently hang on, however, such as augmented reality (AR). Most recently, Apple launched its Vision Pro ‘mixed reality’ headset at an absolutely astounding price to a largely negative response and disappointing sale numbers. This impending market flop seems to now have made Meta (née Facebook) reconsider bringing a similar AR device to market.

To most, this news will come as little of a surprise, considering that Microsoft’s AR product (HoloLens) explicitly seeks out (government) niches with substantial budgets, and Google’s smart glasses have crashed and burned despite multiple market attempts. In a consumer market where virtual reality products are already desperately trying not to become another 3D display debacle, it would seem clear that amidst a lot of this sci-fi adjacent ‘cool technology,’ there are a lot of executives and marketing critters who seem to forego the basic question: ‘why would anyone use this?’

Continue reading “Meta Cancels Augmented Reality Headset After Apple Vision Pro Falls Flat”

Meta Doesn’t Allow Camera Access On VR Headsets, So Here’s A Workaround

The cameras at the front of Meta’s Quest VR headsets are off-limits to developers, but developer [Michael Gschwandtner] created a workaround (Linkedin post) and shared implementation details with a VR news site.

The view isn’t a pure camera feed (it includes virtual and UI elements) but it’s a clever workaround.

The demo shows object detection via MobileNet V2, which we’ve seen used for machine vision on embedded systems like the Raspberry Pi. In this case it is running locally on the VR headset, automatically identifying objects even though the app cannot directly access the front-facing cameras to see what’s in front of it.

The workaround is conceptually simple, and leverages the headset’s ability to cast its video feed over Wi-Fi to other devices. This feature is normally used for people to share and spectate VR gameplay.

First, [Gschwandtner]’s app sets up passthrough video, which means that the camera feed from the front of the headset is used as background in VR, creating a mixed-reality environment. Then the app essentially spawns itself a Chromium browser, and casts its video feed to itself. It is this video that is used to — in a roundabout way — access what the cameras see.

The resulting view isn’t really direct from the cameras, it’s akin to snapshotting a through-the-headset view which means it contains virtual elements like the UI. Still, with passthrough turned on it is a pretty clever workaround that is contained entirely on-device.

Meta is hesitant to give developers direct access to camera views on their VR headset, and while John Carmack (former Meta consulting CTO) thinks it’s worth opening up and can be done safely, it’s not there yet.

Robust Speech-to-Text, Running Locally On Quest VR Headset

[saurabhchalke] recently released whisper.unity, a Unity package that implements whisper locally on the Meta Quest 3 VR headset, bringing nearly real-time transcription of natural speech to the device in an easy-to-use way.

Whisper is a robust and free open source neural network capable of quickly recognizing and transcribing multilingual natural speech with nearly-human level accuracy, and this package implements it entirely on-device, meaning it runs locally and doesn’t interact with any remote service.

Meta Quest 3

It used to be that voice input for projects was a tricky business with iffy results and a strong reliance on speaker training and wake-words, but that’s no longer the case. Reliable and nearly real-time speech recognition is something that’s easily within the average hacker’s reach nowadays.

We covered Whisper getting a plain C/C++ implementation which opened the door to running on a variety of platforms and devices. [Macoron] turned whisper.cpp into a Unity binding which served as inspiration for this project, in which [saurabhchalke] turned it into a Quest 3 package. So if you are doing any VR projects in Unity and want reliable speech input with a side order of easy translation, it’s never been simpler.