A while back I had this idea to display multiple ESP32-Cam feeds in VR relative to where the cameras are on a robot. Even moving the feed in VR if the camera if the camera is on a pan/tilt base or arm. Putting other data in VR could be done too, like range data from ToF sensors (i.e. SparkFun Qwiic Mini ToF Imager – VL53L5CX – Multizone ranging output sensor.) Maybe combine the picture and range data too.
Recently I ran across a project with a similar idea but with a panoramic 360 camera in VR space. In an update, they say “I like the VR view, but the resolution is just plain crap…”, referring to the 360 camera feed I think. Then they go on to say they will try multiple cameras projected on multiple cylinder segments in the future. I’ll have to keep an eye out for updates.
At a KitsapCREATE meeting last Friday, I was talking to a member about their robot and they mentioned another VR related idea. Stereo cameras on a pan/tilt controlled by a VR headset with a foam dart launcher on a separate pan/tilt controlled by joystick. They talked about switching one of headset’s eye views to a camera on the launcher. I showed them the other project on my phone and mentioned my idea too. I thought maybe a line in VR showing the launcher’s orientation would be cool too.
I was just wondering if using stereo cameras could be fed to the related eye in a VR headset to help with depth perception but still the cameras feeds be shown in relative position in the VR space. I don’t think that would work out too well, but maybe, if something can process the feeds fast enough, 3D data could be fed into the VR space.