I bought a robot arm kit that came with a shield that had 2 PS2 joysticks on it. The two Arduinos I tried with it would always show 1023 on the analog inputs for all of the analog inputs. I tried other potentiometers and another PS2 joystick without the shield, and the analog inputs read properly. After some troubleshooting I tried cutting the Aref pin on the shield and the analog pins started reading correctly.
Notes about the board that I had to dig around to find:
Left joystick X axis connected to A0
Left joystick Y axis connected to A1
Left joystick Z axis (button) connected to D2
Right joystick X axis connected to A2
Right joystick Y axis connected to A3
Right joystick Z axis (button) connected to D4
Servo 1 signal pin connected to D11
Servo 2 signal pin connected to D10
Servo 3 signal pin connected to D9
Servo 4 signal pin connected to D5
Servo 5 signal pin connected to D7
Servo 6 signal pin connected to D6
Shield LED connected to D3
Bluetooth TX connected to D13
Bluetooth RX connected to D12
Shield is labelled: mearm.wdt v2.18.8.22.s It is also sold as: Dual PS2 Game Joystick Button Module JoyStick Shield Compatible With UNO R3
This schematic doesn’t show that the Aref pin is tied to ground, but on the board I have it is. Store with link to schematic PNG.
A while back I had this idea to display multiple ESP32-Cam feeds in VR relative to where the cameras are on a robot. Even moving the feed in VR if the camera if the camera is on a pan/tilt base or arm. Putting other data in VR could be done too, like range data from ToF sensors (i.e. SparkFun Qwiic Mini ToF Imager – VL53L5CX – Multizone ranging output sensor.) Maybe combine the picture and range data too.
Recently I ran across a project with a similar idea but with a panoramic 360 camera in VR space. In an update, they say “I like the VR view, but the resolution is just plain crap…”, referring to the 360 camera feed I think. Then they go on to say they will try multiple cameras projected on multiple cylinder segments in the future. I’ll have to keep an eye out for updates.
At a KitsapCREATE meeting last Friday, I was talking to a member about their robot and they mentioned another VR related idea. Stereo cameras on a pan/tilt controlled by a VR headset with a foam dart launcher on a separate pan/tilt controlled by joystick. They talked about switching one of headset’s eye views to a camera on the launcher. I showed them the other project on my phone and mentioned my idea too. I thought maybe a line in VR showing the launcher’s orientation would be cool too.
I was just wondering if using stereo cameras could be fed to the related eye in a VR headset to help with depth perception but still the cameras feeds be shown in relative position in the VR space. I don’t think that would work out too well, but maybe, if something can process the feeds fast enough, 3D data could be fed into the VR space.
During the final quarter of the composites class, a fellow student was working on fiber glass corn hole boards with acrylic pour and then bar top epoxy over that.
PrinterBot CrawlBot but with a plastic sheet that goes from one end of the sheet to be cut to the other end, over the carriage. Then has a vacuum on one side of the carriage for cleaning and pulling the carriage down to the sheet being cut. Maybe the exhaust of the vacuum blows from the other side of the carriage to help with cleaning.