Wheelchair ReCamera virtual joy

I am designing a prototype for assisted driving on my wheelchair with reCamera.

My goal is to use reCamera as a virtual joystick. The reCamera is facing me and looks at my face and head.

It must recognise the degree of head tilt.

For example, when I turn my head to the right or left, it outputs a variable proportional to the degree of head movement.

Similarly, raising and lowering my head should produce another proportional variable.

Is this the right product?

1 Like

Hi,

That’s a solid plan! Your idea is definitely feasible: by using the YOLO Pose model on reCamera to extract the 5 facial keypoints (eyes, ears, and nose), you can then apply a Perspective-n-Point (PnP) calculation to estimate head orientation. From there, simply mapping the Yaw and Pitch angles into proportional variables will give you the precise virtual joystick control you need. It’s a very efficient workflow for edge-based assistive driving!

However, keep in mind that this PnP transformation isn’t an “out-of-the-box” feature of the YOLO model itself. You’ll need to manually implement the bridge between the 2D keypoints and the 3D world.

John

3 Likes

Thank you, Jhon.

Does Recamera Nodered support the Chessboard Method for Camera Calibration? I need it for (PnP) calculation.