Universal HID Updates

Introduction 

I've been working on this project, and I've run into some issues that I haven't been able to overcome. In this post, I will articulate how much I've done since my last update when this project was titled ALS HID. Please read that post if you wish to understand the essence of the project, because this post will only be going over the newer problems I've encountered.

IF I have sent you this link, chances are it's because you are: 

  • A professor/ industry specialist/ TA other form of a mentor whom I've consulted the problem about.

  • A job/ internship provider whom I would like to show this project and how far I've gotten

  • A fellow student

Updates - Summer 2016

Over the summer I did the IDEA Summer Internship with my former design & technology teacher - which is documented here. I decided to apply my Universal HID to this specific use case for two reasons.

1. Although this mechanical solution was comfortable, solid, and functional, there was one facet which could have been better. The movement for a trigger input involves pressing something, however, we substituted that movement input with the extension of the entire arm. Griffin seemed to be fine playing Space Pirate Trainer; however, games involving holding a shield, for example, are not compatible with the device we built. 

2. The point of my Universal HID project is to be universal; it should be able to extract any movement-based input (in the domain of humans) to output to another device. The VR Accessibility issue we solved for the internship is an excellent example of such a scenario. Solving this scenario using the Universal HID could prove the concept. 

So, I set out to accomplish this. One thing I realized is that the games I found didn't have an option to use other input - they only had settings compatible with the HTC Vive default controllers. In order to get input from my device to a game, I had to create my own game which would enable me to get input from other devices. Towards the end of the internship I created a virtual reality test game outlined by a tutorial from VR Dev School. It can be seen below.

Pressing the trigger picks up an object - and I have a cube and a sphere to play around with (with physics, collision detection, etc...). All I have to do is change the event handler from TriggerPress to MouseClick (or some other syntax, I can't remember, but you get the idea), then configure the PSoC to act like a HID mouse with click functions. Or I could use it to mimic a keyboard. Either way, this is how I plan to implement the project. I already got the wrist straps which Griffin prefers, so all that's left to do is get my HID to act like a trigger. 

A question that could be brought up is why don't I just use a button/ switch? My answer is :a strong reason for applying the HID is as a proof of concept - even if I'm simplifying the capability of two inertial measurement units (IMUs) to the functionality of a button. It's possibly the easiest application of the HID for a real world problem, so it's a good place to start. Once I get this working, I plan to expand its functionality beyond simply a switch - perhaps adding more thresholds for different degrees of trigger pressing. 

Here's the problem that I've been running into. 

Problem Description

FAQ's

What Do I have Done already?  

I have two inertial measurement units (IMUs), each of which give me their absolute orientation in various units and other information. An overview of the Adafruit BNO055 Absolute Orientation Sensor, with links to further documentation, can be found here. I am able to read absolute orientation in both Euler Angles and Quaternions from each of the sensors. I know that I have to use quaternions instead of euler angles for two main reasons:

  1. In Euler Angles, the value for 0 degrees and 360 degrees - and this discontinuity can make it difficult to work with. Quaternions, on the other hand, are continuous.

  2. You come across Gimbal lock with Euler Angles.

What is gimbal lock?  

I can barely understand it myself. I understand it mechanically - by playing around with a 3D printed gimbal, but I do not understand why that occurs mathematically, i.e. how some axes rotate/ depend on other axes in measurement. Here are some links to help one understand this issue. 

 

What Do I want to do?  

I want to acquire a value that will determine the orientation/ bend angle of my wrist - or any other joint for that matter, but let's keep the problem confined. Currently, the problem is purely mathematical. A sketch and a video of the problem will be coming soon. Here's how I'll confine the problem: 

We're only looking at the wrist here

  • I'm only interested in the bend of one dimension - and that will be the angle between the "planes" of the two sensors

  • I only care about bending past a certain threshold, so my desired value will be binary. Either I've bent past the threshold or I haven't.

Btw this video is Take #1, Take #2 will be coming out soon. 

Update: 12/17/19

I never came out with a take #2, but it doesn’t matter, because I’ve found the technology that’s precisely what I’m trying to re-create for affordability and versatility: a wearable electrogoniometer. I had the chance to tinker with one through the lab I do research for currently: The Restorative Tech Lab. This was at the AMP (Amplified Movement Lab) where I was analyzing the data of pendulum tests. Anyways, here’s a little video below demonstrating precisely the phenomenon I was describing in the video above, years ago: