Showing posts with label node. Show all posts
Showing posts with label node. Show all posts

Saturday, 11 October 2014

Integration of Subsystems on the Software level

This post we will be talking about integrating the all subsystems on software level. As a whole, we are using 3 arduino boards.
                                Board 1 - Robot Arm and Hand
                                Board 2 - IR Sensors on the Robot Hand
                                Board 3 - Glove (Vibrational Motors)

Shown is the overview of how the user control the robot remotely. Setting up the server is discussed in detail here.
Figure 1 - Overview

Bending angle calculation - FINAL

In the previous post we mentioned the limitations on calculating the bending angle. As shown in the below figure when bending the finger from B to C we get the same angle as from B to A.

Figure 1- Bending angle
As discussed in the new leap SDK post, skeletal tracking enables us to receive data through out till LM can detect our palm. In the previous one we can identify fingers by its ids as long as the fingers are visible. We had to manually identify fingers earlier because when ever the fingers disappear finger id randomise.

Monday, 8 September 2014

Frames and Data

Frame Extraction 

 
On the client side, we have this set of code that sends a frame of data over the socket connection to the server. The actual Leap Motion SDK uses the 'Leap.loop' to control the frames that come in from the Leap Motion, but since we have to send it over a socket connection, we can't use the functions that come with the SDK to process the data on the server-side (this would have many advantages, as the Frame object that is created from the SDK gives us more information than the raw data). Instead, we have to dump the raw JSON data from the websocket the Leap Motion creates itself, and process it ourselves.

But what's actually in the 'frame.dump()' function?


Here's an example of the mass of data that comes from the frame dump. Yes, it's as crazy as it looks, but we can actually extract what we want from this dump of data.

The start of the dump gives a summary of certain data, namely a summary of data from the Fingers, Hands and Gestures. However, there is also a portion of the data that starts as 'Raw JSON', which is what we are more interested in.

Using the above snippet of code, we can extract the raw JSON data from the Leap Motion, which is what we need to control the arm and hand. JavaScript has powerful string control functions to search and extract certain bits of data. As this data is in JSON format, but has been sent as a string, we 'JSONify' it by using 'JSON.parse(frame)'. This allows it to be a JSON object that we can then reference if we require certain parts of the data.

I think it's important to note that since the start of this project, the Leap Motion SDK and API have been updated twice. It is currently at V3, which incorporates the Virtual Reality side of technology, including the Oculus Rift. Our project has been based on V1, and we are in the process of rewriting for V2. The reason this is important is that each update incorporates more data into each dump of data. Furthermore, the API is updated with newer and better functions.

Thus, instead of having a static reference to the data as to where the JSON data is, we use the relative reference from the words 'Raw JSON', which is more adaptable to changes in the API.

Processing Frame Data

Here is an example of how we have adapted the data to be able to process what we want from it. The Frame data is processed and returns pitch, roll and yaw of the hand.
However, the raw JSON data does not return these values, as it is dependent on the functions that are called in the API.

Thus, we simply extracted the functions that we required from the API:

As I am still learning the intricacies of JavaScript and interfacing with the Leap Motion, I was not certain as how to send a custom 'Frame' object over the socket. I will continue searching to see whether this can be done in a better way!

Next up - vibration motor problems and IR sensor data!

Friday, 16 May 2014

Identifying all the fingers seperately

In order to control each servo we have to identify fingers separately(thumb, index, pinky, ring, middle). The sdk assigns an ID to each finger however this randomise always. So I did some research regarding this and most of them were suggesting to identify each finger by taking their lengths and the angle between palm and the tip position.

However I managed to identify each finger by sorting the x coordinates. So the left most finger is the Thumb, then index and vice versa.

Tuesday, 11 March 2014

Interfacing Leap Motion with Arduino


Using the WebSocket server in leapmotion SDK we can analyse the tracking data.
Its super easy! Just connect the port to ws://127.0.0.1:6437

I have used 'ws' library from node.js. To install the 'ws' library the following command is used.
npm install ws --save

We communicate with Arduino using the 'johnny-five' library. It's installed using this 
npm install johnny-five --save

Yeah! Thanks to node.js :)




You can find the code on github: arduinoleapcar

Have FUN! :)