Showing posts with label javascript. Show all posts
Showing posts with label javascript. Show all posts

Saturday, 11 October 2014

Integration of Subsystems on the Software level

This post we will be talking about integrating the all subsystems on software level. As a whole, we are using 3 arduino boards.
                                Board 1 - Robot Arm and Hand
                                Board 2 - IR Sensors on the Robot Hand
                                Board 3 - Glove (Vibrational Motors)

Shown is the overview of how the user control the robot remotely. Setting up the server is discussed in detail here.
Figure 1 - Overview

Calibrating the Arm

In the previous posts we talked about how we applied Inverse Kinematics and kinematic analysis of the mentor arm. As mentioned the arm consists with Potentiometers and DC motors and with the POTs we get a feedback on the angle and movement direction. In this post we will be talking on calibrating the arm. 

Indeed the reading from the POTs are consistent. And we don't have a control circuit to receive accurate readings. We are passing each POT readings through a digital low pass filter so it filters out the high frequency noise. Also there is a threshold value for pot readings (+/- 10) to avoid the overshoot. 

In the mentor arm there are physical limits to avoid gears getting damaged physically. In the code we will limiting the movement of each arm respect to the limits of the leap motion and including these physical limits. Also in the using the IK formulae calculated angles, will go to infinity if the distance of the palm and the leap origin is greater than the total length of Top arm and bot arm which is 31.5cm. So if the function "jointangles"outputs "NaN" (Not a Number - infinity) we ignore them.


Bending angle calculation - FINAL

In the previous post we mentioned the limitations on calculating the bending angle. As shown in the below figure when bending the finger from B to C we get the same angle as from B to A.

Figure 1- Bending angle
As discussed in the new leap SDK post, skeletal tracking enables us to receive data through out till LM can detect our palm. In the previous one we can identify fingers by its ids as long as the fingers are visible. We had to manually identify fingers earlier because when ever the fingers disappear finger id randomise.

Inverse Kinematics on MENTOR Arm - Part 2

As mentioned in the previous post we are using IK only for 2dof (on x and y plane). By that we calculated the joint angles on the Top and Bottom arm for the given x,y,z coordinates. 

Using the conversion from cartesian coordinate system (x,y,z) to cylindrical coordinates (ρ,theta2,r) the joint angle for base arm (theta2) was calculated.

Angle - Base arm


To calculate the joint angles the function "jointAngles" is used. The x,y,z coordinates of the palm that we get from the leap motion data are passed as an array "v" into the function and theta2 - base angle, angle2 - Bottom angle and angle3 - Top angle are returned as the output. 

Wednesday, 8 October 2014

Leap Motion V2 Tracking Beta

With the new beta version of leap motion API, skeletal tracking it enable us to receive additional for each overall hand and data is much improved compared to version 1.

Leap motion - skeletal tracking
https://community.leapmotion.com/t/v2-tracking-beta-new-sdk-and-tracking-release-v2-0-3-17004/1329

Monday, 8 September 2014

Frames and Data

Frame Extraction 

 
On the client side, we have this set of code that sends a frame of data over the socket connection to the server. The actual Leap Motion SDK uses the 'Leap.loop' to control the frames that come in from the Leap Motion, but since we have to send it over a socket connection, we can't use the functions that come with the SDK to process the data on the server-side (this would have many advantages, as the Frame object that is created from the SDK gives us more information than the raw data). Instead, we have to dump the raw JSON data from the websocket the Leap Motion creates itself, and process it ourselves.

But what's actually in the 'frame.dump()' function?


Here's an example of the mass of data that comes from the frame dump. Yes, it's as crazy as it looks, but we can actually extract what we want from this dump of data.

The start of the dump gives a summary of certain data, namely a summary of data from the Fingers, Hands and Gestures. However, there is also a portion of the data that starts as 'Raw JSON', which is what we are more interested in.

Using the above snippet of code, we can extract the raw JSON data from the Leap Motion, which is what we need to control the arm and hand. JavaScript has powerful string control functions to search and extract certain bits of data. As this data is in JSON format, but has been sent as a string, we 'JSONify' it by using 'JSON.parse(frame)'. This allows it to be a JSON object that we can then reference if we require certain parts of the data.

I think it's important to note that since the start of this project, the Leap Motion SDK and API have been updated twice. It is currently at V3, which incorporates the Virtual Reality side of technology, including the Oculus Rift. Our project has been based on V1, and we are in the process of rewriting for V2. The reason this is important is that each update incorporates more data into each dump of data. Furthermore, the API is updated with newer and better functions.

Thus, instead of having a static reference to the data as to where the JSON data is, we use the relative reference from the words 'Raw JSON', which is more adaptable to changes in the API.

Processing Frame Data

Here is an example of how we have adapted the data to be able to process what we want from it. The Frame data is processed and returns pitch, roll and yaw of the hand.
However, the raw JSON data does not return these values, as it is dependent on the functions that are called in the API.

Thus, we simply extracted the functions that we required from the API:

As I am still learning the intricacies of JavaScript and interfacing with the Leap Motion, I was not certain as how to send a custom 'Frame' object over the socket. I will continue searching to see whether this can be done in a better way!

Next up - vibration motor problems and IR sensor data!

Monday, 18 August 2014

Airgrip Control...over the air?!

Hello hello! Another long-awaited (or not) post from Vince!

In this update we look at the Remote Control Server that we will be able to use to control the Airgrip system from anywhere (as long as there's internet connection).

There are an absolute MULTITUDE of tutorials about nodejs online, but very little about Socket.io. If there are, they are sorely outdated. Furthermore, from the beginning of developing this server and now, the Socket.io package on nodejs has been updated to further increase it's performance, and there has been little updates in tutorials.

However, I would like to credit this tutorial (http://danielnill.com/nodejs-tutorial-with-socketio/) for being an excellent springboard into what we needed to do.

Server-side

NodeJS

Node is an excellent, lightweight package that builds off JavaScript to make it terribly easy to set up servers and paths. See the code snippet below (apologies for the raw, unfinished nature of it!):


The line "var server = http.createServer(function(request, response) {..." in one step creates the server. The switch case determines which page is shown - in this case, the root directory "/" will show a bare html page that just says "hello world". If you access "socket.html", the socket.html page will be shown instead. The page that comes up if it's not a URL that is recognized, is the dreaded 404 page that we all have experienced at least once in our lifetimes (just like a BSOD - or if you're a MAC user, that spinny rainbow thing that says EVERYTHING IS BROKEN)

Anyway, that's the basic idea of the NodeJS package. The NodeJS Package Manager has an absolute multitude of packages that can extend the capability of the server. Popular packages include Express, AngularJS and most importantly for our purposes, SocketIO.

Socket.IO

The Socket.IO package's main use is for real-time data transmission. Due to the asynchronous nature of NodeJS, being able to send data whenever it is ready is quite handy. Thus, we have Socket.IO!


This snippet shows how we can use Socket.IO. This package builds off the WebSockets package that is already inbuilt into NodeJS.

The most important parts of this snippet are the "socket.emit" and "socket.on" commands. This socket server is interfacing with an HTML page that also uses Socket.IO to send and receive data. The "socket.emit" command sends data (called 'sensors') with the name 'arduino data' to the client/html page. The "socket.on" command waits for a message to come in with a certain name (in this case 'frame') before executing the code given in it's callback function. Here, I was trying to test whether the data given from the Leap Motion could be used to vary what was sent over the server.

This small snippet of code is from the HTML file that is shown in the browser. When the socket receives a message named 'arduino data', it modifies the html field with id="frame" and changes it to whatever data has been sent.

Next post I'll detail some more information about the data that's being sent!

Tuesday, 12 August 2014

Limitations of calculating the bending angle of the finger

In this article we will talk on some of the limitations in the Leap Motion SDK when calculating the bending angle of the finger.

As mentioned in the "controlling a servo using leap" post we calculated the bending angle of the finger by taking the dot product of finger direction and the palm normal.


Figure 1
[http://hand-illustration.com/82-pinky/240-pinky.html]

Sunday, 10 August 2014

Programming Arduino with JS

As we mentioned earlier, we are using Javascript as the main programming language through out the project. The Arduino microcontroller can be controlled through the use of frameworks like Rick Waldron's Johnny-five for Node JS, and the "StandardFirmata"software package for Arduino. 

Yeah we will be most likely utilising the Node.JS framework, which includes packages that allow for ease of communication between hardware and software. This enables us to handshake between Arduino Firmware and the Firmata protocol. Structure of the whole system is shown below.

Friday, 4 April 2014

Short update on the IR sensing equipment

Hey all! It's been a while - mostly due to the fact that my subsystem is waiting on the most important parts - VIBRATIONAL MOTORS!

We've purchased them from a US company, but there were a couple of complications in the purchasing bureaucracy that has led their shipping to be delayed. In the mean time, though, I've been learning JavaScript for use in NodeJS, which is what we'll be using in the LeapMotion interface. The source I have been using is Eloquent Javascript, quite a useful resource, and I'll be moving on to Node soon enough :)

Furthermore, I've been prototyping the infrared ("IR") sensors that Ray had left over from some design units from the last couple of years. These work quite well in terms of voltage - an Arduino can easily deal with them. However, there were problems we encountered in the process.

Firstly, the IR emitter itself is not strong. The receiver was not able to pick up the response from it very well. Instead, we substituted an IR LED. This was able to give a much stronger response that we could read from the Arduino. Next, the ambient light of a room can affect the response, and thus needs to be taken account in the code. Lastly, the colour of the object being put in front of the sensor matters as well - black is not a good colour for IR sensing as it absorbs so much of it!

Thus, we used a bit of a hack to work out if it was close to something or near something. See the short code snipped below:

#define IRreceive A1 
int IRinitial;
void setup() { 
Serial.begin(9600); 
pinMode(IRreceive, INPUT); 
delay(100); 
IRinitial = analogRead(IRreceive); 


void loop() { 
long IR; IR = analogRead(IRreceive);
 Serial.print("NUMBER: ");
 Serial.println(abs(IRinitial - IR)); 
}

The initial read of the IR sensor is used to assess the ambient light of the room. This is then used to compare to response we receive from the sensor. The response to most colours is usually higher. However, for black, it does not sense that it has been put close to the IR receiver until it is pretty much covered - in which case it will give an even lower response. Thus the need for the in abs the code.

Once the motors have come in, I'll be working on making these IR sensors send information to them and hopefully emulate some sort of haptic feedback.

Tuesday, 11 March 2014

Interfacing Leap Motion with Arduino


Using the WebSocket server in leapmotion SDK we can analyse the tracking data.
Its super easy! Just connect the port to ws://127.0.0.1:6437

I have used 'ws' library from node.js. To install the 'ws' library the following command is used.
npm install ws --save

We communicate with Arduino using the 'johnny-five' library. It's installed using this 
npm install johnny-five --save

Yeah! Thanks to node.js :)




You can find the code on github: arduinoleapcar

Have FUN! :)