Friday 31 October 2014

The end of our long journey

With the final presentation DONE on the 31st of October 2014, it concludes our year long journey with the AirGrip Project. Here is a Prezi Presentation of our project. We spent so much time on it that we thought we should share it with you guys!


Looking back, we had a fantastic reception for the project. We demonstrated our project to high school students with their parents on the Monash Open Day.
 There was also the poster industry night where we won 1st prize for audience's choice award and 2nd prize for Academic's choice award.


But we have a couple of people we like to acknowledge who have been very supportive throughout the year.  First there is Jonathan Li, our supervisor who made the project possible. (see image below)
Shen, Vincent, Ashan and Jonthan
Then there is the staff in ECSE, namely Raycar (a pun with Jaycar), Geoff and Ian.
Vincent, Raycar, Geoff, Shen, Ian and Ashan
There was Andrew and Martin who helped us fabricate the required components in our project. Without them there would be no hand at all!
Andrew, Ashan, Martin, Shen and Vincent
Finally the support from our family and friends!


That concludes our project!


Saturday 11 October 2014

Integration of Subsystems on the Software level

This post we will be talking about integrating the all subsystems on software level. As a whole, we are using 3 arduino boards.
                                Board 1 - Robot Arm and Hand
                                Board 2 - IR Sensors on the Robot Hand
                                Board 3 - Glove (Vibrational Motors)

Shown is the overview of how the user control the robot remotely. Setting up the server is discussed in detail here.
Figure 1 - Overview

Calibrating the Arm

In the previous posts we talked about how we applied Inverse Kinematics and kinematic analysis of the mentor arm. As mentioned the arm consists with Potentiometers and DC motors and with the POTs we get a feedback on the angle and movement direction. In this post we will be talking on calibrating the arm. 

Indeed the reading from the POTs are consistent. And we don't have a control circuit to receive accurate readings. We are passing each POT readings through a digital low pass filter so it filters out the high frequency noise. Also there is a threshold value for pot readings (+/- 10) to avoid the overshoot. 

In the mentor arm there are physical limits to avoid gears getting damaged physically. In the code we will limiting the movement of each arm respect to the limits of the leap motion and including these physical limits. Also in the using the IK formulae calculated angles, will go to infinity if the distance of the palm and the leap origin is greater than the total length of Top arm and bot arm which is 31.5cm. So if the function "jointangles"outputs "NaN" (Not a Number - infinity) we ignore them.


Bending angle calculation - FINAL

In the previous post we mentioned the limitations on calculating the bending angle. As shown in the below figure when bending the finger from B to C we get the same angle as from B to A.

Figure 1- Bending angle
As discussed in the new leap SDK post, skeletal tracking enables us to receive data through out till LM can detect our palm. In the previous one we can identify fingers by its ids as long as the fingers are visible. We had to manually identify fingers earlier because when ever the fingers disappear finger id randomise.

Inverse Kinematics on MENTOR Arm - Part 2

As mentioned in the previous post we are using IK only for 2dof (on x and y plane). By that we calculated the joint angles on the Top and Bottom arm for the given x,y,z coordinates. 

Using the conversion from cartesian coordinate system (x,y,z) to cylindrical coordinates (ρ,theta2,r) the joint angle for base arm (theta2) was calculated.

Angle - Base arm


To calculate the joint angles the function "jointAngles" is used. The x,y,z coordinates of the palm that we get from the leap motion data are passed as an array "v" into the function and theta2 - base angle, angle2 - Bottom angle and angle3 - Top angle are returned as the output. 

integrate with respect to x

Finally, we get to a stage where everything comes together! Isn't it exciting? Are you as excited as we are?

Well, not everything is exciting...this post will be detailing how the server and the rest of the code is integrated. In an different post, Ashan detailed the integration of the arm, hand and sensors together. Here, we'll just do a brief description of what needs to be changed for everything to be integrated.

See the following two code snippets:

As you can see, the first code snippet relies on the Leap Motion controller SDK; you can read from the controller and use the hand.pitch() and hand.roll() method from the SDK. However, as you can't do this over the server, you have to rely on the frame data that has been passed to you.

Looking at the second code snippet, you can see that the bulk of the data is actually slotted within the WebSocket method. Every time a frame is passed in, the Arm Code is triggered and the rest of the code can run.

More changelog:
  • Synchronize all the loops and calls to code to the same frequency, namely, 250 ms
  • Make sure that the computer we attached to had the correct ports, and had set up the multi-board correctly
  • Test whether the data works correctly
  • ???
  • Profit when it does!
Next up - mysteries wrapped in riddles, hidden in enigmas

A note on vibration

From this previous post, you can see the IR data that is being sent from the Arm Server to the User Server (previously named Client). The data that was being received looked a little like this:


This data is essentially what the vibrational motors are using to determine their strength. I have used 3 levels for haptic feedback, but it is possible to use more. The vibration motors we purchased can draw approximately to 60mA at 3V (recommended operating range 2.5-3.5V) - which is perfect for Arduino ports that output 3.3V.

Experimentally, the tactility that you can sense in your fingers is not particularly sensitive. A change from 2.9-3V is negligible. As such, you don't need to perfectly map the proximity of the sensor with the vibration motors (which was an initial idea we investigated). To make sure we didn't have to send masses of data along the WebSocket, we determined that 3 varying levels of  vibration was an optimal amount. This may change with proximity sensors that sense at differing distances, but the levels we have allow a bigger difference in tactility.

There is some research with tactility, but it is mainly concerned with a) medical/orthopaedic benefits of using vibration gloves, b) difference in sensing tactility at difference frequencies or c) the limits of hand tactility. For our purposes, the frequency study is of most interest, but in essence our determination of the levels was done experimentally.

For completeness, the spec sheet for the motors state that there's a 12000 RPM at 3V which translates to 200Hz, as 1Hz = 60 RPM. The IEEE article linked above shows that a maximum tactility at around 250Hz, which the most sensitive region being 80-120Hz.

The level that you can read from the Arduino analog pin ranges from 0 to 255. The levels I have used move from a level of 64- 128 - 255; ie quarter value - half value - full value. Assuming a linear change of RPM, this would translate to approximately 50Hz - 100Hz - 200Hz.

Experimentally, any slight buzzing can just be sensed. The first level does not necessarily need to be at a level of 64, but it is an easy fraction of the full value. The change up to the second value is a noticeable difference, and the highest value buzzes like no tomorrow!

We felt that if it was 15cm or further, it should just be able to be sensed. Obviously, if it's right next to the sensor it should be buzzing quite strongly. As such, there must be an intermediate value between them, otherwise you would be stepping up too quickly.

In the future, there may be more research done on the number and strength of the levels of vibration. However, at this time, we do not think it is necessary.

Next up - integrations (not the calculus kind!)