Showing posts with label remote control server. Show all posts
Showing posts with label remote control server. Show all posts

Saturday, 11 October 2014

integrate with respect to x

Finally, we get to a stage where everything comes together! Isn't it exciting? Are you as excited as we are?

Well, not everything is exciting...this post will be detailing how the server and the rest of the code is integrated. In an different post, Ashan detailed the integration of the arm, hand and sensors together. Here, we'll just do a brief description of what needs to be changed for everything to be integrated.

See the following two code snippets:

As you can see, the first code snippet relies on the Leap Motion controller SDK; you can read from the controller and use the hand.pitch() and hand.roll() method from the SDK. However, as you can't do this over the server, you have to rely on the frame data that has been passed to you.

Looking at the second code snippet, you can see that the bulk of the data is actually slotted within the WebSocket method. Every time a frame is passed in, the Arm Code is triggered and the rest of the code can run.

More changelog:
  • Synchronize all the loops and calls to code to the same frequency, namely, 250 ms
  • Make sure that the computer we attached to had the correct ports, and had set up the multi-board correctly
  • Test whether the data works correctly
  • ???
  • Profit when it does!
Next up - mysteries wrapped in riddles, hidden in enigmas

Thursday, 9 October 2014

So you think you can install the client?

Following the integration of all the subsystems, one very stark problem came to light.

Firstly, let me recap the server-client relationship:

Server: connected to the Robotic Arm, LBS hand and IR sensors

Client: connected to the Leap Motion and the vibrational motors

Previously, for usability purposes, the client was accessed through a web browser, and only required a web browser and the Leap Motion to be connected. However, it quickly became clear that there was no way that you could control the Arduino through the browser. All browsers are sandboxed by default, and for good reason! If you were able to affect the local computer through a browser, there would be a massive vulnerability with Javascript.

(Pedantic note: there are ways to unlock the sandboxing in Chrome, but that is unsafe and if you forget to lock it up after yourself, it leaves you, your machine and your network vulnerable to attack)

Thus, we decided that a server-client relationship that was instead two servers talking to each other would be much easier, but less user-friendly. It will probably not feature in our demo, but it is an option for others to improve on in the future if they wish to.

In this post, I'll be outlining the general way to get the server and client working. There are multitudes of tutorials on this set-up, but for documentation's sake it's good to have this recorded.

Step 1: Install NodeJS - the easiest step. The setup is quite simple, and will help install the Node Package Manager (npm) for you.

Step 2: Create your  new project directory - for our sakes let's call C:\SecretPlans\WorldDomination.

Step 3: Open Command prompt and navigate to that directory. (use "cd [insert path here]") In our example, it will be cd C:\SecretPlans\WorldDomination.

Step 4: Install package dependencies through npm. In our case here, we haven't done it the best way - you will have to use "npm install johnny-five socket.io socket.io-client leapjs" which installs all the latest versions of those packages.

The best way is to instead wrap it up in a JSON file that includes all the dependencies - usually called package.json. In that case, you can merely write "npm install" and it will read the package.json and download all it's dependencies. The package.json file is used mainly when one is writing a script or package that may be used in the npm directories (ie downloadable by others). There's a lot of information you can put in the package.json so that it can be read by others. Read more about it here!

Step 5: Write the script in the editor of your choice (our choices have been Notepad++ (for Windows) and Sublime Text 2 (for Mac)). In our case for Airgrip, we have files called FINALclient.js or FINALserver.js. Each of these need to certain lines modified in the code as to what port the Arduino boards are attached to. In our fictional case, lets call the files NukeWorld.js and CommandCenter.js.

Step 6: Back to Command prompt - set the program running by using 'node filename.js'. So in our fictional case, we would run 'node CommandCenter.js' (assuming that's the server). On the other computer, we would run 'node NukeWorld.js' (assuming that's the client).

Step 7: ??? World Domination...?

Next up - more magical blogposts :)

Wednesday, 8 October 2014

The beauty of Infra-Red over a WebSocket

This post will outline the new IR sensors we have built, problems we faced during it, and the integration of them with software.

Hardware

The initial IR sensors we built were cheap and dirty (figuratively, of course). Using an all-purpose photo-transistor turned out not be a good idea. It would pick up far too much ambient light and thus wherever we went, the phototransistors would be affected a non-trivial amount. This would make it quite hard to decide what levels of contrast we would need in our software.

Furthermore, the IR sensors were affected by the type of material (more importantly, the colour) and how far away each thing was. For example, this system does not work well with black objects. It is one of the limitations that also applies for the Leap Motion; it is the limitation of Infra-Red itself - that it is absorbed by black objects so well!

Thus, I examined some other types of sensors we could use. The first idea I researched was whether you could modulate the signal coming from the IR LED and have it receive at a certain wavelength so that each finger's sensor wouldn't interfere with each other. To this end, I found a receiver called the TSOP4838, which recieves the signal at around 37kHz. Modulating the light of the LED to 37kHz was not difficult - it was a very simple RC circuit (I also experimented using a function generator at 37kHz).


However, to my dismay, I found that the output of this TSOP was digital in nature. It would go HIGH or LOW, but would not output an analog value. This would not be useful for our purposes, as the sensors would need to give a different analog value based on how far away an object was. Interestingly, I found out that these TSOP receivers were more generally used for remote control systems and thus had a long range. I was able to pick up the signal of this modulated LED from quite a fair distance - maybe 60-100cm. The LED I used has quite a narrow view angle, and I was surprised it would be able to pick it up. In the end, I probably could have worked out a solution using the TSOP's, but instead I hit on a much simpler idea.

Infrared Phototransistors: They essentially work the same as the phototransistors we were using before, but instead have a built-in filter for Infra-Red light! This allowed the software to be changed very minimally, and a much more accurate level to be given. A note on the IR phototransistors and the IR LEDs - they have quite a low viewing angle - around 15 degrees. Thus, the IR sensors are highly directional. In the future, it may be valuable to install LEDs with a higher viewing angle, or install more of these sensors so more parts of the hand can have haptic feedback.

The circuitry sketches are shown below:


This sketch shows the basic idea of the phototransistor circuit


The top half of this drawing is the ideas behind how to actually build the VeroBoard for this sensor and how small I could make it.

The final product is below! Shen will detail more about the integration of the sensors and the fingers in another post - suffice to say that these turned out quite well!




The only problem with these sensors is that they were not identically made. Even the slightest difference in the heights of the phototransistor and the LED would cause a much different reading. The angle of each LED and phototransistor determines the optimal distance that it can measure. The next section will detail the software readings and problems that came with the IR sensors.

 

Software

In an earlier post, I detailed the way that the Arduino would work in reading the IR data. The polished version in Javascript is below:

The levels that I have chosen have been experimentally determined. I decided that a 15-10cm, 10cm-3cm and 3cm-0cm threshold should be used. This was an arbitrary decision, but it gave what I felt to be the right amount of distance for a change to occur. If the lowest threshold was longer than 15cm, then it would start vibrating too early. This is what occurs in the checklevel() function in the above code - it outputs a certain level from 0-3 on the intensity of the vibration (a later post will detail the software of the motors).

Using Arduino code, the following values were recorded when there was nothing in front of the IR sensor:
Thumb: 9
Index: 5
Middle: 8
Ring: 10
Pinky: 5

The thresholds you see in the above code come from using a white sheet of paper at differing distances away from the front of the IR sensors. This gives the best response; using other colours will give varying responses (ie the distance thresholds will be slightly different), but it works well with a variety of objects as will be shown in a later demo.

The reading of the sensors comes from each of the small sections that start with "Finger.on('data'...". As NodeJS is asynchronous, as soon as there is data that can be read from the finger, it will be taken and used to update the 'levelarray' that contains the intensity of response for each finger. The reason it is so small is that I felt it would be easier and quicker to send 5 numbers over the socket than a fluctuating array. It is a small difference but it turns out to be easier to do anyway.

Below is what you receive on the client-side:



You can probably see when we introduced an object in front of the fingers, and when we took it away! Another thing you will see is that the fingers are not always consistent in their reading - this is due to the asynchronous nature of the calls that are quite sensitive to change. It does not affect the use of the system, however, and thus does not cause a problem.


Coming soon: vibration motor software integration and fixing wiring problems

Monday, 18 August 2014

Airgrip Control...over the air?!

Hello hello! Another long-awaited (or not) post from Vince!

In this update we look at the Remote Control Server that we will be able to use to control the Airgrip system from anywhere (as long as there's internet connection).

There are an absolute MULTITUDE of tutorials about nodejs online, but very little about Socket.io. If there are, they are sorely outdated. Furthermore, from the beginning of developing this server and now, the Socket.io package on nodejs has been updated to further increase it's performance, and there has been little updates in tutorials.

However, I would like to credit this tutorial (http://danielnill.com/nodejs-tutorial-with-socketio/) for being an excellent springboard into what we needed to do.

Server-side

NodeJS

Node is an excellent, lightweight package that builds off JavaScript to make it terribly easy to set up servers and paths. See the code snippet below (apologies for the raw, unfinished nature of it!):


The line "var server = http.createServer(function(request, response) {..." in one step creates the server. The switch case determines which page is shown - in this case, the root directory "/" will show a bare html page that just says "hello world". If you access "socket.html", the socket.html page will be shown instead. The page that comes up if it's not a URL that is recognized, is the dreaded 404 page that we all have experienced at least once in our lifetimes (just like a BSOD - or if you're a MAC user, that spinny rainbow thing that says EVERYTHING IS BROKEN)

Anyway, that's the basic idea of the NodeJS package. The NodeJS Package Manager has an absolute multitude of packages that can extend the capability of the server. Popular packages include Express, AngularJS and most importantly for our purposes, SocketIO.

Socket.IO

The Socket.IO package's main use is for real-time data transmission. Due to the asynchronous nature of NodeJS, being able to send data whenever it is ready is quite handy. Thus, we have Socket.IO!


This snippet shows how we can use Socket.IO. This package builds off the WebSockets package that is already inbuilt into NodeJS.

The most important parts of this snippet are the "socket.emit" and "socket.on" commands. This socket server is interfacing with an HTML page that also uses Socket.IO to send and receive data. The "socket.emit" command sends data (called 'sensors') with the name 'arduino data' to the client/html page. The "socket.on" command waits for a message to come in with a certain name (in this case 'frame') before executing the code given in it's callback function. Here, I was trying to test whether the data given from the Leap Motion could be used to vary what was sent over the server.

This small snippet of code is from the HTML file that is shown in the browser. When the socket receives a message named 'arduino data', it modifies the html field with id="frame" and changes it to whatever data has been sent.

Next post I'll detail some more information about the data that's being sent!