We are about to switch to a new forum software. Until then we have removed the registration on this forum.
Hi all
I am working on a little project which requires reading out the data stream from a neurosky mind wave mobile. The current state of the project is: I have the headset connected to my mac through the installed thinkgearconnector over bluetooth. I also have a processing sketch which sets up a tcp connection to the thinkgearconnector using the oscP5 library.
I know there is a certain level of communication through the system as when I run the sketch, thinkgearconnector shows that it is connected as a client and it wakes the headset pairing. My problem is I am using oscEvent to receive any traffic but I am not receiving anything, and currently I am just printing a period to the console as a test.
Nothing…
Anyone else having similar issues? I know that there was a neurosky lib for processing but it seems to be deprecated. I also know that the headset is working as it connects to the deep applications supplied with the device through the thinkgearconnector.
Any help gratefully received
Caleb
Answers
Please post your communication code. I would suggest on the side to use the network library directly. Info in this link:
https://processing.org/reference/libraries/net/index.html
Kf
Hey there! Here's the code so far - as you can see it's pretty stripped down just to get at least some understandable code coming through:
`import netP5.*; import oscP5.*;
OscP5 myClient; NetAddress myRemoteLocation;
void setup(){ size(200,200);
myClient = new OscP5(this, "127.0.0.1", 13854, OscP5.TCP);
}
void draw(){
}
void oscEvent(OscMessage theMessage) { print("."); }`
The reason I am using the P5 lib is due to the specification of a TCP socket for the thinkgearconnector socket server that receives the data from the headset and pushes it to processing
Check this references of the oscP5 library:
If you are receiving data, the following code should work for you:
Kf
Hey there
Apologies for the slow response - I finally got useable data from the headset - in the end I avoided going through the thinkgear connector socket and went straight to the source via bluetooth.
Now at a rate of every 1.5(ish) seconds I receive values for:
Delta, Theta, Low Alpha, High Alpha, Low Beta, High Beta, Low Gamma, Mid Gamma, Attention and Meditation
Your original reply of using the network lib was correct for this rebuild - and the full code is shown below in case anyone else is interested in using it:
Great to hear it is working. Thanks for sharing!
Kf
Thanks so much for sharing this. Very interesting.
I've read mixed reviews of the neurosky mind wave mobile, saying that most of the channels (except perhaps Attention) are incredibly noisy / difficult to distinguish activity (or even presence). Still, I'm interested, and would be interested to hear if you felt the sensor data ended up being usable.
Currently the data does look a little screwy, and that could be down to inaccuracies on the device. For example I seem to be getting quite high readings in the delta range, which are characteristic of deep stage 3 NREM sleep. Unless I'm dreaming that I'm coding in processing I don't think that's necessarily right ;)
One thing that I am doing in the byte reading is just checking the byte positions in the payload from the headset which may not be the most accurate. The byte prior to the particular wave response is an id byte so I may switch out and check that for clarification.
One other thing is I seem to be getting full payload bytes every 1.5 to 3 seconds which isn't a very fast response. I think I may read in a bytestream instead, then run through it rather than reading individual bytes.
Will post my findings.
Ok, so here's my final code (until I push the responses out to the arduino for toy purposes). Currently it reads in, and when two bytes at dec170 are received it checks the packet length. As I am only looking for long packets which contain all the wave data, I know this length is 32bytes so I read 32 bytes into an array. Then I just identify the appropriate bytes as per the protocol documentation and push them out to my graphing code.
Again, the data does look a little weird so I certainly wouldn't use it for anything other than a toy (I'm waiting for my openBCI board to come, when I will relook at this process in more depth), but the last two signals on the graph are Attention and Meditation which seem to roughly correspond to what I feel I am doing so they may be useful.
Code below...
Thanks so much again for sharing this.