This version does not add new features but adds compatibility for the Galaxy Note 10.1 (Android 4.1.2) and probably a few other devices.
Some thoughts about the development
I have been trying to get the IR frames from the sensor and ran into a huge facepalm problem. So if anyone tries to do the same and finds this post he might be saved from a few hours of “bug” hunting.
Problem: Blank frames retrieved from the IR sensor
So what did I try to achieve?
I opened the sensor just as I would for the depth stream but with SENSOR_IR instead of SENSOR_DEPTH. The sensor opened just fine and it did send data, but it was basically a black image with only very few non zero values, and those non zero values were not higher than 0x04 or 0x05.
My assumption was, that there is a problem with the initialization, which in part was true but not in a way that I expected.
At first I assumed that the sensor just is not able to send the IR frames but then I discovered that the NiViewer would show the IR stream just fine.
My second thought was that it maybe is sent as the RGB stream since the StructureSenor does not have an RGB camera and you never know what developers are thinking :). But that stream, obviously, could not even be opened.
So I put it aside for a rainy day.
After a few days of doing nothing with it I tried to dive into the NiViewer and find out what black magic they are aplying that I missed.
And what I found was … nothing … nada. They opened the sensor just the same as I did and got the correct IR frames but I still did not.
There is, however, one thing I should mention which finally helped me solve this “problem”. Inbetween the first tries and finally finding out what the problem was, I played around with the PS1080Console and the PSLinkConsole and discovered that there are commands to turn the infrared projector on and off.
And then, during debugging the NiViewer, it finally hit me. The NiViewer does in fact do something which I did not do in my modifications of the SimpleViewer sample code. It opens the depth stream AND the IR stream, not just the IR stream. By opening the depth stream the infrared projector gets turned on automatically. Which means that in my test code it did receive the correct IR frames, but there was just nothing to see for the sensor. It seeme that the IR pass filter is so narrow that the camera really only sees the wavelength emitted by the infrared projector.
Bottom line: If you are trying to take IR pictures with Structure Sensor, don’t forget to turn on the IR projector (just start the depth stream to do that) or use an infrared light source that has the correct wavelength which passes through the IR filter.
We are still playing catch-up with just about every other Kinect, PrimeSense, … enabled software since the app currently only does single shot scans which, on top of that are not even synchronized.
- The next goal is to add an IR capture feature which will be used to calibrate the IR camera and calibrate both sensors (depth and android camera) to each other.
- Once the calibration works we will add a feature to transform the depthmap to a 3D pointcloud with the camera image mapped on the vertex colors - The app will probably not implement the 3D viewer itself but only create an OBJ file and then call an external app to view the model (there seem to be a view viewer apps in the play store).
- Synchronizing the depth scan and camera images - I am not sure if the Structure Sensor exposes some form of software trigger mechanism but since the android camera is working in an asynchronous way it is going to be tough anyway. I really have no clue about a good synchronization method at the moment, apart from doing it heuristically.