Seeing through walls using RF


For someone who works with RF frequencies for communication of Internet of things applications this is a very impressive use of RF. The team from MIT have built a device that can emits RF through a wall which rebounds of of objects the other side and using sequential frames reconstructs a captured object. For moving individuals it looks like there is a substantial doppler effect on the rebounding waves which is interesting as I am planning to get to grips with the 10Ghz microwave sensor system that relies on the doppler effect and is no where near what MIT are working on. Here is a video of what they have achieved which I think is seriously impressive.

Bookmark and Share

Keytree at SAP TechEd Las Vegas 2015

Photos from SAP TechEd Las Vegas 2015

Bookmark and Share

Project SMILE Winner of SAP TechEd 2015

We won the Las Vegas Demojam with an application we built in just 32 hours. We presented it on stage to give you an idea of what it can do but we didn’t give you a complete picture of exactly what went into it.

The Brief

Our brief was to reduce employees engaging in bad habits that will harm their health at work. We built an application that provided an offering to the end employee as a native mobile app, a web interface for the company health therapist, a sensor for the desk to capture the data and a backend to tie all this together with analysis capabilities. The purpose was to monitor the following employee attributes:

  • Temperature of Employee
  • Movement of Employee at desk
  • Hydration of Employee

The Hardware

Keytree_SMILE_Demojam_thermopileThe key capture of the data was the hardware selection. We settled on a thermopile. Thermopile sensors monitor temperatures of what they are looking at by analysis the infra-red being emitted. It performed 64 unique temperature measurements 15 times per second which was read by an Arduino UNO before the raw data was pushed to SAP HANA. To give you an idea of data scales this gives us approximately 960 readings per second and just shy of 83 millions unique temperatures per day. The actual electronic circuit itself required the addition of passives, regulators, pull ups and a few other components so when we built it on a breadboard it did look like a tangle of wires but it did provide clean data.

For the purposes of the demo we mounted the sensor on top of a LG HD webcam and then coordinated the field of view of the webcam with the field of view of the sensor so that we were able to demonstrate what the hardware was actually doing. For actual use the webcam would not be required.

We also put in Bluetooth ibeacon technology to allow user validation at a particular desk. This means that the user has the SMILE app installed when they come near their device can see the ibeacon and then automatically sign them in when they are close to the sensor.

The Logic

With all of this data being captured from the hardware the processing and analysis is the next stage of the application. On each frame we look for significant areas. A significant area is defined as a region of temperature readings that differs from the ambient temperature and is of a particular size. To do this we actually pulled an image processing technique for blob detection and analysis, these blobs were then compared to known entities or signatures of entities so we had a signature of a human face that was between 35-40 degrees and of a particular size and we had similar for cold and hot drinks. Each of these detected signatures have a calculated weighted center point, magnitude and maximum value. At this stage we have detection of a human being present and their temperature. With some research we discovered that the forehead is a reasonably accurate body temperature, but is on average two degrees below the core body temperature as it is the skin surface temperature we are monitoring but the forehead has blood flow close to the surface.

With the rate at which the sensor provides data we look at the difference between frames. Once these blobs start to be analyzed in movement we were able to calculate their interactions. So now we have the movement of an individual’s head at a desk and then when a drink is put to their mouth.

For the organisation level we are not interested in the individual values of these we just want one metric to measure the health of an employee so these metrics are combined into a single SMILE score ranging from 0 to 100 of how healthy the individual is and then it can be averaged over the course of a day. So SAP HANA has taken 83 million data points and worked out one value of how healthy a person is.

As an additional level we took the temperature of an individual over time and compared it with known temperature variance of certain illnesses such as influenza so that a person can be warned if they should go and preemptively get themselves checked out by the doctor.

The Backend

We needed to tie all of the application together in a fast reliable way. We set out with NodeJS using the SAP HDB connector, websockets (via Socket IO) and rest service (via HTTP). This gave us a platform that can connect data streams over the full duplex connection methods or run one of requests all connected to the data stores in SAP HANA.

One of the aspects we did foresee early on is that writing the raw SQL statements to pull data from our HANA model which was created in the SAP HANA Tools for Eclipse wasn’t going to be as fast as it needed to be. Therefore, we wrote a library that creates the statements much like Squel does for MySQL but for HANA. This library was more time consuming to get working for the first few queries we needed in this application but then it meant that we could perform retrievals and queries in HANA using a hand full of lines of code in NodeJS.

The rest service used express library with our own middleware tier to process requests.

The web sockets used socket io so that machines can join particular data feeds and streams of data using a pub/sub and room methodology.

The User Experience

Keytree_SMILE_native_iphone_applicationThis experience was a custom designed and custom UX native mobile app which was geared around give user’s goals. For instance, that you need to drink water 5 times a day or you need to move more than a particular threshold. The iPhone app connects to the data feeds from the backend analysis directly on the web sockets when the app is running and via apple push notifications (APNS) when it isn’t. The mobile device receives notifications when you check in to the sensor (using the iBeacon), your temperature, movement and hydration which are shown in real-time within the app and actions performed such as when you take a drink so no need to touch the screen for it to be acknowledged.

The Organisation Experience

Keytree_SMILE_DashboardFor this we went with a Web UI built in ReactJS which consumed the websocket and rest services for data acquisition. This data was then delivered to the UI within graphs, floor plans and alert lists so it was easy to see what was going on.

This means that easily a organization health therapist can see which desks people are at, their temperature and any notifications that are of concern for instance a user has drunk too much coffee or someone is too stationary.

The End Result

The complete solution provides a comprehensive health system for an organization with benefits and motivations for both the individual users and the organization.

In the 32 hours we build a mobile native app, web interface, electronic hardware and SAP HANA data model (with a custom built library on top). We are looking at taking this hardware forward and making it into a product and give everyone a chance to SMILE.

Bookmark and Share

Assembly Festival – Edinburgh Fringe App and Website

AssemblyFestival_website_keytreeWe are down to the final week of Edinburgh Fringe and from the publicity, press, reviews and statistics it has been bigger than ever. At Keytree we helped Assembly Festival provide a website to promote and sell tickets along with an app for delivering daily deals so you can make the most of your Edinburgh Festival. There are tickets available for Comedy, Musicals, Cabaret and much more including covering performers such as Al Murray, Josh Widdicombe, Dan Clark, Andrew Lawrence and many more.


Bookmark and Share

GoPro Session a new option for Quadcopters

The new GoPro session camera weighs only 74g or 2.6 ounces. Initially this does seem all that much lighter than the original 88g weight of the likes of the Hero 3 series. However a crucial factor is the case that is required for mounting which for the gopro Hero 3 increases the overall weight to 152g. Interestingly on the gopro site there is only one weight listed for the GoPro Session rather than the camera and camera housed for the other cameras. If this is the case then it will mean that there is a saving of up to 78g a healthy 52%.

So when would this make a difference to drones? I think that if you are making a drone specifically for the purpose of filming the Hero 4 will be better suited because of the quality of image. However with the racing drones of the  250 class the GoPro Session is ideal shedding weight which makes a massive difference.


Bookmark and Share

Myo and Chrome … changing, opening and closing tabs with muscle gesture detection

myo_thalmic_labs_chromeTo get a better understanding of how to write scripts for the MYO I wanted to use the MYO to control Chrome. Here is what I set for the gestures:

  • Wave Out – Move to next tab
  • Wave In – Move to previous tab
  • Spread fingers – create new tab
  • Fist – close current tab

I also wanted to share this with everyone so feel free to download and use. chromecontrol.lua


Bookmark and Share

Myo muscle sensing armband

Myo Thalmi

I have been looking forward to this device ever since it popup up as an alternative method to hand tracking. It works by analysing the electrical impulses going through your arm to understand the gestures that you hand and wrist are doing and combining this with an IMU to track where your arm is. It connects via bluetooth and there are Windows, Mac, Unity3d, iOS and Android SDKs to work with. It makes quite a few promises of what can be achieved I cannot wait to get started with it.

Continue reading “Myo muscle sensing armband” »

Bookmark and Share

Wearable technology the next frontier… brain reading with the InterAxon Muse headset


After backing Muse on Indiegogo I received my brain sensing wearable technology headset. Brain sensing! It doesn’t understand what you are thinking about but it more provides an understanding of the activity of your brain. In the box there is the headset, micro usb cable to charge the headset, a read me book and a legal document.

Continue reading “Wearable technology the next frontier… brain reading with the InterAxon Muse headset” »

Bookmark and Share

Model Cluster Structure Sensor Capture, Storing, Sharing and Effects


In the pursuit of new technologies I backed the sensor from Occipital Inc on Kickstarter. Being one of the first 200 hundred devices I got to grips with the device and think that it is a great device and really can allow your applications to reach out into the real world.

Download Now From iTunes for your iPad

Continue reading “Model Cluster Structure Sensor Capture, Storing, Sharing and Effects” »

Bookmark and Share