Sunday, July 03, 2016

Image classification with TensorFlow

Lately I am spending time experimenting with scikit-learn and TensorFlow (Google) open source library for Machine Learning.

Using image recognition library from TensorFlow, I hacked together a web service with an end point where I can send an image and get back with prediction for the content of the image.
To consume the API, initially I thought of making an Android app, later it made sense to follow up on my previous post and use a chat bot. My experimental project was simple, I should be able to send an image to my chat-bot in WhatsApp, and get back the prediction.

Tensorflow is easy to learn and use and it has decent documentation. I trained the model using sample image database downloaded from google example. To create the web endpoint, I used flask, a python web framework.
Now I can test, my web service by uploading my test image

Prediction :
    _score: "0.623313",
    _string: "catamaran"
      _score: "0.099451",
      _string: "dock, dockage, docking facility"
      _score: "0.0621891",
      _string: "liner, ocean liner"
      _score: "0.0246246",
      _string: "fireboat"
      _score: "0.0244034",
      _string: "speedboat"
Prediction engine is 62% sure that the image is of a catamaran. Close enough.

Now, I could hook this up to my WhatsApp bot. I used yowsup python library to get this up and running. I just had to register my spare mobile number with WhatsApp server. With a bit of fiddling around, the bot was ready to talk to the prediction API.
Now I can send a picture, and get back the prediction in JSON format.


Thursday, February 19, 2015

Current state of Wearables and Clinical trials

I have been fascinated by sensor networks from my graduation days. Spent many sleepless nights playing around with Microcontrollers, miniature robots talking to each other over a close net WIFI network using XMPP. Then in 2005 something terrible happened, I got a job. My journey into software development world started. But electronics and hardware has always been my main inspiration.

My fascination with sensor network started from my pre graduation days, I hadn’t heard the term IoT at that time. It was all Wireless sensor network, building miniature robots sending messages to each other over bluetooth and closenet WIFI network. Good old days…

Looking at the current trend with wearable technology and healthcare industry, it seems that the way forward for clinical trials is remote uninterrupted data collection or Device Driven Clinical Trials as I like it to be called. This is nothing new, but the concerns I have is with the wearable devices available in the market today. These limits the usecase for collecting sensible data. For example commercial fitness band/activity trackers like FitBit,Garmin is more focused towards capturing the activity like Steps, Run some times heartbeat. These data is good to track ones activity but would fail when it comes to collecting data for a clinical trial [ it may sound exaggerated, but bare with me ]

Consider a usecase where a patient signup for a clinical trial which is using cutting edge technology to capture data from the patient. He would be given a traditional “fitness band”  so that his activity can be electronically captured.  What happens if the patient experiences a sudden increase in his heart rate? And data is captured in the EDC as at 7:00PM the patients heart rate is 110bpm. We cannot make sense of only one kind of objective data.

To have a meaningful metrics we need meaningful data. To capture meaningful data we need to have as much objective data as possible. To achieve this we need to look at the kind of wearable fit for the purpose.

IoT for the rescue:
I am using IoT loosely here. Internet of Things has become a phenomenon far greater than selfies. people took selfies even before the invention of digital cameras, but only recently ‘selfies’ became an internet rage.

IoT in wearables, in other terms can be described as connected sensor network. Below is a simple breakdown of what would consist of a wearable suitable in the field of clinical trials [ not just limiting to the ones mentioned]
Screen Shot 2015-02-18 at 22.19.58.png Screen Shot 2015-02-18 at 22.20.39.png

Heart rate at 110bpm alone does not given enough information, patient may be just playing, or having an argument with his wife or even it may be due to severe  side reaction to a drug. Other data such as respiration rate, perspiration and body temperature to name few we may would be useful. All these data can be transmitted in realtime to a cloud service and later imported to an EDC system.

These sensors can be embedded in a smart clothing, that we we won't be embarrassing patient by making them wear a computer over there neck and looking for a power outlet.

In the next post I will share the setup I have developed using Android/MQTT/RoR/MongoDB/ to capture the data into an intermediate datastore and later can be pushed to any Electronic data capture system

Saturday, November 17, 2012

OpenHome with BubbleUPnP

with UPnP device we get a wireless freedom of streaming content on many devices. With many more devices being DLNA certified there are many more kind of devices hitting the market right from mobile phones to smartTVs. In DIYers world one good thing about non proprietary software stack means more freedom. Many years DIY home media servers have been rely on open source software to stream content through out the house. Having to setup this by yourself is a mighty task given the complexity of h/w supporting the s/w, for such kind there is always a non-cheap option [ set up a small apple shop at home iphone+ipad+appleTV] for the rest we have mediatomb, minidlna to name a few.

In my case, I already had minidlna running on my pogoplug seamlessly streaming video and audio to my devices. The missing functionality of a good home media system is the media renderer. [ note: this is different from client, a dlna client can be used to browse media server content and play them, while a control point+renderer allows you to play the media on a different device] Consider a simple use case, you are watching a movie in the living room and you decide to watch it in your bedroom TV, in this case, using a control point you can transfer the movie to continue on the TV in your bedroom.

There is no single open source software which has such great feature [ LinuxMCE was an interesting project, is another interesting project ]. But can be achived with combination of tools. For my setup I chose BubbleUPnP.

BubbleUPnP server has the support for creating an Openhome playlist which is exactly what I was looking for. This feature helps in setting up the device specific playlist. This means you can have different playlist per renderer, your living room TV can have a different playlist and your bedroom hifi system can have a different playlist, and since the playlist is not savedin the control point, you can switch from one rendere to another with outloosing the playlist.

Seting up BubbleUPnP is documented for windows and Linux systems. For my setup i wanted to run BubbleUPnP server on my pogoplug , rendere on my HP touchpad and control point on my Nexus one

Media server:
pogoplug running minidlna+BubbleUPnPserver
HP touchpad BubbleUPnP rendere
Nexus one  - BubbleUPnP rendere+controlpoint

BubbleUPnP server setup:
Getting BubbleUPnP server up on pogoplug is a bit tricky. It needs java to be installed first. Download the latest java runtime for embedded linux from oracle site, untar the file

tar zvxf ejre-1_6_0_34-fcs-b04-linux-arm-sflt-eabi-headless-19_jul_2012.tar.gz
mkkdir /usr/local/java
mv ejre-1_6_0_34 /usr/local/java

Next download the server package from here
and run the script
if Java installation is successful, you must now see that the server is running and You can configure the server and renderer from the webinterface on http://:5850

For further configuration related help can be found here