Friday, November 25, 2016

Embed git commit hash in a scala web service

One of the Scala service I was working on needed an endpoint which would inform what version of the code has been deployed on the server.

There are multiple ways of doing it. Embedding it into Manifest files is one easy way, but I found this https://github.com/sbt/sbt-buildinfo plugin.  It works by generating a scala source file from build definition. Full details can be found here

Project name, Version and other build time details are already defined in build.sbt file of the sbt project. Now I need git details to be added to this BuildInfo.scala source file.

https://github.com/sbt/sbt-git plugin does exactly what I need. It's a plugin which offers git integration with sbt.  Now I can use sbt-git to fetch commit SHA and use BuildInfo plugin to generate scala source which stores this information at build time.

build.sbt file looks like this

import Dependencies._

name := "myService"
version := "0.1"
scalaVersion := "2.11.8"
libraryDependencies ++ testLibraryDependencies
val gitCommitString = SettingKey[String]("gitCommit")

gitCommitString := git.gitHeadCommit.value.getOrElse("Not Set")

lazy val root = (project in file(".")).
  enablePlugins(BuildInfoPlugin).
  settings(
    buildInfoKeys := Seq[BuildInfoKey](version, gitCommitString),    
    buildInfoPackage := "buildInfo",   
    buildInfoOptions += BuildInfoOption.ToMap,    
    buildInfoOptions += BuildInfoOption.ToJson
  )

After compiling my project, buildinfo plugin will generate BuildInfo.scala source file in target->scala-{version}->src_managed directory. This source contains the app version and git commit hash




 BuildInfoOption.ToJson
Above line enables it to render into json format.

I added a new route for the service  /version
package com.brainchunk.routes

import akka.http.scaladsl.server.Directives._
import buildInfo.BuildInfo

trait VersionApi {
  private val resource = "version"
  val versionApi =
    path(resource) {
      get {
        complete(BuildInfo.toJson)
      }
    }
}

Now the endpoint should return json as below

$ curl localhost:8081/version

{"version":"0.1", "gitCommit":"23962adddf6e55d3f43466283cd3418a9133ee6c"}


Note:
IntelliJ IDEA is not very happy with this BuildInfo being in target/../src_managed location. Workaround has been mentioned here

https://youtrack.jetbrains.com/issue/SCL-7182#u=1402953965717


Sunday, July 03, 2016

Image classification with TensorFlow

Lately I am spending time experimenting with scikit-learn and TensorFlow (Google) open source library for Machine Learning.

Using image recognition library from TensorFlow, I hacked together a web service with an end point where I can send an image and get back with prediction for the content of the image.
To consume the API, initially I thought of making an Android app, later it made sense to follow up on my previous post and use a chat bot. My experimental project was simple, I should be able to send an image to my chat-bot in WhatsApp, and get back the prediction.

Tensorflow is easy to learn and use and it has decent documentation. I trained the model using sample image database downloaded from google example. To create the web endpoint, I used flask, a python web framework.
Now I can test, my web service by uploading my test image
          


Prediction :
[
  {
    _score: "0.623313",
    _string: "catamaran"
    },
    {
      _score: "0.099451",
      _string: "dock, dockage, docking facility"
    },
    {
      _score: "0.0621891",
      _string: "liner, ocean liner"
    },
    {
      _score: "0.0246246",
      _string: "fireboat"
    },
    {
      _score: "0.0244034",
      _string: "speedboat"
    }
]
Prediction engine is 62% sure that the image is of a catamaran. Close enough.

Now, I could hook this up to my WhatsApp bot. I used yowsup python library to get this up and running. I just had to register my spare mobile number with WhatsApp server. With a bit of fiddling around, the bot was ready to talk to the prediction API.
Now I can send a picture, and get back the prediction in JSON format.

Screenshot_20160619-183759.png



Thursday, February 19, 2015

Current state of Wearables and Clinical trials

I have been fascinated by sensor networks from my graduation days. Spent many sleepless nights playing around with Microcontrollers, miniature robots talking to each other over a close net WIFI network using XMPP. Then in 2005 something terrible happened, I got a job. My journey into software development world started. But electronics and hardware has always been my main inspiration.

My fascination with sensor network started from my pre graduation days, I hadn’t heard the term IoT at that time. It was all Wireless sensor network, building miniature robots sending messages to each other over bluetooth and closenet WIFI network. Good old days…

Looking at the current trend with wearable technology and healthcare industry, it seems that the way forward for clinical trials is remote uninterrupted data collection or Device Driven Clinical Trials as I like it to be called. This is nothing new, but the concerns I have is with the wearable devices available in the market today. These limits the usecase for collecting sensible data. For example commercial fitness band/activity trackers like FitBit,Garmin is more focused towards capturing the activity like Steps, Run some times heartbeat. These data is good to track ones activity but would fail when it comes to collecting data for a clinical trial [ it may sound exaggerated, but bare with me ]

Consider a usecase where a patient signup for a clinical trial which is using cutting edge technology to capture data from the patient. He would be given a traditional “fitness band”  so that his activity can be electronically captured.  What happens if the patient experiences a sudden increase in his heart rate? And data is captured in the EDC as at 7:00PM the patients heart rate is 110bpm. We cannot make sense of only one kind of objective data.

To have a meaningful metrics we need meaningful data. To capture meaningful data we need to have as much objective data as possible. To achieve this we need to look at the kind of wearable fit for the purpose.

IoT for the rescue:
I am using IoT loosely here. Internet of Things has become a phenomenon far greater than selfies. people took selfies even before the invention of digital cameras, but only recently ‘selfies’ became an internet rage.

IoT in wearables, in other terms can be described as connected sensor network. Below is a simple breakdown of what would consist of a wearable suitable in the field of clinical trials [ not just limiting to the ones mentioned]
Screen Shot 2015-02-18 at 22.19.58.png Screen Shot 2015-02-18 at 22.20.39.png



Heart rate at 110bpm alone does not given enough information, patient may be just playing, or having an argument with his wife or even it may be due to severe  side reaction to a drug. Other data such as respiration rate, perspiration and body temperature to name few we may would be useful. All these data can be transmitted in realtime to a cloud service and later imported to an EDC system.

These sensors can be embedded in a smart clothing, that we we won't be embarrassing patient by making them wear a computer over there neck and looking for a power outlet.

In the next post I will share the setup I have developed using Android/MQTT/RoR/MongoDB/ to capture the data into an intermediate datastore and later can be pushed to any Electronic data capture system