The real-time struggles with Internet of Things

The dreams and promises of IoT are many, the predictions of a shining future. There is an underlying expectation that IoT will change the way we use and behave on the internet since it’s predicted to connect 10 times as many “things” to the Internet by 2020.

Everything from bracelets, to cars and smart homes, and blurred will be the lines between hardware, software, and mobile.


From a consumer side of view, this all sounds glorious and we can’t wait to test, try, and consume this new revolution. But behind the scenes, from developers and software developers side of view, the many questions are many and the discussions are plenty. How should these devices connect? How should we handle all the real-time data that is produced? Though IoT is the new buzz, not much has happened yet.

So let’s get practical here, to get real value of IoT we need real-time data and lots of it. User actions, behavior, habits, needs, preferences, the more data the better user experience and the more satisfied customer. Based on the data, we need to be able to make smart analysis and an action needs to be made based on the analysis. Now the complexity is, all of this needs to happen within fraction of a second.

And the reason we say “complexity” is because a new generation technology can’t be handled by an old generation of software, just like evolution. So, what demands are there on the new generation software? A new generation software is what we call intelligent software, it needs to

  • handle large amounts of data
  • handle data real-time
  • do real-time analysis
  • act on the data on spot

We are rapidly getting to the point where the wireless networks have the capacity to handle very large volumes of devices, and to be able to handle large amount of data in real-time we need to have extremely fast software platforms that only store one sample of the data needed. To do real-time analysis and to be able to act-on-spot on the analysis we need to have smart systems with very high performance.

These demands call for some dramatic change in the way we build our system. We need to shrink the distance between data, software and hardware, and to accomplish this we need to build these systems using in-memory computing, having the database and the applications as close as to each other as possible. We need to minimize the interface between system parts and have our data in one copy, in one place.


Post Comment