Skip to main content

IoT on Google Cloud Platform

Google wants people to use its Cloud Platform for connecting and managing IoT devices through IoT Core and use other GCP components like BigQuery to analyze data produced by those devices. While these products are fantastic, they also have some real world challenges.

IoT Core provides a managed service for connecting IoT devices. It talks with both HTTP and MQTT protocols and features one-click integration with Cloud PubSub easing most of the infrastructure tasks. However there are some limitations:

  1. You cannot use any random MQTT topic to send/receive messages as you would expect on a custom MQTT bridge. There are special topic formats to send messages and also to receive commands.
  2. IoT Core uses Public-Private Key cryptography to secure devices. All IoT devices must first authenticate using the Public Key in a JWT token and then start sending and receiving messages.
While these may seem like reasonable restrictions, one has to keep in mind that hardware vendors are still stuck in the 90's. Many of them use outdated languages and are reluctant to change any aspect of their devices script to make them talk to IoT Core. Vendors also want to move a subscription model where they charge a monthly fee to access the MQTT bridge hosted by them where their devices would send messages. As such their reluctance to adapt to IoT Core is much higher. The only real option left is to create a component which bridges the gap between hardware vendor's MQTT bridge and IoT Core which is a bad idea for so many reasons. Google must release a pre-approved list of vendors whose devices are compatible with IoT Core so that they can increase the adoption of GCP.

The other problem that one faces while using GCP is in building the UI. Most of IoT customers want real-time graphs for their devices and while Google does have Data Studio, it is not meant for real-time streaming data. To create a custom solution, one has to hook directly into PubSub and plot the streaming data received from there. For historical graphs, Data Studio along with BigQuery might be a option but one has to be very wary of costs which can escalate quickly while using BigQuery. Also the only way to embed a DataStudio graph in a web page is to use IFrame which leaves a very bad user experience. Google must provide a managed service for building charts (real-time or historical) which should be much better than what they currently provide with DataStudio.

GCP is a fantastic platform for building IoT solutions and if Google listens to its customers and provides a fully managed end-to-end solution for most scenarios, it will be a game changer and real challenger to Azure and AWS IoT solutions.

Comments

Popular posts from this blog

Integrating React with SonarQube using Azure DevOps Pipelines

In the world of automation, code quality is of paramount importance. SonarQube and Azure DevOps are two tools which solve this problem in a continuous and automated way. They play well for a majority of languages and frameworks. However, to make the integration work for React applications still remains a challenge. In this post we will explore how we can integrate a React application to SonarQube using Azure DevOps pipelines to continuously build and assess code quality. Creating the React Application Let's start at the beginning. We will use npx to create a Typescript based React app. Why Typescript? I find it easier to work and more maintainable owing to its strongly-typed behavior. You can very well follow this guide for jsx based applications too. We will use the fantastic Create-React-App (CRA) tool to create a React application called ' sonar-azuredevops-app '. > npx create-react-app sonar-azuredevops-app --template typescript Once the project creation is done, we

Creating a Smart Playlist

A few days earlier I was thinking that wouldn't it be nice if I had something which will automatically generate a playlist for me with no artists repeated. Also, it would be nice if I could block those artists which I really hate (like Himesh Reshammiya!). Since I couldn't find anything already available, I decided to code it myself. Here is the outcome -  This application is created entirely in .NET Framework 4/WPF and uses Windows Media Player Library as its source of information. So you have to keep your Windows Media Player Library updated for this to work. It is tested only on Windows 7/Vista. You can download it from here . UPDATE : You can download the Windows XP version of the application here . Please provide your feedback!

Centralized Configuration for .NET Core using Azure Cosmos DB and Narad

We are living in a micro services world. All these services are generally hosted in Docker container which are ephemeral. Moreover these service need to start themselves up, talk to each other, etc. All this needs configuration and there are many commercially available configuration providers like Spring Cloud Config Server, Consul etc. These are excellent tools which provide a lot more functionality than just storing configuration data. However all these have a weakness - they have a single point of failure - their storage mechanism be it a file system, database etc. There are ways to work around those but if you want a really simple place to store configuration values and at the same time make it highly available, with guaranteed global availability and millisecond reads, what can be a better tool than Azure Cosmos DB! So I set forth on this journey for ASP.NET Core projects to talk to Cosmos DB to retrieve their configuration data. For inspiration I looked at Steeltoe Con