Skip to main content

IoT on Google Cloud Platform

Google wants people to use its Cloud Platform for connecting and managing IoT devices through IoT Core and use other GCP components like BigQuery to analyze data produced by those devices. While these products are fantastic, they also have some real world challenges.

IoT Core provides a managed service for connecting IoT devices. It talks with both HTTP and MQTT protocols and features one-click integration with Cloud PubSub easing most of the infrastructure tasks. However there are some limitations:

  1. You cannot use any random MQTT topic to send/receive messages as you would expect on a custom MQTT bridge. There are special topic formats to send messages and also to receive commands.
  2. IoT Core uses Public-Private Key cryptography to secure devices. All IoT devices must first authenticate using the Public Key in a JWT token and then start sending and receiving messages.
While these may seem like reasonable restrictions, one has to keep in mind that hardware vendors are still stuck in the 90's. Many of them use outdated languages and are reluctant to change any aspect of their devices script to make them talk to IoT Core. Vendors also want to move a subscription model where they charge a monthly fee to access the MQTT bridge hosted by them where their devices would send messages. As such their reluctance to adapt to IoT Core is much higher. The only real option left is to create a component which bridges the gap between hardware vendor's MQTT bridge and IoT Core which is a bad idea for so many reasons. Google must release a pre-approved list of vendors whose devices are compatible with IoT Core so that they can increase the adoption of GCP.

The other problem that one faces while using GCP is in building the UI. Most of IoT customers want real-time graphs for their devices and while Google does have Data Studio, it is not meant for real-time streaming data. To create a custom solution, one has to hook directly into PubSub and plot the streaming data received from there. For historical graphs, Data Studio along with BigQuery might be a option but one has to be very wary of costs which can escalate quickly while using BigQuery. Also the only way to embed a DataStudio graph in a web page is to use IFrame which leaves a very bad user experience. Google must provide a managed service for building charts (real-time or historical) which should be much better than what they currently provide with DataStudio.

GCP is a fantastic platform for building IoT solutions and if Google listens to its customers and provides a fully managed end-to-end solution for most scenarios, it will be a game changer and real challenger to Azure and AWS IoT solutions.

Comments

  1. Nice reading, I love your content. This is really a fantastic and informative post. Keep it up and if you are looking for Machine Learning Services then visit Emflair Technologies.

    ReplyDelete
  2. Nice reading, I love your content. This is really a fantastic and informative post. Keep it up and if you are looking for Iot Solutions then visit Neebal Technologies Put Ltd

    ReplyDelete

Post a Comment

As far as possible, please refrain from posting Anonymous comments. I would really love to know who is interested in my blog! Also check out the FAQs section for the comment policy followed on this site.

Popular posts from this blog

Integrating React with SonarQube using Azure DevOps Pipelines

In the world of automation, code quality is of paramount importance. SonarQube and Azure DevOps are two tools which solve this problem in a continuous and automated way. They play well for a majority of languages and frameworks. However, to make the integration work for React applications still remains a challenge. In this post we will explore how we can integrate a React application to SonarQube using Azure DevOps pipelines to continuously build and assess code quality. Creating the React Application Let's start at the beginning. We will use npx to create a Typescript based React app. Why Typescript? I find it easier to work and more maintainable owing to its strongly-typed behavior. You can very well follow this guide for jsx based applications too. We will use the fantastic Create-React-App (CRA) tool to create a React application called ' sonar-azuredevops-app '. > npx create-react-app sonar-azuredevops-app --template typescript Once the project creation is done, we

Centralized Configuration for .NET Core using Azure Cosmos DB and Narad

We are living in a micro services world. All these services are generally hosted in Docker container which are ephemeral. Moreover these service need to start themselves up, talk to each other, etc. All this needs configuration and there are many commercially available configuration providers like Spring Cloud Config Server, Consul etc. These are excellent tools which provide a lot more functionality than just storing configuration data. However all these have a weakness - they have a single point of failure - their storage mechanism be it a file system, database etc. There are ways to work around those but if you want a really simple place to store configuration values and at the same time make it highly available, with guaranteed global availability and millisecond reads, what can be a better tool than Azure Cosmos DB! So I set forth on this journey for ASP.NET Core projects to talk to Cosmos DB to retrieve their configuration data. For inspiration I looked at Steeltoe Con

Add Git Commit Hash and Build Number to a Static React Website using Azure DevOps

While working on a React based static website recently, there was a need to see exactly what was deployed in the Dev/Test environments to reduce confusion amongst teams. I wanted to show something like this: A quick look at the site's footer should show the Git Commit Hash and Build Number which was deployed and click through to actual commits and build results. Let's see how we achieved this using Azure DevOps. Git Commit Hash Azure DevOps exposes a variable called  $(Build.SourceVersion) which contains the hash of the commit. So I defined a variable in the Build Pipeline using it. Build Id and Build Number Azure DevOps also exposes two release time variables  $(Build.BuildId) and  $(Build.BuildNumber) which can be used to define custom variables in the pipeline. So we have a total of 3 variables defined: Next we use these variables in our React App. I created 3 global variables in index.html and assigned a token value to them. < script   type = "text/JavaScript&quo