Skip to main content

Book Review - The Einstein Prophecy

I have been fascinated by science fiction stories since my childhood. From Jules Verne's wonderful "Twenty Thousand Leagues Under the Sea" to Andy Weir's "The Martian", sci-fi books have never been boring. Till I read Robert Masello's "The Einstein Prophecy".

This book was suggested by the Kindle bookstore as number one popular book in sci-fi category. With a print volume of only 326 pages, I immediately bought it on my Kindle.

The story is set in the World War II era with the Allied powers facing off the Axis powers. The story moves briskly at first with good description of the environment and the war situation. Our hero is a US military officer Lucas. Yes, despite the book's title Einstein is not a major player in the story. Also, he does not make a prophecy. Lucas is trying to find an ancient object which will allow Allied forces to defeat Germany. Apparently it is something so important that even Hitler is also looking for it. After setting up this intriguing plot, the story takes a meandering form. If you have seen The Mummy series of movies, you will know what I am talking about.

To be absolutely fair to the book, the story is set on a great premise which could have a lot of potential if presented correctly. Instead for nearly three-fourths of the book, the story does not reveal what that secret object is, despite not much going on in other parts of story either. In many places the author begins describing the trees, birds and surroundings when the story should ideally be revealing the next big secret. It almost seemed that the author ran out of ideas and was just trying to fill in the pages to make the publisher happy!

The end is underwhelming and not worth whatever time you spent reading the story. If you are going to read the book anyway, keep your expectations very low. Also don't try to correlate it to Einstein or any of the historical occurrences.

Good luck!

Rating - 2/5

Comments

Popular posts from this blog

Integrating React with SonarQube using Azure DevOps Pipelines

In the world of automation, code quality is of paramount importance. SonarQube and Azure DevOps are two tools which solve this problem in a continuous and automated way. They play well for a majority of languages and frameworks. However, to make the integration work for React applications still remains a challenge. In this post we will explore how we can integrate a React application to SonarQube using Azure DevOps pipelines to continuously build and assess code quality. Creating the React Application Let's start at the beginning. We will use npx to create a Typescript based React app. Why Typescript? I find it easier to work and more maintainable owing to its strongly-typed behavior. You can very well follow this guide for jsx based applications too. We will use the fantastic Create-React-App (CRA) tool to create a React application called ' sonar-azuredevops-app '. > npx create-react-app sonar-azuredevops-app --template typescript Once the project creation is done, we

Centralized Configuration for .NET Core using Azure Cosmos DB and Narad

We are living in a micro services world. All these services are generally hosted in Docker container which are ephemeral. Moreover these service need to start themselves up, talk to each other, etc. All this needs configuration and there are many commercially available configuration providers like Spring Cloud Config Server, Consul etc. These are excellent tools which provide a lot more functionality than just storing configuration data. However all these have a weakness - they have a single point of failure - their storage mechanism be it a file system, database etc. There are ways to work around those but if you want a really simple place to store configuration values and at the same time make it highly available, with guaranteed global availability and millisecond reads, what can be a better tool than Azure Cosmos DB! So I set forth on this journey for ASP.NET Core projects to talk to Cosmos DB to retrieve their configuration data. For inspiration I looked at Steeltoe Con

Add Git Commit Hash and Build Number to a Static React Website using Azure DevOps

While working on a React based static website recently, there was a need to see exactly what was deployed in the Dev/Test environments to reduce confusion amongst teams. I wanted to show something like this: A quick look at the site's footer should show the Git Commit Hash and Build Number which was deployed and click through to actual commits and build results. Let's see how we achieved this using Azure DevOps. Git Commit Hash Azure DevOps exposes a variable called  $(Build.SourceVersion) which contains the hash of the commit. So I defined a variable in the Build Pipeline using it. Build Id and Build Number Azure DevOps also exposes two release time variables  $(Build.BuildId) and  $(Build.BuildNumber) which can be used to define custom variables in the pipeline. So we have a total of 3 variables defined: Next we use these variables in our React App. I created 3 global variables in index.html and assigned a token value to them. < script   type = "text/JavaScript&quo