binary auto trader scam The Cloud and the IoT. Which one is real? - IT is what IT is
Career Advice #23: Reinvent yourself but be modular.
October 12, 2015
Dell to acquire EMC. What to make of it?
October 14, 2015

InternetofThings_contentfullwidth

It’s hardly the topic these days. I mean, the Cloud as a concept is no longer a trending or new topic. We’ve discussed it a lot, and so have most key players like VMware, Google, Amazon and Oracle. As a concept it was a very good transition into something bigger, more powerful and way cooler: The Internet of Things. They are all about the #IoT now, yet their solutions hardly aim to that. The Cloud is still a viable business pitch. I should know 😉

Before moving on, here’s a very cool explanation by VMware on what the Cloud is, or was 5 years ago anyway. I enjoyed this in person and so far I’d say it’s one of the funniest and most accurate definitions of what the Cloud, as the “Service mesh that binds the IT and User world together” is.

5 years ago we all envisioned a world of services. It was more about continuing the legacy of the SOA / Web Services era. Developers would tell you it was all about transitioning into a more stable UI/UX world where User interaction would become king. And they were right. Platform architects would tell you it was all about finally dropping the 3 layer model (Infrastructure-Middleware-Apps) and moving into a controlled and secure pooling of services. And they were right. IT Pros would say it was just about monetizing services, brokering them actually instead of just building them. And of course, they too were right.

The fun part of it all is that the Internet of Things per se is not the evolution of the Cloud though conceptually it might look just like that. It was the year 1999 when Kevin Ashton, a British Technology Entrepreneur and co-founder of the Auto-ID Center actually coined the term when presenting a new concept at a P&G site. This is what he meant:

Today computers—and, therefore, the Internet—are almost wholly dependent on human beings for information. Nearly all of the roughly 50 petabytes (a petabyte is 1,024 terabytes) of data available on the Internet were first captured and created by human beings—by typing, pressing a record button, taking a digital picture or scanning a bar code. Conventional diagrams of the Internet include servers and routers and so on, but they leave out the most numerous and important routers of all: people. The problem is, people have limited time, attention and accuracy—all of which means they are not very good at capturing data about things in the real world.

You can find more on Kevin and the origin of the term in this article he himself penned for the RFID Journal. It is relevant mostly because it explains that the concept per se did not come from the continuous development of the Cloud and the Brokering of any services. It certainly envisioned at some point the usage of devices to create more and more info at speeds we would find hard to control and back then it was a concept closer to security and identification as key aspects of the evolution of Technology.

All of this makes me think that maybe the Cloud was indeed a transition rather than a thing. Maybe that is why it became so hard to actually discuss it, name it and work around it. Only a few people I’ve met were really clear on the concept per se and almost every discussion we had on the topic ended up with us discussing either Computing in the Twentieth Century (late ’80s to ’99) or the actual, cool and modern Internet of Things Concept.

As a corollary of my ramblings and an end to this article, I will let the greatest of them all, Professor Isaac Asimov discuss the future, only 20 years ago, as if he was his famed character Hari Seldon and this was being projected in his Vault in Terminus.

 

(Visited 287 times, 1 visits today)