Sometime in 1949, the bar code was invented which set the ball rolling for machine-reading capabilities. ARPANET was used to send the first electronic message in 1969. In 1973, the patent for RFID was awarded. 1974 marked the beginnings of what we know is the core backbone of internet communication; TCP/IP. Universal Product Codes (UPC) began to be used commercially in supermarkets. The first use of micro-controllers was actually when Carnegie Mellon researchers used them to monitor stock and dispensation of products in campus vending machines. Domain Name Service (DNS) was introduced in 1984. Five years later, the world wide web (WWW) was introduced but the first web page was only able to be created in 1991. In 1994, the first wearable webcam was created.

1995 saw the true beginnings of E-commerce. This year also saw wearable computing being developed. LG announced plans for an internet-connected refrigerator. In 2002, Forbes published the first article on the Internet of Things. Sometime in 2003, IoT as a term goes mainstream. Walmart starts to use RFID extensively to track its inventory and supplies. IoT gets further impetus with the introduction of cheap mass-produced controllers in 2005. The first IoT conference was held in 2006. Between 2008 and 2009, more things than people were connected to the internet. The age of Internet connectivity of things was officially here. In 2008, the US National Intelligence Council determines that IoT is a disruptive influence. Finally, in 2011 research firm Gartner adds IoT to its famous Hype Cycle.

I could be wrong, but I’m fairly sure the phrase ‘Internet of Things’ started life as the title of a presentation I made at Procter & Gamble (P&G) in 1999. Linking the new idea of RFID in P&G’s supply chain to the then-red-hot topic of the Internet was more than just a good way to get executive attention. It summed up an important insight—one that 10 years later, after the Internet of Things has become the title of everything from an article in Scientific American to the name of a European Union conference, is still often misunderstood.

Kevin Ashton, RFID Journal 2009

Today, IoT is a term that is much used but its implications are not well understood. Every company wants to get on the IoT bandwagon but really does not know how to commercialize the phenomenon.

Commercial Potential

The internet (an ironic source considering we are talking about IoT) has pundits and research firms galore talking up the potential commercial potential of this disruptive technology. We have estimates of potential value ranging from 2 trillion dollars to 20 trilion dollars by the year 2020. There is a lot of hype and speculation in the industry resulting in such wild estimations most of which are not backed up by empirical evidence.

While the industry estimates look very promising, the reality is that most executives have not started any meaningful initiatives. Most of them are still in pilot stages and have not yet realized the full potential that has been laid out in the business case. The graphic below gives a general idea of how the industry views the potential in this space.

From the Pundits…

Now that I have set some background and context, I will be talking about the shifting mindset of value and the commercialization models often used in IoT. Stay tuned…