Both M2M and IoT are connectivity solutions that provide remote access to machine data. They both have the capability of exchanging information among machines without human intervention. Thus, the two terms have been mistakenly interchanged often. However, M2M is a predecessor to IoT and had revolutionized enterprise operations by enabling them to monitor and manage their machines and hardware components remotely. M2M set the underlying basis of machine connectivity on which IoT built upon. Nevertheless, IoT is the ultimate manifestation when it comes to connectivity.
Figure 1.3 Characteristics of wearable technology.
The main objective of M2M is to connect a machine/device to another machine (typically in an industrial setting) via cellular or wired network so that its status can be monitored and its data can be collected, remotely. IoT is more of a universal market technology that aims at serving consumers, industries, and enterprises. Consumer IoT connects users to their devices and enables remote access. On the other hand, enterprise and industrial IoT take it further by allowing tracking, control, and management.
IoT and M2M diverge immensely when it comes to the way they access devices remotely. M2M relies on point‐to‐point communications enabled by dedicated hardware components integrated within the machine. The communication among these connected machines is made possible via wired or conventional cellular network and dedicated software. IoT, on the other hand, typically uses IP networks and integrates web applications to interface device/machine data to a middleware, and in the majority of cases, to cloud.
It is worth noting that IoT is intrinsically more scalable than M2M since cloud‐based architectures do not need additional hard‐wired connections and subscriber identification modules (SIM) which are required in M2M.
1.1.2.6 IoT vs. Wearables
Despite the commonalities, it is clear that there are substantial differences when we speak about wearable technology in the context of fitness trackers as opposed to when IoT is used in the context of manufacturing processes or smart cities. In fact, many experts in the field argue that wearables fall under the umbrella of IoT. One key difference worth highlighting here is that most wearables rely on a gateway device, such as a smartphone, for configuration and connectivity, and in most cases to enable features and process data. It is this M2M aspect that makes wearables a separate class of devices, and that's why we prefer to treat these as two technologies with two sets of characteristics.
It is also worth noting that not all wearable devices require connectivity, for example, a simple pedometer and an ultraviolet monitor could operate offline. Other wearables require minimal connectivity only.
Although IoT and wearable devices have a lot in common in terms of design aspects, components, and technologies and protocols used, there are still some real differences that architects and designers need to be aware of. Figure 1.4 shows a table summarizing the main differences between M2M, IoT, and Wearable Technology.
Figure 1.4 A summary of the main differences between M2M, IoT, and Wearable Technology.
1.1.3 IoT: Historical Background
The term “IoT” has not been around for so long. However, the idea of machines communicating with one another has been brewing since the telegraph was developed in early 1800s.
The first wireless transmission over a radio took place in 1900, bringing about endless innovations. This crucial ingredient of the future IoT was complemented by the inception of computers in the 1950s.
An essential component of the IoT is the Internet itself which was initiated as part of the Defense Advanced Research Projects Agency (DARPA) in 1962 and then progressed into ARPANET in 1969. In the 1980s service providers started promoting the commercial use of ARPANET, which matured into today's Internet.
The term IoT was not officially coined until 1999 when Kevin Ashton, the executive director of Auto‐ID Labs at MIT, was the first to describe the Internet of Things in a presentation for Procter & Gamble. During his speech, Ashton stated:
Today computers, and, therefore, the Internet, are almost wholly dependent on human beings for information. Nearly all of the roughly 50 petabytes of data available on the Internet were first captured and created by human beings by typing, pressing a record button, taking a digital picture or scanning a bar code. The problem is, people have limited time, attention, and accuracy. All of which means they are not very good at capturing data about things in the real world. If we had computers that knew everything there was to know about things, using data they gathered without any help from us, we would be able to track and count everything and greatly reduce waste, loss and cost. We would know when things needed replacing, repairing or recalling and whether they were fresh or past their best.
Kevin Ashton also pioneered the radio‐frequency identification (RFID) use in supply chain management and believed that it was essential for the deployment of the IoT. He concluded if all devices were uniquely identified, computers could then manage, track, and inventory them.
A foundational element in realizing the IoT concept was the creation of IPV6. Steve Leibson of Intel Corporation once stated: “The address space expansion means that we could assign an IPV6 address to every atom on the surface of the earth, and still have enough addresses left to do another 100+ earths.” In other words, we have enough IP addresses to uniquely identify all the objects in the world, for hundreds of years to come.
One of the early examples of an Internet of Things is from 1982, when four students from the School of Computer Science department installed switches in a Coca Cola machine at the Carnegie Melon University. The students would connect by ARPANET to the appliance and remotely check the availability of the drink, and if it was cold, before making the trip to the machine. This experiment had inspired numerous inventors around the world to devise their own connected appliances.
After the invention of the World Wide Web by the British scientist Tim Berners‐Lee in 1989 and the launching of commercial Global Positioning System, inventors had been able to develop interconnected devices way more efficiently. One of the first examples was an Internet‐connected toaster introduced by John Romkey in 1990, which is considered by many as the first “real” IoT device.
In 1991, two academicians who worked at the computer laboratory in the University of Cambridge set up a camera to provide live picture of a coffee pot (known as the Trojan Room coffee pot) to all desktop computers on the office network to save people working in the building time and from getting disappointed of finding the coffee pot empty after making the trip. This invention was a true inspiration for the world's first webcam. A few years later, the coffee pot was connected to the Internet and gained international fame until it was retired in 2001.
In the year 2000, LG announces Internet Digital DIOS, the world's first Internet‐enabled refrigerator. The refrigerator became a buzzword despite its commercial failure.
In 2004, Walmart Inc. required its top suppliers to assign RFID tags to cases and pallets in place of barcodes by 2005 to enhance their supply chain operations. The suppliers were unhappy with the new requirements as Electronic Product Code (EPC) tags were pricey and seemed unnecessary. Walmart, subsequently, offered the suppliers to disclose point of sales information which led to a decreased merchandise thefts and labor