Fundamentals of IoT and Wearable Technology Design. Haider Raad. Читать онлайн. Newlib. NEWLIB.NET

Автор: Haider Raad
Издательство: John Wiley & Sons Limited
Серия:
Жанр произведения: Техническая литература
Год издания: 0
isbn: 9781119617556
Скачать книгу
realm of tech industry.

      Both M2M and IoT are connectivity solutions that provide remote access to machine data. They both have the capability of exchanging information among machines without human intervention. Thus, the two terms have been mistakenly interchanged often. However, M2M is a predecessor to IoT and had revolutionized enterprise operations by enabling them to monitor and manage their machines and hardware components remotely. M2M set the underlying basis of machine connectivity on which IoT built upon. Nevertheless, IoT is the ultimate manifestation when it comes to connectivity.

Schematic illustration of the characteristics of wearable technology. The given terms are low power, sensing, connectivity, fashionability, intelligence, and comfort and ergonomics.

      IoT and M2M diverge immensely when it comes to the way they access devices remotely. M2M relies on point‐to‐point communications enabled by dedicated hardware components integrated within the machine. The communication among these connected machines is made possible via wired or conventional cellular network and dedicated software. IoT, on the other hand, typically uses IP networks and integrates web applications to interface device/machine data to a middleware, and in the majority of cases, to cloud.

      It is worth noting that IoT is intrinsically more scalable than M2M since cloud‐based architectures do not need additional hard‐wired connections and subscriber identification modules (SIM) which are required in M2M.

      1.1.2.6 IoT vs. Wearables

      It is also worth noting that not all wearable devices require connectivity, for example, a simple pedometer and an ultraviolet monitor could operate offline. Other wearables require minimal connectivity only.

      1.1.3 IoT: Historical Background

      The term “IoT” has not been around for so long. However, the idea of machines communicating with one another has been brewing since the telegraph was developed in early 1800s.

      The first wireless transmission over a radio took place in 1900, bringing about endless innovations. This crucial ingredient of the future IoT was complemented by the inception of computers in the 1950s.

      An essential component of the IoT is the Internet itself which was initiated as part of the Defense Advanced Research Projects Agency (DARPA) in 1962 and then progressed into ARPANET in 1969. In the 1980s service providers started promoting the commercial use of ARPANET, which matured into today's Internet.

      The term IoT was not officially coined until 1999 when Kevin Ashton, the executive director of Auto‐ID Labs at MIT, was the first to describe the Internet of Things in a presentation for Procter & Gamble. During his speech, Ashton stated:

      Today computers, and, therefore, the Internet, are almost wholly dependent on human beings for information. Nearly all of the roughly 50 petabytes of data available on the Internet were first captured and created by human beings by typing, pressing a record button, taking a digital picture or scanning a bar code. The problem is, people have limited time, attention, and accuracy. All of which means they are not very good at capturing data about things in the real world. If we had computers that knew everything there was to know about things, using data they gathered without any help from us, we would be able to track and count everything and greatly reduce waste, loss and cost. We would know when things needed replacing, repairing or recalling and whether they were fresh or past their best.

      Kevin Ashton also pioneered the radio‐frequency identification (RFID) use in supply chain management and believed that it was essential for the deployment of the IoT. He concluded if all devices were uniquely identified, computers could then manage, track, and inventory them.

      A foundational element in realizing the IoT concept was the creation of IPV6. Steve Leibson of Intel Corporation once stated: “The address space expansion means that we could assign an IPV6 address to every atom on the surface of the earth, and still have enough addresses left to do another 100+ earths.” In other words, we have enough IP addresses to uniquely identify all the objects in the world, for hundreds of years to come.

      After the invention of the World Wide Web by the British scientist Tim Berners‐Lee in 1989 and the launching of commercial Global Positioning System, inventors had been able to develop interconnected devices way more efficiently. One of the first examples was an Internet‐connected toaster introduced by John Romkey in 1990, which is considered by many as the first “real” IoT device.

      In 1991, two academicians who worked at the computer laboratory in the University of Cambridge set up a camera to provide live picture of a coffee pot (known as the Trojan Room coffee pot) to all desktop computers on the office network to save people working in the building time and from getting disappointed of finding the coffee pot empty after making the trip. This invention was a true inspiration for the world's first webcam. A few years later, the coffee pot was connected to the Internet and gained international fame until it was retired in 2001.

      In the year 2000, LG announces Internet Digital DIOS, the world's first Internet‐enabled refrigerator. The refrigerator became a buzzword despite its commercial failure.

      In 2004, Walmart Inc. required its top suppliers to assign RFID tags to cases and pallets in place of barcodes by 2005 to enhance their supply chain operations. The suppliers were unhappy with the new requirements as Electronic Product Code (EPC) tags were pricey and seemed unnecessary. Walmart, subsequently, offered the suppliers to disclose point of sales information which led to a decreased merchandise thefts and labor