For manufacturers, smart cities, trucking companies, public safety agencies, and other organizations, the Internet of Things (IoT) provides enormous amounts of new data. The drawback? IoT provides enormous amounts of new data.
Data is only as valuable as the organization’s ability to analyze it. That’s always been the case, but it’s worth remembering today when developing an IoT strategy because the data volumes are unprecedented. Here are three reasons why:
- Mobile networks keep getting faster. Many IoT applications work just fine at 3G or even 2G speeds, but many more require 4G and 5G, such as surveillance camera video. Faster networks enable bandwidth- and data-intensive IoT applications that weren’t practical or even possible until recently.
- The end-user cost of IoT modules and other endpoint hardware keeps decreasing. LTE technology has spent over a decade riding down the cost curve, and 5G has begun its descent. The lower the cost, the more affordable it is to do mass-scale deployments — the kinds that can produce terabytes of data each month. Roughly 5.8 billion IoT endpoints were in service worldwide by the end of 2020, Gartner estimates. That’s a 21 percent increase from 2019.
- Public cloud storage options abound, with prices steadily declining. The cloud storage market will have a 21.9 percent compound annual growth rate between 2019 and 2027, when it will top $222 billion, Allied Market Research says. The combination of wide availability and declining cost make it tempting to just upload everything the IoT devices can produce.
These three trends create a host of pitfalls. For starters, “decreasing” and “declining” aren’t the same as free. There are costs associated with uploading and storing all that data — overhead that can and often does undermine an IoT project’s RoI.
Another pitfall is amassing far more data than the organization can analyze to find all of those actionable insights that justify the investment. Data scientists are in short supply and thus command high salaries, so amassing a small army of them — plus equipping them with the necessary analytics tools — is difficult and expensive. As the data scales up, the human-centric analytics model quickly falls apart.
Taking IoT to the Edge
Savvy organizations are avoiding these pitfalls by making artificial intelligence (AI) and edge computing two key components of their IoT strategies. Computational resources, and the analytical capabilities they support, traditionally reside in central locations, such as a security operations center (SOC). Edge computing moves those resources and capabilities out to where the IoT devices are.
To understand the operational and bottom-line benefits, take the example of a retail chain’s video surveillance system, used not only for security, but also for understanding shopper behavior, such as dwell times and the paths that consumers take through a store. Instead of uploading all of their video in real time to a central location for humans to review, the AI at the edge is trained to look for specific things, such as people where there aren’t supposed to be any or a bag left unattended. When the AI detects those conditions, it turns on the live feed to the SOC and alerts a human to take a look. Or if it’s trained to detect shoppers congregating around a display or taking a certain item off the shelf, the AI uploads that feed to cloud storage. Data scientists then could use analytics tools to understand, for example, whether certain demographics were particularly attracted to that display or item.
In these scenarios, the AI-powered edge saves bandwidth and storage — and thus money — because the video is uploaded only when it meets the right criteria. These benefits are helping drive a major enterprise trend. In 2018, Gartner estimated that roughly 10 percent of enterprise-generated data was created and processed outside a traditional centralized data center or cloud. By 2025, it expects that amount to hit 75 percent.
“As the volume and velocity of data increases, so too does the inefficiency of streaming all this information to a cloud or data center for processing,” said Santhosh Rao, Gartner senior research director at Gartner.
Building an AI-Powered Edge
- Stereo vision, including 3D measurement (size, position, velocity).
- Scene illumination control, dark- and flat-field correction, and region-based auto exposure, all of which are critical for overcoming challenging environments, such as heavy shadowing.
- Understanding what an image shows and then classifying it accordingly, such as a shopper from the target demographic.
- Action recognition, such as alerting security staff when people are seen at a loading dock when the store is closed.
- Human pose estimation, which could be used to analyze a person lying on the ground due to injury.
- Audio recognition, such as glass breaking, alarms blaring and other sounds that justify turning on the video feed and alerting a human.
By making AI and edge computing key parts of their large-scale IoT deployments, organizations can save money, maximize RoI and ensure that they don’t get swamped by IoT’s data deluge. In the process, they get the kinds of deep, actionable insights they need to be competitive and innovative.
At Taoglas, we specialise in designing AI IoT solutions at the edge. We can demystify the process for our customers – from the initial strategy definition right through to the design, build, deployment and management of IoT projects.
We are hardware and software experts and will work with you from the beginning of your IoT project, to ensure connectivity is properly integrated into your device, ensuring a connected, easy-to-use, low-power, secure and market-ready solution. Taoglas can provide finished IoT devices for immediate deployment, as well as EDGE™ IoT Starter Kits for fast prototyping. With a flexible offering covering most connectivity, global positioning standards, vision AI and sensors, the Taoglas EDGE™ portfolio is a complete edge-to-cloud enablement platform comprising hardware, a cloud-based management platform and connectivity.