Internet of Things (IoT) devices are gaining a great deal of traction among enterprises, as valuable information lives on the edge -- embedded in products, on factory floors, at remote work sites, and within vehicles. To fully leverage the benefits of edge computing, careful planning is needed, as hidden or unexpected costs can arise, making your edge initiatives more expensive than anticipated.
What is edge computing?
In a very broad definition, edge computing encompasses the generation, collection, and analysis of data where the data is generated. This is a contrast to centralized locations such as data centers. Internet of Things devices are typically considered to include some edge computing, since some of the data can be processed on-device, while in other circumstances, data is transmitted in real-time or data aggregates are transmitted on programmable intervals to a central system.
SEE: Tech budgets 2020: A CXO's guide (ZDNet/TechRepublic special feature) | Download the free PDF version (TechRepublic)
Some edge computing (or IoT) devices may be single-use, or have a shorter shelf life than standard computing devices, due to limitations such as non-replaceable batteries.
How much can edge computing cost?
Realistically, the costs of edge computing can vary wildly, depending on the size and scale of your deployment, the amount of data being collected and processed, and the geographic location of your edge computing deployment.
First, consider the number of sensors that are needed. In certain deployments, this can work as a hub-and-spoke model, with smart tags that are recorded when they pass through exits (or other physical thresholds in a building). For manufacturing settings -- among others -- this can also work with barcodes and scanners, minimizing costs of single-use scanners. Expenses associated with RFID-enabled smart tags can add up quickly, as standard tags run for 15 cents at wholesale prices.
The amount of data being collected and processed can also play a role -- for IoT devices that trigger serverless events in Lambda, the frequency and length of events may be a consideration. When properly optimized, serverless computing should be more cost-effective than fixed-rate virtual machines, though comes with an inherent variability as charges are based on the number of events and run time.
Likewise, location makes a significant difference for your deployment. For on-premises devices hardwired to a network or using wi-fi, the connection cost is essentially free -- there's not really an additional cost for transit, just power and internet connectivity already available at your location. For remote devices, using mobile (cellular) networks typically incurs a cost, as charges are typically metered on a per-gigabyte basis. This being the case, streaming video over mobile networks can become expensive quickly, as gigabytes per day would ordinarily be used.
SEE: From Cloud to Edge: The Next IT Transformation (ZDNet Special Feature)
For agricultural settings, using sensors -- such as digital thermometers, hygrometers, and rain gauges -- that programmatically track and report changes in the environment is less expensive to operate on a yearly basis, but incur a modestly higher upfront cost. Ensuring that attention is paid to operational costs -- not just material or hardware costs -- is vital to ensuring success in your edge computing projects.
How can an edge computing strategy save your business money?
By using data collected programmatically at the edge, it can save on manual data collection. Edge computing and IoT devices typically include the use of serverless functions, in circumstances when the actual compute portion cannot be completed on-device. In these instances, the IoT device acts as a data collector and offloads the computational heavy lifting to the cloud.
Serverless is a computing paradigm that allows for a compute instance to be spun up when (and only when) an application is run. In contrast to running a VM to host an application 24/7, the adoption of serverless computing plus IoT devices can save significant amounts in cloud costs.
How should CFOs start thinking about an edge computing strategy?
For edge computing, as with any digital transformation initiative, ensuring that new devices can interoperate with existing equipment is the first step: while bridges and other types of digital glue can physically connect devices, it may also take custom programming to make the data generated by edge/IoT devices comprehensible by existing systems. Because of these potential incompatibilities, edge computing budgets and strategy should be given extra consideration during hardware and network refresh initiatives.
Likewise, ensure that you are budgeting for longitudinal costs associated with IoT devices. Budgeting properly for mobile data usage for remote devices, single-use tags for RFID systems, etcetera, is vital to ensuring the business use case that necessitates edge technology will be properly met, and that the initial investment in the technology will be fully utilized.
In a slightly different sense, CFOs looking to utilize edge computing for their own needs -- not just for budgetary planning -- should consider the use of RFID tag technology to track "what deliveries or shipments have made it," according to Holger Mueller, principal analyst and vice president at Constellation Research, noting that RFID tags can also be used to track employees in an office, to see if they have shown up for work (and when), but that attempts to do this can be limited by the privacy regulations and worker council issues in Europe.
What is edge computing? Here's why the edge matters and where it's headed
The edge takes shape: The 5G telco cloud that would compete with Amazon
Bringing open-source rhyme and reason to edge computing: LF Edge
How to implement edge computing (TechRepublic)