Ask the Expert: What’s the Difference Between Intelligent Automation (IA) and Artificial Intelligence (AI)?
Spoiler alert: no robot is “smart” from the start.
As a society we’re fascinated by the prospect of our greatest sci-fi visions coming to life right in front of our eyes. Robotics are being used in high school gaming competitions and surgeries alike, and we can call upon artificial intelligence (AI) anytime we want to remember what we have to do each day – and actually do those things. (“Alexa, order dinner…call dad…set the alarm.”)
Yet, there’s a lot of confusion about what AI can do to help organizations improve their operational efficiency and how it differs from intelligent automation (IA), an increasingly used term, especially here at Zebra. Hopefully our recent conversation with Shawn Harris, one of Zebra’s intelligent automation experts, will help you fully understand the difference – and correlation – between IA and AI as well as their mutual connections with computer vision, augmented reality (AR) and prescriptive analytics technologies.
Your Edge Blog Team: At Zebra, we talk every day about the potential of intelligent automation. We’ve even introduced several intelligent automation solutions in recent months. Yet, many people aren’t quite clear on what we mean when we say intelligent automation. Are we talking about robots or co-bots? Is it a term used more to describe automated industrial systems, such as an Industry 4.0 production line? Or is intelligent automation something entirely different?
Shawn: In a way, it’s all of the above. Intelligent automation is the way by which we pair “man with machine” to help organizations better sense, analyze and act on opportunities and issues. Though we often hear Zebra talk about intelligent automation in terms of robotics, it’s not just about the “machine” you may see roaming a warehouse or retail store. Intelligent automation can be derived from and delivered via a host of technologies, including computer vision, sensors, AR, machine learning and intelligent automation.
Your Edge Blog Team: Is IA technically the same thing as AI, then?
Shawn: While intelligent automation and AI are indeed unique technologies, there is a strong intersection of commonality. Here is how I think about it: AI is now the world’s greatest predictor. (Prediction is everywhere.) Intelligent automation in both the physical form (e.g. robots, autonomous cars, drones) and software form (e.g. Robotic Process Automation or “RPA,” recommendation engines, high frequency trading) will often times leverage AI to make a series of predictions that lead to an action, series of actions or combination of simultaneous actions. But AI is an embedded technology – a driving force in making other technologies “intelligent” and, in many cases, allowing for automation of decisions and actions.
If you think about this in terms of physical intelligent automation – in say the form of a robot – then the series of actions resulting from AI predictions may be something like this: “lift, put down, stop, go left, go right, slow down.” However, the software-based intelligent automation solution may dictate actionable steps to workers such as “look up current work in progress orders, create new order, predict a promise to ship, look for field promise to ship and populate the promise to ship field with the predicted date.” These are all things being done through a desktop interface that was previously used by an order entry clerk.
If you remember Algebra, AI is essentially y=mx+b on steroids with a sprinkling of calculus, specifically differentiable functions. It can find a curve to match any data it’s trained on. When you give it new data that aligns to the trained data distribution, it can find a prediction that fits the curve with an accuracy determined to be sufficient.
Your Edge Blog Team: Last year, Yan Zhang was kind enough to help us understand the difference – and correlation – between machine learning and AI. But, what we didn’t talk about was how machine learning, AI and other “intelligence” technologies are used to power intelligent automation solutions? Can you dig into that a bit?
Shawn: Vision-based solutions such as computer vision and augmented reality are really being used to drive the intelligent automation platforms leveraged to optimize business operations. Zebra’s new SmartSight™ solution is a great example. At the heart of this offering is EMA50.
Your Edge Blog Team: And, EMA stands for “enterprise mobile automation,” correct?
Shawn: Yes. EMA is actually one form of intelligent automation and uniquely designed to roam the aisles of grocery or retail stores to look for irregularities in item pricing, planogram compliance and out-of-stock or misplaced inventory. Though technically “not a robot,” EMA50 looks like what many people think of when they hear the term robot. The difference is that EMA50 is smart. We’ve integrated computer vision technology to make this automated vision-based solution intelligent. EMA50 provides a direct line of sight into what’s happening, or what’s not happening, within the store. It sees things that store associates may not. When it does, it intelligently and automatically prompts human workers to take corrective actions with a great deal of specificity via a set of instructions sent to their handheld mobile computers.
Your Edge Blog Team: So, if there’s a jar of peanut butter sitting on an end cap shelf in the baby section, EMA50, via the SmartSight system, will notify a human worker to the misplaced item?
Shawn: Exactly. It will tell a store associate that it spotted the peanut butter on the end cap and then task that person to return the jar to a very precise aisle and shelf location via an alert on the worker’s handheld mobile computer. Once that task is complete, the store associate can then confirm resolution on their mobile computer.
Your Edge Blog Team: Is intelligent automation being used in other ways today?
Shawn: Oh yes, there are many applications of intelligent automation today. Organizations are starting to realize how beneficial it can be to automate the smart tasking of workers. For some, labor pools are tight and they need to be able to remove the mundane tasks from employee’s workloads so they can assign them to the higher-value tasks that can only be accomplished with human interaction. For others, there is just greater pressure to deliver a higher quality customer experience.
Retail store associates need a way to ensure shelves are stocked, price tags are accurate and they remain accessible to help shoppers with inventory questions. And anyone involved in the e-fulfillment process must be able to process orders at lightning speed with complete accuracy. Manufacturers, warehouse and distribution center operators, transportation companies and delivery services are facing operational challenges that no one could have predicted a decade ago, so they are hyper focused on technology solutions that improve process efficiency, worker productivity and overall supply chain synergy as demand continues to rise at record rates. No one wants to be faulted for a mispacked item or missed delivery deadline.
Intelligent automation is helping with all of the above. You can go back and read about the many ways co-bots are being used in manufacturing and warehousing environments in some of my colleagues’ previous blog posts, but at the fundamental level, they are helping workers more efficiently pick items during order fulfillment.
Related Blog Posts
How Robots and Humans are Teaming Up to Take on the World (We Mean, the Warehouse)
“Co-Bots” are Working Alongside People in the Warehouse. But, Is It a Harmonious Union?
Heads Up! Heads-Up Displays are Coming to a Warehouse Near You
Meet the Head-Mounted Display Making Augmented Reality Accessible to All Workers
And, as I mentioned, computer vision solutions are being applied broadly – even beyond SmartSight – to help retailers identify operational inefficiencies in the store such as out of stock or missing price tags. Of course, machine learning and advanced analytics will also continue to be key tools for supply chain organizations seeking to optimize inventory efficiency, identify opportunities for process improvements and proactively identify issues that affect business outcomes. The goal with intelligent automation and enterprise intelligence solutions in general is to pre-empt issues – to reduce the risk of employee mistakes, missed opportunities or delays.
If you watch these #NextWave videos, you’ll see real-life examples of how intelligence-centric technologies are being used today in healthcare, retail stores and even restaurants:
Your Edge Blog Team: Given the demonstrated value of intelligent automation in early deployments, can we expect new applications to emerge in the next few years?
Shawn: As Zebra CTO Tom Bianculli mentioned in a recent blog, the focus in the short term remains heavily on “intelligent orchestration.” This, of course, includes the orchestration of human workers and co-bots and intelligent automation platforms such as SmartSight. But, even broader, the focus is on developing a common, central point of orchestration of disparate sensors, devices and automation platforms – which we call the instrumentation layer – in order to coordinate the combined value of these assets. From there, enterprises can use mobile computing technologies to push real-time actionable intelligence to workers at the edge.
Your Edge Blog Team: In other words, intelligent automation is going to rely on a strong synergy between mobile and automated technology platforms in order to facilitate the “sense, analyze and act” capabilities that enterprises need to gain an operational edge?
Shawn: Yes. Many optimization solutions today focus on either human tasks and workflows or robotic tasks and workflows, with little to no crossover (or synergy) between the two. Cooperative orchestration between automation systems and human workers will be critical to achieving the highest levels of productivity improvement in the not-too-distant future. I can tell you that all of the mobility, augmentation and automation solutions that Zebra is working on are designed to facilitate a collaborative “man and machine” type workflow. Our technology is being used to make humans smarter, more effective and more valuable by either taking the busy work off their plate – as is the case with co-bots – or feeding them the enterprise intelligence they need to accomplish exactly what they’ve set out to do with greater speed and success – which is what you’ll gain from prescriptive analytics, SmartSight, the HD4000 head-mounted display and more. Remember, the primary goal for companies is to reallocate their human labor force toward the greatest area of need and highest value, not replace those workers.
Your Edge Blog Team: How do you see those evolving even further in the next 5 years?
Shawn: According to our 2019 Intelligent Enterprise Index, a record 62 percent of enterprises worldwide are on the path to becoming “intelligent,” compared to only 49 percent in 2018. Yet, they clearly have a long way to go still.
As companies work to better connect the physical and digital, we’re going to see their Internet of Things (IoT) vision and enterprise intelligence strategy evolve quite quickly to include greater utilization of automation-related technologies, as these stats demonstrate:
- Co-bots/smart robotics (44 percent currently use, 43 percent plan to use)
- Artificial intelligence (44 percent currently use, 51 percent plan to use)
- Augmented reality (37 percent currently use, 49 percent plan to use)
- Machine visioning systems (46 percent currently use, 46 percent plan to use)
In fact, they have quickly become core to Zebra’s own Enterprise Asset Intelligence vision. The convergence of multiple IoT, mobility and cloud-based technologies is the key to fostering dynamic “sense-analyze-act” outcomes that today’s businesses demand. So, we intend to harness the deep expertise and knowledge of the talented team from our recent acquisitions of Cortexica and Profitect as well as our Chief Technology Office and our Zebra Venture partners to accelerate innovation around our portfolio of intelligent automation solutions. Cortexica was actually recognized in Gartner’s “Cool Vendors for AI in Computer Vision” 2018 report and for good reason. That team is working on some very interesting things, which we look forward to sharing more about in the coming months.
Want Our Bi-Weekly Blog Roundup?
Subscribe to Zebra's Blog
Prefer Real-Time Notifications?
Get the RSS feeds
Search the Blog
Are You a Zebra Developer?
Find more technical discussions on our Developer Portal blog.
Reflexis is Now Part of Zebra Technologies
Visit the Reflexis blog for more retail, hospitality and banking-related insights.
Fetch Robotics is Now Part of Zebra Technologies
Visit the Fetch blog for robotics-related insights.
Looking for more expert insights?
Visit the Zebra Story Hub for more interviews, news, and industry trend analysis.