A view of what a warehouse worker may see when picking parts using an augmented reality app on a Zebra ARCore-certified mobile computer
By Pat Narendra | July 26, 2019

Extending Augmented Reality (AR) Across the Enterprise

AR-Enabled Enterprise Mobile Computers Help to Increase Worker Efficiency and Accuracy

Augmented reality (AR) applications have hit the mainstage on consumers’ mobile devices. Perhaps you have seen the AR stickers in photo apps; tried out furniture in your living room; used one of the multiple “measure” apps; or played one of the many interactive AR games making the rounds. (Beer Pong anyone?)

Meanwhile in the enterprise space, there is palpable evidence of massive, pent-up demand to deploy AR for a real return on investment (ROI)! For example, HoloLens™ 2 was launched at Mobile World Congress 2019 with a focus on the enterprise. Exciting as the potential is for this particular technology, such wearable AR devices are neither here nor affordable – even in the enterprise space. (Barring a few corner cases where the cost is justified or is part of a larger project, such as with the US Army.)

However, a myriad of innovative enterprise AR applications can be unlocked right here and now using other widely accessible technology platforms, such as Zebra’s ARCore-certified TC52, TC57, TC72 and TC77 mobile computers. 

Two screenshots of what a retail worker may see when using an augmented reality application on Zebra's ARCore-certified mobile computers.

AR Unleashed: Zebra’s AR-enabled Mobile Computers Open a Vista of New Enterprise Applications

I’m not writing this post to pitch you on Zebra’s mobile computers. Rather, I simply want to dispel any myths that portray widespread enterprise AR utilization as a pipe dream. There are a host of AR use cases poised to make a significant impact on enterprise operational efficiency in nearly every vertical sector – from manufacturing to warehousing, transportation and logistics to retail and even healthcare. And these applications leverage many of the technologies you have already deployed in your facilities and in the field, such as the enterprise-grade touch mobile computers I mentioned above.

With AR, you gain the ability to present workflow directions and instructions to your workers in the field of view on their AR-enabled mobile devices – in a way that is attached to their real-world location. For example, retailers may find AR especially beneficial for buying online and picking up in store (BOPIS) scenarios, and warehouse operators may be able to improve picking accuracy and speed via AR guidance. Here’s how:

Retail Use Case: Buy Online, Pickup In Store (BOPIS)

In 2018, half of all U.S. retailers offered BOPIS. More than four out of every 10 consumers purchased online and picked up in store over the past year, a 43 percent increase over the year before.

As a retailer, you may find it challenging to provide this service in an efficient manner – shoppers do not pay a premium for this service. It doesn’t help if you are constantly struggling with frequent associate turnover, or your associates’ general unfamiliarity with the variety and location of products in your store. However, employing an AR application on your associates’ mobile devices could solve several of your BOPIS logistics and some workforce management challenges.

For example, to maximize your BOPIS fulfillment demands, you might divide your store into multiple zones and assign an associate to be the specialized picker in each zone – such as produce. The picker will likely roll a special cart with multiple bins, each associated with a different shopper order, in order to shop for multiple shoppers’ orders at once. (These carts will eventually be consolidated from each zone to build the final shopper packages.) Using the store planogram, the list of products they need to pick is likely presented today in sequential aisle order via a list-type app on a Zebra mobile computer. The associate then hunts for and picks each product on the shelf, scans them in and then puts them in the corresponding shopper bins.

Here’s the problem with this current workflow: While planograms may contain the precise location of the product (as expected), they only present this product location information to the picker as a “coordinate” – aisle/section/shelf/(maybe) sequence number. When dealing with dozens of very similar products/variants on the same shelf, this presents a cognitive challenge to the picker who is picking for multiple shoppers at once. This, in turn, leads to efficiency deficits.

Fortunately, an AR-powered application running on that same Zebra mobile computer has the potential to dramatically improve the associate’s performance, leading to faster, error-free picking. By just holding the Zebra device in front of him or her and looking down the aisle, the associate can simply follow the AR stickers to identify the precise locations of ordered products and confirm order instructions, such as how many to pick for each shopper and the bin number in which picked items should be placed. This goes a long way to alleviate the cognitive load on the associate, as the AR flag can be placed on the precise location of the desired item – to within an inch. 

An augmented reality application used to find items in a grocery store

To visualize this, consider the lotion aisle above. There are a number of very similar-looking products, which makes for a trial and error process (especially for an inexperienced associate).  A precisely-placed AR location flag would make it much easier to accurately pick the ordered items.

Warehouse Use Case: The Picking Challenge

The best way to demonstrate the value of AR in a warehouse environment is with this picking example from a customer repair center…

Picture more than 6,000 bins across 60 trays in a vertical carousel.  The picker with the pick list summons the tray (10 ft x 3 ft) and has to hunt to locate the bin by the grid code (Column E: Row 22). He or she then scans the corresponding barcode to confirm the pick. Errors can and do occur because there is no visual confirmation of the part with the pick list, and picking the right bin among the hundreds in the field of view is a challenge to do quickly.

However, a picker with an AR-enabled Zebra device could hold up the device to the tray to see the entire picklist for the particular tray in the field of view, with each pick “flag” firmly anchored to the corresponding bin location, as you can see in the above image. He or she can then pick the part, guided by the AR flag; check against the image on the flag; and simply touch the flag on the handheld mobile computer screen to record the pick and dismiss the tag. No need to scan for confirmation! A long press on the AR flag on the screen brings up a user interface dialog to record low stock, wrong placement, etc. Mistakes are averted, fulfillment is faster and customers are happy.  Here is a live demo video recorded using an AR-enabled Zebra mobile computer in our Buffalo Grove, IL, facility:

Of course, these are just a few of the potential enterprise applications that can now move from proof of concept to “proven use case” in your operating environment. We’ll discuss more in the coming weeks. Stay tuned into Your Edge.

We also invite Zebra customers, developers and partners to join us at one of Zebra’s 2019 APPFORUM series events to learn more about Augmented Reality in the enterprise:

  • August 13-14: Sydney, Australia
  • August 20-21: Beijing, China
  • October 1-2: Las Vegas, Nevada, US

Visit our APPFORUM event page to learn more or register.

Plan to attend the APPFORUM Americas event? Register before August 16, 2019, to take advantage of the $150 early bird discount.

Topics
Retail, Warehouse and Distribution, Manufacturing, Transportation and Logistics, Field Operations, Inside Zebra Nation, Public Sector, Healthcare,
Pat Narendra, PhD
Pat Narendra, PhD

Dr. Pat Narendra is a Principal Product Manager of the Zebra Enterprise Mobile Computer (EMC) Emerging Technology organization, where he passionately explores augmented reality, mobile locationing, deep and machine learning in the enterprise ecosystem.

Dr. Narendra has a PhD in Electrical Engineering/Computer Science from Purdue University, where his thesis on Pattern Recognition and Machine Learning resulted in over a dozen referenced journal articles with over 4000 citations.

He spent his early career in computer vision and signal processing research at Honeywell. He then obtained an MBA in Strategy from the University of Minnesota and launched his product development career at Motorola Mobility, Motorola Solutions and, now, Zebra where he has created and launched over a dozen products and been granted nine patents.

Dr. Narendra is a hands-on business and technical architect, equally at home in prospecting for customer return on investment (ROI) and coding up prototypes using the latest augmented reality frameworks on Zebra mobile computers and machine learning platforms, such as Tensorflow lite.