Where would you be without the graphical user interface (GUI)? Certainly not right here right now reading this. Think about it…to get to this very blog you had to click buttons, utilize a scroll bar, and possibly look at a menu of options. That means you just used a GUI!
It probably felt pretty natural, right? You didn’t even think about the user experience (UX), much less the user interface (UI)? That’s because, for the past 60 years, UX designers like James Morley-Smith have been working hard to refine the way in which you interact with electronic devices such as computers and smartphones using graphical icons or audio indicators. You can thank him – and the GUI’s inventor Ivan Sutherland – for not having to use text instructions or, worse, type command labels to navigate the internet, apps or anything on your computer or mobile devices.
That’s right, it’s been 60 years since Ivan first had the inkling to draw digitally. He eventually developed what he called the Sketchpad, which turned out to be the earliest program ever to utilize a complete GUI. By building what he thought to be an insignificant application, Sutherland’s Sketchpad became a predecessor to the oN-Line System (NLS), which would go on to influence practically every graphical computing system we see today. It wasn’t until 20 years later – in January 1983, when Apple introduced the Lisa computer – that’s when GUIs went mainstream, popularizing personal computers.
Today, GUIs are everywhere in the consumer and business world, so much so that the ‘G’ or ‘graphical’ in ‘GUI’ is no longer necessary. Just like the word “automotive” seems redundant when we refer to cars today, the same can be said when we think about the word “graphical” when referring to user interfaces. This is because, today, we expect the graphical aspect to come along with it. We expect that each day when we log onto our devices, there will be a visual interface that allows us to interact with our devices, software and applications – proving how far Ivan’s once seemingly small innovation has come.
In the world of enterprise and industry, we now see retail assistants, warehouse workers, manufacturing plant floor engineers, nurses, and millions of other essential front-line workers interacting with well-refined UIs to track their tasks, receive and send time-sensitive updates, and communicate with colleagues simply by looking, tapping and talking via their device screen. The psychological aspect of modern UIs has also helped businesses like yours ensure workers are interpreting information in a way that is as easy and frictionless as possible. For example, as James mentioned in a previous blog post, UIs are being adapted to account for people’s situational disabilities and meet accessibility standards by addressing possible impairments (such as compromised hearing or bulky clothing). This helps provide an all-around better UX for workers, increasing their confidence and inclination to stay with an employer who sees and values them as people first.
In other words, what once started as a few lines and buttons on a screen has since become an innovation that gives us all the agency to work seamlessly with our devices, other people and even autonomous mobile robots (AMRs) regardless of the situation or possible impairments.
What does the future of UIs look like? Or, should we say, “What will UIs look like in the future?”
That’s one of the things we asked James recently. As Zebra’s Global Director of User Experience, James helps Zebra employees, partners and customers understand the impact that technology has on business and how technology can and should be leveraged to gain a strategic advantage for the future. So, we wanted to know how he expects the UI to further evolve based on what he’s working on today at Zebra and what he sees others in the tech industry attempting to do:
Your Edge Blog Team: As someone who is focused specifically on improving the UX every day, how do you see UIs evolving to make for a more efficient and immersive UX?
James: The UX is becoming an increasingly important consideration for when designing UIs, and it will only grow more vital in coming years. This steady emphasis on UX design promises a future that prioritizes simplicity, functionality and ease of use that will help generate cleaner and more intuitive interfaces, with minimal clutter and distractions. This could look like an increased use of voice and gesture-based interfaces through virtual assistants such as Siri and Alexa. Rather than having to physically touch a screen or a button, voice UIs will allow users to interact directly with devices, such as Zebra mobile computers, rugged tablets, RFID readers and more.
Another trend is the integration of artificial intelligence (AI) and machine learning (ML) into UIs to help predict user behavior and tailor interfaces accordingly, making them more personalized and efficient for the end user. For example, an AI-powered interface on a personal shopping solution (PSS) like Zebra’s PS20 could guide shoppers to promotions, alternative products and recipe items based on the customers’ known preferences, current shopping list, and dietary requirements or other loyalty card profile information. Or AI could also be used to provide live contextual based information alongside a participant's video based on what they are saying, which is especially helpful in remote consulting situations, like when a doctor is hundreds of miles away from a patient or an engineer is guiding a field technician through utility infrastructure installs or repairs.
Lastly, I imagine that there will only be an increase in the usage of augmented reality (AR) and virtual reality (VR) in UI design to similarly help better the UX by enabling users to interact with digital content in a more immersive and intuitive way.
Your Edge Blog Team: With new developments in UIs, such as ChatGPT, how do you predict users will interact with their devices in the future?
James: New interface developments, such as ChatGPT, has expanded the way we see our future relationship with computers and technology as a whole. With innovations like ChatGPT, I believe that we could not only expect UIs to complete commands but to also offer opinions in the near future. For example, rather than asking Alexa to repeat your grocery list back to you, you may be able to ask her for an opinion on what you should get at the grocery store. Or rather than asking Siri “what restaurants are there in the area”, you may be able to ask, “what restaurants should I go to in the area”.
While concepts like this one may seem foreign to us now, the fact that we dropped the ‘G’ in GUI because of how second-nature it became makes me wonder what other aspects may become so natural to us in the future when it comes to referring to our computers for opinion as opposed to fact.
Your Edge Blog Team: What does the advancement in UIs, such as voice-based interfaces and VR, mean for enterprises and customers?
James: With new developments in AI, it can only be predicted that the near future of innovation in voice-based interfaces will be used across various environments, especially in healthcare, field services and warehousing, where hands-free access to information is crucial. For example, in healthcare facilities, voice-based interfaces may very well be able to help streamline operations by assisting with patient monitoring or the retrieval of medical records by voice commands, giving nurses and doctors more time to focus on patient-facing tasks. Similar examples can be found in other industries where voice-based interfaces empower workers, while giving them more to focus on their customers or the freedom to work autonomously in environments where they may not have been able to work at all before. Leaders at Austin Lighthouse, for example, implemented voice-directed picking technology in conjunction with AMRs to give visually impaired people the opportunity to work. Not only that, but the workers actually doubled their picking productivity after the technology was fully in play. It’s quite an incredible story that I encourage everyone to check out.
As for AR, this could be particularly helpful for both businesses and customers in giving them the ability to experience products or services in a more realistic way. Overall, the future of what new UI advancements can do for industries is endless and will likely be used to help make workers jobs easier and customers more satisfied.
A Cause for Celebration
Who knew that one person’s simple desire to draw on a screen 60 years ago would lead to a world full of AI, augmented reality and machine learning?! As we celebrate the 60th birthday of the GUI, let’s pause to commemorate the brilliant innovators who have helped it transform and become even more impactful on our lives over time. Let’s also commit, as designers and developers, to create more efficient, personalized and intuitive interfaces that improve the user experience and streamline business processes. We could all use a little more simplicity in our lives.
Curious how certain organizations in your industry are working with Zebra engineers and partners to help inform UI refinements, new technology developments, and the overall digital UX? Or are you wondering what your employees and customers really want technology to do for them? Check out the Zebra Vision Study Library for the latest feedback from the frontline.