Artificial Intelligence: The Future or the Present?
Before Alan Turing’s 1950 defining article, “Computing Machinery and Intelligence,” Artificial Intelligence (AI) would have been considered by many to be science fiction. 70+ years later and we’re living in a future where AI has integrated itself into everyday life. Yet, even today, some question the validity of machine learning (ML) and wonder about its practical implications.
If you’re reading this on your smartphone, chances are your device uses AI software. Apple outfits the iPhone with Siri, and Amazon devices have Alexa. Since the introduction of Siri in 2011’s iPhone 4S, AI has functioned as a virtual personal assistant.
Need to send a quick text? Tell Siri. Want to know what the capital of Nebraska is? Ask Alexa. AI has become part of the home as Alexa-compatible devices populate living rooms, and kitchens. Whether it’s to play a song or look up a recipe, these devices can be a useful tool.
Even with AI at our fingertips, the question remains about practical use cases outside of our personal devices. Specifically in the enterprise, AI has the potential to become an invaluable asset. The possibilities are seemingly endless as to how AI and ML can improve upon the tools we rely on most. And may even usher in a generation of technology we have yet to dream of.
BlueFletch’s founders Richard Makerson and Brett Cooper sat down on the Enterprise Mobility Roundup Podcast to discuss AI, ML, and its practical application in the enterprise environment. In this article, we’ll be breaking down their discussion and highlighting some of their key points.
Artificial Intelligence (AI) vs. Machine Learning (ML)
The two phrases AI and ML are often grouped together. But are they the same thing? Well, Richard Makerson describes them as Apples and Oranges, they’re both fruits but they’re different kinds.
- Artificial Intelligence – AI is a branch of computer science centered around software that mimics aspects of human intelligence. This can include advanced problem solving, thought, and communication. Machine “thinking” is by far the most critical aspect of artificial intelligence and has been the center of AI research for decades.
The belief that a machine could learn to manifest thought is a polarizing subject but key to understanding the possibilities of the technology. Artificial intelligence is not limited to androids or virtual assistants. AI can also be used in software to run programs such as grammar checkers, text-to-speech applications and so much more.
Examples: Siri, Alexa, Cortana, Grammarly, Otter.ai, Jasper.ai
- Machine Learning – Machine learning takes data and algorithms to mimic how humans learn. ML relies on the data it is fed in order to grow. A machine is often fed an input and then taught to compute a certain output.
This “learning” helps give the computer the ability to adapt to complex problem-solving. Therefore becoming more accurate with its predictions over time.
- At the heart of machine learning are neural networks. According to IBM, neural networks allow machines to “reflect the behavior of the human brain” by “allowing computer programs to recognize patterns and solve common problems in the fields of AI, machine learning, and deep learning.”
- Neural networks are the key to creating patterns of recognition for a machine to identify. By building a vast collection of neural nets, each with an input and output layer, machines can begin to detect patterns. Computer scientists will feed information into the input, and guide the program until it can display the correct output. Here is a basic example of how this process would play out:
How Machine Learning Works
Imagine you’re working to strengthen a machine’s learning capabilities. You want to begin with some basic inputs to test how well the computer is able to learn. You give the computer a collection of images of apples. There’s a red apple, a green apple, a blue apple and a purple apple. The latter two apples are unnatural, so the goal is to tell the computer which apples are real and which are not.
A programmer would develop a set of rules for the computer to follow. These rules build a structure of possibility for the machine to compute in. Once given the input, we’d ask the computer to identify which image features a natural apple.
As it makes its selection we communicate if it’s correct or incorrect depending on the output. The validation the program receives will determine its understanding of future inputs.
While this isn’t a very scientific example, it helps us understand at a basic level what the neural networks are mimicking. In the human mind, these pathways are already mapped out, and we continue to strengthen them every day. For a machine, we must initiate the learning process and help it understand how to accurately make decisions. Training is critical to the success of machine learning.
Using this understanding of AI, and ML, let’s shift gears and highlight some trends in the retail space.
AI-Enabled Devices in the Retail Space
Artificial intelligence is an immersive tool that has the power to revitalize the workspace when used properly. Here at BlueFletch we believe AI can bring operational enhancements with computer vision, AI assistants, and AI on the edge.
Leveraging the functionalities that your company needs could make the difference for both your employees and customers. Let’s get into some specific examples of how AI can enhance the retail experience.
Aiding Employees and Customers
Products such as Amazon’s Alexa give us a glimpse of the possible advantages in the workplace. What if finding a customer the product they want was as easy as asking Alexa to play you a song? Imagine a warehouse device that locates inventory via voice command; reciting exact shelve numbers back to a user.
AI could alleviate the workload of floor employees by fulfilling requests and providing a quick way to access workplace information. Customer facing tools could benefit as well, with kiosks and/or devices dedicated to customer use. Potential use-cases include:
- Price matching
- Stock updates
- Product customization
- Store navigation
- Inventory location
- Fulfilling voice commands and questions
Custom Use-Case Development
Every workplace is different, and some workforces require different tools to meet workplace standards. Thanks to the boom in AI’s consumer presence over the last few years, developing AI tools has become more common.
Retailers could work with developers to curate applications for specific use cases. Paving the way for a future where AI and ML are core elements of a corporate workspace. For larger enterprises, it’s possible AI development could happen in-house. Regardless, having a curated collection of AI tools would be an asset.
Computer vision is very similar to products like Ring or Nest doorbell cameras. When packages are dropped off or someone rings your doorbell they can capture footage and send alerts. Computer vision is a variant of these products, and can contribute significantly to the enterprise.
Home improvement stores typically offer a range of services to their customers. For departments and services that cater custom orders, customers always need the assistance of an employee. If we return to our paint department, a customer looking to get a custom paint made can’t just grab it off the shelf. Spaces outfitted with computer vision could detect and send alerts when customers are lingering. This would get customers services faster while maintaining an efficient workflow.
Using the power of machine learning, artificial intelligence could be taught to monitor inventory. ML gives a retailer the power to train AI systems to recognize where inventory belongs. This would be especially useful for shelf resets and merchandising. If the computer knows what a picture of a good shelf looks like, it can alert the system when there is a discrepancy.
Facial recognition (facerec) is a form of biometric authentication that grants access to devices and security systems with the scan of a user’s face. The software creates a digital map of their face and uses that map for identification. This can take the place of a traditional passcode system, giving users access to devices and locations immediately.
Similar biometric authentication has existed in smartphones for a while now. Fingerprint scanning and Apple’s Face ID makes accessing passwords, applications, and devices instantaneous. When compared to a traditional passcode, facerec is supersonic in speed and far more secure.
We’ve gone into further detail about the facerec versus the traditional password in our article How Biometric Authentication Enables a Secure and Efficient Workforce. And for more information on biometric authentication, you can check out our article Biometric Authentication: From Future Fantasy To Our Daily Routine.
Computer Vision Security
Every state, and country has different regulations for what is allowed to be captured on video. There’s an ongoing debate about the legality of enterprises using facial recognition within retail. Creating and storing a digital map of someone’s unique biology is extremely sensitive.
In the case of an employee, any use or storage of their facial map would be outlined and agreed upon in a contract. Contracts usually emphasize that any sensitive data within a workplace system will be deleted upon their exit. But if customers have not opted in to having their faces scanned and stored, there could be a huge ethical and legal issue.
With that said, there is still room for AI and facerec software to provide security against crime. If utilizing a sensor such as the one found on a Ring device, systems could trigger alarms and cameras.
In the event of an emergency, a retailer can hand the footage over to law enforcement, leaving it up to the law’s discretion whether or not to use facerec to identify a suspect. These cameras can be outfitted to traditional security cameras as well as self-checkout kiosks. As mentioned in our article How Biometric Authentication Enables a Secure and Efficient Workforce, this is an ongoing conversation in cybersecurity.
Customer and Employees in need
Let’s circle back our home-improvement store example. Elle who usually works in the gardening department has been assigned to the paint section today. While she may be an expert at all things gardening, Elle knows next to nothing about paint.
When a customer walks over to begin fulfilling a custom paint order, Elle begins to worry. The customer is looking for pro-tips on color selection. They show some photos of their house, asking if Elle has any professional recommendations. Despite communicating her situation to the customer, they persist.
Without color matching experience, Elle has to resort to the store’s massive paint catalogue. Elle also becomes overwhelmed by the selection, and is worries about suggesting the wrong paint. Without the help of an AI program or fellow employee, Elle may be unable to help the customer to the extent they need.
Now let’s break down this scenario with the introduction of an AI assistant.
Elle’s store has recently deployed an AI application that can fulfill a variety of tasks. Similar to Amazon’s Alexa, this AI can answer questions, search the internet and reference store information. Now when a customer walks over to the paint section Elle feels confident that she can provide help by leveraging the AI’s toolset.
The customer sends over some reference pictures and once fed to the computer, the AI uses its ML to determine appropriate colors. The AI also recommends paintbrushes and a list of store resources that could help the customer make an informed decision. Elle doesn’t have to stress about being an expert, and can trust her AI.
Artificial Intelligence on the Edge
As devices continue to double up on CPU every year, there are more possibilities to what you can do on the edge. The edge refers to something happening on a device that doesn’t necessarily need to go back to the server or the cloud. This is a model you can deploy, that doesn’t require a great data set.
Voice-to-text communication is an excellent example of AI on the edge. You can also use a device to check price labels and another base sequencing. All the things you can push down to an edge device will minimize interference across your network. If you’re already utilizing AI, and/or putting it into your employees’ hands, AI at the edge would be extra credit.
AI and ML have become accessible in tools across the industry. Developers have poured years into models that you can download at the click of a button. If you’re a Chrome user there are a plethora of browser extensions on the market that use AI to meet a variety of use cases. And if you’re an IT department within a big enterprise using cloud services (Google, Microsoft, etc…) there are AI and ML tools already built into the products or services that you’re using.
When evaluating whether AI or ML would fit into your workplace you should begin with a checklist. Run some queries about where AI is needed and determine where your organization needs help. Even building your own model at a very high level is possible. Having the range to tweak and train software before deployment would give your enterprise a valuable advantage.
For more information related to the retail mobility be sure to browse our resource library. If you’re interested in learning more please consider subscribing to our podcast, The Enterprise Mobility Roundup.