<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=2006193252832260&amp;ev=PageView&amp;noscript=1">

3 Min Read

How To Tell What Is And Isn’t Artificial Intelligence

Featured Image

Wondering how to get started with AI? Take our on-demand Piloting AI for Marketers Series.

Learn More

At the Marketing AI Institute, we read dozens of articles on artificial intelligence every week to uncover the most valuable ones for our subscribers (become a subscriber today), and we curate them for you here. We call it 3 Links in 3 Minutes. Enjoy!

1. Becoming a Professional AI Tool Investigator

We’ve touched on the topic of cutting through the hype surrounding artificial intelligence before, but with more companies offering “AI-powered solutions,” it can be hard to tell which businesses are actually using AI and which aren’t. AdWeek offers some tips for marketers on the hunt for tools actually fueled by AI.

Jason Carmel, global chief data officer for Possible said it best, “Any time marketing technology makes a decision that a human normally would, it’s being driven by some form of machine learning.” The important differentiation here is that the machine is making the decision in real time, not a human.

While investigating an AI-driven technology, marketers should admit to being ignorant on the subject and ask as many questions as possible. Austin Miller, director of product marketing at Oracle Marketing Cloud, recommends, “When a vendor refers to ‘algorithms,’ making ‘smart decisions,’ for your business, ask them what those algorithms are and which specific decisions are made, and what the readout is for a human to validate that. Ask them for examples of business using that technology currently and, even better, ask to talk with them.”

Lastly, the worst thing a marketer can do is being afraid of sounding stupid and not speaking up. Understanding the product and the technology that powers it is essential to making informed decisions for your organization.

 
2. How To Use The Google Lens In Your Pocket

One of the apps being talked about the most right now is Google Lens, and for a good reason. By tapping into Google’s AI, the app is able to recognize items through your phone’s camera and take action. The Lens update has been rolling out to Android and iOS users for the last month.

To highlight what exactly having this type of machine learning in your pocket means, Gizmodo dove into all of its capabilities.

Using the Google Photos app, you can apply Lens to any photo you’ve already taken and it can identify most common objects. Gizmodo’s writers had it successfully identify a cappuccino, a MacBook Pro, a daffodil, and a pint of beer. It was also able to identify more specific images such as “The Bean” sculpture in Chicago and LS Lowry’s painting Going to the Match.

Additionally, Lens can pick up text such as email addresses, phone numbers, and map addresses, and launch another relevant app with one more tap. A great example of how this would be useful is snapping a photo of an event poster. Lens can tell you where the event is in Google Maps and use the text on the flyer to add an event to your calendar.

Users are still waiting on functionality to complete some of the tasks Google demoed at last year’s I/O event, such as being able to wipe away a chain link fence to see a clear photo of what’s behind it or snapping a photo of a Wi-Fi router’s code and connecting to that network. However, while Lens still has a long way to go, the recent updates show strong progress. And, as with all AI, the more it is used (and trained), the faster it will improve.

3. Reducing Food Waste With An AI App

AgShift, a food inspection technology startup in California, has started using deep learning to fight food waste, according to AgFunder News.

Without deep learning, the process of inspecting produce quality is very time-consuming and subjective. Workers receive a palate of fresh produce, unwrap the palate, remove a few packages and inspect the produce with the human eye or a ruler, evaluating the size, color and amount of bruising. This leads to inconsistencies with the quality of produce arriving at stores and markets, ultimately leading to food waste.

Thanks to their recent round of funding led by Exfinity Ventures, AgShift now has $2 million in seed money to apply deep learning to this process. With each new crop, the AgShift team takes six weeks to perfect the algorithms to tell the app what to look for. From there, workers simply photograph the produce and the app measure the color, size and proportion of bruising and issues a USDA grade.

Founder Miku Jha says the app has cut the inspection time nearly in half for each team. Also, it has not replaced any workers, just increased their capacity. Since the industry has never done inspections this way, Jha hopes that photographing the produce will also decrease or eliminate claims, which are major contributors to food waste.

Related Posts

AI Is Transforming The Workplace: Pro or Con?

Ashley Sams | April 11, 2018

Every week our team reads as many AI articles as we can get our hands on to share the best ones with you. This week we’re covering the improvements artificial intelligence is making in the workplace, Microsoft’s plan to infuse everything with AI and more.

AI Poised to Increase Jobs Among Early Adopters

Ashley Sams | December 6, 2018

Marketing AI Institute's weekly roundup of artificial intelligence and machine learning. This week we're reading about AI's ability to increase jobs, Google' attempt to squash cultural bias in algorithms and more.

How Your Instagram Photos and Hashtags Are Training Facebook's AI

Ashley Sams | May 9, 2018

Your weekly roundup of the best artificial intelligence and machine learning articles on the web. This week we're reading about how Facebook is using Instagram hashtags to train their algorithms, the new AI features announced at Google I/O 2018, and more.