Friday, November 15, 2024
Google search engine
HomeData Modelling & AIWhat Businesses Should Know About Speech Technologies

What Businesses Should Know About Speech Technologies

One of the top workshops at ODSC West last year (2018) was a talk by Omar Tawakol, the founder of Voicea. His company created a voice assistant that transformed meetings by handling lower order tasks like note-taking. Cisco acquired Voicea, most likely to integrate it into Webex as part of their Cognitive Collaboration agenda. For you, it could mean a transformation of how you do meetings thanks to speech technologies.

[Related Article: The Mechanics and Business Value of Speech Technologies]

Most of our business use case examples center around using chatbots for customer service or Alexa-like assistants. However, speech technologies could open a few more doors than that. Speech tech, like anything else related to AI, could upend your business operations in a positive way, but that’s only if you’re ready. Here’s what’s coming and why you should care.

How Does Speech Technology Work?

Machines learn language in a similar way to children. The algorithms take test data and train using the same trial and error children use. Machines are at a disadvantage, however, because they still don’t understand the content of what they’re learning. Speech markers delineate key speech functions, causing sometimes hilarious results.

As tech gets better at training machines to understand not only the words but the context involved — called Natural Language Understanding — our speech technologies are getting more efficient. Years ago, you could spot a chatbot from a mile away, and now? You’ve probably spoken to one and never even known it. 

Who is Using Speech Technologies?

Google, Facebook, Apple, you know the lineup. They’re all pouring money into speech tech’s next big thing, and for a good reason. This year projects around 67 million people using some form of speech tech in the household, and some of those customers could be yours.

AI is creeping towards human-like understanding. If you remember the first Siri rollout, it was difficult to maneuver if you didn’t have a standard accent or a quiet environment, prompting hilarious spoofs. Now, those days could be behind us.

Cloud Native Stream Processing

What Are Companies Exploring?

Machines are overcoming challenges and difficulties to deliver more accurate results in a variety of applications, including:

  • Realistic text-to-speech: English, for example, is a stress-timed language, which gives robot voices a strange cadence. Companies like IBM are trying to overcome that hurdle for natural speech, which could allow your business to make better use of text to speech for day to day tasks.
  • Speech-based assistants: It’s not just chat. Talking to a robot could be the future with the luxury of speaking to human a rare thing. As bots take on more human qualities, it could free up both your human team and make it easier for your customers to interact with your company (Hello, introverts!)
  • Audiovisual analysis: In a rare area of AI that wasn’t magical hype, the drive to build better audiovisual analysis has paid off. Just like you’d talk to a coworker or employee through images and speech, machines rapidly approached human-like abilities for the same understanding. This could mean a transformation in how you train your human employees. It could also change your customer-facing business applications and a host of other use cases.
  • Sentiment analysis: Understanding spoken tone could be a wildly successful expansion of the already popular text-based sentiment analysis. You want to know what your customers think? Using recordings and videos as part of your data repository could open up new avenues of analysis that text-based systems can’t.
  • Thoughts to speech: Probably the wildest achievement has been Columbia’s success in using brain waves to translate what users heard into text. It was frighteningly accurate, although not perfected. That capability could potentially change UI altogether in the future.

What are the Challenges?

Gartner predicts that robots will manage 85% of customer interactions by 2020, so organizations are scrambling to expand what AI can do. We don’t communicate solely through writing. Speech technologies could help make those interactions more efficient and more intuitive.

Speech technologies are often built from the code up rather than domain down. Instead of working with linguists and psychologists to figure out how humans do communicate, code is written from a tech perspective. It was the “All your base are belong to us” heyday.

[Related Article: Deep Learning for Speech Recognition]

Now, developers have gotten smarter and more informed. As our understanding of the intersections of language and thought get better, speech technologies do as well. Your business could soon be using speech tech to transform meetings, customer service, and even UI.

RELATED ARTICLES

Most Popular

Recent Comments