Google recently announced updates to its ML Kit, announced at I/O 2018, allowing third-party developers to add Smart Reply and and other natural language processing features to Android and iOS apps. The announcement, which adds contextual and automated responses (as in Gmail and Messenger) to other applications, is one of several advancements making use of natural language processing in action--in ways that will affect everyday enterprise operations, from sending emails to dealing with data.
Language understanding is hard for computers--the subtle differences of language are not easy to program for--but it’s also game changing. The growing capabilities in natural language processing open up possibilities for a variety of technologies that could have transformative effects for organizations, from chatbots to search to voice-activated computing.
“While not new technologies, speech recognition and NLP are at levels where they can now solve practical problems—and adoption is becoming mainstream,” said Dan O’Connell, chief strategy officer of Dialpad.
There are many developments to come in natural language processing, but there were some important research achievements in 2018. That work is ongoing, and the enterprise value of specific findings or developments may not be immediately evident.
But these advancements in NLP play a role in the increased importance of technologies like chatbots, voice search and virtual assistants. A report from Tableau on business intelligence trends to watch in 2019 pointed to the increasing importance of natural language processing in data interactions. Another report predicted significant growth, bringing the NLP market to $28.6 billion by 2026.
Research Highlights: Natural Language Processing in Action
Important research in NLP in 2018 covered a lot of ground, including NLP models and neural networks
A Google AI team presented a new model for NLP, called BERT (Bidirectional Encoder Representations from Transformers), that is designed to allow the consideration of context from both the left and right sides of a word, bringing benefits in functions like question answering in ways that could have implications in the future for chatbots, to give one example.
Another development came from a different tech giant, Facebook and its AI team, which proposed leveraging monolingual data for machine translation using both neural and phrase-based model variants. The approach could improve machine traslation results and training.
While much of this research is happening within tech giants such as Google and Facebook, smaller firms are making this own inroads--and becoming potential acquisition targets in the process, O’Connell said.
“While we're still in the early innings of software development with these technologies,” he said, “we’re going to see more enterprises acquiring smaller startups with expertise in these areas--namely for tech, talent, and data.”
Those big firms already have an advantage in NLP because of the sheer amount of data at their fingertips. Data is going to become increasingly important as natural language processing continues, O’Connell said.
“If you have data that can be used to train models, the larger players will want that, too--that's a competitive advantage,” he said, “whether it’s integrating new team members with AI and data science backgrounds, finding new ways to collect and store data for NLP and AI models (or even marketing it to the large players), or incorporating new technologies to the IT stack like voice AI for phone systems.”
Tech
via https://www.AiUpNow.com
Terri Coles, Khareem Sudlow