Google’s New Artificial Intelligence Technology: Multitask Unified Model (MUM)
Google’s Multitask Unified Model (MUM) is an advancement in Artificial Intelligence Technology. The technology was first announced at Google’s annual developer conference, I/O this past May and has the ability to process information far more efficiently than other AI technologies currently on the market.
MUM can be applied to any number of tasks, making it a valuable tool for businesses looking to improve their efficiency through automation. In this blog post, we will explore Multitask Unified Model (MUM)’s capabilities.
What is Multitask Unified Model (MUM)?
In simple words, Multitask Unified Model (MUM) is an advancement in Artificial Intelligence Technology that gives simple answers to complex queries that earlier require multiple searches.
How Google’s New Artificial Intelligence Technology, MUM Works?
It works by understanding the search query and preparing results by analyzing multiple sources whether text, image, or video over the internet within a fraction of seconds.
It can even answer conversational questions which earlier weren’t possible for Google to decipher.
Why is the Multitask Unified Model (MUM) made when BERT is already working?
MUM has been created because BERT is not able to answer complex queries and Google wants to lower its burden on the server because right now users make a large number of queries for a particular complex topic but MUM will now give richer results with fewer searches. This will cause less burden on the server as well as solve user’s queries much faster.
Explanation of Google’s new AI Multitask Unified Model (MUM) through example
Let’s take a scenario where we need to find a restaurant. Multitask Unified Model (MUM) would use the power of machine learning and AI to process vast amounts of data at once, which means it could make sense out of massive collections – such as Yelp reviews, Google Maps information, or weather forecasts – in mere seconds!
Google’s MUM framework
MUM uses transfer learning with the T5 text-to-text framework.
Transfer Learning is a field of machine learning that has been rapidly gaining popularity in recent years. Transferring knowledge from one task to another can be done by using a single neural network, which takes the output of an existing model and then transfers it onto the new task.
Transfer Learning with T5: the Text-To-Text Transfer Transformer aims to take this concept even further by adding an additional layer to its architecture: a text transformer.
T5 text-to-text framework is a TensorFlow-based framework that can generate a sequence of conversational responses for questions it has learned from text data.
Imagine you are in the kitchen, cooking dinner and your child comes up to ask about what is going on. TUMT (text understanding module) would pull out all the relevant facts like “I’m cooking spaghetti for dinner” or “I am making a sauce.” TUMT would be able to identify the type of question and then use TMT (text merging module) to generate appropriate responses.
The system is designed for human-machine interactions, but it demonstrates how Google’s deep learning techniques can solve many problems that require understanding text data with rich semantics.
How many languages MUM can understand and give results in?
Google MUM can understand up to 75 languages and the MUM answer is given in the language as well. MUM understands all three major types of text data: video, images, and text content which makes MUM about 1000x faster than BERT.
When will MUM be released?
Google hasn’t confirmed MUM’s release date yet, but the MUM project started back in April 2018. It is likely that MUM will be released sometime later this year or early next year.