Researchers from the Allen Institute of Artificial Intelligence and the University of Washington are aiming to take machine learning to the next level with system LEVAN. Rather than learning a specific concept, LEVAN aims to Learn EVerything about ANything (hence the name). Unlike most machine learning projects, which learn either in a non-supervised or human-supervised manner, LEVAN is ‘webly supervised’, teaching itself about concepts using only the internet.
The question that sparked LEVAN was “How can we learn a model for any concept that exhaustively covers all ts appearance variations, while requiring minimal or no supervision for compiling the vocabulary of visual variance, gathering the training images and annotations, and learning the models?”, according to the creators’ research paper. What they ended up constructing was LEVAN, ” a fully-automated approach for learning extensive models for a wide range of variations (e.g. actions, interactions, attributes and beyond) within any concept”.
LEVAN works by using Google Books Ngrams to find associated terms around a concept, prune the concepts (by grouping together similar concepts and omitting ‘non-salient’ terms), and then searching for these concepts in image aggregators such as Google Images, Flickr and Bing. For example, on the project website, LEVAN has found out ‘boiled food’, ‘cantonese food’ and ‘food courts’ are all subcategories around the term ‘food’, and grouped together the similar categories of ‘boiled food’ and ‘soft food’.
LEVAN’s creators have suggested potential future applications, such as “co-reference resolution” (finding out which words refer to exactly the same thing, such as ‘Mahatma Gandhi’ and ‘Mohandas Gandhi’) and “temporal evolution of concepts” (distinguishing ‘1900 car’ from ‘1950 car’). So far, LEVAN has identified 50 different concepts and more than 50,000 sub-concepts, and tagged over 10 million images. You can try the system out for yourself on the project website.
Read more here.
Complete our SAP x Data Natives CDO Club survey now, and help us to help you