Select Page

What you always want to know about Word Embeddings

January 15, 2021

Thomas Thurner

Thomas Thurner

Research and Marketing Manager

8

All Blog posts

Have you recently tried to get your nose into the topic of Text Mining, and have you stumbled over terms like BERT, ELMO or word2vec? Semantic Web Company’s researcher Anna Breit has recently started a series of blog posts in the ML ED (Machine Learning, Easily Digestible) series on medium.com that focuses on explaining the key principles behind word embeddings.

How can words be represented as vectors? How the can meaning be therefore represented as a vector’s distance. And how so called embedding are representations of things in another world. And why these vectors will only approximate the semantic relationships, whereas their quality is very dependent on the data quality. Find an easy answer in Anna’s first chapter of ML ED: Word Embeddings – An introduction

You may also like these blogs …