Word representation has been a crucial topic for natural language processing (NLP). It converts textual data into numerical vectors with efficiency and it is essential to comprehend such complicated text data because it is full with information and has a wide range of uses. Moreover, different machine learning (ML) methods can make use of these representations for a variety of NLP applications. This study offers an understanding of the basic word representation models. It also presents the advantages and limitations of each particular model and how they affect the final result. This paper examines the key advancements and milestones achieved in the domain of NLP, enabling readers to gain a deeper comprehension of the NLP and inspire them to learn more about it.
Key words: NLP, Word Representation, Classical Approaches, Statistic Word Embeddings, Contextual Embedding models.
|