Home|Journals|Articles by Year|Audio Abstracts
 

Review Article



Word Representation Techniques in Natural Language Processing

Laaroussi Houria, Guerouate Fatima, Sbihi Mohamed.




Abstract

Word representation has been a crucial topic for natural language processing (NLP). It converts textual data into numerical vectors with efficiency and it is essential to comprehend such complicated text data because it is full with information and has a wide range of uses. Moreover, different machine learning (ML) methods can make use of these representations for a variety of NLP applications. This study offers an understanding of the basic word representation models. It also presents the advantages and limitations of each particular model and how they affect the final result. This paper examines the key advancements and milestones achieved in the domain of NLP, enabling readers to gain a deeper comprehension of the NLP and inspire them to learn more about it.

Key words: NLP, Word Representation, Classical Approaches, Statistic Word Embeddings, Contextual Embedding models.






Full-text options


Share this Article


Online Article Submission
• ejmanager.com




ejPort - eJManager.com
Refer & Earn
JournalList
About BiblioMed
License Information
Terms & Conditions
Privacy Policy
Contact Us

The articles in Bibliomed are open access articles licensed under Creative Commons Attribution 4.0 International License (CC BY), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.