Abstract
This study introduces a method for the improvement of word vectors, addressing the limitations of traditional approaches like Word2Vec or GloVe through introducing into embeddings richer semantic properties. Our approach leverages supervised learning methods, with shifts in vectors in the representation space enhancing the quality of word embeddings. This ensures better alignment with semantic reference resources, such as WordNet. The effectiveness of the method has been demonstrated through the application of modified embeddings to text classification and clustering. We also show how our method influences document class distributions, visualized through PCA projections. By comparing our results with state-of-the-art approaches and achieving better accuracy, we confirm the effectiveness of the proposed method. The results underscore the potential of adaptive embeddings to improve both the accuracy and efficiency of semantic analysis across a range of NLP.
Citations
-
0
CrossRef
-
0
Web of Science
-
0
Scopus
Authors (3)
Cite as
Full text
full text is not available in portal
Keywords
Details
- Category:
- Articles
- Type:
- artykuły w czasopismach
- Published in:
-
Applied Sciences-Basel
no. 14,
ISSN: 2076-3417 - Language:
- English
- Publication year:
- 2024
- Bibliographic description:
- Szymański J., Operlejn M., Weichbroth P.: Enhancing Word Embeddings for Improved Semantic Alignment// Applied Sciences-Basel -,iss. 14/24 (2024), s.11519-
- DOI:
- Digital Object Identifier (open in new tab) 10.3390/app142411519
- Sources of funding:
- Verified by:
- Gdańsk University of Technology
seen 6 times