Search results for: WORD EMBEDDINGS - Bridge of Knowledge

Search

Search results for: WORD EMBEDDINGS

Search results for: WORD EMBEDDINGS

  • Towards semantic-rich word embeddings

    Publication

    - Annals of Computer Science and Information Systems - Year 2019

    In recent years, word embeddings have been shown to improve the performance in NLP tasks such as syntactic parsing or sentiment analysis. While useful, they are problematic in representing ambiguous words with multiple meanings, since they keep a single representation for each word in the vocabulary. Constructing separate embeddings for meanings of ambiguous words could be useful for solving the Word Sense Disambiguation (WSD)...

    Full text available to download

  • An Analysis of Neural Word Representations for Wikipedia Articles Classification

    Publication

    - CYBERNETICS AND SYSTEMS - Year 2019

    One of the current popular methods of generating word representations is an approach based on the analysis of large document collections with neural networks. It creates so-called word-embeddings that attempt to learn relationships between words and encode this information in the form of a low-dimensional vector. The goal of this paper is to examine the differences between the most popular embedding models and the typical bag-of-words...

    Full text to download in external service

  • LSA Is not Dead: Improving Results of Domain-Specific Information Retrieval System Using Stack Overflow Questions Tags

    Publication

    - Year 2024

    The paper presents the approach to using tags from Stack Overflow questions as a data source in the process of building domain-specific unsupervised term embeddings. Using a huge dataset of Stack Overflow posts, our solution employs the LSA algorithm to learn latent representations of information technology terms. The paper also presents the Teamy.ai system, currently developed by Scalac company, which serves as a platform that...

    Full text available to download