About 50 results
Open links in new tab
  1. How to fetch vectors for a word list with Word2Vec?

    I want to create a text file that is essentially a dictionary, with each word being paired with its vector representation through word2vec. I'm assuming the process would be to first train …

  2. How to use word2vec to calculate the similarity distance by giving …

    Word2vec is a open source tool to calculate the words distance provided by Google. It can be used by inputting a word and output the ranked word lists according to the similarity.

  3. What is the concept of negative-sampling in word2vec? [closed]

    The terminology is borrowed from classification, a common application of neural networks. There you have a bunch of positive and negative examples. With word2vec, for any given word you …

  4. Classic king - man + woman = queen example with pretrained …

    Dec 12, 2022 · I am really desperate, I just cannot reproduce the allegedly classic example of king - man + woman = queen with the word2vec package in R and any (!) pre-trained embedding …

  5. SpaCy: how to load Google news word2vec vectors?

    SpaCy: how to load Google news word2vec vectors? Asked 8 years, 11 months ago Modified 6 years, 8 months ago Viewed 21k times

  6. How to load a pre-trained Word2vec MODEL File and reuse it?

    Nov 29, 2017 · import gensim # Load pre-trained Word2Vec model. model = gensim.models.Word2Vec.load("modelName.model") now you can train the model as usual. …

  7. How to get vector for a sentence from the word2vec of tokens in ...

    Apr 21, 2015 · It is possible, but not from word2vec. The composition of word vectors in order to obtain higher-level representations for sentences (and further for paragraphs and documents) …

  8. What's the major difference between glove and word2vec?

    May 10, 2019 · What is the difference between word2vec and glove? Are both the ways to train a word embedding? if yes then how can we use both?

  9. What are the differences between contextual embedding and …

    Jun 8, 2020 · Word embeddings and contextual embeddings are slightly different. While both word embeddings and contextual embeddings are obtained from the models using …

  10. What is the ideal "size" of the vector for each word in Word2Vec?

    Jun 21, 2022 · model = gensim.models.Word2Vec.load("w2model.trained") vec = [] finalvecs = [] #tokens is a list of over a 1 million rows for token in tokens: for word in token: …