Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Create Word Embeddings | Word Embeddings
course content

Conteúdo do Curso

Introduction to NLP

Create Word EmbeddingsCreate Word Embeddings

Tarefa

Now, it's time for you to train a Word2Vec model to generate word embeddings for the given corpus:

  1. Import the class for creating a Word2Vec model.
  2. Tokenize each sentence in the 'Document' column of the corpus by splitting each sentence into words separated by whitespaces. Store the result in the sentences variable.
  3. Initialize the Word2Vec model by passing sentences as the first argument and setting the following values as keyword arguments, in this order:
    • embedding size: 50;
    • context window size: 2;
    • minimal frequency of words to include in the model: 1;
    • model: skip-gram.
  4. Print the top-3 most similar words to the word 'bowl'.

Tudo estava claro?

Seção 4. Capítulo 4
toggle bottom row
course content

Conteúdo do Curso

Introduction to NLP

Create Word EmbeddingsCreate Word Embeddings

Tarefa

Now, it's time for you to train a Word2Vec model to generate word embeddings for the given corpus:

  1. Import the class for creating a Word2Vec model.
  2. Tokenize each sentence in the 'Document' column of the corpus by splitting each sentence into words separated by whitespaces. Store the result in the sentences variable.
  3. Initialize the Word2Vec model by passing sentences as the first argument and setting the following values as keyword arguments, in this order:
    • embedding size: 50;
    • context window size: 2;
    • minimal frequency of words to include in the model: 1;
    • model: skip-gram.
  4. Print the top-3 most similar words to the word 'bowl'.

Tudo estava claro?

Seção 4. Capítulo 4
toggle bottom row
some-alt