Term frequency–inverse document frequency (TF-IDF):  TF-IDF…

Term frequency–inverse document frequency (TF-IDF):  TF-IDF = Term Frequency (TF) × Inverse document frequency (IDF)         = the frequency of the word `i` in the document `n` × log( ) Where N is the total number of documents and is the number of documents containing word `i`. For example, if there are two documents with three words as vocabulary, the TF-IDF embedding vector for each document is: Based on the above information, TF-IDF weights (1)_________(a. less b. more) frequency words in the given document, while (2)____________(a. less b. more) weights common words in documents.

An economist predicts consumer choice between banana (1), ap…

An economist predicts consumer choice between banana (1), apple (2), and cheese (3) using two input features through the above artificial neural network. Based on the above figure, if the output of the softmax function for each class in the output layer are, 0.1 for banana, 0.2 for apple, and 0.7 for cheese, the consumer will purchase (1)____________ (a. banana, b. apple c. cheese; 2 points). This example is a case of (2) ___________(a. regression, b. binary classification, c. multi-class classification; 2 points). Here, the output of the softmax function in the output layer is the probability of y, where y