{"id":288,"date":"2020-04-12T22:11:03","date_gmt":"2020-04-12T13:11:03","guid":{"rendered":"http:\/\/localhost:8000\/?p=288"},"modified":"2021-01-17T10:28:19","modified_gmt":"2021-01-17T01:28:19","slug":"tf-idf","status":"publish","type":"post","link":"http:\/\/localhost:8000\/2020\/04\/tf-idf.html","title":{"rendered":"TF-IDF\u306e\u304a\u52c9\u5f37"},"content":{"rendered":"

\u5c11\u3057\u524d\u306b\u4f1a\u793e\u306e\u52c9\u5f37\u4f1a\u3067\u767a\u8868\u3057\u305f\u8cc7\u6599\u3092\u30d6\u30ed\u30b0\u306b\u3082\u8ee2\u8a18\u3057\u3066\u304a\u304d\u307e\u3059\u3002TF-IDF\u306f\u81ea\u7136\u8a00\u8a9e\u51e6\u7406\u306e\u521d\u5fc3\u8005\u306b\u3082\u3068\u3063\u3064\u304d\u3084\u3059\u304f\u7406\u89e3\u3057\u3084\u3059\u3044\u5185\u5bb9\u3067\u3057\u305f\u3002<\/p>\n

Wikipedia\u306e\u8a18\u8f09\u306b\u57fa\u3065\u304d\u624b\u8a08\u7b97<\/h2>\n

\u6982\u8981<\/h3>\n

Wikipedia<\/a>\u306b\u306f\u4ee5\u4e0b\u306e\u3088\u3046\u306a\u8aac\u660e\u304c\u306a\u3055\u308c\u3066\u3044\u307e\u3059\u3002<\/p>\n

\n

TF-IDF\u306f\u3001\u6587\u66f8\u4e2d\u306b\u542b\u307e\u308c\u308b\u5358\u8a9e\u306e\u91cd\u8981\u5ea6\u3092\u8a55\u4fa1\u3059\u308b\u624b\u6cd5\u306e1\u3064\u3067\u3042\u308a\u3001\u4e3b\u306b\u60c5\u5831\u691c\u7d22\u3084\u30c8\u30d4\u30c3\u30af\u5206\u6790\u306a\u3069\u306e\u5206\u91ce\u3067\u7528\u3044\u3089\u308c\u3066\u3044\u308b\u3002
\nTF\uff08\u82f1: Term Frequency\u3001\u5358\u8a9e\u306e\u51fa\u73fe\u983b\u5ea6\uff09\u3068
\nIDF\uff08\u82f1: Inverse Document Frequency\u3001\u9006\u6587\u66f8\u983b\u5ea6\uff09
\n\u306e\u4e8c\u3064\u306e\u6307\u6a19\u306b\u57fa\u3065\u3044\u3066\u8a08\u7b97\u3055\u308c\u308b\u3002<\/p>\n<\/blockquote>\n

TF\uff08\u5358\u8a9e\u51fa\u73fe\u983b\u5ea6\uff09\u3068IDF\uff08\u9006\u6587\u66f8\u983b\u5ea6\uff09\u306e\u4e8c\u3064\u306e\u6307\u6a19\u3092\u5143\u306b\u6587\u66f8\u4e2d\u306e\u5358\u8a9e\u306e\u91cd\u8981\u5ea6\u3092\u8a55\u4fa1\u3059\u308b\u624b\u6cd5\u3067\u3042\u308b\u3053\u3068\u304c\u308f\u304b\u308a\u307e\u3059\u3002<\/p>\n

Wikipedia\u306e\u8a08\u7b97\u5f0f<\/h3>\n

\u8a08\u7b97\u5f0f\u3092\u898b\u308b\u3068\u3001TF-IDF\u5024\u306f\u3001TF\u5024\u3068IDF\u5024\u3092\u639b\u3051\u5408\u308f\u305b\u305f\u3082\u306e\u3067\u3042\u308b\u3053\u3068\u304c\u5206\u304b\u308a\u307e\u3059\u3002<\/p>\n

TF\u5024\u306f\u6587\u66f8\u4e2d\u306e\u5358\u8a9e\u51fa\u73fe\u983b\u5ea6<\/u>\u306e\u3053\u3068\u3067\u3059\u3002\u3053\u306e\u8a08\u7b97\u5f0f\u306b\u304a\u3044\u3066\u306f\u3001\u4f8b\u3048\u3070I have a pen. I have an apple.<\/code>\u306e\u4e2d\u306b\u5358\u8a9ehave<\/code>\u306f\u3001\u51fa\u73fe\u56de\u65702\u3092\u5168\u5358\u8a9e\u65708\u3067\u5272\u3063\u305f 2\/8=0.25\u306b\u306a\u308a\u307e\u3059\u3002\u6587\u66f8\u4e2d\u306b\u983b\u7e41\u306b\u767b\u5834\u3059\u308b\u5358\u8a9e\u306f\u3053\u306e\u5024\u304c\u5927\u304d\u304f\u306a\u308a\u91cd\u8981\u3068\u5224\u65ad\u3055\u308c\u307e\u3059\u3002<\/p>\n

IDF\u5024\u306f\u3042\u308b\u5358\u8a9e\u304c\u5168\u6587\u66f8\u4e2d\u3044\u304f\u3064\u306e\u6587\u66f8\u306b\u767b\u5834\u3059\u308b\u304b\uff08\uff1d\u6587\u66f8\u983b\u5ea6\uff09\u306e\u9006\u6570<\/u>\u3067\u3059\u3002a<\/code>\u3084I<\/code>\u306a\u3069\u4e00\u822c\u7684\u306b\u767b\u5834\u3059\u308b\u3088\u3046\u306a\u5358\u8a9e\u306f\u3001\u6587\u66f8\u983b\u5ea6\u304c\u9ad8\u304f\u306a\u308b\u306e\u3067\u3001\u305d\u306e\u9006\u6570\u3067\u3042\u308b\u9006\u6587\u66f8\u983b\u5ea6\u306f\u5f53\u7136\u5c0f\u3055\u304f\u306a\u308a\u307e\u3059\u3002\u3064\u307e\u308a\u3001\u4e00\u822c\u7684\u306b\u767b\u5834\u3059\u308b\u3088\u3046\u306a\u5358\u8a9e\u306f\u3053\u306e\u6570\u5024\u304c\u5c0f\u3055\u304f\u306a\u308a\u3001\u91cd\u8981\u3067\u306f\u306a\u3044\u3068\u5224\u65ad\u3055\u308c\u307e\u3059\u3002IDF\u306f\u305d\u306e\u6bcd\u96c6\u56e3\u306b\u304a\u3051\u308b\u4e00\u822c\u8a9e\u3092\u9664\u5916\u3059\u308b\u4e00\u822c\u8a9e\u30d5\u30a3\u30eb\u30bf\u30fc\u3068\u3057\u3066\u50cd\u304f\u3053\u3068\u306b\u306a\u308a\u307e\u3059\u3002<\/p>\n

tfidf_{i,j} = tf_{i,j} * idf_{i} = \\frac{n_{i,j}}{\\sum_{k}n_{k,j}} * log\\frac{\\vert D \\vert}{\\vert \\{ d: d \\ni t_{i} \\}\\vert}\\\\\n\u3000\u3000n_{i,j} : \u6587\u66f8j\u4e2d\u306e\u5358\u8a9ei\u306e\u51fa\u73fe\u56de\u6570\\\\\n\u3000\u3000\\vert D \\vert : \u7dcf\u6587\u66f8\u6570\\\\\n\u3000\u3000\\vert \\{ d: d \\ni t_{i} \\}\\vert : \u5358\u8a9ei\u304c\u542b\u307e\u308c\u308b\u6587\u66f8\u6570<\/code><\/pre>\n

\u624b\u8a08\u7b97\u3057\u3066\u307f\u308b<\/h3>\n
\u6587\u66f81: I have a red pen and a blue pen\n\u6587\u66f82: I like red\n\u6587\u66f83: You have a pen\n\u6587\u66f84: I have a red mechanical pencil<\/code><\/pre>\n

\u305d\u308c\u3067\u306f\u3001\u4e0a\u8a18\u306e4\u6587\u66f8\u306b\u304a\u3044\u3066\u3001TF-IDF\u5024\u3092\u5b9f\u969b\u306b\u8a08\u7b97\u3057\u3066\u307f\u307e\u3059\u3002\u5168\u90e8\u8a08\u7b97\u3059\u308b\u306e\u306f\u5927\u5909\u306a\u306e\u3067\u30013\u5358\u8a9e\u306b\u3064\u3044\u3066\u8a08\u7b97\u3057\u3066\u307e\u3059\u3002<\/p>\n

\u6587\u66f81\u306eblue: tfidf_{blue,1} = \\frac{1}{9} * log\\frac{4}{1} = 0.066\\\\\n\u3000\u6587\u66f81\u306ered: tfidf_{red,1} = \\frac{1}{9} * log\\frac{4}{3} = 0.013\\\\\n\u3000\u6587\u66f82\u306ered: tfidf_{red,2} = \\frac{1}{3} * log\\frac{4}{3} = 0.041<\/code><\/pre>\n

gensim\u306eTfidfModel\u3092\u4f7f\u3063\u3066\u5b9f\u88c5<\/h2>\n

python\u5b9f\u88c5<\/h3>\n

gensim\u306eTfidfModel<\/code>\u3092\u4f7f\u3063\u3066\u5b9f\u88c5\u3057\u307e\u3057\u305f\u3002\u6a2a\u30b9\u30af\u30ed\u30fc\u30eb\u3057\u3066\u307f\u306b\u304f\u3044\u3067\u3059\u304c\u3001\u5b9f\u88c5\u81ea\u4f53\u306f\u975e\u5e38\u306b\u30b7\u30f3\u30d7\u30eb\u3067\u3059\u306d\u3002<\/p>\n

from typing import List, Tuple\nimport string\nimport decimal\nfrom decimal import Decimal\n\nimport nltk\nfrom nltk import tokenize\nfrom nltk.stem.porter import PorterStemmer\nfrom nltk.corpus import stopwords\nfrom gensim import corpora\nfrom gensim import models\nfrom gensim.interfaces import TransformedCorpus\n\nnltk.download('punkt')\nnltk.download('stopwords')\n\ndef tfidf(sentences: List[str]) -> TransformedCorpus:\n    # \u5358\u8a9e\u5206\u5272\n    words_list: List[List[str]] = list(tokenize.word_tokenize(sentence) for sentence in sentences)\n    # \u5c0f\u6587\u5b57\u5316\n    words_list = list(list(word.lower() for word in words) for words in words_list)\n    # \u8f9e\u66f8\u4f5c\u6210\uff08\u5358\u8a9e\u6587\u5b57\u5217 -> \u5358\u8a9eID\uff09\n    dictionary: corpora.Dictionary = corpora.Dictionary(words_list)\n    # \u30b3\u30fc\u30d1\u30b9\u5316\n    corpus: List[List[Tuple[int, int]]] = list(map(lambda words: dictionary.doc2bow(words), words_list))\n    # TF-IDF \u30e2\u30c7\u30eb\u751f\u6210\n    tfidf_model: models.TfidfModel = models.TfidfModel(corpus)\n    # \u30e2\u30c7\u30eb\u9069\u7528\n    tfidf_corpus: TransformedCorpus = tfidf_model[corpus]\n    return tfidf_corpus\n\nsentences: List[str] = [\n    'I have a red pen and a blue pen',\n    'I like red',\n    'You have a pen', \n    'I have a red mechanical pencil',\n]\ntfidf(sentences)<\/code><\/pre>\n

\u5b9f\u88c5\u5185\u5bb9\u3092\u9806\u306b\u89e3\u8aac\u3057\u3066\u307f\u307e\u3059\u3002<\/p>\n