What are the side effects of using a large max_gram for an nGram tokenizer in elasticsearch? -


i want understand implications of using large setting max_gram when using ngram tokenizer. know explode size of index, what? make searches slower? cause things error out? etc

it'll make searches slower sure, because lots of tokens generated comparison.

in general, should analyze business , find out size of ngram suitable field. ex: product id, can support search ngram max 20 chars (max_gram=20), because people remember 5 or 6 chars of product id, 20 enough.


Comments

Popular posts from this blog

java - Jmockit String final length method mocking Issue -

asp.net - Razor Page Hosted on IIS 6 Fails Every Morning -

c++ - wxwidget compiling on windows command prompt -