Link Details

A term is the unit of search in Lucene. A Lucene document comprises of a set of terms. Tokenization means splitting up a string into tokens, or terms. A Lucene Tokenizer is what both Lucene (and correspondingly, Solr) uses to tokenize text.

Posted by egenesky  |   Nov 25 2012 / 18:11

Add your comment


Html tags not supported. Reply is editable for 5 minutes. Use [code lang="java|ruby|sql|css|xml"][/code] to post code snippets.

Recommended Links

Upvoters (4)



Downvoters (0)



    Scala
    Written by: Ryan Knight
    Featured Refcardz: Top Refcardz:
    1. Apache Hadoop
    2. Play
    3. Akka
    4. Debugging JavaScript
    5. Design Patterns
    1. Apache Hadoop
    2. REST
    3. Java
    4. Git
    5. Java Performance
    Connect with DZone