This section describes how Solr adds data to its index. It covers the following topics:
-
Introduction to Solr Indexing: An overview of Solr’s indexing process.
-
Post Tool: Information about using
post.jar
to quickly upload some content to your system. -
Uploading Data with Index Handlers: Information about using Solr’s Index Handlers to upload XML/XSLT, JSON and CSV data.
-
Transforming and Indexing Custom JSON : Index any JSON of your choice
-
Uploading Data with Solr Cell using Apache Tika: Information about using the Solr Cell framework to upload data for indexing.
-
Uploading Structured Data Store Data with the Data Import Handler: Information about uploading and indexing data from a structured data store.
-
Updating Parts of Documents: Information about how to use atomic updates and optimistic concurrency with Solr.
-
Detecting Languages During Indexing: Information about using language identification during the indexing process.
-
De-Duplication: Information about configuring Solr to mark duplicate documents as they are indexed.
-
Content Streams: Information about streaming content to Solr Request Handlers.
-
UIMA Integration: Information about integrating Solr with Apache’s Unstructured Information Management Architecture (UIMA). UIMA lets you define custom pipelines of Analysis Engines that incrementally add metadata to your documents as annotations.
Indexing Using Client APIs
Using client APIs, such as SolrJ, from your applications is an important option for updating Solr indexes. See the Client APIs section for more information.
We welcome feedback on Solr documentation. However, we cannot provide application support via comments. If you need help, please send a message to the Solr User mailing list.