Skip to content
Snippets Groups Projects

Issue #3392715: Rework README.md using markdown syntax

All threads resolved!
All threads resolved!
1 file
+ 138
119
Compare changes
  • Side-by-side
  • Inline
+ 138
119
Module Installation
===================
The search_api_solr module manages its dependencies and class loader via
composer. So if you simply downloaded this module from drupal.org you have to
delete it and install it again via composer!
Simply change into the Drupal directory and use composer to install
search_api_solr:
```shell
cd $DRUPAL
composer require drupal/search_api_solr
```
Solr
====
# Search API Solr
This module provides an implementation of the Search API which uses an Apache
Solr search server for indexing and searching. Before enabling or using this
@@ -26,8 +10,45 @@ if you use your own Solr config or if you enable the optional
In general it is highly recommended to run Solr in cloud mode (Solr Cloud)!
Setting up Solr Cloud - the modern way
--------------------------------------
For a full description of the module, visit the
[project page](https://www.drupal.org/project/search_api_solr).
Submit bug reports and feature suggestions, or track changes in the
[issue queue](https://www.drupal.org/project/issues/search_api_solr).
## Table of contents
- Requirements
- Installation
- Configuration: Solr Cloud - the modern way
- Configuration: Solr (single core) - the classic way
- Configuration: Solr Cloud - the classic way
- Using Jump-Start config-sets and docker images
- Updating Solr
- Search API Solr features
- Customizing your Solr server
- Troubleshooting Views
- Troubleshooting Facets
- Development
- Maintainers
## Requirements
This module requires the [Search API](https://www.drupal.org/project/search_api)
module, as well as several external libraries, which will be installed
automatically by Composer, which manages its dependencies.
## Installation
Install as you would normally install a contributed Drupal module. For further
information, see
[Installing Drupal Modules](https://www.drupal.org/docs/extending-drupal/installing-drupal-modules).
## Configuration: Solr Cloud - the modern way
To setup Solr in Cloud mode locally you can either follow the instructions of
the [Apache Solr Reference Guide](https://solr.apache.org/guide/) or use
@@ -35,11 +56,10 @@ docker-compose with
https://github.com/docker-solr/docker-solr-examples/blob/master/docker-compose/docker-compose.yml
The preferred way for local development is to use DDEV where you can easily add
[ddev-solr](https://github.com/mkalkbrenner/ddev-solr) using this command:
```shell
ddev get mkalkbrenner/ddev-solr
ddev restart
```
[ddev-solr](https://github.com/ddev/ddev-solr) using this command:
$ ddev get ddev/ddev-solr
$ ddev restart
For Drupal and Search API Solr you need to configure a Search API server using
Solr as backend and `Solr Cloud with Basic Auth` as its connector. As mentioned
@@ -48,8 +68,9 @@ credentials for Basic Authentication in `.ddev/solr/security.json`.
Solr requires a Drupal-specific configset for any collection that should be used
to index Drupal's content. (In Solr Cloud "collections" are the equivalent to
"cores" in classic Solr installations. Actually, in a big Solr Cloud installation
a collection might consist of multiple cores across all Solr Cloud nodes.)
"cores" in classic Solr installations. Actually, in a big Solr Cloud
installation a collection might consist of multiple cores across all Solr Cloud
nodes.)
Starting from Search API Solr module version 4.2.1 you don't need to deal with
configsets manually anymore. Just enable the `search_api_solr_admin` sub-module
@@ -57,31 +78,31 @@ which is part of Search API Solr. Now you create or update your "collections" at
any time by clicking the "Upload Configset" button on the Search API server
details page (see installation steps below), or use `drush` to do this with
```
ddev drush --numShards=1 search-api-solr:upload-configset SEARCH_API_SERVER_ID
```
$ ddev drush --numShards=1 search-api-solr:upload-configset SEARCH_API_SERVER_ID
Note: Replace `SEARCH_API_SERVER_ID` with your Search API server machine name.
The number of "shards" should always be "1" as this local installation only
runs a single Solr node.
### Installation steps
1. Enable the `search_api_solr_admin` module. (This sub-module is included in Search API Solr >= 4.2.1)
2. Create a search server using the Solr backend and select `Solr Cloud with Basic Auth` as connector:
- HTTP protocol: `http`
- Solr node: `solr`
- Solr port: `8983`
- Solr path: `/`
- Default Solr collection: `techproducts` (You can define any name here. The collection will be created automatically.)
- Username: `solr`
- Password: `SolrRocks`
- HTTP protocol: `http`
- Solr node: `solr`
- Solr port: `8983`
- Solr path: `/`
- Default Solr collection: `techproducts` (You can define any name here. The
collection will be created automatically.)
- Username: `solr`
- Password: `SolrRocks`
3. On the server's "view" page click the `Upload Configset` button and check the "Upload (and overwrite) configset" checkbox.
4. Set the number of shards to `1`.
5. Press `Upload`.Once Solr is running with DDEV you don't need to deal with any configset
Setting up Solr (single core) - the classic way
-----------------------------------------------
## Configuration: Solr (single core) - the classic way
In order for this module to work, you need to set up a Solr server.
For this, you can either purchase a server from a web Solr hosts or set up your
@@ -91,9 +112,10 @@ module's [project page](https://drupal.org/project/search_api_solr). Otherwise,
please follow the instructions in this section.
Note: A more detailed set of instructions is available at:
* https://lucene.apache.org/solr/guide/8_4/installing-solr.html
* https://lucene.apache.org/solr/guide/8_4/taking-solr-to-production.html
* https://lucene.apache.org/solr/guide/ - list of other version specific guides
- https://lucene.apache.org/solr/guide/8_4/installing-solr.html
- https://lucene.apache.org/solr/guide/8_4/taking-solr-to-production.html
- https://lucene.apache.org/solr/guide/ - list of other version specific guides
As a pre-requisite for running your own Solr server, you'll need a Java JRE.
@@ -127,21 +149,19 @@ But the Search API Solr Search module will create the correct configs for you!
**_Now_** you can create a Solr core using this config-set on a running Solr
server. There're different ways to do so. For most Linux distributions you can
run
```
sudo -u solr $SOLR/bin/solr create_core -c $CORE -d $CONF -n $CORE
```
`$ sudo -u solr $SOLR/bin/solr create_core -c $CORE -d $CONF -n $CORE`
You will see something like
```
$ sudo -u solr /opt/solr/bin/solr create_core -c test-core -d /tmp/solr-conf -n test-core
$ sudo -u solr /opt/solr/bin/solr create_core -c test-core -d /tmp/solr-conf -n test-core
Copying configuration to new core instance directory:
/var/solr/data/test-core
```
If you're forced to create the core before you can run Drupal to generate the
config-set you could also use the appropriate jump-start config-set you'll
find in the `jump-start` directory of this module.
find in the _jump-start_ directory of this module.
**You must not create a core without a proper drupal config-set!**
If you do so - even by accident - you won't recognize it immediately. But you'll
@@ -176,8 +196,8 @@ directory to a random string of characters and using that as the path.
For configuring indexes and searches you have to follow the documentation of
search_api.
Setting up Solr Cloud - the classic way
---------------------------------------
## Configuration: Solr Cloud - the classic way
Instead of a single core you have to create a collection in your Solr Cloud
instance. To do so you have to read the Solr handbook.
@@ -185,21 +205,21 @@ instance. To do so you have to read the Solr handbook.
1. Create a Search API Server according to the search_api documentation using
"Solr" or "Multilingual Solr" as Backend and the "Solr Cloud" or
"Solr Cloud with Basic Auth" Connector.
2. Download the config.zip from the server's details page or by using
1. Download the config.zip from the server's details page or by using
`drush solr-gsc`
3. Deploy the config.zip via zookeeper.
1. Deploy the config.zip via zookeeper.
Using Linux specific Solr Packages
----------------------------------
### Using Linux specific Solr Packages
Note: The paths where the config.zip needs to be extracted to might differ from
the instructions above as well. For some distributions a directory like
`/var/solr` or `/usr/local/solr` exists.
_/var/solr_ or _/usr/local/solr_ exists.
Using Jump-Start config-sets and docker images
----------------------------------------------
## Using Jump-Start config-sets and docker images
This module contains a `jump-start` directory where you'll find a
This module contains a _jump-start_ directory where you'll find a
docker-compose.yml files for various Solr versions. These use default
config-sets that will work for most drupal use-cases.
This variant is suitable for evaluation and development purposes.
@@ -209,8 +229,8 @@ the need for advanced features or customizations.
![Jump Start Config-Sets](https://github.com/mkalkbrenner/search_api_solr/workflows/Jump%20Start%20Config-Sets/badge.svg?branch=4.x)
Updating Solr
-------------
## Updating Solr
Whenever you update your Solr installation it is recommended that you generate a
new config-set and deploy it. The deployment depends on the installation
@@ -222,8 +242,8 @@ When performing a major version update like from Solr 6 to Solr 8 it is
recommended to delete the core or collection and recreate it like described in
the installation instructions above.
Search API Solr features
========================
## Search API Solr features
All Search API datatypes are supported by using appropriate Solr datatypes for
indexing them.
@@ -238,37 +258,37 @@ a Solr Query Debugger and shows how content gets indexed.
Regarding third-party features, the following are supported:
- autocomplete
- Introduced by module: search_api_solr_autocomplete
- Lets you add autocompletion capabilities to search forms on the site.
- Introduced by module: search_api_solr_autocomplete
- Lets you add autocompletion capabilities to search forms on the site.
- facets
- Introduced by module: facet
- Allows you to create facetted searches for dynamically filtering search
results.
- Introduced by module: facet
- Allows you to create facetted searches for dynamically filtering search
results.
- more like this
- Introduced by module: search_api
- Lets you display items that are similar to a given one. Use, e.g., to create
a "More like this" block for node pages build with Views.
- Introduced by module: search_api
- Lets you display items that are similar to a given one. Use, e.g., to create
a "More like this" block for node pages build with Views.
- multisite
- Introduced by module: search_api_solr
- Introduced by module: search_api_solr
- spellcheck
- Introduced by module: search_api_solr
- Views integration provided by search_api_spellcheck
- Introduced by module: search_api_solr
- Views integration provided by search_api_spellcheck
- attachments
- Introduced by module: search_api_attachments
- Introduced by module: search_api_attachments
- location
- Introduced by module: search_api_location
- Introduced by module: search_api_location
- NLP
- Introduced by module: search_api_solr_nlp
- Adds more fulltext field types based on natural language processing, for
example field types that filter all word which aren't nouns. This is great
for auto completion.
- Introduced by module: search_api_solr_nlp
- Adds more fulltext field types based on natural language processing, for
example field types that filter all word which aren't nouns. This is great
for auto completion.
If you feel some service option is missing, or have other ideas for improving
this implementation, please file a feature request in the project's issue queue,
at https://drupal.org/project/issues/search_api_solr.
Processors
----------
### Processors
Please consider that, since Solr handles tokenizing, stemming and other
preprocessing tasks, activating any preprocessors in a search index' settings is
@@ -290,8 +310,8 @@ retrieved data` on the index edit page, the Highlighting processor will use
this data directly and bypass it's own logic. To do the highlighting, Solr will
use the configuration of the Highlighting processor.
Connectors
----------
### Connectors
The communication details between Drupal and Solr is implemented by connectors.
This module includes:
@@ -304,8 +324,8 @@ There are service provider specific connectors available, for example from
Acquia, Pantheon, hosted solr, platform.sh, and others. Please contact your
provider for details if you don't run your own Solr server.
Customizing your Solr server
----------------------------
## Customizing your Solr server
It's highly recommended that you don't modify the schema.xml and solrconfig.xml
files manually because this module dynamically generates them for you.
@@ -319,8 +339,8 @@ config YAML files. Have a look at this module's config folder to see examples.
Such field types can target a specific Solr version and a "domain". For example
"Apple" means two different things in a "fruits" domain or a "computer" domain.
Troubleshooting Views
---------------------
## Troubleshooting Views
When displaying search results from Solr in Views using the Search API Views
integration, you have the choice to fetch the displayed values from Solr by
@@ -363,8 +383,9 @@ you enabled "Retrieve result data from Solr". In this case you have to enable
the "Solr dummy fields" processor and add as many dummy fields to the index as
you require. Afterwards you should manipulate these fields via API.
Troubleshooting Facets
----------------------
## Troubleshooting Facets
Facetting on fulltext fields is not yet supported. We recommend the use of
string fields for that purpose.
@@ -377,15 +398,9 @@ If updating from Search API Solr 8.x-1.x or from Solr versions before 7 to Solr
7 or 8, check your Search API index' field configurations to avoid these errors
that will lead to exceptions and zero results.
Support
=======
Support is currently provided via our
[issue queue](https://www.drupal.org/project/issues/search_api_solr?version=8.x)
or on https://drupalchat.me/channel/search.
## Development
Development
===========
Whenever you need to enhance the functionality you should do it using the API
instead of extending the SearchApiSolrBackend class!
@@ -397,55 +412,59 @@ configuration management.
We leverage the [solarium library](http://www.solarium-project.org/). You can
also interact with solarium's API using our hooks and callbacks or via event
listeners.
This way you can for example add any solr specific parameter to a query you
need.
listeners. This way you can for example add any solr specific parameter to a
query you need.
But if you create Search API Queries by yourself in code there's an easier way.
You can simply set the required parameter as option prefixed by 'solr_param_'.
So these two lines are "similar":
```
$search_api_query->setOption('solr_param_mm', '75%');
$solarium_query->setParam('mm', '75%');
```
$search_api_query->setOption('solr_param_mm', '75%');
Patches and Issues Workflow
---------------------------
$solarium_query->setParam('mm', '75%');
### Patches and Issues Workflow
Our test suite includes integration tests that require a real Solr server. This
requirement can't be provided by the drupal.org test infrastructure.
Therefore we leverage github workflows for our tests and had to establish a more
complex workflow:
1. open an issue on drupal.org as usual
2. upload the patch for being reviewed to that issue on drupal.org as usual
3. fork https://github.com/mkalkbrenner/search_api_solr
4. apply your patch and file a PR on github
5. add a link to the github PR to the drupal.org issue
1. Open an issue on drupal.org as usual
1. Upload the patch for being reviewed to that issue on drupal.org as usual
1. Fork https://github.com/mkalkbrenner/search_api_solr
1. Apply your patch and file a PR on github
1. Add a link to the github PR to the drupal.org issue
The PR on github will automatically be tested on github and the test results
will be reflected in the PR conversation.
Running the test suite locally
------------------------------
### Running the test suite locally
This module comes with a suite of automated tests. To execute those, you just
need to have a (correctly configured) Solr instance running at the following
address:
```
http://localhost:8983/solr/drupal
```
http://localhost:8983/solr/drupal
This represents a core named "drupal" in a default installation of Solr.
As long as you're changes don't modify the config-set generation you could
leverage docker, too. You'll find ready to use docker-compose files in the
`jump-start` directory.
_jump-start_ directory.
The tests themselves could be started by running something like this in your
drupal folder:
```
phpunit -c core --group search_api_solr
```
$ phpunit -c core --group search_api_solr
(The exact command varies on your setup and paths.)
## Maintainers
- Markus Kalkbrenner - [mkalkbrenner](https://www.drupal.org/u/mkalkbrenner)
- Thomas Seidl - [drunken monkey ](https://www.drupal.org/u/drunken-monkey)
Loading