Skip to content

Commit

Permalink
feature #3331 [SearchBundle, NodeSearchBundle] : change nGram to `n…
Browse files Browse the repository at this point in the history
…gram` (delboy1978uk)

This PR was squashed before being merged into the 6.x branch.

Discussion
----------

| Q             | A
| ------------- | ---
| Bug fix?      | yes|
| New feature?  | no
| BC breaks?    | no
| Deprecations? | no
| Fixed tickets | NA

When using elastic search 8, elastica throws the following Exception

The [nGram] tokenizer name was deprecated in 7.6. Please use the tokenizer name to [ngram] for indices created in versions 8 or higher instead.

Version 7 uses ngram as well as nGram as can be seen here, https://www.elastic.co/guide/en/elasticsearch/reference/7.14/analysis-ngram-tokenizer.html#_example_output_13 , so this pull request simply lower cases the G's, which allows the search population command to successfully run again


Commits
-------

1f7f51e [SearchBundle, NodeSearchBundle] : change `nGram` to `ngram`
211616a strtolower $tokenizer['type'] #3331
  • Loading branch information
delboy1978uk authored Feb 18, 2024
1 parent d5619fd commit c4810d7
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -327,7 +327,7 @@ public function setAnalysis(Index $index, AnalysisFactoryInterface $analysis)
$ngramDiff = 1;
if (isset($analysers['tokenizer']) && count($analysers['tokenizer']) > 0) {
foreach ($analysers['tokenizer'] as $tokenizer) {
if ($tokenizer['type'] === 'nGram') {
if (\strtolower($tokenizer['type']) === 'ngram') {
$diff = $tokenizer['max_gram'] - $tokenizer['min_gram'];

$ngramDiff = $diff > $ngramDiff ? $diff : $ngramDiff;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ public function addStripSpecialCharsFilter()
public function addNGramTokenizer()
{
$this->tokenizers['kuma_ngram'] = [
'type' => 'nGram',
'type' => 'ngram',
'min_gram' => 4,
'max_gram' => 30,
'token_chars' => ['letter', 'digit', 'punctuation'],
Expand Down

0 comments on commit c4810d7

Please sign in to comment.