Multi-lingual search with Lucene and Elasticsearch

Last night I gave a talk at SkillsMatter London on multi-lingual search with Lucene and Elasticsearch. The talk covered various challenges with indexing texts in various languages: tokenization, term normalization and stemming. I started with demonstrating the challenges on individual languages, and ended with discussing the ability of mixing texts in various languages in one index - whether it is at all possible, and how to approach that.

We had some issues with the recording so I had to repeat the first few slides (this is why I go very quick in the first minutes...) and the audio quality could be better, nevertheless the talk presents real-world issues and offers what I believe to be good paths for solving those issues. Since this is quite a lot to write blog posts about I think I will just leave it in its video existence for now.

The video is available here: https://skillsmatter.com/skillscasts/4968-approaches-to-multi-lingual-text-search-with-elasticsearch-and-lucen.


Comments are now closed