blob: be75f3dbfd19303dc190c27d2b27c83703e03de6 (
plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
|
[[analysis-keyword-tokenizer]]
=== Keyword Tokenizer
A tokenizer of type `keyword` that emits the entire input as a single
input.
The following are settings that can be set for a `keyword` tokenizer
type:
[cols="<,<",options="header",]
|=======================================================
|Setting |Description
|`buffer_size` |The term buffer size. Defaults to `256`.
|=======================================================
|