Automatic mapping not supported for Elasticsearch
I'm using a connection to my ElasticSearch cluster to write some of my datasets into index.
If I use the default mapping , the types of the schema are well mapped (date into date, string into text, integer into long, boolean) but text have not keyword subfield.
I know I can generate for ext field the keyword subfield by specifiyng the columns in the "Columns with raw copy"but it's not very convenient for me as the obtained field name is not standard ("fieldname.fieldname_facet" instead of "keyword")
I tried to force automatic mapping by customing an "empty" custom mapping :
{
"properties" : {}
}
Doing this, it works with a specific behaviour :
- dates are mapped into dates (OK),
- string fields are text fields with subfield "keyword" of type keyword (OK)
But integers, floats and booleans are also mapped into text with keyword subfield whereas I expected they should be mapped into correct numeric types.
It is just as if when the data is written in the index, the string format i(with quotes) is used (I obtain the same results when playing with Elastic Console and the REST API) ignoring the types defined in the schema of the dataset .
And to obtain what should be with automatic mapping, I have to make a partial custom mapping only for numeric and boolean types to correct the mapping which is not really satisfying.
Example :
"region": { "type": "long" },
"serviceaffecting": {"type": "boolean"},
So my question is there a way to have a correct mapping in Elastic aligned with the schema dataset
- either using default mapping with standard support of text field and their "keyword" subfleds ?
- or with better support of automatic mapping ?
Many thanks,
Marc