Skip to content

libbeat: increase total_fields.limit to 12500 #41640

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Nov 18, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion CHANGELOG.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ https://github.com/elastic/beats/compare/v8.15.3\...v8.15.4[View commits]
*Affecting all Beats*

- Fix issue where old data could be saved in the memory queue after acknowledgment, increasing memory use. {pull}41356[41356]
- Fix metrics not being ingested, due to "Limit of total fields [10000] has been exceeded while adding new fields [...]". The total fields limit has been increased to 12500. No significant performance impact on Elasticsearch is anticipated. {pull}41640[41640]

*Filebeat*

Expand Down Expand Up @@ -132,7 +133,7 @@ https://github.com/elastic/beats/compare/v8.15.0\...v8.15.1[View commits]

*Affecting all Beats*

- Beats Docker images do not log to stderr by default. The workaround is to pass the CLI flag `-e` or to set `logging.to_stderr: true` in the configuration file.
- Beats Docker images do not log to stderr by default. The workaround is to pass the CLI flag `-e` or to set `logging.to_stderr: true` in the configuration file.
- Beats stop publishing data after a network error unless restarted. Avoid upgrading to 8.15.1. Affected Beats log `Get \"https://${ELASTICSEARCH_HOST}:443\": context canceled` repeatedly. {issue}40705{40705}
- Memory usage is not correctly limited by the number of events actively in the memory queue, but rather the maximum size of the memory queue regardless of usage. {issue}41355[41355]

Expand Down
2 changes: 1 addition & 1 deletion libbeat/template/load_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -170,7 +170,7 @@ func TestFileLoader_Load(t *testing.T) {
"refresh_interval": "5s",
"mapping": mapstr.M{
"total_fields": mapstr.M{
"limit": 10000,
"limit": defaultTotalFieldsLimit,
},
},
"query": mapstr.M{
Expand Down
2 changes: 1 addition & 1 deletion libbeat/template/template.go
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ import (
var (
// Defaults used in the template
defaultDateDetection = false
defaultTotalFieldsLimit = 10000
defaultTotalFieldsLimit = 12500
defaultMaxDocvalueFieldsSearch = 200

defaultFields []string
Expand Down
Loading