You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've been looking at various aspects of this, including writing a logstash connector to push things into Kafka, or just having rsyslog push everything directly into kafka with no intermedia store.
In generaly, the idea of taking log data and pushing them either individually or grouped, into Kafka MQ items seems reasonably trivial.
Except for the kafka dependancy on Zookeeper.
IF you build the Elasticseach environment right with a Nginx proxy and certs it can be made simpler with just log stash and the HTTP send module
fyi: we want kafka for a whole bunch of stuff, but early on however the primary benefit is removing IP address config from individual components (ie. just send everything to the local kafka instance, let it worry about off-host connectivity)
Logs collected on CPE sensor should be pushed into a message queue stream for deliver to upstream analytics.
Logs will need to be collected by syslog/eventlog collector on host, then pushed into message queue system.
The text was updated successfully, but these errors were encountered: