In addition to putting all of the data into a topic so that it can be dumped into a data store, we can stream the data into an alerting system. To do this, we can make a Kafka consumer, as shown in the following code. Here, we will stream the code down to a local system and then have a msg_process function that we can use to write to an alert system such as Twilio:
from confluent_kafka import Consumer
conf = {'bootstrap.servers': "host1:9092,host2:9092",
'group.id': "foo",
'kafka.security.protocol':'SASL_SSL,
'kafka.sasl.mechanism':'PLAIN',
'kafka.sasl.jaas.config': 'kafkashaded.org.apache.kafka.common.security.plain.PlainLoginModule required username=\"Kafka UserName\" password=\"Kafka Password\";')
'auto.offset.reset': 'smallest'}
running = True
consumer = Consumer(conf)
consumer.subscribe(&apos...