So, you’ve got Redpanda running locally (or in the cloud)? Great! Now it’s time to do something with it. In this tutorial, we’ll walk through how to create a topic in Redpanda and publish events using Python. This is an essential first step toward building real-time applications.
Let’s keep it practical and developer-friendly.
Prerequisites
- Redpanda is running locally on port
9092
- Python 3.8+
kafka-python
installed:pip install kafka-python
Step 1: Create a Kafka Topic in Redpanda
Redpanda is Kafka-compatible, so you can use Kafka clients to interact with it. Here’s how to create a topic using Python:
from kafka.admin import KafkaAdminClient, NewTopic
admin = KafkaAdminClient(bootstrap_servers="localhost:9092")
topic = NewTopic(name="demo-topic", num_partitions=1, replication_factor=1)
admin.create_topics([topic])
print("Topic 'demo-topic' created successfully!")
ℹ️ Tip: You can also use
rpk
(Redpanda CLI) to create topics if you prefer command-line tools:rpk topic create demo-topic
Step 2: Publish (Produce) Events to the Topic
Let’s now publish some sample events (messages) to the demo-topic
:
from kafka import KafkaProducer
import json
producer = KafkaProducer(
bootstrap_servers='localhost:9092',
value_serializer=lambda v: json.dumps(v).encode('utf-8')
)
# Simulate sending 5 events
for i in range(5):
event = {"event_id": i, "message": f"Hello Redpanda! #{i}"}
producer.send('demo-topic', value=event)
print(f"Produced: {event}")
producer.flush()
✅ This sends JSON-formatted events to Redpanda in real time.
Step 3: Verify by Consuming the Events
You can consume the events using another simple Python script:
from kafka import KafkaConsumer
import json
consumer = KafkaConsumer(
'demo-topic',
bootstrap_servers='localhost:9092',
auto_offset_reset='earliest',
group_id='my-consumer-group',
value_deserializer=lambda v: json.loads(v.decode('utf-8'))
)
print("Listening for events...")
for message in consumer:
print("Received:", message.value)
Summary
You’ve just:
- Created your first Kafka-compatible topic in Redpanda
- Produced custom JSON events
- Verified streaming works by consuming those events
This setup is perfect for building:
- Live dashboards
- Real-time alerts
- Activity streams
- Data pipelines
Now that you can publish events:
- Build a backend service that streams order updates or logs
- Connect Redpanda to a frontend via WebSockets or SSE
- Add Apache Flink or Redpanda Console for processing and insights
Final Thoughts
Redpanda makes it incredibly easy to get started with real-time streaming. In just a few lines of Python code, you’re producing and consuming events at Kafka speed without Kafka complexity.
Stay tuned for more hands-on guides as we explore advanced use cases like:
- Real-time fraud detection
- Sensor data pipelines
- Customer activity monitoring
Happy streaming!