summaryrefslogtreecommitdiff
path: root/README.rst
diff options
context:
space:
mode:
authorDana Powers <dana.powers@rd.io>2016-01-23 22:50:26 -0800
committerDana Powers <dana.powers@rd.io>2016-01-24 17:33:09 -0800
commit85c0dd2579eb6aa0b9492d9082d0f4cf4d8ea39d (patch)
treebd14706a8dfc429f6bf211bac02ad21af967c6ce /README.rst
parentf51623142dfc089aeb46e986b1d0382f3fab3025 (diff)
downloadkafka-python-85c0dd2579eb6aa0b9492d9082d0f4cf4d8ea39d.tar.gz
Add KafkaProducer to autodocs and READMEkafka_producer
Diffstat (limited to 'README.rst')
-rw-r--r--README.rst29
1 files changed, 28 insertions, 1 deletions
diff --git a/README.rst b/README.rst
index 2f716ef..1d04e0b 100644
--- a/README.rst
+++ b/README.rst
@@ -50,7 +50,34 @@ for examples.
KafkaProducer
*************
-<`in progress - see SimpleProducer for legacy producer implementation`>
+KafkaProducer is a high-level, asynchronous message producer. The class is
+intended to operate as similarly as possible to the official java client.
+See `ReadTheDocs <http://kafka-python.readthedocs.org/en/master/apidoc/KafkaProducer.html>`_
+for more details.
+
+>>> from kafka import KafkaProducer
+>>> producer = KafkaProducer(bootstrap_servers='localhost:1234')
+>>> producer.send('foobar', b'some_message_bytes')
+
+>>> # Blocking send
+>>> producer.send('foobar', b'another_message').get(timeout=60)
+
+>>> # Use a key for hashed-partitioning
+>>> producer.send('foobar', key=b'foo', value=b'bar')
+
+>>> # Serialize json messages
+>>> import json
+>>> producer = KafkaProducer(value_serializer=json.loads)
+>>> producer.send('fizzbuzz', {'foo': 'bar'})
+
+>>> # Serialize string keys
+>>> producer = KafkaProducer(key_serializer=str.encode)
+>>> producer.send('flipflap', key='ping', value=b'1234')
+
+>>> # Compress messages
+>>> producer = KafkaProducer(compression_type='gzip')
+>>> for i in range(1000):
+... producer.send('foobar', b'msg %d' % i)
Protocol