summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorDavid Arthur <mumrah@gmail.com>2013-04-01 21:10:51 -0400
committerDavid Arthur <mumrah@gmail.com>2013-04-02 20:20:32 -0400
commitfa13fa88d8a32943535c63e69ed111007ac1a3d3 (patch)
treed94489f0700f60661efdd45c92daebd669cd772c
parent864430c8c8ef0c1d05f7e62451320919b3be8eb7 (diff)
downloadkafka-python-fa13fa88d8a32943535c63e69ed111007ac1a3d3.tar.gz
Update README with new APIs
-rw-r--r--README.md113
1 files changed, 19 insertions, 94 deletions
diff --git a/README.md b/README.md
index 1e22054..9fdfbaf 100644
--- a/README.md
+++ b/README.md
@@ -4,17 +4,17 @@ This module provides low-level protocol support Apache Kafka. It implements the
(and their responses): Produce, Fetch, MultiFetch, MultiProduce, and Offsets. Gzip and Snappy compression
is also supported.
-Compatible with Apache Kafka 0.7x. Tested against 0.7.0, 0.7.1, and 0.7.2
+Compatible with Apache Kafka 0.7x. Tested against 0.8
http://incubator.apache.org/kafka/
# License
-Copyright 2012, David Arthur under Apache License, v2.0. See `LICENSE`
+Copyright 2013, David Arthur under Apache License, v2.0. See `LICENSE`
# Status
-Current version is 0.1-alpha. The current API should be pretty stable.
+Current version is 0.2-alpha. This version is under development, APIs are subject to change
# Install
@@ -89,109 +89,34 @@ python -m test.integration
# Usage
-## Send a message to a topic
+## High level
```python
from kafka.client import KafkaClient
-kafka = KafkaClient("localhost", 9092)
-kafka.send_messages_simple("my-topic", "some message")
-kafka.close()
-```
-
-## Send several messages to a topic
-
-Same as before, just add more arguments to `send_simple`
-
-```python
-kafka = KafkaClient("localhost", 9092)
-kafka.send_messages_simple("my-topic", "some message", "another message", "and another")
-kafka.close()
-```
-
-## Recieve some messages from a topic
-
-Supply `get_message_set` with a `FetchRequest`, get back the messages and new `FetchRequest`
-
-```python
-kafka = KafkaClient("localhost", 9092)
-req = FetchRequest("my-topic", 0, 0, 1024*1024)
-(messages, req1) = kafka.get_message_set(req)
-kafka.close()
-```
-
-The returned `FetchRequest` includes the offset of the next message. This makes
-paging through the queue very simple.
-
-## Send multiple messages to multiple topics
-
-For this we use the `send_multi_message_set` method along with `ProduceRequest` objects.
-
-```python
-kafka = KafkaClient("localhost", 9092)
-req1 = ProduceRequest("my-topic-1", 0, [
- create_message_from_string("message one"),
- create_message_from_string("message two")
-])
-req2 = ProduceRequest("my-topic-2", 0, [
- create_message_from_string("nachricht ein"),
- create_message_from_string("nachricht zwei")
-])
-kafka.send_multi_message_set([req1, req1])
-kafka.close()
-```
-
-## Iterate through all messages from an offset
-
-The `iter_messages` method will make the underlying calls to `get_message_set`
-to provide a generator that returns every message available.
-
-```python
-kafka = KafkaClient("localhost", 9092)
-for msg in kafka.iter_messages("my-topic", 0, 0, 1024*1024):
- print(msg.payload)
-kafka.close()
-```
-
-An optional `auto` argument will control auto-paging through results
-```python
-kafka = KafkaClient("localhost", 9092)
-for msg in kafka.iter_messages("my-topic", 0, 0, 1024*1024, False):
- print(msg.payload)
-kafka.close()
-```
-This will only iterate through messages in the byte range of (0, 1024\*1024)
+producer = SimpleProducer(kafka, "my-topic")
+producer.send_messages("some message")
+producer.send_messages("this method", "is variadic")
-## Create some compressed messages
+consumer = SimpleConsumer(kafka, "my-group", "my-topic")
+for message in consumer:
+ print(message)
-```python
-kafka = KafkaClient("localhost", 9092)
-messages = [kafka.create_snappy_message("testing 1"),
- kafka.create_snappy_message("testing 2")]
-req = ProduceRequest(topic, 1, messages)
-kafka.send_message_set(req)
kafka.close()
```
-## Use Kafka like a FIFO queue
-
-Simple API: `get`, `put`, `close`.
+## Low level
```python
+from kafka.client import KafkaClient
kafka = KafkaClient("localhost", 9092)
-q = KafkaQueue(kafka, "my-topic", [0,1])
-q.put("first")
-q.put("second")
-q.get() # first
-q.get() # second
-q.close()
+req = ProduceRequest(topic="my-topic", partition=1,
+ messages=[KafkaProdocol.encode_message("some message")])
+resps = kafka.send_produce_request(payloads=[req], fail_on_error=True)
kafka.close()
-```
-Since the producer and consumers are backed by actual `multiprocessing.Queue`, you can
-do blocking or non-blocking puts and gets.
-
-```python
-q.put("first", block=False)
-q.get(block=True, timeout=10)
+resps[0].topic # "my-topic"
+resps[0].partition # 1
+resps[0].error # 0 (hopefully)
+resps[0].offset # offset of the first message sent in this request
```