Real-Time Protocols

gRPC Streaming Patterns Guide

The four gRPC communication patterns — unary, server streaming, client streaming, bidirectional — with flow control and when to use each.

Four gRPC Communication Patterns

gRPC supports four fundamental communication patterns defined in the .proto service definition. Each has a distinct use case:

service StreamingExamples {
  // Unary: one request, one response
  rpc GetUser (UserRequest) returns (UserResponse);

  // Server streaming: one request, stream of responses
  rpc ListEvents (EventFilter) returns (stream Event);

  // Client streaming: stream of requests, one response
  rpc UploadChunks (stream Chunk) returns (UploadResult);

  // Bidirectional streaming: stream of requests, stream of responses
  rpc Chat (stream Message) returns (stream Message);
}

Unary RPC

The simplest pattern: one request, one response. Equivalent to a traditional HTTP API call:

# Client
response = stub.GetUser(UserRequest(id='42'))
print(response.name)

# Server
def GetUser(self, request, context):
    user = db.get_user(request.id)
    if not user:
        context.abort(grpc.StatusCode.NOT_FOUND, 'User not found')
    return UserResponse(name=user.name, email=user.email)

Use unary for most CRUD operations, queries, and any request where the complete response fits in a single message (< a few MB).

Server Streaming

The client sends one request; the server streams back multiple responses. Ideal for large result sets, live feeds, or progressive data delivery:

# Client — iterate over the stream
for event in stub.ListEvents(EventFilter(after='2024-01-01')):
    process_event(event)

# Server — yield responses
def ListEvents(self, request, context):
    for event in db.stream_events(after=request.after):
        yield Event(id=event.id, type=event.type, data=event.data)

Use server streaming for: database result export, live log tailing, real-time metric feeds, and large file downloads.

Client Streaming

The client streams multiple requests; the server returns a single response. Ideal for aggregation, chunked uploads, or batch inserts:

# Client — send a stream of requests
def generate_chunks(filepath: str):
    with open(filepath, 'rb') as f:
        while chunk := f.read(65536):
            yield Chunk(data=chunk)

result = stub.UploadChunks(generate_chunks('/tmp/large-file.bin'))
print(f'Uploaded {result.bytes_received} bytes')

# Server — consume the stream, return summary
def UploadChunks(self, request_iterator, context):
    total = 0
    for chunk in request_iterator:
        storage.write(chunk.data)
        total += len(chunk.data)
    return UploadResult(bytes_received=total)

Bidirectional Streaming

Both client and server send independent streams simultaneously. The two streams are independent — either side can send at any time:

# Client
def send_messages():
    for text in user_input_stream():
        yield Message(text=text, user='alice')

for response in stub.Chat(send_messages()):
    display(response.text)

# Server
def Chat(self, request_iterator, context):
    for message in request_iterator:
        broadcast(message)
        yield Message(text=f'Delivered: {message.text}', user='server')

Use bidirectional streaming for: chat, collaborative editing, game state sync, and any scenario requiring simultaneous send/receive.

Flow Control

gRPC uses HTTP/2 flow control to prevent fast producers from overwhelming slow consumers. Each stream has a flow control window (default 65,535 bytes). The receiver issues WINDOW_UPDATE frames to grant more capacity.

If you see streaming RPCs stalling, you may need to increase the initial window size:

# Increase initial window size for large streaming RPCs
channel = grpc.insecure_channel(
    'localhost:50051',
    options=[('grpc.http2.initial_window_size', 1024 * 1024)]  # 1MB
)

Error Handling in Streams

In streaming RPCs, an error from the server terminates the stream:

def ListEvents(self, request, context):
    for event in db.stream_events():
        if context.is_active():  # check if client cancelled
            yield Event(...)
        else:
            return  # client gone, stop generating

Always check context.is_active() in long-running server streams to avoid wasting resources after the client disconnects.

When to Use Which Pattern

PatternBest For
UnaryCRUD, queries, actions
Server streamingLarge result sets, live feeds, log streaming
Client streamingChunked uploads, sensor data, batch inserts
BidirectionalChat, real-time collaboration, game state

Protocolos relacionados

Términos del glosario relacionados

Más en Real-Time Protocols