What is ConnectRPC?
ConnectRPC enables Platform Engineering teams to distribute GraphQL-backed APIs as governed, versioned API products using protobuf. API Providers define a fixed set of GraphQL queries and mutations (Trusted Documents) that represent the supported API surface, which are then converted into protobuf service definitions and distributed as OpenAPI and proto files.How It Works
ConnectRPC acts as a protocol translation layer:- Define Operations: Create named GraphQL operations (Trusted Documents) as
.graphqlfiles - Generate Proto: Use
wgcto convert operations into protobuf service definitions - Start Router: Configure the router to load your proto files and operations
- Consume: Clients call RPC methods, router translates to GraphQL, executes, and returns typed responses
Quickstart
Prerequisites: You need a working Cosmo environment with a federated graph. See the Cosmo Cloud Onboarding guide to set up the demo environment.
Complete Tutorial
For a comprehensive, step-by-step tutorial with detailed explanations, see the ConnectRPC Demo Repository. The quickstart below provides a condensed overview.
1. Create Named Operations
Create a directory for your operations (e.g.,services/). Each .graphql file should contain one named operation:
services/GetEmployee.graphql
Operations must use PascalCase naming (e.g.,
GetEmployee), one operation per file, and no root-level aliases.2. Generate Proto Service
Usewgc to generate a protobuf service from your operations:
service.proto and service.proto.lock.json in the ./services directory.
3. Start the Router
Configure the router to load your proto services:config.yaml
Next Steps
- Generate SDKs: Use buf to generate TypeScript, Go, Swift, Kotlin, or Python clients
- Generate OpenAPI: Create OpenAPI specs for documentation and tooling
- Learn More: Follow the complete tutorial for detailed examples
- TypeScript SDK
- Go SDK
- OpenAPI
Reference
Protocol Support
The router supports multiple protocols with automatic transcoding:- Connect Protocol - HTTP/1.1 or HTTP/2 with JSON or binary protobuf
- gRPC - Binary protobuf over HTTP/2
- gRPC-Web - Browser-compatible gRPC
- Query operations support HTTP GET (enables CDN caching)
- Mutation operations require HTTP POST
- All protocols support JSON encoding
Operation Requirements
Naming: Use PascalCase (e.g.,GetEmployee, UpdateEmployee)
File Structure: One operation per .graphql file
Aliases: No root-level aliases (nested aliases are allowed)
Directory Structure
The router recursively discovers proto files and operations. Recommended structure:The combination of proto package name and service name must be unique. Nested proto files in subdirectories are not discovered if a parent directory contains a proto file.
Forward Compatibility
Theservice.proto.lock.json file maintains field number stability across regenerations. Always commit this file to version control.
When you modify operations:
- Existing fields retain their protobuf field numbers
- New fields get new numbers
- Binary compatibility is maintained for deployed clients
CLI Reference
For complete command options and advanced configuration:wgc grpc-service generate- Generate proto from GraphQL operations- Storage Providers - Configure file system and remote storage
Roadmap
Planned features for future releases:- Enhanced OpenAPI Generation - Descriptions, summaries, deprecated fields, and tags
- Subscription Support - GraphQL subscriptions as gRPC streams
- Multiple Root Fields - Operations with multiple root selection set fields
- Field Aliases - GraphQL aliases to customize API surface