Files
fastgpt-python-sdk/docs/api/async_clients.md
2026-01-08 17:35:21 +08:00

202 lines
5.6 KiB
Markdown

# Async Clients
The SDK provides full async/await support for high-performance applications. All synchronous clients have async equivalents.
## Async ChatClient
### Initialization
```python
from fastgpt_client import AsyncChatClient
client = AsyncChatClient(
api_key="fastgpt-xxxxx",
base_url="http://localhost:3000"
)
```
### Basic Usage
```python
import asyncio
from fastgpt_client import AsyncChatClient
async def main():
async with AsyncChatClient(api_key="fastgpt-xxxxx") as client:
response = await client.create_chat_completion(
messages=[{"role": "user", "content": "Hello!"}],
stream=False
)
response.raise_for_status()
result = response.json()
print(result['choices'][0]['message']['content'])
asyncio.run(main())
```
### Streaming with Async
```python
import asyncio
import json
from fastgpt_client import AsyncChatClient
async def stream_chat():
async with AsyncChatClient(api_key="fastgpt-xxxxx") as client:
response = await client.create_chat_completion(
messages=[{"role": "user", "content": "Tell me a story"}],
stream=True
)
async for line in response.aiter_lines():
if line.startswith("data:"):
data = line[5:].strip()
if data and data != "[DONE]":
chunk = json.loads(data)
if "choices" in chunk and chunk["choices"]:
delta = chunk["choices"][0].get("delta", {})
content = delta.get("content", "")
if content:
print(content, end="", flush=True)
asyncio.run(stream_chat())
```
### Multiple Concurrent Requests
One of the main benefits of async is handling multiple requests concurrently:
```python
import asyncio
from fastgpt_client import AsyncChatClient
async def fetch_multiple():
async with AsyncChatClient(api_key="fastgpt-xxxxx") as client:
# Create multiple chat completions concurrently
tasks = [
client.create_chat_completion(
messages=[{"role": "user", "content": f"What is {concept}?"}],
stream=False
)
for concept in ["AI", "Machine Learning", "Deep Learning"]
]
responses = await asyncio.gather(*tasks)
for i, response in enumerate(responses):
response.raise_for_status()
result = response.json()
concept = ["AI", "Machine Learning", "Deep Learning"][i]
print(f"\n{concept}:")
print(result['choices'][0]['message']['content'])
asyncio.run(fetch_multiple())
```
## Async AppClient
### Basic Usage
```python
import asyncio
from fastgpt_client import AsyncAppClient
async def get_analytics():
async with AsyncAppClient(api_key="fastgpt-xxxxx") as client:
response = await client.get_app_logs_chart(
appId="your-app-id",
dateStart="2024-01-01",
dateEnd="2024-12-31",
source=["api"]
)
response.raise_for_status()
data = response.json()
print(data)
asyncio.run(get_analytics())
```
## Complete Example: Async Chat Application
```python
import asyncio
from fastgpt_client import AsyncChatClient
class AsyncChatApp:
def __init__(self, api_key: str, base_url: str):
self.client = AsyncChatClient(api_key=api_key, base_url=base_url)
self.chat_id = None
async def start(self):
await self.client.__aenter__()
async def stop(self):
await self.client.__aexit__(None, None, None)
async def send_message(self, content: str) -> str:
response = await self.client.create_chat_completion(
messages=[{"role": "user", "content": content}],
chatId=self.chat_id,
stream=False
)
response.raise_for_status()
result = response.json()
# Update chat_id after first message
if not self.chat_id:
self.chat_id = result.get('chatId')
return result['choices'][0]['message']['content']
async def chat(self):
await self.start()
try:
while True:
user_input = input("\nYou: ")
if user_input.lower() in ['quit', 'exit']:
break
print("AI: ", end="", flush=True)
response = await self.send_message(user_input)
print(response)
finally:
await self.stop()
async def main():
app = AsyncChatApp(
api_key="fastgpt-xxxxx",
base_url="http://localhost:3000"
)
await app.chat()
asyncio.run(main())
```
## Key Differences from Sync Clients
| Aspect | Sync | Async |
|--------|------|-------|
| Context Manager | `with` | `async with` |
| Method Call | `client.method()` | `await client.method()` |
| Streaming | `for line in response.iter_lines()` | `async for line in response.aiter_lines()` |
| Close | `client.close()` | `await client.close()` |
## Best Practices
1. **Always use `async with`** for automatic resource cleanup
2. **Use `asyncio.gather()`** for concurrent requests
3. **Handle exceptions properly** with try/except blocks
4. **Close clients** when done (or use context managers)
5. **Avoid mixing sync and async** code in the same application
## When to Use Async
Use async clients when you need to:
- Handle many concurrent requests
- Integrate with other async libraries (FastAPI, aiohttp, etc.)
- Build real-time applications with streaming
- Maximize throughput in I/O-bound applications
For simple scripts or applications with low concurrency, sync clients are often simpler and equally effective.