Client-Server Connectivity
Understanding HTTPS, WebSocket, and real-time communication for AI chatbots
How Client-Server Communication Works
Modern GenAI applications like chatbots require secure, real-time communication between the client (browser/app) and server. This involves establishing trusted connections via HTTPS/TLS and using streaming protocols like WebSocket or Server-Sent Events (SSE) for real-time responses.
HTTPS/TLS
Encryption
WebSocket
Bidirectional
SSE
Server Push
REST API
Request/Response
How HTTPS Works (Step-by-Step)
HTTPS = HTTP + TLS/SSL Security. It ensures secure communication between Client (Browser) and Server using encryption.
π Asymmetric Encryption
Used for key exchange. Public key encrypts, Private key decrypts. Slower but secure for initial handshake.
β‘ Symmetric Encryption
Used for data transfer. Same session key on both sides. Much faster for ongoing communication.
How a Chatbot Connects (End-to-End)
Real-time Chat: Chatbots typically use Server-Sent Events (SSE) or WebSocket for streaming LLM responses token-by-token, providing a responsive user experience.
Connection Methods Comparison
| Method | Direction | Use Case | Chatbot Use |
|---|---|---|---|
| REST API | Request β Response | Standard CRUD operations | Non-streaming queries |
| SSE (Server-Sent Events) | Server β Client (one-way) | Server push, streaming | β LLM token streaming |
| WebSocket | Bidirectional | Real-time chat, gaming | β Full-duplex chat |
| Long Polling | Request β Wait β Response | Fallback for old browsers | Legacy support only |
SSE Implementation for Chatbots
Server (FastAPI)
from fastapi import FastAPI
from fastapi.responses import StreamingResponse
from openai import OpenAI
app = FastAPI()
client = OpenAI()
async def stream_chat(message: str):
"""Stream LLM response as SSE events"""
response = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": message}],
stream=True
)
for chunk in response:
if chunk.choices[0].delta.content:
token = chunk.choices[0].delta.content
yield f"data: {token}\n\n" # SSE format
yield "data: [DONE]\n\n"
@app.post("/api/chat")
async def chat(message: str):
return StreamingResponse(
stream_chat(message),
media_type="text/event-stream",
headers={"Cache-Control": "no-cache"}
)
Client (JavaScript)
async function sendMessage(message) {
const response = await fetch('/api/chat', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ message })
});
const reader = response.body.getReader();
const decoder = new TextDecoder();
while (true) {
const { done, value } = await reader.read();
if (done) break;
const text = decoder.decode(value);
const lines = text.split('\n');
for (const line of lines) {
if (line.startsWith('data: ')) {
const token = line.slice(6);
if (token !== '[DONE]') {
appendToChat(token); // Update UI in real-time
}
}
}
}
}
Why HTTPS Matters
π Data Confidentiality
All messages between user and chatbot are encrypted. No eavesdropping on sensitive conversations.
π‘οΈ Data Integrity
Messages cannot be modified in transit. Prevents injection of malicious responses.
β User Trust
Browser shows secure padlock. Users trust the chatbot with sensitive queries.
π SEO & Performance
HTTPS is required for HTTP/2 which enables faster multiplexed connections.
β οΈ MITM Protection: HTTPS prevents Man-in-the-Middle attacks where attackers could intercept API keys, modify LLM responses, or steal user data.
Key Takeaways
- HTTPS = HTTP + TLS: Encryption for all client-server communication
- Asymmetric β Symmetric: Use asymmetric for key exchange, symmetric for data transfer
- Port 443: Standard HTTPS port, always use in production
- SSE for streaming: Perfect for LLM token-by-token responses
- WebSocket for bidirectional: When you need real-time two-way chat
- TLS 1.3: Latest protocol, faster handshake, better security
- API Gateway: Handle TLS termination, auth, and rate limiting at the edge
Learn More
Essential Resources
Related Topics
Test Your Knowledge
Score 8/10 or higher to pass
You need to be logged in to take this quiz.
Login to Continue