API testing
API tests validate the contracts between the Next.js app, the Secure AI Gateway, and the Python services. They are the fastest way to catch breaking changes before UI automation runs.
Validation focus
Health
Liveness and readiness-style checks on the Secure AI Gateway and downstream services (RAG, Eval, Incident, DevOps).
Schema shape
Response bodies match expected keys and types—critical when Pydantic models or OpenAI payloads evolve.
Error handling
4xx/5xx paths, validation errors, and gateway rejections (rate limit, injection) return stable, parseable errors.
Latency expectations
Upper-bound assertions on cold vs warm paths; separate budgets for embedding-heavy vs chat routes.
Response consistency
Deterministic fields (request_id, metadata) present; AI text may vary but envelope is stable.
Connection to portfolio AI systems
Same services described on Architecture and Projects.
The gateway fronts RAG retrieval, LLM evaluation, incident investigation, DevOps risk analysis, and architecture review. API tests assert that each route behind the gateway still accepts valid input and returns structured JSON—including citation blocks, eval scores, and gateway metadata.
Planned coverage includes contract tests per service and negative cases (missing API keys, overloaded models) where the UI must degrade gracefully. Local execution targets Docker Compose; production smoke stays minimal and read-only where possible.