mock_response_latency_ms
300
Project Detail
A TypeScript Next.js 16 (App Router) frontend skeleton for an AI-powered learning workspace. It provides a 3-pane workspace (left folder tree, center file viewer, right AI chat panel), file/folder CRUD UI, PDF viewer shell, profile and set...
mock_response_latency_ms
300
max_upload_size_mb
200
Clear separation between UI and transport, single place to add error mapping, and easy mock fallback injection for dev.
Trade-off: Slight indirection and duplication of DTO shapes; requires maintenance when backend API evolves.
Enables server-side route guard in Next.js middleware and server components (via getServerAccessToken) while allowing client to refresh tokens transparently.
Trade-off: Cookies are set from client JS (not HttpOnly) in the skeleton for demo convenience — acceptable for local prototyping but not production-safe; server-set HttpOnly cookies are recommended for production.
SSE makes ordered streaming of textual chunks simple; WebSocket supports bidirectional events for presence/typing and room-level messages.
Trade-off: SSE is unidirectional and not suited for bidirectional commands; operational complexity when scaling (WebSocket requires sticky sessions or a managed pub/sub layer).
Provides a production-like, strongly-typed UI surface for prototyping user flows and integration patterns for document-centric AI interactions (file management, AI-assisted chat & streaming) without requiring a backend to be present.
Monolithic single-repository web application built with Next.js App Router; layered client/server components with a clear adapter/facade for backend interactions (gatewayApi). Realtime and streaming are handled by WebSocket and Server-Sent Events (SSE).
Key measurable signals: mock_response_latency_ms (300), max_upload_size_mb (200).
| Dimension | Selected Option | Impact | Compromise |
|---|---|---|---|
| Session storage approach | Cookie + localStorage (client-set cookies) for demo convenience | Easy server-side detection in middleware and fast local development without a backend | Not secure (cookies not HttpOnly); higher XSS/CSRF risk compared to server-set HttpOnly cookies |
| Streaming transport | SSE for AI streams, WebSocket for realtime chat events | Simple ordered text streaming (SSE) and full-duplex realtime (WebSocket) | Operational complexity when scaling; need managed infra or sticky sessions and pub/sub glue |
Move token management to server-issued HttpOnly cookies (refresh tokens) and limit client-side token persistence to in-memory or short-lived access tokens.
Add structured observability: integrate a centralized error/telemetry system (Sentry/Datadog), log correlation with X-Correlation-ID, and add telemetry for SSE/WebSocket errors and reconnection events.