Skip to main content

Back to Projects

Project Detail

AI-Learning-Platform-UI

A TypeScript Next.js 16 (App Router) frontend skeleton for an AI-powered learning workspace. It provides a 3-pane workspace (left folder tree, center file viewer, right AI chat panel), file/folder CRUD UI, PDF viewer shell, profile and set...

Frontend EngineerDuration: 2 monthsType: ai

Key Achievement Metrics

mock_response_latency_ms

300

max_upload_size_mb

200

Architecture View

Processing state: architecture signal graph is initializing...

Decision Log

Use a gateway facade + typed domain modules

Clear separation between UI and transport, single place to add error mapping, and easy mock fallback injection for dev.

Trade-off: Slight indirection and duplication of DTO shapes; requires maintenance when backend API evolves.

Cookie-based access + refresh token strategy with client-side refresh

Enables server-side route guard in Next.js middleware and server components (via getServerAccessToken) while allowing client to refresh tokens transparently.

Trade-off: Cookies are set from client JS (not HttpOnly) in the skeleton for demo convenience — acceptable for local prototyping but not production-safe; server-set HttpOnly cookies are recommended for production.

Combine SSE and WebSocket for streaming & realtime

SSE makes ordered streaming of textual chunks simple; WebSocket supports bidirectional events for presence/typing and room-level messages.

Trade-off: SSE is unidirectional and not suited for bidirectional commands; operational complexity when scaling (WebSocket requires sticky sessions or a managed pub/sub layer).

Architecture Narrative

Challenge

Provides a production-like, strongly-typed UI surface for prototyping user flows and integration patterns for document-centric AI interactions (file management, AI-assisted chat & streaming) without requiring a backend to be present.

Solution

Monolithic single-repository web application built with Next.js App Router; layered client/server components with a clear adapter/facade for backend interactions (gatewayApi). Realtime and streaming are handled by WebSocket and Server-Sent Events (SSE).

Result

Key measurable signals: mock_response_latency_ms (300), max_upload_size_mb (200).

Trade-off Matrix

DimensionSelected OptionImpactCompromise
Session storage approachCookie + localStorage (client-set cookies) for demo convenienceEasy server-side detection in middleware and fast local development without a backendNot secure (cookies not HttpOnly); higher XSS/CSRF risk compared to server-set HttpOnly cookies
Streaming transportSSE for AI streams, WebSocket for realtime chat eventsSimple ordered text streaming (SSE) and full-duplex realtime (WebSocket)Operational complexity when scaling; need managed infra or sticky sessions and pub/sub glue

What I'd Do Differently

+

Move token management to server-issued HttpOnly cookies (refresh tokens) and limit client-side token persistence to in-memory or short-lived access tokens.

+

Add structured observability: integrate a centralized error/telemetry system (Sentry/Datadog), log correlation with X-Correlation-ID, and add telemetry for SSE/WebSocket errors and reconnection events.

Estifanos Kebede

System Engineer & Full Stack Developer

Social

SYSTEM: ESTIFANOS.PORTFOLIO

STATUS: OPERATIONAL

LAST_UPDATED: 2026

© 2026 Estifanos Kebede. Built with precision and intent.