# LEM Chat — Web Components Design **Date**: 2026-02-17 **Status**: Approved ## Summary Standalone chat UI built with vanilla Web Components (Custom Elements + Shadow DOM). Connects to the MLX inference server's OpenAI-compatible SSE streaming endpoint. Zero framework dependencies. Single JS file output, embeddable anywhere. ## Components | Element | Purpose | |---------|---------| | `` | Container. Conversation state, SSE connection, config via attributes | | `` | Scrollable message list with auto-scroll anchoring | | `` | Single message bubble. Streams tokens for assistant messages | | `` | Text input, Enter to send, Shift+Enter for newline | ## Data Flow ``` User types in → dispatches 'lem-send' CustomEvent → catches it → adds user message to → POST /v1/chat/completions {stream: true, messages: [...history]} → reads SSE chunks via fetch + ReadableStream → appends tokens to streaming → on [DONE], finalises message ``` ## Configuration ```html ``` Attributes: `endpoint`, `model`, `system-prompt`, `max-tokens`, `temperature` ## Theming Shadow DOM with CSS custom properties: ```css --lem-bg: #1a1a1e; --lem-msg-user: #2a2a3e; --lem-msg-assistant: #1e1e2a; --lem-accent: #5865f2; --lem-text: #e0e0e0; --lem-font: system-ui; ``` ## Markdown Minimal inline parsing: fenced code blocks, inline code, bold, italic. No library. ## File Structure ``` lem-chat/ ├── index.html # Demo page ├── src/ │ ├── lem-chat.ts # Main container + SSE client │ ├── lem-messages.ts # Message list with scroll anchoring │ ├── lem-message.ts # Single message with streaming │ ├── lem-input.ts # Text input │ ├── markdown.ts # Minimal markdown → HTML │ └── styles.ts # CSS template literals ├── package.json # typescript + esbuild └── tsconfig.json ``` Build: `esbuild src/lem-chat.ts --bundle --outfile=dist/lem-chat.js` ## Not in v1 - Model selection UI - Conversation persistence - File/image upload - Syntax highlighting - Typing indicators - User avatars