steven-tey/novel
Notion-style WYSIWYG editor with AI-powered autocompletion.
Rich text editor with AI-powered autocompletion and Notion-style slash commands
User input flows through the Tiptap editor which applies extensions for formatting and command detection. When slash commands are triggered, the system captures the query and range, generates suggestions (including AI completions), and replaces text at the specified range. Images are handled through drag-and-drop events that trigger upload functions and insert image nodes.
Under the hood, the system uses 2 feedback loops, 2 data pools, 3 control points to manage its runtime behavior.
A 6-component library. 61 files analyzed. Data flows through 6 distinct pipeline stages.
How Data Flows Through the System
User input flows through the Tiptap editor which applies extensions for formatting and command detection. When slash commands are triggered, the system captures the query and range, generates suggestions (including AI completions), and replaces text at the specified range. Images are handled through drag-and-drop events that trigger upload functions and insert image nodes.
- Editor Initialization — EditorRoot creates Tiptap editor instance with configured extensions (StarterKit, AIHighlight, SlashCommand, etc.) and initializes Jotai atoms for state management [Extension configurations → Editor instance]
- Content Input — User types content which flows through Tiptap's document model, applying extensions like MarkdownExtension for real-time formatting and CustomKeymap for keyboard shortcuts [Keyboard events → JSONContent]
- Command Detection — SlashCommand extension detects '/' character input, captures text position as Range, and triggers EditorCommandOut to show suggestion dropdown [Text input → Range]
- Suggestion Generation — Command system generates suggestion list including static items (headings, lists) and AI completion options, filtering by query text and calling OpenAI API for completions [Query string → SuggestionItem array]
- Content Replacement — Selected suggestion replaces text at stored Range position, with AI completions applying AIHighlight extension during generation and removing it when complete [SuggestionItem selection → JSONContent]
- Image Processing — UploadImagesPlugin handles drag-and-drop or paste events, validates image files, calls configured uploadFn (Vercel Blob in demo app), and inserts TiptapImage nodes with resulting URLs [File objects → Image nodes]
Data Models
The data structures that flow between stages — the contracts that hold the system together.
packages/headless/src/components/index.tsTiptap's JSONContent type representing editor document structure as nested objects with type, attrs, content arrays
Created from user input, transformed by extensions, serialized for persistence or AI completion
packages/headless/src/utils/atoms.tsTiptap Range object with from: number, to: number indicating text selection boundaries
Set when slash commands are triggered, used to replace text ranges with generated content
apps/web/components/tailwind/selectors/node-selector.tsxObject with name: string, icon: LucideIcon, command: function, isActive: function for editor toolbar actions
Static configuration objects that define available formatting options in bubble menus
packages/headless/src/plugins/upload-images.tsFunction type (file: File) => Promise<string> that handles image upload and returns URL
Configured by consuming application, called when images are dropped or pasted into editor
Hidden Assumptions
Things this code relies on but never validates. These are the things that cause silent failures when the system changes.
The /api/upload endpoint exists and accepts POST requests with file body and specific headers (content-type, x-vercel-filename), returning JSON with 'url' field on success
If this fails: If the API route is missing, renamed, or returns different response format, image uploads fail silently or with cryptic errors - the promise resolves with the File object instead of URL, breaking image display
apps/web/components/tailwind/image-upload.ts:onUpload
The editor.storage.markdown.serializer exists and has a serialize() method that returns a string
If this fails: If MarkdownExtension isn't configured or storage structure changes, getPrevText() crashes with 'Cannot read property serialize of undefined', breaking AI completion context generation
packages/headless/src/utils/index.ts:getPrevText
File objects always have .type and .size properties with expected formats - file.type contains 'image/' substring for images and file.size is in bytes
If this fails: Malformed File objects or files without proper MIME types bypass validation, potentially uploading non-images or oversized files that crash the upload endpoint
apps/web/components/tailwind/image-upload.ts:validateFn
A DOM element with id 'slash-command' exists in the document when navigation keys are pressed during command mode
If this fails: If the command component unmounts or ID changes, keyboard navigation (ArrowUp, ArrowDown, Enter) fails silently - users can't navigate the command palette with keyboard
packages/headless/src/components/editor-command.tsx:onKeyDown
Image preloading completes before the promise resolves, ensuring the image is ready for display when URL is returned
If this fails: If image.onload never fires (network issues, invalid URLs, CORS problems), the upload promise never resolves, leaving UI in loading state indefinitely
apps/web/components/tailwind/image-upload.ts:image.onload
20MB is a reasonable file size limit that won't exceed server memory or request size limits
If this fails: If the server has smaller limits (default Next.js is 1MB), uploads that pass client validation will fail server-side with unclear error messages
apps/web/components/tailwind/image-upload.ts:validateFn
HTTP status codes have specific meanings: 200 = success with URL, 401 = missing blob token, anything else = unknown error
If this fails: If the API returns different status codes (403, 413, 500) or uses 200 for error states, the error handling logic misclassifies failures and shows wrong error messages
apps/web/components/tailwind/image-upload.ts:res.status
The tunnel instance from EditorCommandTunnelContext always has an Out component that can render without props
If this fails: If tunnel-rat library changes API or context is not properly initialized, rendering fails with 'Cannot read property Out of undefined'
packages/headless/src/components/editor-command.tsx:EditorCommandOut
The document object and addEventListener are available (browser environment) and keyboard events can be prevented and re-dispatched
If this fails: In server-side rendering or non-browser environments, document is undefined causing crashes during component mount
packages/headless/src/components/editor-command.tsx:navigationKeys
Strings containing dots but no spaces are likely URLs that can be prefixed with 'https://' to become valid URLs
If this fails: False positives like 'file.txt' or 'user.name' get converted to invalid HTTPS URLs, while valid URLs with spaces get rejected
packages/headless/src/utils/index.ts:getUrlFromString
System Behavior
How the system operates at runtime — where data accumulates, what loops, what waits, and what controls what.
Data Pools
Tiptap editor document state containing current content, selection, and transaction history
Jotai atoms storing current slash command query and selection range for coordinating between command detection and suggestion UI
Feedback Loops
- AI Completion Loop (recursive, reinforcing) — Trigger: User selects AI completion option from slash command. Action: System highlights text, calls OpenAI API, streams response, and updates editor content. Exit: Completion finishes and highlight is removed.
- Command Navigation (polling, balancing) — Trigger: Arrow key presses while command menu is open. Action: EditorCommand intercepts keyboard events and dispatches them to command palette for navigation. Exit: Command is selected or menu is dismissed.
Delays
- AI Response Streaming (async-processing, ~2-5 seconds) — Text remains highlighted while AI generates completion, providing visual feedback of processing state
- Image Upload (async-processing, ~1-3 seconds) — Placeholder image shown with loading state until upload completes and real URL is available
Control Points
- OPENAI_API_KEY (env-var) — Controls: Whether AI completion features are enabled in the demo application
- BLOB_READ_WRITE_TOKEN (env-var) — Controls: Whether image uploads work in demo app, falls back to local file display if missing
- Extensions Configuration (runtime-toggle) — Controls: Which editor features are enabled (AI, image upload, markdown, etc.). Default: Full feature set enabled
Technology Stack
Core rich text editing engine providing ProseMirror-based document model and extension system
React framework for the demo application with API routes for AI completion and image upload
Atomic state management for command queries and selection ranges
AI completion service called from Next.js API routes to generate text suggestions
Image storage service for uploaded images in the demo application
Styling system for the demo application UI components
Unstyled UI primitives for popover, dialog, and command components
Key Components
- EditorRoot (orchestrator) — Main editor orchestrator that initializes Tiptap editor with extensions, manages state atoms, and provides context for child components
packages/headless/src/components/editor.tsx - EditorCommand (processor) — Command palette interface that captures slash command queries, manages keyboard navigation, and coordinates with suggestion system
packages/headless/src/components/editor-command.tsx - SlashCommand (processor) — Tiptap extension that detects slash character input, creates suggestion dropdown with AI completion options, and handles text replacement
packages/headless/src/extensions/slash-command.tsx - AIHighlight (transformer) — Extension that highlights text being processed by AI, providing visual feedback during completion generation
packages/headless/src/extensions/ai-highlight.ts - UploadImagesPlugin (processor) — ProseMirror plugin that handles image drag-and-drop and paste events, validates files, and coordinates with upload functions
packages/headless/src/plugins/upload-images.ts - uploadFn (adapter) — Connects editor image uploads to Vercel Blob storage via API route, handles validation and error states
apps/web/components/tailwind/image-upload.ts
Package Structure
Core headless editor library built on Tiptap providing AI autocompletion, slash commands, and rich text editing components
Next.js demonstration app showing Novel editor integration with image uploads and OpenAI completion
Explore the interactive analysis
See the full architecture map, data flow, and code patterns visualization.
Analyze on CodeSeaRelated Library Repositories
Frequently Asked Questions
What is novel used for?
Rich text editor with AI-powered autocompletion and Notion-style slash commands steven-tey/novel is a 6-component library written in TypeScript. Data flows through 6 distinct pipeline stages. The codebase contains 61 files.
How is novel architected?
novel is organized into 4 architecture layers: Core Editor, Extensions, Plugins, Demo Application. Data flows through 6 distinct pipeline stages. This layered structure keeps concerns separated and modules independent.
How does data flow through novel?
Data moves through 6 stages: Editor Initialization → Content Input → Command Detection → Suggestion Generation → Content Replacement → .... User input flows through the Tiptap editor which applies extensions for formatting and command detection. When slash commands are triggered, the system captures the query and range, generates suggestions (including AI completions), and replaces text at the specified range. Images are handled through drag-and-drop events that trigger upload functions and insert image nodes. This pipeline design reflects a complex multi-stage processing system.
What technologies does novel use?
The core stack includes Tiptap (Core rich text editing engine providing ProseMirror-based document model and extension system), Next.js (React framework for the demo application with API routes for AI completion and image upload), Jotai (Atomic state management for command queries and selection ranges), OpenAI (AI completion service called from Next.js API routes to generate text suggestions), Vercel Blob (Image storage service for uploaded images in the demo application), Tailwind CSS (Styling system for the demo application UI components), and 1 more. A focused set of dependencies that keeps the build manageable.
What system dynamics does novel have?
novel exhibits 2 data pools (Editor State, Command Atoms), 2 feedback loops, 3 control points, 2 delays. The feedback loops handle recursive and polling. These runtime behaviors shape how the system responds to load, failures, and configuration changes.
What design patterns does novel use?
4 design patterns detected: Extension Pattern, Atomic State Management, Plugin Architecture, Suggestion System.
Analyzed on April 20, 2026 by CodeSea. Written by Karolina Sarna.