Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
179 changes: 179 additions & 0 deletions apps/web/client/src/app/llms-full.txt/route.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,179 @@
async function getFullDocumentation(docsUrl: string): Promise<string> {
const baseContent = `# Onlook - Complete Documentation

> Open-source visual editor for React apps. Design directly in your live React app and generate clean code.

## Project Overview

Onlook is a "Cursor for Designers" that enables designers to make live edits to React and TailwindCSS projects directly within the browser DOM. It provides a seamless integration between design and development.

### Key Features

- **Visual Editing**: Edit React components directly in the browser
- **Code Generation**: Automatically generates clean, production-ready code
- **TailwindCSS Integration**: Full support for Tailwind styling
- **AI Assistance**: Built-in AI chat for design and development help
- **Real-time Preview**: See changes instantly as you design
- **Component Library**: Reusable components and design systems

### Architecture

Onlook is structured as a monorepo with several interconnected apps and packages:

- **Web App**: Next.js application with visual editor interface
- **Documentation**: Comprehensive guides and API references
- **Packages**: Shared utilities, UI components, and core functionality
- **Backend**: Supabase integration for user management and data storage

### Technology Stack

- **Frontend**: Next.js, React, TailwindCSS
- **Backend**: Supabase, tRPC, Drizzle ORM
- **AI Integration**: Anthropic Claude, OpenRouter
- **Development**: TypeScript, Bun, Docker
- **Deployment**: Vercel, CodeSandbox containers

## Getting Started

### Installation

1. Clone the repository:
\`\`\`bash
git clone https://github.com/onlook-dev/onlook.git
cd onlook
\`\`\`

2. Install dependencies:
\`\`\`bash
bun install
\`\`\`

3. Set up environment variables:
\`\`\`bash
cp .env.example .env.local
\`\`\`

4. Start the development server:
\`\`\`bash
bun dev
\`\`\`

### First Project

1. **Create a New Project**: Use the project creation wizard
2. **Import Existing Project**: Connect your React + TailwindCSS project
3. **Start Designing**: Use the visual editor to modify components
4. **Generate Code**: Export clean code changes to your project

### Core Concepts

- **Visual Editor**: The main interface for designing components
- **Style Editor**: Modify TailwindCSS classes through a visual interface
- **Component Tree**: Navigate and select elements in your React app
- **AI Chat**: Get help with design decisions and code generation
- **Code Export**: Generate and apply code changes to your project

## API Reference

### Core APIs

- **Project Management**: Create, update, and manage projects
- **Component Editing**: Modify React components and their properties
- **Style Management**: Apply and manage TailwindCSS classes
- **AI Integration**: Chat with AI for design assistance
- **Code Generation**: Generate and export code changes

### Authentication

Onlook uses Supabase for authentication and user management:

- **Sign Up/Sign In**: Email-based authentication
- **User Profiles**: Manage user settings and preferences
- **Project Access**: Control access to projects and collaboration

### Data Models

- **Projects**: Container for your React applications
- **Components**: Individual React components within projects
- **Styles**: TailwindCSS classes and custom styles
- **Conversations**: AI chat history and context

## Contributing

### Development Setup

1. **Prerequisites**: Node.js 18+, Bun, Docker (optional)
2. **Environment**: Set up Supabase, AI providers, and other services
3. **Local Development**: Run the development server and containers
4. **Testing**: Run tests and ensure code quality

### Code Standards

- **TypeScript**: Strict type checking enabled
- **ESLint**: Code linting and formatting
- **Prettier**: Code formatting
- **Husky**: Pre-commit hooks for quality assurance

### Pull Request Process

1. Fork the repository and create a feature branch
2. Make your changes with appropriate tests
3. Ensure all tests pass and code is properly formatted
4. Submit a pull request with detailed description
5. Address review feedback and get approval

## Deployment

### Production Deployment

- **Web App**: Deployed on Vercel with automatic CI/CD
- **Documentation**: Static site generation and deployment
- **Backend**: Supabase managed services
- **Containers**: CodeSandbox for development environments

### Environment Configuration

- **Production**: Optimized builds with caching
- **Staging**: Testing environment for new features
- **Development**: Local development with hot reloading

## Community and Support

### Getting Help

- **Documentation**: Comprehensive guides and tutorials
- **Discord**: Active community for questions and discussions
- **GitHub Issues**: Bug reports and feature requests
- **Email**: Direct contact for business inquiries

### Contributing

- **Code Contributions**: Bug fixes, features, and improvements
- **Documentation**: Help improve guides and examples
- **Community**: Answer questions and help other users
- **Testing**: Report bugs and test new features

---

For the most up-to-date information, visit our documentation at ${docsUrl} or join our Discord community at https://discord.gg/hERDfFZCsH.
`;

return baseContent;
}

export async function GET() {
try {
const docsUrl = process.env.DOCS_URL ?? 'https://docs.onlook.com';
const content = await getFullDocumentation(docsUrl);

return new Response(content, {
headers: {
'Content-Type': 'text/plain; charset=utf-8',
'X-Robots-Tag': 'llms-txt',
},
});
} catch (error) {
console.error('Error generating llms-full.txt:', error);
return new Response('Error generating documentation', { status: 500 });
}
}
77 changes: 77 additions & 0 deletions apps/web/client/src/app/llms.txt/route.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
interface LLMSSection {
title: string;
links: Array<[string, string]>;
}

interface LLMSData {
title: string;
description: string;
sections: LLMSSection[];
}

function renderMarkdown(data: LLMSData): string {
let output = `# ${data.title}\n\n> ${data.description}\n\n`;

for (const section of data.sections) {
output += `## ${section.title}\n\n`;
for (const [text, url] of section.links) {
output += `- [${text}](${url})\n`;
}
output += `\n`;
}

return output;
}

export function GET() {
const docsUrl = process.env.DOCS_URL ?? 'https://docs.onlook.com';

const llmsData: LLMSData = {
title: 'Onlook',
description:
'Open-source visual editor for React apps. Design directly in your live React app and generate clean code.',
sections: [
{
title: 'Getting Started',
links: [
['Documentation', docsUrl],
['First Project', `${docsUrl}/getting-started/first-project`],
['UI Overview', `${docsUrl}/getting-started/ui-overview`],
['Core Features', `${docsUrl}/getting-started/core-features`],
],
},
{
title: 'Tutorials',
links: [
['Importing Templates', `${docsUrl}/tutorials/importing-templates`],
['Figma to Onlook', `${docsUrl}/tutorials/figma-to-onlook`],
],
},
{
title: 'Contributing',
links: [
['Developer Guide', `${docsUrl}/contributing/developers`],
['Running Locally', `${docsUrl}/contributing/developers/running-locally`],
['Architecture', `${docsUrl}/contributing/developers/architecture`],
],
},
{
title: 'Resources',
links: [
['GitHub Repository', 'https://github.com/onlook-dev/onlook'],
['FAQ', `${docsUrl}/faq`],
['Discord Community', 'https://discord.gg/hERDfFZCsH'],
],
},
],
};

const content = renderMarkdown(llmsData);

return new Response(content, {
headers: {
'Content-Type': 'text/plain; charset=utf-8',
'X-Robots-Tag': 'llms-txt',
},
});
}
27 changes: 27 additions & 0 deletions apps/web/client/src/app/robots.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
import type { MetadataRoute } from 'next';

const BASE_URL = process.env.APP_URL ?? 'https://onlook.com';

export default function robots(): MetadataRoute.Robots {
return {
rules: {
userAgent: '*',
allow: '/',
disallow: [
'/api/',
'/auth/',
'/callback/',
'/webhook/',
'/projects/',
'/project/',
'/invitation/',
'/_next/',
'/_vercel/',
'/private/',
],
crawlDelay: 1,
},
Comment on lines +7 to +23
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Validate crawlDelay support and sync disallow list with sitemap exclusions

  • Confirm Next.js MetadataRoute.Robots supports crawlDelay; some versions don’t emit it. If unsupported, it will be ignored.
  • Ensure this disallow list exactly matches paths excluded from sitemap generation (apps/web/client/src/lib/sitemap-utils.ts) to keep robots and sitemap consistent per PR objectives.

[suggest_optional_refactor]

Run this to inspect sitemap-utils for exclusion patterns and compare:

If desired, extract disallow into a constant to make intent explicit:

+const DISALLOW = [
+  '/api/',
+  '/auth/',
+  '/callback/',
+  '/webhook/',
+  '/projects/',
+  '/project/',
+  '/invitation/',
+  '/_next/',
+  '/_vercel/',
+  '/private/',
+];
 ...
-            disallow: [
-                '/api/',
-                '/auth/',
-                '/callback/',
-                '/webhook/',
-                '/projects/',
-                '/project/',
-                '/invitation/',
-                '/_next/',
-                '/_vercel/',
-                '/private/',
-            ],
+            disallow: DISALLOW,

🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "---- Inspecting sitemap exclusions ----"
fd -a "sitemap-utils.ts" | while read -r f; do
  echo "File: $f"
  rg -n -C2 -e 'exclude|excluded|EXCLUDE|DISALLOW|private|auth|api|webhook|project' "$f" || true
done

echo
echo "---- Current robots disallow list ----"
rg -n -A20 -e 'disallow:\s*\[' apps/web/client/src/app/robots.ts

Length of output: 2137


Validate crawlDelay support and align robots/disallow with sitemap exclusions

  • Confirm your Next.js version’s MetadataRoute.Robots supports crawlDelay (older versions will ignore it). If unsupported, consider upgrading or removing the field.
  • The disallow list in apps/web/client/src/app/robots.ts and the EXCLUDED_PATTERNS in sitemap-utils.ts must match exactly. Right now /private/ is present in robots but missing from sitemap exclusions. Add it:
--- a/apps/web/client/src/lib/sitemap-utils.ts
+++ b/apps/web/client/src/lib/sitemap-utils.ts
@@ -6,7 +6,8 @@ const BASE_URL = process.env.APP_URL ?? 'https://onlook.com';
 const EXCLUDED_PATTERNS = [
   '/api/',
   '/auth/',
-  '/callback/',
+  '/callback/',
   '/webhook/',
   '/projects/',
   '/project/',
   '/invitation/',
+  '/private/',       // ← add this to mirror robots.ts
   '/_',              // covers both /_next/ and /_vercel/
 ];
  • Optional refactor: extract the shared exclusion array into a single constant (e.g. EXCLUDED_PATHS) and import it in both robots.ts and sitemap-utils.ts to avoid future drift.
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
rules: {
userAgent: '*',
allow: '/',
disallow: [
'/api/',
'/auth/',
'/callback/',
'/webhook/',
'/projects/',
'/project/',
'/invitation/',
'/_next/',
'/_vercel/',
'/private/',
],
crawlDelay: 1,
},
// File: apps/web/client/src/lib/sitemap-utils.ts
const BASE_URL = process.env.APP_URL ?? 'https://onlook.com';
const EXCLUDED_PATTERNS = [
'/api/',
'/auth/',
'/callback/',
'/webhook/',
'/projects/',
'/project/',
'/invitation/',
'/private/', // ← add this to mirror robots.ts
'/_', // covers both /_next/ and /_vercel/
];
🤖 Prompt for AI Agents
In apps/web/client/src/app/robots.ts around lines 7 to 23, validate whether your
Next.js MetadataRoute.Robots implementation supports the crawlDelay field and if
not either remove the crawlDelay entry or upgrade Next.js to a version that
supports it; also ensure the robots disallow list exactly matches the
EXCLUDED_PATTERNS in sitemap-utils.ts by adding '/private/' to the sitemap
exclusions (or better, extract a shared EXCLUDED_PATHS constant and import it
into both robots.ts and sitemap-utils.ts so both lists remain identical going
forward).

sitemap: `${BASE_URL}/sitemap.xml`,
host: BASE_URL,
};
}
7 changes: 7 additions & 0 deletions apps/web/client/src/app/sitemap.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
import type { MetadataRoute } from 'next';
import { getWebRoutes } from '@/lib/sitemap-utils';

export default async function sitemap(): Promise<MetadataRoute.Sitemap> {
const routes = await getWebRoutes();
return routes;
}
100 changes: 100 additions & 0 deletions apps/web/client/src/lib/sitemap-utils.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,100 @@
import { readdir } from 'fs/promises';
import { join } from 'path';
import type { MetadataRoute } from 'next';

const BASE_URL = process.env.APP_URL ?? 'https://onlook.com';
const EXCLUDED_PATTERNS = [
'/api/',
'/auth/',
'/callback/',
'/webhook/',
'/projects/',
'/project/',
'/invitation/',
'/_',
];

async function scanAppDirectory(
dir: string,
basePath = '',
excludedPatterns: string[],
): Promise<string[]> {
const routes: string[] = [];

try {
const entries = await readdir(dir, { withFileTypes: true });

for (const entry of entries) {
const fullPath = join(dir, entry.name);
const routePath = join(basePath, entry.name);

if (entry.isDirectory()) {
if (
entry.name.startsWith('_') ||
entry.name.startsWith('(') ||
entry.name.startsWith('[')
) {
continue;
}

const subRoutes = await scanAppDirectory(fullPath, routePath, excludedPatterns);
routes.push(...subRoutes);
} else if (entry.name === 'page.tsx' || entry.name === 'page.ts') {
let route = basePath === '' ? '/' : basePath.replace(/\\/g, '/');

if (!route.startsWith('/')) {
route = '/' + route;
}

const shouldExclude = excludedPatterns.some((pattern) => route.startsWith(pattern));

if (!shouldExclude) {
routes.push(route);
}
}
}
} catch (error) {
console.warn(`Failed to scan directory ${dir}:`, error);
}

return routes;
}

function getRouteMetadata(route: string) {
const routeConfig = {
'/': { priority: 1.0, changeFrequency: 'daily' as const },
'/pricing': { priority: 0.9, changeFrequency: 'weekly' as const },
'/about': { priority: 0.9, changeFrequency: 'weekly' as const },
'/faq': { priority: 0.7, changeFrequency: 'weekly' as const },
'/login': { priority: 0.6, changeFrequency: 'monthly' as const },
'/terms-of-service': { priority: 0.5, changeFrequency: 'monthly' as const },
'/sitemap': { priority: 0.3, changeFrequency: 'monthly' as const },
} as const;

return (
routeConfig[route as keyof typeof routeConfig] ?? {
priority: 0.5,
changeFrequency: 'monthly' as const,
}
);
}

export async function getWebRoutes(): Promise<MetadataRoute.Sitemap> {
const now = new Date();

const appDir = join(process.cwd(), 'src', 'app');
const discoveredRoutes = await scanAppDirectory(appDir, '', EXCLUDED_PATTERNS);

const sitemapRoutes = discoveredRoutes.map((route) => {
const { priority, changeFrequency } = getRouteMetadata(route);

return {
url: `${BASE_URL}${route}`,
lastModified: now,
changeFrequency,
priority,
};
});

return sitemapRoutes;
}
Loading