docetl2/website
Shreya Shankar 57a284bcb1
Fast Decomposition for Map Operations in DocWrangler (#472)
* refactor: docwrangler to use a faster decomposition flow, only if the last operation in a pipeline is a map operation.

* refactor: docwrangler to use a faster decomposition flow, only if the last operation in a pipeline is a map operation.

* refactor: update MOAR documentation
2025-12-29 18:22:02 -06:00
..
posts
public Add new showcase example (#455) 2025-11-14 16:29:06 -08:00
src Fast Decomposition for Map Operations in DocWrangler (#472) 2025-12-29 18:22:02 -06:00
.env.local.example Clarify environment variable usage for frontend and backend (#418) 2025-08-24 16:21:04 -07:00
.env.local.sample feat: adding hosted llm and azure DI to DW (#352) 2025-04-27 17:05:28 -07:00
README.md Clarify environment variable usage for frontend and backend (#418) 2025-08-24 16:21:04 -07:00
components.json
eslint.config.mjs
next.config.mjs
package-lock.json Add new showcase example (#455) 2025-11-14 16:29:06 -08:00
package.json Add new showcase example (#455) 2025-11-14 16:29:06 -08:00
postcss.config.mjs
tailwind.config.ts refactor: DSLRunner now uses a pull-based execution model (#273) 2025-01-10 12:45:04 -08:00
todos.md
tsconfig.json
vercel.json refactor: updating UI text to docwrangler (#257) 2025-01-01 18:49:31 -08:00

README.md

This is a Next.js project bootstrapped with create-next-app. This is DocWrangler, the frontend for DocETL.

Getting Started

Setting up environment variables

The .env.local file controls the TypeScript frontend features in DocWrangler, such as:

  • The improve prompt feature
  • The chatbot assistant
  • Other UI-based LLM interactions

Note: This is separate from the root .env file, which is used by the backend Python server for actual pipeline execution.

Copy the .env.local.example file to .env.local and modify the environment variables:

# API key for UI assistant features (chatbot, improve prompt, etc.)
OPENAI_API_KEY=sk-xxx
OPENAI_API_BASE=https://api.openai.com/v1
MODEL_NAME=gpt-4o-mini  # Model for UI features, not pipeline execution

NEXT_PUBLIC_BACKEND_HOST=localhost
NEXT_PUBLIC_BACKEND_PORT=8000
NEXT_PUBLIC_HOSTED_DOCWRANGLER=false

First, run the development server:

npm run dev
# or
yarn dev
# or
pnpm dev
# or
bun dev

Open http://localhost:3000 with your browser to see the result.

You can start editing the page by modifying app/page.tsx. The page auto-updates as you edit the file.

This project uses next/font to automatically optimize and load Geist, a new font family for Vercel.

Learn More

To learn more about Next.js, take a look at the following resources:

You can check out the Next.js GitHub repository - your feedback and contributions are welcome!

Deploy on Vercel

The easiest way to deploy your Next.js app is to use the Vercel Platform from the creators of Next.js.

Check out our Next.js deployment documentation for more details.