Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to run LlamaParseReader from llamaindex in Nextjs 14.2.14 #1308

Open
heinergiehl opened this issue Oct 8, 2024 · 8 comments
Open
Labels
bug Something isn't working

Comments

@heinergiehl
Copy link

Currently I am facing the error below when running LlamaParseReader in an API-route in NextJS 14.2.14:
Error: Missing tiktoken_bg.wasm at Object.<anonymous> (C:\Users\Heiner\botfiles\.next\server\chunks\node_modules_44d7d9._.js:1966:26) at [project]/node_modules/tiktoken/tiktoken.cjs [app-route] (ecmascript) (C:\Users\Heiner\botfiles\.next\server\chunks\node_modules_44d7d9._.js:1974:3) at instantiateModule (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:520:23) at getOrInstantiateModuleFromParent (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:572:12) at esmImport (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:122:20) at C:\Users\Heiner\botfiles\.next\server\chunks\node_modules_44d7d9._.js:172:143 at [project]/node_modules/@llamaindex/env/dist/tokenizers/node.js [app-route] (ecmascript) <module evaluation> (C:\Users\Heiner\botfiles\.next\server\chunks\node_modules_44d7d9._.js:176:3) at instantiateModule (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:520:23) at getOrInstantiateModuleFromParent (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:572:12) at esmImport (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:122:20) at C:\Users\Heiner\botfiles\.next\server\chunks\node_modules_44d7d9._.js:297:199 at [project]/node_modules/@llamaindex/env/dist/index.js [app-route] (ecmascript) <module evaluation> (C:\Users\Heiner\botfiles\.next\server\chunks\node_modules_44d7d9._.js:302:3) at instantiateModule (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:520:23) at getOrInstantiateModuleFromParent (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:572:12) at esmImport (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:122:20) at C:\Users\Heiner\botfiles\.next\server\chunks\node_modules_@llamaindex_core_8a7dfd._.js:2413:186 at [project]/node_modules/@llamaindex/core/agent/dist/index.js [app-route] (ecmascript) (C:\Users\Heiner\botfiles\.next\server\chunks\node_modules_@llamaindex_core_8a7dfd._.js:3279:3) at instantiateModule (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:520:23) at getOrInstantiateModuleFromParent (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:572:12) at esmImport (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:122:20) at C:\Users\Heiner\botfiles\.next\server\chunks\node_modules_44d7d9._.js:387:168 at [project]/node_modules/@llamaindex/openai/dist/index.js [app-route] (ecmascript) (C:\Users\Heiner\botfiles\.next\server\chunks\node_modules_44d7d9._.js:1389:3) at instantiateModule (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:520:23) at getOrInstantiateModuleFromParent (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:572:12) at esmImport (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:122:20) at C:\Users\Heiner\botfiles\.next\server\chunks\node_modules_llamaindex_dist_873e0c._.js:14379:161 at [project]/node_modules/llamaindex/dist/index.edge.js [app-route] (ecmascript) <module evaluation> (C:\Users\Heiner\botfiles\.next\server\chunks\node_modules_llamaindex_dist_873e0c._.js:14417:3) at instantiateModule (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:520:23) at getOrInstantiateModuleFromParent (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:572:12) at esmImport (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:122:20) at C:\Users\Heiner\botfiles\.next\server\chunks\node_modules_llamaindex_dist_873e0c._.js:14429:185 at [project]/node_modules/llamaindex/dist/index.react-server.js [app-route] (ecmascript) <module evaluation> (C:\Users\Heiner\botfiles\.next\server\chunks\node_modules_llamaindex_dist_873e0c._.js:14433:3) at instantiateModule (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:520:23) at getOrInstantiateModuleFromParent (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:572:12) at esmImport (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:122:20) at C:\Users\Heiner\botfiles\.next\server\chunks\_2f5573._.js:9:196 at [project]/lib/llama-parse.ts [app-route] (ecmascript) (C:\Users\Heiner\botfiles\.next\server\chunks\_2f5573._.js:23:3) at instantiateModule (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:520:23) at getOrInstantiateModuleFromParent (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:572:12) at esmImport (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:122:20) at C:\Users\Heiner\botfiles\.next\server\chunks\_2f5573._.js:30:127 at [project]/app/api/llama-parse/route.ts [app-route] (ecmascript) (C:\Users\Heiner\botfiles\.next\server\chunks\_2f5573._.js:41:3) at instantiateModule (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:520:23) at getOrInstantiateModuleFromParent (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:572:12) at esmImport (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:122:20) at C:\Users\Heiner\botfiles\.next\server\chunks\node_modules_next_ca9708._.js:6908:143 at [project]/node_modules/next/dist/esm/build/templates/app-route.js { INNER_APP_ROUTE => "[project]/app/api/llama-parse/route.ts [app-route] (ecmascript)" } [app-route] (ecmascript) (C:\Users\Heiner\botfiles\.next\server\chunks\node_modules_next_ca9708._.js:6943:3) at instantiateModule (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:520:23) at instantiateRuntimeModule (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:580:12) at Object.getOrInstantiateRuntimeModule (C:\Users\Heiner\botfiles\.next\server\chunks\[turbopack]_runtime.js:595:12) { page: '/api/llama-parse' } }

under api/llama-parse.ts:

import { llamaParse } from "@/lib/llama-parse" import { NextResponse } from "next/server" export async function POST(req: Request, res: Response) { const { url } = await req.json() const result = await llamaParse(url) return NextResponse.json(result) }

and here the llamaParse-function:

import { OpenAI, SimpleChatEngine, LlamaParseReader } from "llamaindex" const llamaParse = async (pdfUrl: string) => { const reader = new LlamaParseReader({ resultType: "markdown" }) const pdf = await reader.loadData(pdfUrl) console.log(pdf) return pdf } export { llamaParse }

I would be super happy if someone knew what the problem is. Thanks <3

@himself65
Copy link
Member

himself65 commented Oct 8, 2024

Can you just import @llamaindex/cloud/reader?

@heinergiehl
Copy link
Author

Can you just import @llamaindex/cloud/reader?

Thx for your reply @himself65 but how to import?
What do I need to install? I cannot simply import this, since I only have llamaindex
{50938974-3409-48B5-8F9C-E8559293E1E1}

@himself65
Copy link
Member

Yes install that

@himself65
Copy link
Member

I thought I fixed that in the latest llamaindex

@himself65 himself65 added the bug Something isn't working label Oct 8, 2024
@heinergiehl
Copy link
Author

I thought I fixed that in the latest llamaindex

Thx for your work <3 You already know what the problem is?

@heinergiehl
Copy link
Author

I thought I fixed that in the latest llamaindex

Thx for your work <3 Do you already know what the problem is?
The issue was partly fixed when copying the .wasm file over to the .next-chunks directory. But this fixes it only in production when building and starting the app :(
run-llama/create-llama#164 (comment)

@himself65
Copy link
Member

Oh, this seems like tiktoken issue again, can you try import withLlama from 'llamaindex/next' in next.config.js

@heinergiehl
Copy link
Author

heinergiehl commented Oct 9, 2024

withLlama

Thanks MrSir @himself65 But I already have this:

/** @type {import('next').NextConfig} */ import withLlama from "llamaindex/next" const nextConfig = { // runtime: 'edge', // unstable_allowDynamic: ['/node_modules/@pinecone-database/pinecone/**'], experimental: { serverComponentsExternalPackages: ['@aws-sdk/client-s3', '@aws-sdk/s3-request-presigner', 'tiktoken'], "outputFileTracingIncludes": { "/api/**/*": ["./node_modules/**/*.wasm", "./node_modules/tiktoken/**/*.wasm"], "/*": ["./cache/**/*"], } }, experiments: { asyncWebAssembly: true, layers: true, }, transpilePackages: ['pdfjs-dist'], eslint: { // Warning: This allows production builds to successfully complete even if // your project has ESLint errors. ignoreDuringBuilds: true, }, typescript: { // !! WARN !! // Dangerously allow production builds to successfully complete even if // your project has type errors. // !! WARN !! ignoreBuildErrors: true, } } const withLamaIndexConfig = withLlama(nextConfig) export default withLamaIndexConfig

This should be correct, or? However, the error Error: Missing tiktoken_bg.wasm still persists :(

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants