Skip to main content

Starter Tutorial

Make sure you have installed LlamaIndex.TS and have an OpenAI key. If you haven't, check out the installation guide.

From scratch(node.js + TypeScript):

In a new folder:

npm init
npm install -D typescript @types/node

Create the file example.ts. This code will load some example data, create a document, index it (which creates embeddings using OpenAI), and then creates query engine to answer questions about the data.

import fs from "node:fs/promises";

import { Document, VectorStoreIndex } from "llamaindex";

async function main() {
// Load essay from abramov.txt in Node
const path = "node_modules/llamaindex/examples/abramov.txt";

const essay = await fs.readFile(path, "utf-8");

// Create Document object with essay
const document = new Document({ text: essay, id_: path });

// Split text and create embeddings. Store them in a VectorStoreIndex
const index = await VectorStoreIndex.fromDocuments([document]);

// Query the index
const queryEngine = index.asQueryEngine();
const response = await queryEngine.query({
query: "What did the author do in college?",
});

// Output response
console.log(response.toString());
}

main().catch(console.error);

Create a tsconfig.json file in the same folder:

{
"compilerOptions": {
"target": "es2017",
"module": "esnext",
"moduleResolution": "bundler",
"esModuleInterop": true,
"forceConsistentCasingInFileNames": true,
"strict": true,
"skipLibCheck": true,
"outDir": "./lib",
"tsBuildInfoFile": "./lib/.tsbuildinfo",
"incremental": true,
"composite": true
},
"ts-node": {
"files": true,
"compilerOptions": {
"module": "commonjs"
}
},
"include": ["./**/*.ts"]
}

Now you can run the code with

npx tsx example.ts

Also, you can clone our examples and try them out:

npx degit run-llama/LlamaIndexTS/examples my-new-project
cd my-new-project
npm install
npx tsx ./vectorIndex.ts

From scratch (Next.js + TypeScript):

You just need one command to create a new Next.js project:

npx create-llama@latest