-
Notifications
You must be signed in to change notification settings - Fork 152
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
replace prompts with inquirer #489
base: main
Are you sure you want to change the base?
Conversation
|
WalkthroughThis pull request represents a comprehensive migration from the Changes
Possibly related PRs
Suggested reviewers
Poem
✨ Finishing Touches
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🔭 Outside diff range comments (2)
helpers/providers/groq.ts (1)
Line range hint
11-45
: Improve error handling ingetAvailableModelChoicesGroq
.The current error handling can be improved:
- Remove the redundant
console.log(error)
as it's followed by a formatted message- Consider throwing an error instead of calling
process.exit(1)
to allow the caller to handle the error} catch (error: unknown) { spinner.stop(); - console.log(error); if ((error as any).response?.statusCode === 401) { - console.log( + throw new Error( red( "Invalid Groq API key provided! Please provide a valid key and try again!", ), ); } else { - console.log(red("Request failed: " + error)); + throw new Error(red("Request failed: " + error)); } - process.exit(1); }helpers/providers/llmhub.ts (1)
Dimensions mapping needs to be updated with current model specifications
The current implementation in
getDimensions
is outdated and incorrect:
- It references
text-embedding-004
which is not a current model- The hardcoded dimensions don't match the actual dimensions of current models
- Other providers in the codebase maintain proper dimension mappings
Consider updating the implementation to match current model specifications, similar to how other providers handle it:
const EMBEDDING_MODELS: Record<string, number> = { 'text-embedding-3-small': 1536, 'text-embedding-3-large': 1024, // Add other supported models }; function getDimensions(modelName: string) { return EMBEDDING_MODELS[modelName] ?? 1536; // fallback to 1536 if unknown }🔗 Analysis chain
Line range hint
164-167
: Add validation for unknown embedding models.The
getDimensions
function makes assumptions about model dimensions without validating unknown models.Consider adding validation and maintaining a mapping of known dimensions:
+const KNOWN_DIMENSIONS: Record<string, number> = { + 'text-embedding-004': 768, + 'default': 1536 +}; + function getDimensions(modelName: string) { - // Assuming dimensions similar to OpenAI for simplicity. Update if different. - return modelName === "text-embedding-004" ? 768 : 1536; + return KNOWN_DIMENSIONS[modelName] ?? KNOWN_DIMENSIONS.default; }Let's verify the embedding model dimensions across the codebase:
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Search for embedding model dimension definitions rg "dimensions.*=.*\d{3,4}" --type tsLength of output: 39
Script:
#!/bin/bash # Search for embedding-related code and types ast-grep --pattern 'embedding' rg -i 'embedding' --type ts -A 2 -B 2 # Search for potential dimension numbers rg '(768|1536)' --type ts # Look for type definitions related to embeddings ast-grep --pattern 'type $_ = { $$$ dimension$_ $$$ }'Length of output: 41770
🧹 Nitpick comments (6)
helpers/providers/huggingface.ts (1)
40-50
: Add validation for the API key prompt.Consider adding validation similar to mistral.ts to ensure the API key is either provided or available in the environment.
const { key } = await inquirer.prompt([ { type: "input", name: "key", message: "Please provide your Huggingface API key (or leave blank to use HF_API_KEY env variable):", + validate: (value: string) => { + if (!value && !process.env.HF_API_KEY) { + return "HF_API_KEY env variable is not set - key is required"; + } + return true; + }, }, ]);helpers/providers/ollama.ts (1)
Line range hint
13-13
: Track the TODO comment about embedding vector dimensions.The TODO indicates a missing SDK feature for retrieving embedding vector dimensions.
Would you like me to create an issue to track this enhancement request for the Ollama SDK integration?
helpers/providers/anthropic.ts (1)
59-77
: Enhance API key validation.Consider adding format validation for the Anthropic API key (should start with 'sk-'). This would help catch invalid keys early.
validate: (value: string) => { if (askModels && !value) { if (process.env.ANTHROPIC_API_KEY) { return true; } return "ANTHROPIC_API_KEY env variable is not set - key is required"; } + if (value && !value.startsWith('sk-')) { + return "Invalid API key format. Anthropic API keys should start with 'sk-'"; + } return true; },helpers/providers/llmhub.ts (1)
86-92
: Simplify the model choices transformation.The nested promise chain with multiple arrow functions can be simplified for better readability.
- choices: await getAvailableModelChoices(false, config.apiKey).then( - (choices) => - choices.map((choice: { title: string; value: string }) => ({ - name: choice.title, - value: choice.value, - })), - ), + choices: (await getAvailableModelChoices(false, config.apiKey)) + .map(({ title, value }) => ({ + name: title, + value, + })),helpers/providers/gemini.ts (1)
51-59
: Improve validation message clarity.The validation message could be more informative by explicitly mentioning both options available to the user.
- return "GOOGLE_API_KEY env variable is not set - key is required"; + return "Please either provide an API key or set the GOOGLE_API_KEY environment variable";helpers/providers/azure.ts (1)
136-138
: Consider moving toChoice to a shared utility file.The
toChoice
helper function appears in multiple provider files. Consider moving it to a shared utility file to avoid duplication.Move this function to
questions/utils.ts
and import it here instead:-function toChoice(item: { title: string; value: string }) { - return { name: item.title, value: item.value }; -} +import { toChoice } from "../../questions/utils";
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yaml
is excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (15)
helpers/providers/anthropic.ts
(2 hunks)helpers/providers/azure.ts
(3 hunks)helpers/providers/gemini.ts
(2 hunks)helpers/providers/groq.ts
(2 hunks)helpers/providers/huggingface.ts
(2 hunks)helpers/providers/index.ts
(3 hunks)helpers/providers/llmhub.ts
(3 hunks)helpers/providers/mistral.ts
(2 hunks)helpers/providers/ollama.ts
(2 hunks)helpers/providers/openai.ts
(4 hunks)index.ts
(3 hunks)package.json
(2 hunks)questions/questions.ts
(16 hunks)questions/simple.ts
(3 hunks)questions/utils.ts
(4 hunks)
🔇 Additional comments (23)
helpers/providers/huggingface.ts (1)
53-60
: LGTM! Clean migration to inquirer's list type.The model selection prompts have been correctly migrated to use inquirer's list type, maintaining consistency with other provider files.
Also applies to: 63-70
helpers/providers/mistral.ts (2)
43-60
: LGTM! Excellent API key validation implementation.The API key prompt includes:
- Clear conditional messaging based on askModels
- Comprehensive validation checking both input and environment variables
- Appropriate error messages
65-72
: LGTM! Clean migration to inquirer's list type.Model selection prompts have been correctly migrated with consistent implementation across providers.
Also applies to: 75-82
helpers/providers/ollama.ts (1)
37-44
: LGTM! Clean migration with robust error handling.The model selection prompts have been correctly migrated to inquirer with:
- Proper error handling for unavailable models
- Clear error messages with actionable instructions
Also applies to: 48-55
helpers/providers/index.ts (3)
21-23
: LGTM! Good type extension for endpoint support.The ModelConfigParams type has been correctly extended to support endpoint configuration.
33-44
: LGTM! Well-structured provider choices with framework-specific options.The provider choices are:
- Correctly formatted for inquirer
- Properly handle framework-specific providers
- Maintain a clean, readable structure
46-53
: LGTM! Clean migration to inquirer's list type.The provider selection prompt has been correctly migrated to use inquirer's list type.
helpers/providers/anthropic.ts (1)
82-89
: LGTM!The model selection implementation is clean and consistent with other providers.
questions/utils.ts (2)
118-118
: LGTM!The property name change from
title
toname
aligns with theinquirer
library's requirements.
134-142
: LGTM!The changes correctly migrate the prompt to use
inquirer
library's format:
- Prompt type changed from
select
tolist
- Choice property changed from
title
toname
Also applies to: 161-173
questions/simple.ts (4)
32-56
: LGTM!The app type prompt has been correctly migrated to use
inquirer
:
- Prompt type changed from
select
tolist
- Choice property changed from
title
toname
63-73
: LGTM!The language prompt has been correctly migrated to use
inquirer
:
- Prompt type changed from
select
tolist
- Choice property changed from
title
toname
77-84
: LGTM!The LlamaCloud prompt has been correctly migrated to use
inquirer
:
- Prompt type changed from
toggle
toconfirm
initial
property changed todefault
89-96
: LGTM!The LlamaCloud API key prompt has been correctly migrated to use
inquirer
:
- Prompt type changed from
text
toinput
index.ts (2)
265-273
: LGTM!The project path prompt has been correctly migrated to use
inquirer
:
- Prompt type changed from
text
toinput
- Default value updated to "my-llama-app"
Line range hint
153-165
: Verify the impact of removing<tools>
parameter.The
--tools
option signature has been updated to remove the expected parameter. This change might affect how tools are specified when invoking the command line interface.Run the following script to verify the usage of the
--tools
option:✅ Verification successful
The removal of
<tools>
parameter type hint is safeThe change only removes a type hint from the CLI option's signature while preserving all existing functionality:
- The
--tools
option still accepts and processes tool names as comma-separated values- All code paths, including e2e tests and tool initialization, remain unchanged
- The parameter processing logic (splitting by comma and tool lookup) continues to work as before
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Verify the usage of the `--tools` option in the codebase. # Test: Search for examples of using the `--tools` option. rg -A 5 $'--tools' # Test: Search for documentation or help text about the `--tools` option. rg -A 5 $'--tools.*<tools>'Length of output: 875
Script:
#!/bin/bash # Search for --tools usage patterns rg -l "\"--tools\"" # Search in markdown files fd -e md -x rg -l "tools" {} # Search for command option definitions ast-grep --pattern 'option($$$"--tools"$$$)' # Search for test cases fd -e test -e spec -x rg -l "tools" {}Length of output: 233
Script:
#!/bin/bash # Check e2e test usage rg -A 5 -B 5 '"--tools"' e2e/utils.ts # Check documentation rg -A 5 -B 5 'tools' README.md # Check changelog entries rg -A 5 -B 5 'tools' CHANGELOG.md # Look for any command line parsing related to tools rg -A 5 -B 5 'getTools'Length of output: 7900
questions/questions.ts (5)
1-1
: LGTM!The import statement has been correctly updated to include the
QuestionCollection
type frominquirer
.
20-42
: LGTM!The list-based prompts have been correctly migrated to use
inquirer
:
- Prompt type changed from
select
tolist
- Choice property changed from
title
toname
Also applies to: 51-61, 69-79, 93-106, 117-124, 156-169, 227-234, 254-263
135-143
: LGTM!The boolean prompts have been correctly migrated to use
inquirer
:
- Prompt type changed from
toggle
toconfirm
initial
property changed todefault
Also applies to: 371-379
Line range hint
288-308
: LGTM!The text input prompts have been correctly migrated to use
inquirer
:
- Prompt type changed from
text
toinput
- Validation logic preserved
Also applies to: 321-352, 390-397
413-421
: LGTM!The multi-select prompt has been correctly migrated to use
inquirer
:
- Prompt type changed from
multiselect
tocheckbox
helpers/providers/gemini.ts (1)
66-73
: LGTM! Clean transition to inquirer's list type.The migration from prompts' select type to inquirer's list type is well implemented. The code maintains the same functionality while leveraging inquirer's more intuitive interface.
Also applies to: 76-83
helpers/providers/azure.ts (1)
93-100
: LGTM! Consistent implementation of model selection.The model and embedding model selection prompts are well-implemented and consistent with other provider files.
Also applies to: 103-110
f890e87
to
0a1a36c
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (1)
index.ts (1)
158-162
: Add validation for tool names.While the simplified logic is good, consider adding validation to ensure that only supported tool names are provided. This would prevent runtime errors from invalid tool specifications.
(tools) => { if (!tools || tools === "none") { return []; } - return getTools(tools.split(",")); + const toolNames = tools.split(","); + const invalidTools = toolNames.filter(tool => !isValidTool(tool)); + if (invalidTools.length > 0) { + console.warn(yellow(`Warning: Invalid tools specified: ${invalidTools.join(", ")}`)); + } + return getTools(toolNames.filter(tool => isValidTool(tool))); }
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
index.ts
(3 hunks)
🔇 Additional comments (1)
index.ts (1)
5-5
: Verify complete migration from prompts to inquirer.The migration to
inquirer
is a significant change. Let's verify that allprompts
usage has been properly migrated.✅ Verification successful
Migration from prompts to inquirer is complete ✅
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Check for any remaining references to the 'prompts' library # Search for any remaining imports or usage of 'prompts' echo "Checking for remaining 'prompts' references..." rg -i "prompts" --type ts # Search for any potential prompt-related patterns that might need migration echo "Checking for potential prompt patterns..." rg "type:\s*['\"]text['\"]|type:\s*['\"]select['\"]" --type tsLength of output: 1784
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (3)
helpers/providers/llmhub.ts (3)
Line range hint
59-76
: Consider enhancing the error message for clarity.The API key prompt implementation is well-structured with proper validation logic. However, the error message could be more user-friendly.
Consider this improvement for the error message:
- return "T_SYSTEMS_LLMHUB_API_KEY env variable is not set - key is required"; + return "Please provide either an API key or set the T_SYSTEMS_LLMHUB_API_KEY environment variable";
81-98
: Add error handling for model choices fetching.While the model selection prompts are well-implemented, they lack error handling if
getAvailableModelChoices
fails. Consider wrapping these prompts in a try-catch block to provide a better user experience when model fetching fails.if (askModels) { + try { const { model } = await inquirer.prompt([ { type: "list", name: "model", message: "Which LLM model would you like to use?", choices: await getAvailableModelChoices(false, config.apiKey), }, ]); config.model = model; const { embeddingModel } = await inquirer.prompt([ { type: "list", name: "embeddingModel", message: "Which embedding model would you like to use?", choices: await getAvailableModelChoices(true, config.apiKey), }, ]); config.embeddingModel = embeddingModel; config.dimensions = getDimensions(embeddingModel); + } catch (error) { + console.error("Failed to fetch model choices. Using default models."); + // Defaults are already set in the config object + } }
138-140
: Consider enhancing model choice display with descriptions.While the mapping is correct, consider adding model descriptions to help users make informed choices.
.map((el: any) => { return { - name: el.id, + name: `${el.id}${el.description ? ` - ${el.description}` : ''}`, value: el.id, }; });
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
helpers/providers/azure.ts
(3 hunks)helpers/providers/llmhub.ts
(4 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- helpers/providers/azure.ts
⏰ Context from checks skipped due to timeout of 90000ms (26)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, express, --llamacloud)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, express, --example-file)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file)
- GitHub Check: typescript (20, 3.11, windows-latest, express, --llamacloud)
- GitHub Check: typescript (20, 3.11, windows-latest, express, --example-file)
- GitHub Check: typescript (20, 3.11, windows-latest, express, --no-files)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file)
- GitHub Check: typescript (18, 3.11, ubuntu-22.04, express, --llamacloud)
- GitHub Check: typescript (18, 3.11, ubuntu-22.04, express, --example-file)
- GitHub Check: typescript (18, 3.11, ubuntu-22.04, nextjs, --llamacloud)
- GitHub Check: typescript (18, 3.11, ubuntu-22.04, nextjs, --example-file)
- GitHub Check: typescript (18, 3.11, windows-latest, express, --example-file)
- GitHub Check: typescript (18, 3.11, windows-latest, express, --no-files)
- GitHub Check: typescript (18, 3.11, windows-latest, nextjs, --llamacloud)
- GitHub Check: typescript (18, 3.11, windows-latest, nextjs, --example-file)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud)
- GitHub Check: typescript (18, 3.11, windows-latest, nextjs, --no-files)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file)
- GitHub Check: typescript (18, 3.11, macos-latest, nextjs, --example-file)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file)
🔇 Additional comments (1)
helpers/providers/llmhub.ts (1)
2-2
: LGTM! Import changes align with PR objectives.The addition of the
inquirer
import is consistent with the PR's goal of replacing theprompts
library.
@@ -43,7 +43,7 @@ | |||
"@types/cross-spawn": "6.0.0", | |||
"@types/fs-extra": "11.0.4", | |||
"@types/node": "^20.11.7", | |||
"@types/prompts": "2.0.1", | |||
"@types/inquirer": "^9.0.7", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
version mismach with non dev package
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can't find a similar version, the other type packages are the same. is that an issue?
}, | ||
questionHandlers, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we have the functionality from questionHandlers?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
afaik, it doesn't have that callback event.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (2)
questions/simple.ts (2)
34-54
: Consider grouping choices for better organization.The choices could be grouped using inquirer's separator feature for better readability. For example:
choices: [ + new inquirer.Separator('--- Basic Apps ---'), { name: "Agentic RAG", value: "rag" }, { name: "Data Scientist", value: "data_scientist" }, { name: "Code Artifact Agent", value: "code_artifact" }, { name: "Information Extractor", value: "extractor" }, + new inquirer.Separator('--- Workflow Apps ---'), { name: "Financial Report Generator (using Workflows)", value: "financial_report_agent", }, { name: "Form Filler (using Workflows)", value: "form_filling", }, { name: "Contract Review (using Workflows)", value: "contract_review", }, ],
77-85
: Consider shortening the message for better readability.The message is quite long. Consider shortening it and providing the URL separately:
- message: - "Do you want to use LlamaCloud services? (see https://www.llamaindex.ai/enterprise for more info)", + message: "Do you want to use LlamaCloud services?", + suffix: "\nSee https://www.llamaindex.ai/enterprise for more info",
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (9)
helpers/providers/anthropic.ts
(2 hunks)helpers/providers/azure.ts
(3 hunks)helpers/providers/gemini.ts
(2 hunks)helpers/providers/groq.ts
(2 hunks)helpers/providers/huggingface.ts
(2 hunks)helpers/providers/index.ts
(2 hunks)helpers/providers/mistral.ts
(2 hunks)index.ts
(3 hunks)questions/simple.ts
(3 hunks)
🚧 Files skipped from review as they are similar to previous changes (6)
- helpers/providers/gemini.ts
- helpers/providers/groq.ts
- helpers/providers/mistral.ts
- helpers/providers/azure.ts
- helpers/providers/huggingface.ts
- helpers/providers/anthropic.ts
🔇 Additional comments (4)
helpers/providers/index.ts (2)
31-37
: LGTM! Well-structured choices array.The choices array is well-organized and follows inquirer's naming convention correctly.
44-51
: LGTM! Clean prompt configuration.The prompt configuration is clear and follows inquirer's best practices.
index.ts (2)
158-162
: LGTM! Clean tools option handler.The handler correctly handles edge cases (undefined and "none") and properly parses the tools list.
264-272
:⚠️ Potential issueAdd input validation and trimming to the project path prompt.
The prompt should validate the input and trim whitespace:
{ type: "input", name: "path", message: "What is your project named?", default: "my-app", + validate: (input) => { + input = input.trim(); + const { valid, problems } = validateNpmName(input); + if (!valid) { + return `Invalid project name: ${problems!.join(", ")}`; + } + if (fs.existsSync(path.resolve(input))) { + return "Directory already exists"; + } + return true; + }, + filter: (input) => input.trim(), },Likely invalid or redundant comment.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🧹 Nitpick comments (5)
index.ts (1)
158-162
: Add validation for tool names.While the logic is cleaner, consider validating the tool names before splitting to prevent invalid inputs.
(tools) => { if (!tools || tools === "none") { return []; } + const validTools = tools.split(",").filter(tool => { + const [namespace, spec] = tool.trim().split("."); + return namespace && spec && spec.endsWith("ToolSpec"); + }); + if (validTools.length === 0) { + console.warn("No valid tools found in input"); + return []; + } - return getTools(tools.split(",")); + return getTools(validTools); },questions/simple.ts (4)
32-56
: Consider grouping and reordering choices for better UX.The app type choices could be organized better:
- Group related choices (e.g., workflow-based apps together)
- Add descriptions to clarify features (e.g., what "Agentic RAG" offers)
- Maintain consistent suffixes (some have "using Workflows", others don't indicate special features)
Example grouping structure:
choices: [ + new inquirer.Separator('=== Basic Apps ==='), { name: "Agentic RAG", value: "rag" }, { name: "Data Scientist", value: "data_scientist" }, { name: "Code Artifact Agent", value: "code_artifact" }, { name: "Information Extractor", value: "extractor" }, + new inquirer.Separator('=== Workflow-based Apps ==='), { name: "Financial Report Generator (using Workflows)", value: "financial_report_agent" }, { name: "Form Filler (using Workflows)", value: "form_filling" }, { name: "Contract Review (using Workflows)", value: "contract_review" }, ]
62-75
: Consider improving language selection maintainability and clarity.Two suggestions for improvement:
- Use an array of app types that don't require language selection
- Add descriptions to clarify the implications of each choice
+const LANGUAGE_RESTRICTED_APPS = ['extractor', 'contract_review'] as const; + -if (appType !== "extractor" && appType !== "contract_review") { +if (!LANGUAGE_RESTRICTED_APPS.includes(appType)) { const { language: newLanguage } = await inquirer.prompt([{ type: "list", name: "language", message: "What language do you want to use?", choices: [ - { name: "Python (FastAPI)", value: "fastapi" }, - { name: "Typescript (NextJS)", value: "nextjs" }, + { + name: "Python (FastAPI)", + value: "fastapi", + description: "Best for API-first applications with Python ecosystem integration" + }, + { + name: "Typescript (NextJS)", + value: "nextjs", + description: "Ideal for full-stack web applications with React" + }, ], }]);
77-85
: Make the URL clickable in modern terminals.The URL in the suffix can be made clickable using terminal hyperlinks.
- suffix: " (see https://www.llamaindex.ai/enterprise for more info)", + suffix: ` (see \u001B]8;;https://www.llamaindex.ai/enterprise\u0007https://www.llamaindex.ai/enterprise\u001B]8;;\u0007 for more info)`,
Line range hint
108-199
: Consider architectural improvements for better maintainability.Several architectural improvements could enhance maintainability:
- Extract repeated model configurations
- Decouple frontend decisions from framework choice
- Make vectorDb choice explicit
+const DEFAULT_MODEL_CONFIG: ModelConfig = { + provider: "openai", + apiKey: args.openAiKey, + model: "gpt-4o", + embeddingModel: "text-embedding-3-large", + dimensions: 1536, + isConfigured(): boolean { + return !!args.openAiKey; + }, +}; const lookup: Record< AppType, Pick< QuestionResults, "template" | "tools" | "frontend" | "dataSources" | "useCase" > & { modelConfig?: ModelConfig; + vectorDbOverride?: string; } > = { rag: { template: "streaming", tools: getTools(["weather"]), frontend: true, dataSources: [EXAMPLE_FILE], + modelConfig: DEFAULT_MODEL_CONFIG, }, // ... other configurations }; +const shouldEnableFrontend = ( + framework: TemplateFramework, + frontendConfig: boolean +): boolean => { + if (!frontendConfig) return false; + if (framework === "nextjs") { + console.warn('Frontend disabled for NextJS framework as it includes its own UI'); + return false; + } + return true; +}; return { framework: answers.language, ui: "shadcn", llamaCloudKey: answers.llamaCloudKey, useLlamaParse: answers.useLlamaCloud, llamapack: "", - vectorDb: answers.useLlamaCloud ? "llamacloud" : "none", + vectorDb: results.vectorDbOverride ?? (answers.useLlamaCloud ? "llamacloud" : "none"), observability: "none", ...results, modelConfig: results.modelConfig ?? (await askModelConfig({ openAiKey: args.openAiKey, askModels: args.askModels ?? false, framework: answers.language, })), - frontend: answers.language === "nextjs" ? false : results.frontend, + frontend: shouldEnableFrontend(answers.language, results.frontend), };
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (4)
helpers/providers/anthropic.ts
(2 hunks)helpers/providers/huggingface.ts
(2 hunks)index.ts
(3 hunks)questions/simple.ts
(3 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- helpers/providers/anthropic.ts
- helpers/providers/huggingface.ts
🔇 Additional comments (2)
index.ts (1)
5-5
: LGTM: Import change aligns with PR objective.The addition of inquirer import and removal of prompts import is consistent with the PR's goal.
questions/simple.ts (1)
Line range hint
1-11
: LGTM! Import changes align with the migration to inquirer.
{ | ||
title: "🤖 Agentic RAG", | ||
value: "rag", | ||
description: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
afaik, Inquirer doesn't have a way to display description/hint. So i must remove this feature.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hmm, i think that is more helpful than the grouping - wouldn't merge the PR then
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (1)
questions/simple.ts (1)
103-111
:⚠️ Potential issueEnhance security and UX of API key input.
The API key input should:
- Mask the input for security
- Validate the key format
- Clarify the environment variable fallback
const { llamaCloudKey: newLlamaCloudKey } = await inquirer.prompt([ { - type: "input", + type: "password", name: "llamaCloudKey", message: - "Please provide your LlamaCloud API key (leave blank to skip):", + "Please provide your LlamaCloud API key (leave blank to use LLAMA_CLOUD_API_KEY env var):", + validate: (input: string) => { + if (!input) return true; + return input.startsWith('llama-') ? true : 'API key should start with "llama-"'; + } }, ]); -llamaCloudKey = newLlamaCloudKey || process.env.LLAMA_CLOUD_API_KEY; +const envKey = process.env.LLAMA_CLOUD_API_KEY; +if (!newLlamaCloudKey && !envKey) { + console.warn('No LlamaCloud API key provided. Some features may be limited.'); +} +llamaCloudKey = newLlamaCloudKey || envKey;
🧹 Nitpick comments (1)
questions/simple.ts (1)
34-65
: Consider terminal size constraints for pageSize.While the separators nicely organize the choices into logical groups, setting
pageSize: Infinity
might lead to poor display on smaller terminals.Consider using a reasonable fixed value or calculating based on terminal size:
- pageSize: Infinity, + pageSize: process.stdout.rows ? Math.floor(process.stdout.rows * 0.75) : 10,
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
package.json
(2 hunks)questions/simple.ts
(4 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- package.json
⏰ Context from checks skipped due to timeout of 90000ms (45)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, express, --llamacloud)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, express, --example-file)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, express, --no-files)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files)
- GitHub Check: typescript (20, 3.11, windows-latest, express, --llamacloud)
- GitHub Check: typescript (20, 3.11, windows-latest, express, --example-file)
- GitHub Check: typescript (20, 3.11, windows-latest, express, --no-files)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files)
- GitHub Check: typescript (20, 3.11, macos-latest, express, --llamacloud)
- GitHub Check: typescript (20, 3.11, macos-latest, express, --example-file)
- GitHub Check: typescript (20, 3.11, macos-latest, express, --no-files)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files)
- GitHub Check: typescript (18, 3.11, ubuntu-22.04, express, --llamacloud)
- GitHub Check: typescript (18, 3.11, ubuntu-22.04, express, --example-file)
- GitHub Check: typescript (18, 3.11, ubuntu-22.04, express, --no-files)
- GitHub Check: typescript (18, 3.11, ubuntu-22.04, nextjs, --llamacloud)
- GitHub Check: typescript (18, 3.11, ubuntu-22.04, nextjs, --example-file)
- GitHub Check: typescript (18, 3.11, ubuntu-22.04, nextjs, --no-files)
- GitHub Check: typescript (18, 3.11, windows-latest, express, --llamacloud)
- GitHub Check: typescript (18, 3.11, windows-latest, express, --example-file)
- GitHub Check: typescript (18, 3.11, windows-latest, express, --no-files)
- GitHub Check: typescript (18, 3.11, windows-latest, nextjs, --llamacloud)
- GitHub Check: typescript (18, 3.11, windows-latest, nextjs, --example-file)
- GitHub Check: typescript (18, 3.11, windows-latest, nextjs, --no-files)
- GitHub Check: typescript (18, 3.11, macos-latest, express, --llamacloud)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud)
- GitHub Check: typescript (18, 3.11, macos-latest, express, --example-file)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file)
- GitHub Check: typescript (18, 3.11, macos-latest, express, --no-files)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files)
- GitHub Check: typescript (18, 3.11, macos-latest, nextjs, --llamacloud)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files)
- GitHub Check: typescript (18, 3.11, macos-latest, nextjs, --example-file)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud)
- GitHub Check: typescript (18, 3.11, macos-latest, nextjs, --no-files)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files)
🔇 Additional comments (3)
questions/simple.ts (3)
1-1
: LGTM! Import changes align with migration objectives.The imports are correctly updated to use the inquirer library and necessary utilities.
Also applies to: 12-12
76-86
: LGTM! Clear and well-structured language selection.The choices are well-labeled with both language and framework information.
90-98
: LGTM! User-friendly LlamaCloud services prompt.Good addition of the informational URL and safe default value.
Summary by CodeRabbit
Release Notes: Library and Input Handling Update
New Features
prompts
toinquirer
library for more robust interactionsImprovements
Dependency Changes
prompts
withinquirer
Compatibility