Skip to content

YorkieDev/CanIRunThisLLM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

Can I Run This LLM?

Visit me here: https://yorkiedev.github.io/CanIRunThisLLM/

A web-based tool to check if your system can run specific LLM models from Hugging Face. Simply enter a GGUF model URL, your system's VRAM and RAM specifications, and the tool will tell you if you can run the model.

Features

  • Check compatibility with GGUF models from Hugging Face
  • Supports both direct file URLs and repository URLs
  • Provides detailed system requirements
  • Includes guides for finding system specifications on Windows, macOS, and Linux
  • Real-time model size checking
  • User-friendly interface

Usage

  1. Visit the website
  2. Enter a Hugging Face GGUF model URL
  3. Enter your system's VRAM and RAM specifications
  4. Click "Check Compatibility" to see if you can run the model

Development

This is a simple HTML/JavaScript application with no dependencies. To run locally:

  1. Clone the repository
  2. Open index.html in your browser

Deployment

You can deploy this project in several ways:

  1. GitHub Pages: Enable GitHub Pages in your repository settings
  2. Static Hosting: Deploy to any static hosting service (Netlify, Vercel, etc.)
  3. Local Server: Serve using any web server (Apache, Nginx, etc.)

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT License - feel free to use this project however you'd like.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages