Skip to content

Megabrain256/infernet-container-starter

 
 

Repository files navigation

infernet-container-starter

Welcome to this repository! 🎉 This repo contains a series of examples that demonstrate the true power of infernet, and the wide range of applications that can be built using it:

Examples

  1. Hello World: Infernet's version of a hello-world program. Here, we deploy a container that simply echoes back the input to us.
  2. Running a Torch Model on Infernet: This example shows you how to deploy a pre-trained pytorch model to infernet. Using this example will make it easier for you to deploy your own models to infernet.
  3. Running an ONNX Model on Infernet: Same as the previous example, but this time we deploy an ONNX model to infernet.
  4. Prompt to NFT: In this example, we use stablediffusion to mint NFTs on-chain using a prompt.
  5. TGI Inference with Mistral-7b: This example shows you how to deploy an arbitrary LLM model using Huggingface's TGI, and use it with an infernet node.
  6. Running OpenAI's GPT-4 on Infernet: This example shows you how to deploy OpenAI's GPT-4 model to infernet.

About

Starter examples for deploying to Infernet.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 34.0%
  • Solidity 25.0%
  • TypeScript 24.6%
  • Makefile 11.0%
  • Dockerfile 4.7%
  • CSS 0.5%
  • JavaScript 0.2%