import { ioClient, ioHandshake, ioInference, ioMetadata } from "llama.native.js/bin/client.io"; import { ioClientController } from "llama.native.js/bin/network/client";
Changes for llama.native.js
per publication on the npm register for node packages.
# updating your current version:
npm i --save [email protected]
# to start the server. i recommend you download the sourcecode from github and use:
npm setup && npm start
/*
This is supposed to work as single module,
This is a minified module with all non-client related packages removed.
*/
/*
This is supposed to be the different types you can import,
by themselves they are the ioClientController and the interfaces / structures used when communication was successful.
This is a minified module with all non-client related packages removed.
*/
- a server that does handshakes for inference and making sure the connection is correctly validated and secure. for now theres only one pre processed prompt that can be executed on this server. It uses the os native binary for python and llama.cpp.
- the server puts them in rooms. secure or not secure, and thereby the server has authorativity.
- a client that requests handshakes for inference and identifies the connection if its secure or not.
modified: src/jarvis/codex-x64.ts till i was happy with it and can duplicate into new ones.