-
Notifications
You must be signed in to change notification settings - Fork 874
Issues: mistralai/mistral-inference
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[BUG: ] Do we have a paper of a doc describing Mixtral-8x7B and Mixtral-8x22B model architectures?
bug
Something isn't working
#233
opened Nov 7, 2024 by
vipannalla
[BUG: Using the fine-tuned Mistral-7B-v0.1 for inference, when encountering the backslash escape character '', the inference stalls, very slow, but after a few minutes, it continues generating.
bug
Something isn't working
#229
opened Oct 12, 2024 by
Essence9999
[BUG: pip install mistral_inference: ModuleNotFoundError: No module named 'torch'
bug
Something isn't working
#228
opened Oct 4, 2024 by
chrisstankevitz
Pixtral-12B tokenizer error - special_token_policy=IGNORE does not ignore special tokens in decoding
bug
Something isn't working
#227
opened Sep 27, 2024 by
OmriKaduri
[BUG: RuntimeError: Boolean value of Tensor with more than one value is ambiguous]
bug
Something isn't working
#225
opened Sep 26, 2024 by
siwer
[BUG: Cannot build on Mac M1 Silicion
bug
Something isn't working
#224
opened Sep 24, 2024 by
timspannzilliz
[BUG: AttributeError: module 'torch.library' has no attribute 'custom_op'
bug
Something isn't working
#222
opened Sep 20, 2024 by
mruhlmannGit
[BUG: RuntimeError: Couldn't instantiate class <class 'mistral_inference.args.TransformerArgs'> using init args dict_keys(['dim', 'n_layers', 'vocab_size', 'model_type'])
bug
Something isn't working
#221
opened Sep 20, 2024 by
NM5035
[BUG: TypeError: generate_mamba() takes 2 positional arguments but 3 positional arguments were given
bug
Something isn't working
#220
opened Sep 20, 2024 by
NM5035
[BUG] Device error when running on other cuda device than cuda:0
bug
Something isn't working
#215
opened Aug 28, 2024 by
cornzz
[BUG: Mamba-Codestral-7B-v0.1 Internal Triton PTX codegen error: Ptx assembly aborted due to errors
bug
Something isn't working
#213
opened Aug 21, 2024 by
andretisch
[Feat] Add streaming support to Codestral Mamba
bug
Something isn't working
#212
opened Aug 15, 2024 by
xNul
[BUG: rate limit exceeded on basic examples
bug
Something isn't working
#210
opened Aug 11, 2024 by
AlbertoMQ
[BUG: ImportError: cannot import name 'Transformer' from 'mistral_inference.model' (/usr/local/lib/python3.10/dist-packages/mistral_inference/model.py)
bug
Something isn't working
#206
opened Jul 26, 2024 by
rabeeqasem
[BUG: Could not find consolidated.00.pth or consolidated.safetensors in Mistral model path but mistralai/Mistral-Large-Instruct-2407 surely not contains it
bug
Something isn't working
#205
opened Jul 26, 2024 by
ShadowTeamCN
[BUG: ModuleNotFoundError: No module named 'mistral_inference.transformer'
bug
Something isn't working
#202
opened Jul 23, 2024 by
yafangwang9
[BUG] Transformer.from_folder() does not load the model on multiple GPU
bug
Something isn't working
#197
opened Jul 19, 2024 by
Cerrix
[BUG: mistralai/mamba-codestral-7B-v0.1 AttributeError: 'Mamba2' object has no attribute 'dconv'
bug
Something isn't working
#196
opened Jul 19, 2024 by
s-natsubori
[BUG: AssertionError: Mamba is not installed. Please install it using Something isn't working
pip install mamba-ssm
.
bug
#192
opened Jul 17, 2024 by
matbee-eth
[BUG: Mistral 7B Instruct Models from Huggingface limited to 4096tokens?
bug
Something isn't working
#182
opened Jun 18, 2024 by
MaxS3552284
Previous Next
ProTip!
Type g p on any issue or pull request to go back to the pull request listing page.