Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

main exe with deepseek-coder-1.3b-instruct.Q8_0.gguf not stopping correctly #6912

Closed
hyperbolic-c opened this issue Apr 25, 2024 · 7 comments

Comments

@hyperbolic-c
Copy link

When I use main to load the deepseek-coder model from here, the model can not stop correctly.

Bob: Of course, I'd be happy to help. Could you please provide me with the prompt?

In the case of generating manim code for the animation, the prompt should be something like this:

1. How would you like to animate the following scene in manim?
2. What is the main idea of the scene?
3. What are the main transitions between scenes?
4. What should be the initial and final setup of the scene?
5. How would you like to style the scene? (colors, shapes, etc.)
6. What should be the animation speed and direction?
7. What is the main focus of the scene?

The User will provide the prompts and I would generate the manim code accordingly.

Note: I will not be able to run the code as I am a text-based bot. The generation of code will be done manually by the User.


A: In Python, you can use the `manim` library to create animations. Here's a basic example of how you could generate code for a manim animation given a prompt:

```python
from manim import *

class MyScene(Scene):
    def construct(self):
        # Your animation here
        pass

# Generate code from prompt
def prompt_to_manim(prompt):
    # Implement logic to map prompt to manim commands
    pass

prompt = "1. How would you like to animate the following scene in manim?\n2. What is the main idea of the scene?\n3. What are the main transitions between scenes?\n4. What should be the initial and final setup of the scene?\n
5. How would you like to style?\n6\pass\m\a\bot\bot\bot\bot\bot\bot\bot\bot\bot\bot\bot\bot\bot\bot\bot\bot\bot\bot\bot\bot\bot\bot\bot\text\bot\bot\text\bot\bot\text\bot\bot\text\bot\bot\text\bot\bot\text\bot\bot\text\bot\bot\text\bot\bot\text\bot\bot\text\bot\bot\text\bot\bot

The chat template for server is:

Usage: ./server -m ... --chat-template deepseek
deepseek-ai/deepseek-coder-33b-instruct
You are a helpful assistant### Instruction:
Hello
### Response:
Hi there
<|EOT|>
### Instruction:
Who are you
### Response:
   I am an assistant   
<|EOT|>
### Instruction:
Another question
### Response:

How to fix it with prefix, suffix, etc under main mode? Thanks for your reply !
Maybe main could spport to use --chat-template like server. link

@RhinoDevel
Copy link
Contributor

Could this be related to #6903?

@Jeximo
Copy link
Contributor

Jeximo commented Apr 25, 2024

When I use main to load the deepseek-coder model from here, the model can not stop correctly.

Deepseek is not yet implemented, here's the PR: #5981

Maybe main could spport to use --chat-template like server.

Yes, it's being discussed: #6822

How to fix it with prefix, suffix, etc under main mode?

./main -m ~/model.gguf --temp 0 -ins --penalize-nl -r "<|EOT|>" --in-suffix "### Response:" -p "You are a helpful assistant."

@hyperbolic-c
Copy link
Author

Could this be related to #6903?

Yep, this problem is widespread. But I did not try newer quant model, because deepseek model is not yet support.

@hyperbolic-c
Copy link
Author

@Jeximo Thanks! I've been following the progress of supporting deepseek model. So I'm asking for help on how to solve the problem by use parameter with main like this, which is a little complicated for me.

@Jeximo
Copy link
Contributor

Jeximo commented Apr 26, 2024

I'm asking for help on how to solve the problem by use parameter with main like this, which is a little complicated for me.

@hyperbolic-c
I gave an example of how to use deepseek-coder-1.3b-instruct.Q8_0.gguf in main. Here it is again,

./main -m ~/deepseek-coder-1.3b-instruct.Q8_0.gguf --temp 0 -ins --penalize-nl -r "<|EOT|>" --in-suffix "### Response:" -p "You are a helpful assistant."

@hyperbolic-c
Copy link
Author

hyperbolic-c commented Apr 26, 2024

@Jeximo Oh, I've seen it. Thanks again ! !

I'm asking for help on how to solve the problem by use parameter with main like this, which is a little complicated for me.

I mean thank you for your help.

@github-actions github-actions bot added the stale label May 27, 2024
Copy link
Contributor

This issue was closed because it has been inactive for 14 days since being marked as stale.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants