Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama working but formatting issues on the templates for tailored resumes. #1090

Open
ludiusvox opened this issue Feb 9, 2025 · 3 comments
Labels
bug Something isn't working

Comments

@ludiusvox
Copy link

Describe the bug

Fixed Ollama in Resume and Tailored Resume but formatting issues

Steps to reproduce

the templates have formatting issues unlike the main resume

Based on your requirements, here is the polished header for the ATS-friendly resume:
Aaron Linder
Autryville, USA
+(1) 356-580-7403 [email protected] LinkedIn GitHub
81 Hunter's View Dr.
 28318 09/19/1981
Contact information will be displayed as follows: Contact Information: {Name, Surname}, {City, Country} Phone Number: +{Phone number}
Email Address: {Email} LinkedIn Profile: {LinkedIn profile (if provided)} GitHub Profile: {GitHub profile (if provided)}

Expected behavior

No response

Actual behavior

No response

Branch

None

Branch name

No response

Python version

No response

LLM Used

No response

Model used

No response

Additional context

I got Ollama working, I can't do pull request, but please fix formatting issue.

@ludiusvox ludiusvox added the bug Something isn't working label Feb 9, 2025
@ludiusvox ludiusvox changed the title [BUG]: <Provide a clear, descriptive title> Ollama working but formatting issues on the templates for tailored resumes. Feb 9, 2025
@haris-peter
Copy link

can i get that ollama fixed code @ludiusvox

@ludiusvox
Copy link
Author

@haris-peter yeah so GitHub will do comparison for you let me get on github rq.

I have currently tested Ollama on Fedora 41 on AMD Radeon 6800 with AMD drivers, it required custom installation script for Ollama (install.sh had to be rewritten), and I used this model on the Ollama for lower FLOPS calculations.

QuantFactory/Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO-GGUF
Q_08 8 bit quantization model.

Let me know how you want the repo submitted, I would just make an additional branch but I am not a developer on this repository.

@ludiusvox
Copy link
Author

Bump, I don't want to fork this but let me know how active you are I could make another branch.

I am working on something else right now more actively but I would like to see an automation program.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants