Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue appling for jobs in windows #1

Open
victordp86 opened this issue Sep 5, 2024 · 1 comment
Open

Issue appling for jobs in windows #1

victordp86 opened this issue Sep 5, 2024 · 1 comment

Comments

@victordp86
Copy link

I have serveral issues when I use the app.

Classify Result: None
Tipo de Classify Result: <class 'NoneType'>
Traceback (most recent call last):
File "C:\Users\user\proyectos\upwork-auto-jobs-applier-using-AI\main.py", line 17, in
bot.run(job_title)
File "C:\Users\user\proyectos\upwork-auto-jobs-applier-using-AI\src\graph.py", line 206, in run
state = self.graph.invoke({"job_title": job_title})
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\user\proyectos\upwork-auto-jobs-applier-using-AI\venv\Lib\site-packages\langgraph\pregel_init_.py", line 1617, in invoke
for chunk in self.stream(
File "C:\Users\user\proyectos\upwork-auto-jobs-applier-using-AI\venv\Lib\site-packages\langgraph\pregel_init_.py", line 1303, in stream
panic_or_proceed(all_futures, loop.step)
File "C:\Users\user\proyectos\upwork-auto-jobs-applier-using-AI\venv\Lib\site-packages\langgraph\pregel_init
.py", line 1733, in _panic_or_proceed
raise exc
File "C:\Users\user\proyectos\upwork-auto-jobs-applier-using-AI\venv\Lib\site-packages\langgraph\pregel\executor.py", line 59, in done
task.result()
File "C:\Python312\Lib\concurrent\futures_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\concurrent\futures_base.py", line 401, in __get_result
raise self.exception
File "C:\Python312\Lib\concurrent\futures\thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Mb3-vdonet\proyectos\upwork-auto-jobs-applier-using-AI\venv\Lib\site-packages\langgraph\pregel\retry.py", line 26, in run_with_retry
task.proc.invoke(task.input, task.config)
File "C:\Users\Mb3-vdonet\proyectos\upwork-auto-jobs-applier-using-AI\venv\Lib\site-packages\langchain_core\runnables\base.py", line 2876, in invoke
input = context.run(step.invoke, input, config, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Mb3-vdonet\proyectos\upwork-auto-jobs-applier-using-AI\venv\Lib\site-packages\langgraph\utils.py", line 102, in invoke
ret = context.run(self.func, input, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Mb3-vdonet\proyectos\upwork-auto-jobs-applier-using-AI\src\graph.py", line 76, in classify_scraped_jobs
matches = json.loads(classify_result, strict=False)["matches"]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\json_init
.py", line 339, in loads
raise TypeError(f'the JSON object must be str, bytes or bytearray, '
TypeError: the JSON object must be str, bytes or bytearray, not NoneType

@kaymen99
Copy link
Owner

It looks like the issue is that the model is returning a None value instead of a valid JSON object, which is causing the graph to break. Could you please let me know which LLM model you're using?

I am using gemini-1.5-pro for the classification task, and the code works fine. Sometime the Groq llama3 models fail to provide a valid JSON output.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants