-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
IndexError: list index out of range #7
Comments
I am Encountering same error on Ubuntu. I followed each and every step described in the blog but got the error.
while my output for this is
Please provide any solution for this |
The same error. Elastic returns no hits on every question about elastic docs |
It's not working for me on 8.10.3 deployment version but works fine on 8.9 with fix from this closed issue - #4. Thanks |
Hey, I found the issue. query = {
"bool": {
"must": [{
"match": {
"title": {
"query": query_text,
"boost": 1
}
}
}],
"filter": [{
"exists": {
"field": "ml.inference.title.predicted_value"
}
}]
}
} The above solved it for me |
Getting a list index out of range error, as below:
File "C:\PyProjects\elastic-ChatGPT\rsllc-elastic-env\lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 541, in _run_script exec(code, module.__dict__) File "C:\PyProjects\elastic-ChatGPT\elastic-gpt.py", line 116, in <module> resp, url = search(query) File "C:\PyProjects\elastic-ChatGPT\elastic-gpt.py", line 83, in search body = resp['hits']['hits'][0]['fields']['body_content'][0]
The error is happening in the streamlit script for the line
body = resp['hits']['hits'][0]['fields']['body_content'][0]
The output of
resp = es.search(index=index, query=query, knn=knn, fields=fields, size=1, source=False)
is
{'took': 44, 'timed_out': False, '_shards': {'total': 2, 'successful': 2, 'skipped': 0, 'failed': 0}, 'hits': {'total': {'value': 0, 'relation': 'eq'}, 'max_score': None, 'hits': []}}
I've followed the instructions in the blog but unsure why getting a list index out of range
The text was updated successfully, but these errors were encountered: