Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

如何导入本地的DocLayout-YOLO-DocStructBench-onnx模型而不通过huggingface导入 #300

Closed
15692280859 opened this issue Dec 20, 2024 · 7 comments

Comments

@15692280859
Copy link

由于在内网测试,我从huggingface中下载了DocLayout-YOLO-DocStructBench-onnx模型,他应该放入哪个路径让程序可以直接读取本地模型,而不通过网络从huggingface中下载?

@hellofinch
Copy link
Contributor

什么方式使用?docker还是pip安装?

@15692280859
Copy link
Author

什么方式使用?docker还是pip安装?

pip安装

@hellofinch
Copy link
Contributor

linux下的话保存的位置在~/.cache/huggingface
你在有网络的地方下载然后放在这个路径下就行

@15692280859
Copy link
Author

是否支持windows系统呢,我尝试将doclayout-yolo-docStructbench_imgsz1024.onnx放入~/.cache/huggingface 以及~/.cache/huggingface/hub都会从huggingface中请求下载

@hellofinch
Copy link
Contributor

哦,文件结构不对。
image

@awwaawwa
Copy link
Contributor

awwaawwa commented Dec 20, 2024

可以试试在一台有公网的电脑上从 HF 加载模型,然后将 ~/.cache/huggingface 目录打包并复制到内网机器上

@awwaawwa
Copy link
Contributor

awwaawwa commented Dec 20, 2024

注意到 doclayout.py 中写到:

            pth = hf_hub_download(repo_id=repo_id, filename=filename, etag_timeout=1)
        return OnnxModel(pth)

其中,pth 是 onnx 文件 路径。或许也可以尝试本地修改此代码,直接返回 OnnxModel(path) 来解决问题。

建议后续给 cli 添加一个参数,允许手动指定 onnx 模型路径。(如果没有有缘人愿意写得话,我有空了可以写)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants