-
Notifications
You must be signed in to change notification settings - Fork 203
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Request: Walkthrough Installation on a Raspberry Pi 4 #9
Comments
The idea is to use a precompiled bin from: |
Had always tried to install TF trough this binaries and got this error message:
|
As @kahst pointed out in another issue: |
got also another error -->
|
I am also stuck with a librosa error... |
Although mumpy 1.2 is correct installed, I don't understand... |
I tried to build Tensorflow ob RPI4, but the problem was the bazel installation which also stops. |
There is a detailed installation guide for BirdNET (original, not -Lite), see the issues there. Have you tried it? I wonder what the differences to Birdnet-Lite are.... |
All this runs on macos M1, but not on RasPi. Bird.NET (without LITE) don't use Tensorflow which is need here, and (my) main problem with the installation on the RasPi4. Anyone success installing TF and rum on Rasberry Pi --> would be highly interested on the shared the installation instructions for TF-lite. |
No it is for RasPi4. See here: |
Good news! mcguirepr89 is working on the BirdNET-Lite installer! |
Yes, wish me luck -- the tflite pre-built binaries are new to me and I don't have a laptop with enough storage to cross compile the libraries (I only have a little Chromebook with 5GB storage 😞 and the Bazel build process is not ready for aarch64 😞 ). I have been corresponding with a fellow who has gotten this up and working before, so I will likely use him as a resource and will certainly share helpful information I learn. |
@Christoph-Lauer: try out the latest installation instruction by @mcguirepr89 for the installation of a BirdNET stand-alone system on the RPi4. I can confirm that it installs neatly and runs perfectly! Nice job, Patrick, @mcguirepr89 ! @mcguirepr89 : Really looking forward to a BirdNET-Lite version! Just a dumb question: would it make sense to prepare the Lite version for a "lighter" OS (32-bit) ? Would that enable us to run the Lite version on less power-hungry RPis (RPi3) and especially less power-hungry OS ? But I may be on the totally wrong path with that question . . . I noticed that the RPi with Aarch64 with BirdNET-system gets really really hot while running . . . |
Hello, all -- I have BirdNET-Lite up and running now and will be setting up a "system" to accompany it. In the meantime, if you want to just get this project installed on a Raspberry Pi 4B running an AArch64 OS, this script does it:
You should be able to make that executable or copy and paste each line in the terminal. I'll be sure to let you know how the "system" is coming. Best regards, |
Excellent Patrick! Awesome you got this! |
Hello again -- I wanted to update that I have put together an all-in-one recording and detection extraction system for Raspberry Pi 4B built on a fork of this repo. Please check it out here I hope folks interested in using the BirdNET-Lite platform on the Raspberry Pi will find this project helpful and easy-to-use. |
It may have already been resolved, so if it's not useful, please ignore it. I have optimized FlexRFFT by replacing it with the standard TFLite operations. I only optimized the model, so the license needs to follow this repository.
|
Quoted with I haven't checked the operation yet.
|
@PINTO0309: Thanks for this! We have been using your bin for this project, thanks so much for providing them!!!! |
@nilspupils I did see this, but it is way above my head, too ;) I didn't know what to do with it, so I just made sure to star the repo and started following @PINTO0309 (SUPER active and very prolific person -- a veritable genius from my vantage) |
Proper legend in the bin field 😉 |
@PINTO0309: First of all many thanks for your contribution and putting so much effort into BirdNET to include it into your projects! |
I'm a geek, so I enjoy making various models from around the world usable across frameworks. Therefore, for models that contain special layers, such as this model, I use my own tool to automatically replace the special layers with harmless layers. Models consisting of only standard layers can be inferred on most deep learning frameworks. By doing so, it is possible to generate models that are independent of specific hardware or frameworks. Android, iPhone, MacOS, RaspberryPi, NVIDIA GPUs, Jetson Nano, Google Chrome, Safari, OAK(OpenCV AI Kit), WASM, etc... Unfortunately, the For reference, here are the conversion patterns I can handle
My tools. These home-grown tools understand the characteristics of the deep learning framework they are converting to, and have the ability to optimize the model to the limit. It is more optimized than the official model conversion tools provided by Microsoft, Facebook, Intel, and Google. It also avoids some bugs in the official model conversion tool. e.g. However, my tools are always a WIP because it is always improving. PINTO_model_zoo, which I maintain, commits a large number of models whose training code is not publicly available or whose datasets are not publicly available, and converts them for various frameworks. This is because only the binary file of the model is needed for conversion. |
@PINTO0309 I regret not reaching out sooner -- thanks for the nudge, @nilspupils! my buddy :) -- thank you, @PINTO0309 so much for laying that out and for your prolific contributions AND for modeling (pun intended) such a great ethos towards shared information. |
Thanks for your great work @PINTO0309! Would it be possible to use the Edge TPU Compiler on your BirdNET-Lite version? I run an Coral USB Accelerator on my raspberry and this might increase the speed quite a lot. |
The conditions for fast inference using the CPU are as follows:
INT8 quantized: These quantize Float32 to 8-bit Integer, which degrades the accuracy of the inference. There is another way to infer faster while maintaining accuracy.
Float32 model: There is too little information about the environment you are using, so I have presented the information assuming that you are using Python. Supplementary information. I am constantly trying to convert all models to EdgeTPU. Therefore, even if I can convert the model to EdgeTPU, I do not commit to PINTO_model_zoo if the model is not beautiful enough to have meaningful performance. Successful transformation of a model does not exactly match committing the model. In fact, the conversion of BirdNet-Lite to the EdgeTPU model is successful. However, as mentioned earlier, it produces a model that is completely unusable. Although it is not BirdNet-Lite, I have committed the results of benchmarking the speed of inference with Float32 + TFLite + XNNPACK here. |
Working on converting model which has FFT layer from Tensorflow 2.X -> OpenVINO IR. |
My tool. OpenVINO does not have an RFFT implementation. Therefore, a custom layer must be coded in C++. I am not really interested in implementing custom layers. |
First of all BirdNET is a wonderful project and i am really thankful that Stefan Kahl @kahst made this open source. As a project open to citizen science this should be made approachable for the "average citizen scientist" who happens to come from another field than computer science. The complexity of the setup of BirdNET and Tensorflow happens to be too difficult for most of us.
So could anyone link to or post a detailed walkthrough of the installation? If successful, a system image could be produced for even easier access to BirdNET. I would also ask Stefan Kahl to kindly support this idea.
Thanks! /Nils
The text was updated successfully, but these errors were encountered: