Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Documentation: Add an example scenario that shows intra-delegation #118

Merged
merged 1 commit into from
Oct 8, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 11 additions & 5 deletions documentation/getting_started.md
Original file line number Diff line number Diff line change
Expand Up @@ -685,10 +685,16 @@ the screen as shown below.

The devices used to capture the screenshots are Galasy S24 and Raspberry Pi4.

### Run Android Inference
### ML Computation Delegation scenario

`NNStreamer-SSD` is a simple object detection example with TF-Lite model.
You can see build and install
process [here](https://github.com/nnstreamer/nnstreamer-example/tree/main/android/example_app#build-example)
#### Handling ML Computation Delegation Requests from the Android Components in the Same package

![Android example](img/A-24-arm64_ssd.jpg)
The MLAgent service can also handle the ML Computation requests from the other components in the same package. In order to demonstrate this scenario, we added a simple Activity example to the `ml_inference_offloading` package. This example uses the Android CameraX API to take a picture and send it to the MLAgent service for inference. Then, MLAgent service handles the requests using the bundled models and returns the result to the calling component.

To change the screen to the demonstrate the scenario, 1) open the modal navigation drawer and 2) select the **Vision Examples**. Note that to open the navigation drawer you need to press [the hamburger menu button](https://fonts.google.com/icons?selected=Material+Symbols+Outlined:menu:FILL@0;wght@400;GRAD@0;opsz@24&icon.size=24&icon.color=%235f6368&icon.platform=android) on the top left corner of the screen. The detail procedure to change the screen is shown in the following screenshots.

![Android Device (Running the App): Open the modal navigation drawer](img/A-34-arm64_Drawer.png)

After changing the screen, you can see the **Camera Preview** and the real-time object classification result above the preview box as shown below.

![Android example](img/A-34-arm64_VisionEx0.png)
Binary file removed documentation/img/A-24-arm64_ssd.jpg
Binary file not shown.
Binary file added documentation/img/A-34-arm64_Drawer.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added documentation/img/A-34-arm64_VisionEx0.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.