Skip to content

Commit

Permalink
feat(ios): support Core ML (#41)
Browse files Browse the repository at this point in the history
* feat(ios): initial CoreML support

* feat(ios): add WHISPER_COREML_ALLOW_FALLBACK

* docs(readme): coreml section

* feat(ios): download ggml model on bootstrap for testing

* feat(example): set var bool to choose download or load local model

* feat(android): cp model to assets/

* docs(readme): add platform requirements of coreml

* chore(ci): skip download ggml model

* fix(example): update create dir & comments

* fix(example): default to USE_DOWNLOAD_MODEL

* feat(ios): use patch to fix build issue

* docs(example): update contributing guide

* docs(readme): update coreml section
  • Loading branch information
jhen0409 authored May 13, 2023
1 parent bc1f43f commit 736f9e0
Show file tree
Hide file tree
Showing 12 changed files with 415 additions and 286 deletions.
12 changes: 12 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,18 @@ To run the example app on iOS:
yarn example ios
```

To enable CoreML in example app, you need to convert the model first:

- Remove `whisper.cpp/models/ggml-base.en-encoder.mlmodelc` if it is empty (It's generated by `yarn bootstrap`)
- Please read [Core ML support](../README.md#core-ml-support) section of README and [Core ML support](https://github.com/ggerganov/whisper.cpp#core-ml-support) section of whisper.cpp.

Then modify `App.js` to use local bundle assets:

```patch
- const USE_DOWNLOAD_MODEL = true
+ const USE_DOWNLOAD_MODEL = false
```

Make sure your code passes TypeScript and ESLint. Run the following to verify:

```sh
Expand Down
24 changes: 24 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,30 @@ In Android, you may need to request the microphone permission by [`PermissionAnd

The documentation is not ready yet, please see the comments of [index](./src/index.ts) file for more details at the moment.

## Core ML support

__*Platform: iOS 15.0+, tvOS 15.0+*__

To use Core ML on iOS, you will need to have the Core ML model files.

The `.mlmodelc` model files is load depend on the ggml model file path. For example, if your ggml model path is `ggml-base.en.bin`, the Core ML model path will be `ggml-base.en-encoder.mlmodelc`. Please note that the ggml model is still needed as decoder or encoder fallback.

Currently there is no official way to get the Core ML models by URL, you will need to convert the ggml model to Core ML model folder by yourself. Please see [Core ML Support](https://github.com/ggerganov/whisper.cpp#core-ml-support) of whisper.cpp for more details.

During the `.mlmodelc` is a directory, you will need to download 5 files:

```json5
[
'model.mil',
'metadata.json',
'coremldata.bin',
'weights/weights.bin',
'analysis/coremldata.bin',
]
```

Or just add them to your app's bundle resourcesas, like the example app does, but this would increase the app size significantly.

## Run with example

The example app is using [react-native-fs](https://github.com/itinance/react-native-fs) to download the model file and audio file.
Expand Down
2 changes: 2 additions & 0 deletions example/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
*.mlmodelc
**/assets/*.bin
3 changes: 3 additions & 0 deletions example/ios/Podfile
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,9 @@ if linkage != nil
end

target 'RNWhisperExample' do
# Tip: You can use RNWHISPER_DISABLE_COREML = '1' to disable CoreML support.
ENV['RNWHISPER_DISABLE_COREML'] = '0'

config = use_native_modules!

# Flags change depending on the env values.
Expand Down
Loading

0 comments on commit 736f9e0

Please sign in to comment.