feat(ml): export clip models to ONNX and host models on Hugging Face (#4700)

* export clip models

* export to hf

refactored export code

* export mclip, general refactoring

cleanup

* updated conda deps

* do transforms with pillow and numpy, add tokenization config to export, general refactoring

* moved conda dockerfile, re-added poetry

* minor fixes

* updated link

* updated tests

* removed `requirements.txt` from workflow

* fixed mimalloc path

* removed torchvision

* cleaner np typing

* review suggestions

* update default model name

* update test
This commit is contained in:
Mert 2023-10-31 06:02:04 -04:00 committed by GitHub
parent 3212a47720
commit 87a0ba3db3
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
29 changed files with 6192 additions and 2043 deletions

View file

@ -140,9 +140,8 @@
isEdited={machineLearningConfig.clip.modelName !== savedConfig.clip.modelName}
>
<p slot="desc" class="immich-form-label pb-2 text-sm">
The name of a CLIP model listed <a
href="https://clip-as-service.jina.ai/user-guides/benchmark/#size-and-efficiency"><u>here</u></a
>. Note that you must re-run the 'Encode CLIP' job for all images upon changing a model.
The name of a CLIP model listed <a href="https://huggingface.co/immich-app"><u>here</u></a>. Note that
you must re-run the 'Encode CLIP' job for all images upon changing a model.
</p>
</SettingInputField>
</div>