Salvador Belenguer
bffa3757b5
Merge branch 'IDEA-Research:main' into batched_float16_inference
2024-03-03 11:01:59 +00:00
ASHWIN UNNIKRISHNAN
d13643262e
Update inference.py ( #298 )
2024-02-23 15:10:00 +08:00
Salvador Belenguer
ace383e299
Batched inference API and support for float16 inference
2024-01-27 14:08:29 +01:00
Mohamad Al Mdfaa
9389fa492b
fix: improve phrases2classes implementation ( #143 )
...
This commit improves the phrases2classes implementation by using a regular expression to match sub-phrases in the phrases list. This makes the implementation more accurate and efficient.
2023-06-17 02:36:16 -07:00
HaoRan-hash
9a96ef055c
Solve combined categories ( #125 )
...
* Update inference.py
2023-06-07 11:48:08 -07:00
Darshat Shah
168d65d5c4
create "." separated caption
2023-05-03 23:23:32 +05:30
rentainhe
a4dcf5d411
fix bug
2023-05-02 19:41:34 +08:00
Darshat Shah
ff94310921
use model.device when calling legacy predict
2023-04-27 12:15:11 +05:30
Piotr Skalski
e45c11c4c3
⚙️ more compact inference API - single class to load, process and infer ( #16 )
...
* ⚙️ more compact inference API - single class to load, process and infer
* 👊 bump Supervision version to `0.4.0`
2023-04-06 15:02:17 +08:00
SlongLiu
3023d1a26f
fix bugs for CPU mode
2023-03-28 16:30:45 +08:00
Piotr Skalski
c974f60d73
Test fix for #11 ( #12 )
2023-03-28 09:39:44 +08:00
Piotr Skalski
2309f9f468
feature/first_batch_of_model_usability_upgrades ( #9 )
...
* initial commit
* test updated requirements.txt
* move more code to inference utils
* PIL import fix
* add annotations utilities
* README.md updates
2023-03-24 10:07:02 +08:00