logoalt Hacker News

deividyesterday at 10:27 PM1 replyview on HN

How big are the models that you use/built? Can't you run them on the browser?

Asking because I built a translator app[0] for Android, using marian-nmt (via bergamot), with Mozilla's models, and the performance for on-device inference is very good.

[0]: https://github.com/DavidVentura/firefox-translator


Replies

omneityyesterday at 10:38 PM

Thanks for the tip and cool project! The model I trained is relatively large, as it's a single model that supports all language pairs (to leverage transfer learning).

With that said while running it client-side is indeed an option, openly distributing the model is not something I would like to do, at least at this stage. Unlike the bigger projects in the NMT space, including Marian and Bergamot, I don't have any funding, and my monetization plan is to offer inference via API[0].

0: https://api.sawalni.com/docs

show 1 reply