logoalt Hacker News

janalsncmlast Wednesday at 5:41 PM10 repliesview on HN

I noticed that this model is multilingual and understands 14 languages. For many use cases, we probably only need a single language, and the extra 13 are simply adding extra latency. I believe there will be a trend in the coming years of trimming the fat off of these jack of all trades models.

https://aclanthology.org/2025.findings-acl.87/


Replies

m463last Wednesday at 9:15 PM

I don't know. What about words inherited from other languages? I think a cross-language model could improve lots of things.

For example, "here it is, voila!" "turn left on el camino real"

show 1 reply
decide1000last Wednesday at 5:51 PM

I think this model proves it's very efficient and accurate.

show 1 reply
black_puppydogyesterday at 12:42 AM

honestly the inability to correctly transcribe the 4 language mix i use in my everyday life is one of the major blockers for adopting ASR tech in my own tooling. this coming from someone who literally works in that field.

turns out, outside the US, many people speak more than one language. :)

edit: I should say was a major blocker, because the last iterations of open-weight models actually work better and better. it's often the UX that's not thought for these usecases.

popalchemistlast Wednesday at 6:27 PM

It doesn't make sense to have a language-restricted transcription model because of code switching. People aren't machines, we don't stick to our native languages without failure. Even monolingual people move in and out of their native language when using "borrowed" words/phrases. A single-language model will often fail to deal with that.

show 2 replies
deprlast Wednesday at 8:11 PM

STT services that have been around for longer, like Azure, Google and Amazon, generally require you to request a specific language, and their quality is a lot higher than models that advertise themselves as LLMs (even though I believe the clouds are also using the same types of models now).

idiotsecantlast Wednesday at 9:14 PM

The hilarious part of this comment is all the comments around it complaining about not supporting enough languages

raincolelast Wednesday at 8:11 PM

Imagine if ChatGPT started like this and thought they should trim coding abilities from their language model because most people don't code.

show 1 reply
littlestymaaryesterday at 6:10 AM

A single language modèle wouldn't make any sense except for English: there's simply too much English intertwined with any other language nowadays (corporate jargon, brands, tech, etc.)

ryan_laneyesterday at 3:42 AM

"I only speak one language, so models I use should only understand one".

keeganpoppenlast Wednesday at 6:51 PM

uhhh i cast doubt on multi-language support as affecting latency. model size, maybe, but what is the mechanism for making latency worse? i think of model latency as O(log(model size))… but i am open to being wrong / that being a not-good mental model / educated guess.

show 4 replies