Important Notice: Our web hosting provider recently started charging us for additional visits, which was unexpected. In response, we're seeking donations. Depending on the situation, we may explore different monetization options for our Community and Expert Contributors. It's crucial to provide more returns for their expertise and offer more Expert Validated Answers or AI Validated Answers. Learn more about our hosting issue here.

How can the backoff weights a language model be positive?

0
Posted

How can the backoff weights a language model be positive?

0

A. Here’s how we can explain positive numbers in place of backoff weights in the LM: The numbers you see in the ARPA format LM (use din CMU) are not probabilities. They are log base 10 numbers, so you have log10(probs) and log10(backoffweights). Backoff weights are NOT probabilities. Consider a 4 word vocab A B C D.

Related Questions

What is your question?

*Sadly, we had to bring back ads too. Hopefully more targeted.