You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am working on a bot project at the moment where we use language detection to determine what language the user wants to converse in. (French or English) The user can switch at any moment so we need to be confident on when switches should occur.
I am noticing that alternative languages are always detected with score of 1. This is really throwing off our conversations because we can't accurately determine the confidence level if the user wants to switch languages or not.
See the following Postman screenshot.
"Complete a masterline" - This is very clearly English, but it could contain alternates for French and 'ro' (Not even sure what this language is). But at a score of 1? Seems unrealistic to me. Am I doing something wrong? Can I train this to be more accurate?
The text was updated successfully, but these errors were encountered:
Hi all,
I am working on a bot project at the moment where we use language detection to determine what language the user wants to converse in. (French or English) The user can switch at any moment so we need to be confident on when switches should occur.
I am noticing that alternative languages are always detected with score of 1. This is really throwing off our conversations because we can't accurately determine the confidence level if the user wants to switch languages or not.
See the following Postman screenshot.
"Complete a masterline" - This is very clearly English, but it could contain alternates for French and 'ro' (Not even sure what this language is). But at a score of 1? Seems unrealistic to me. Am I doing something wrong? Can I train this to be more accurate?
The text was updated successfully, but these errors were encountered: