Back in January, a research scientist at Google tried drafting an email with Smart Compose AI and discovered the problem. After typing a sentence that used the word “investor,” Smart Compose offered a follow-up auto-fill question that identified the noun as being male. “Not all ‘screw ups’ are equal,” said Gmail product manager Paul Lambert. The engineers and designers tried a number of ways to eliminate the bias from the program, but the inherent flaw was its natural language generation learning process. Machines rely on humans (for now) to teach them, which means that problematic patterns are simply passed on. Ultimately, Lambert and the team chose to effectively ban the pronouns as suggestions. “The only reliable technique we have is to be conservative,” said former Gmail engineer Prabhakar Raghavan. Users can of course still type whichever pronouns they want to use, but under the new policy, Smart Compose (and Google’s Smart Reply) no longer gives its two cents.
Reuters reports that Smart Compose will soon be available for French, Italian, Spanish, and Portuguese speakers. Google and other companies that use predictive text programs say they are always working to make them better but it will take time. Mistakes often turn into public backlash, which means loss of respect from consumers, which often hits corporations where it really hurts: their wallets. Avoiding those mistakes keeps people and investors happy, so while it means more work for engineers, it’s more than worth it in the long run.