Google says its AI will tag man and woman in photos as person to avoid unfair bias

Google has announced that Google AI would not recognize the person in the pictures and logos, present in the database, as male or female. Instead, the tool will use “person” as a tag in pictures if there would be any human being.

According to Business Insider, Google Vision API sent an email to all of its customers about this new development. In the email, the company said that will recognize all genders as PEOPLE because AI cannot judge the gender through appearance or what the people are wearing and secondly, it is against the code of AI to create gender biases. The rule was mentioned in the mail.

We checked the development and found out that there was a lack of tags of male and female (see the attached image below for reference).

The company asked developers to comment on it. The majority of them found it “positive development” but one complained about why AI could not recognize if I could recognize 99 percent of the person is male or female.

Currently, the major topic of discussion is the bias created by AI. The developers complain that AI misidentifies people according to their color. Besides, there were complaints that Google Vision API misidentify trans people and sexuality of a person. Moreover, it was discussed that some pictures have more tag of females than males or vice versa. Yet, it was found out that there are biased dataset of cooking in some tools in which algorithms predict that person is women even it is male in numerous pictures by 68 percent.

Therefore, the development was appreciated by a fellow tech policy of Mozilla, Frederike Kaltheuner, who called it positive while others found it exemplary step for others.




Read next: Artificial Intelligence: Good Versus Evil (infographic)
Previous Post Next Post