Assigning gender to these AI personalities may say something about the roles we expect them to play.
Virtual assistants like Siri, Cortana, and Alexa perform functions historically given to women.
The chatbot currently covers 26 major criminal offenses in England and Wales and the developers plan to add others.
Among the topics it covers are sex offenses; property offenses such as theft, burglary and break-ins; and offenses against the person such as injuries, assaults and psychological harm.
Lawbot leads users through a series of questions to arrive at an assessment of their legal situation.
The developers say that Lawbot’s answers are randomized so that no two conversations are the same.
And it becomes a loop where, if you’re not conscious, you just think this is inevitable and this is the way it is.
It creates this illusion that this is the way it is, how it has been, and how it shall be.” Flight attendants and travel agents are also roles that traditionally skew female, so perhaps it’s unsurprising that Alaska Airlines and United Airlines chose lady bots “Jenn” and “Alex” to assist their passengers.
The trend may seem harmless, but we should be careful about the message it sends if we want to prevent AI from becoming the latest chapter in a history of objectifying women.
Women are already subject to volumes of damaging, implicit messaging.
Its developers say it draws from the insights of Cambridge-educated lawyers to formulate accurate questions derived from relevant statutory law.