Even when websites don't explicitly ask users for their gender, they
often use predictive models to gender users based on data the sites collect.
Twitter, for example, doesn't require a user's gender upon registration
but instead assigns users a binary gender. Users can access this gender
information by going into their settings and privacy,
and clicking on 'Your Twitter Data'. While it is unclear what kinds of
models or features Twitter uses to determine a user's gender, it is
apparent that Twitter's gendering algorithm is biased.
The aim of this WebApp is to call out the implicit bias in gendering algorithms by gendering users based on information collected from their twitter accounts, and collect information about how often users are misgendered in correlation with race and gender identity.
What would a gendering algorithm assign to you?