Facets: Issue 44

October 14, 2017

Dear AIGA, Goodbye.

Timothy Bardlavens
For example, the Gender Equity Toolkit — beautifully designed, well thought out and highly ineffective. Who does [it] help outside of the AIGA member who read it once?

In a letter to AIGA, Bardlavens describes why he’s leaving the organisation: because it organisation can no longer represent him: as a UX designer, as a Black man, and as a person who can no longer identify with its message. AIGA has betrayed its mission, placing a focus on appearances over hard work, and lost him as a result.

Writing Well about Terrible People

Erin Kissane
their intentions don’t matter here, any more than they matter when the Times...offers similarly context-free coverage of Donald Trump

When controversial articles are published, the Internet's backlash is withering, impulsive, and usually 144 characters. After encountering such an article, Erin Kissane paused before responding. Her piece is an exercise in taking a step back and comprehend the approach of the writers, identifying the exact decisions and choices which led to the piece and its fundamental failure to explain.

Not Tryna Kill Your Buzz(words), But What Does Your Culture Code Really Mean?

Rasika Rajagopalan
if your company openly communicates that it values work/ life balance, but [promotes those working] 70-hour weeks, your employees will quickly catch wind of the hypocrisy.

We can write as many words as we like to describe what we want our companies to be, but how do those words manifest? A company’s culture ultimately is the people it enables, and the behaviours it promotes. Culture codes should be declared, Rajagopalan reminds us, but we cannot forget that they need to be performed, too.

How Meetup Counters Algorithmic Sexism

Jessica McKenzie
"For example, if you join a women-only tech group, we have chosen a model structure which won’t recommend you topics that other women like, but rather ones that are related to the technical content.”

When Meetup learnt that when they let their recommendation algorithm auto-optimise, it would suggest tech groups more often to men than women. This piece details the difficulty of designing and refining an algorithm which recommends groups that a user is interested, without using a historically key recommendation identifier: gender.

And last but not least...

Unsurprisingly, it matters how we describe what we value, and it affects representation in a given field.

This Twitter thread links a study which found that male professors were more often described as "brilliant" or "genius" than their female counterparts. Furthermore, the more these words were used in a given field, the greater its gender and race gaps.

Enjoyed this issue? Tweet about it!