Friday, January 25, 2013

The Problem with "Mansplaining"

The "mansplaining" meme, which captures the phenomenon of men being patronizing to women in conversation, is trending on the internet. There is a Tumblr here; a Hillary Clinton meme there. While it is good that women are speaking out about sexism, there are some issues with using the term "mansplaining." Not only does it distract from the real problems, it also perpetuates behaviors that prevent progress. Allow me to mansplain.

The term "mansplaining" conflates two issues: that men tend to explain things in a manner that may be interpreted as patronizing and that people explain things to women with the assumption that they are ignorant. The first can largely be explained by conversational differences between men and women: in Talking from 9 to 5, Georgetown sociolinguist Deborah Tannen discusses how in conversations, (American) men tend to seek a dominant position, while women tend to prevent the other person from taking a subordinate position. Men also often use knowledge (for instance, statistics about sports teams) to establish this dominance. It is no wonder, then, that conversations between men and women often end up having the man explaining things in a condescending manner to women. Rather than mocking men, it is more productive to create awareness of these conversational differences. Men can use this awareness to be more careful when talking to women (or anyone else from a culture where establishing dominance is not the norm) and women can use this awareness to perceive the situation differently (for instance, be less offended or intimidated) when talking to men.

Because it brings in stereotypes about men vs. women, the "mansplaining" distracts from the real issue: that people are often condescending to women. In Why So Slow?, psychologist Virginia Valian explains that this often because of statistical bias. Using heuristics to make snap judgments is how we are able to effectively deal with overwhelming amounts of input. Using these same heuristics is often why people (men and women) assume women are less capable. Especially in technical fields, there are fewer capable women. Thus, given any specific woman, she is more likely to be ignorant than knowledgeable. Thus a reasonable snap judgment is to assume that a woman knows less than she does. Fortunately, we can second-guess our snap judgments. Awareness can again mitigate this problem: if we know we are likely to make a snap judgment that is wrong, we can be conscious of these situations and adjust our judgments accordingly. In overcoming snap judgments, it also helps to focus on specific attributes of a person instead of whether they are a man or a woman.

The term "mansplaining" perpetuates this gender-based statistical bias by focusing attention on the male gender rather than the behavior. The fact that this term exists makes it more likely that people will assume someone is "mansplaining" simply because he is a man. This is not productive for men who may be trying to overcome "mansplaining" tendencies. It increases the chances that the other person will dismiss what the "mansplainer" is saying outright, giving him less opportunity to practice having conversations with people who may not be accustomed to his dominance-seeking style of speaking. Also, the term "mansplaining" distracts from the fact that women can be sexist and condescending as well.

The "mansplaining" meme has been useful for raising awareness both about the tendency of men to come off as patronizing and the tendency of people to be condescending to women. But it distracts from the real problems and perpetuates gender-based bias. To work towards a solution for productive cross-gender conversation, we should focus on specific behaviors rather than gender stereotypes and on listening rather than on pointing fingers.

Saturday, January 19, 2013

Cooking for One

You there. Eating takeout at your kitchen counter, pots and pans clean and neatly stowed. Third time this week. Your taste buds are getting a little bit bored and your stomach a little bit skeptical of that recycled deep-fry oil. Why don't you cook?

What's that? You are waiting for your roommate or partner. Or you are sad because you have no such roommate or partner. Gastronomic pleasure need not be shared, you know. You can cook for one.

I know what it is like. Losing a cooking partner--especially one to whom I played a supporting role in the kitchen--was one of the more devastating aspects of past breakups. With whom was I going to invent dishes like ancho chile risotto? How was I going to have more than one home-cooked dish per meal? Who was going to test the food to make sure we were not going to die?

And I have suffered for years. I have spent more time than I would like to admit calling Tofurkey sausage, noodles, and a sauteed vegetable "dinner." I have spent more money than I would like to admit eating out because I got bored of this "dinner." "I don't cook," I would say. "Who has time for that?" Then, one day while aimlessly clicking through dating profiles of men in Israel*, I had a revelation. I could start cooking for myself. Not just stir-fries, but soups and casseroles and other things involving more than four ingredients.

Brilliant, I thought. There are so many obvious advantages to cooking for one. You can cook whatever you want. You can cook whenever you want. You can listen to whatever music you want while cooking. You don't have to worry about getting stabbed if you turn around too quickly. You don't need two people to chop vegetables! Recipes still work when you are on your own! Also, Google is surprising helpful for mitigating concerns about imminent death. (Just today, I searched "Can you combine raw tomatoes with raw honey?" I am that paranoid.)

There are some basic lessons in cooking for one. A key insight, my roommate says, is to pretend you are cooking for two and have leftovers. As cooking for two is the same amount of effort, this is an easy way to increase variety across meals. Cooking larger portions also helps you avoid the awkward situation where you are only using one quarter of an onion at a time. There are also other, smaller tricks: microwaves and ovens are great for keeping things warm for when you have to cook in sequence. Having small tupperware containers is good for both smaller leftovers and small left-over ingredients.

There are also some fun challenges: how to get variety; how use ingredients before they expire. With no one to watch, you can get creative. Have basil instead of mint? Basil tabbouleh! Don't want to buy cream just for a soup? Use yogurt instead. Want to cook pears with chiles? Go for it. Sometimes you fail--as I did today with a tomato/basil/honey dessert--but hey, no one is around to see. And who knows, you might create something amazing. My favorite creation is a cold spinach dish based on the Japanese ohitashi appetizer that uses soy and chile sauces instead of tahini. I was once told that if I invent four other dishes this good, I could start a campus restaurant.

The leftover problem also yields a nice puzzle. Most recipes seem to be for at least four people, as are the default grocery store sizes of ingredients like celery. Left unsolved, the problem is that you will have leftovers for lunch not just the next day, but the day after, and often the day after that. Choosing dishes that freeze well (for instance, soups) can help with spacing this out. You can also pretend your weeks are extended Top Chef episodes by reducing recipe portion sizes and finding different recipes for the same ingredient. Even still, you should not be surprised if there are weeks when you eat celery every day**.

But of course, cooking for one is not for every dish. It does not make the most sense, for instance, for dishes like risotto that are labor-intensive and do not keep well. These are what restaurants are for! Eating out for one is also fun--far more fun than romantic comedies would have you believe.

Go on, explore the world of solo cooking. But don't forget to come back out for a meal with a friend every now and then. Otherwise we would miss you.

* I have never been to Israel. I do not have plans to go to Israel. In denial of the true depths of my time-wasting problem, I classify this activity as "anthropological research."
** It is not clear why they sell celery in such large bunches when such small amounts are needed for soup. I am certain this is why ants on a log exists as a snack.`

Sunday, January 13, 2013

Upgrading to Scala 2.10.0

Scala 2.10.0 came out recently. It has many nice features I haven't used yet, for instance type tags, and production versions of many nice features that I was already using, for instance type Dynamic, implicit conversions, and reflective calls. I upgraded my main project to it and found it to be relatively painless.

My upgrade involved the following corresponding updates:
  • Updating my SBT version. I had been using an older version of the SBT build tool (0.7.7), so I upgraded that to 0.12.1. (Any version higher than 0.11.0 should work.) This was the most painful of all since SBT has changed the way its build files work. The new SBT build system is simple, so it was mostly a matter of figuring out which variables to set and how.
  • Updating library versions. The only dependency that affected me was that Scala 2.10.0 uses ScalaTest 1.9 instead of 1.8.
  • Updating compiler options. Type Dynamic, implicit conversions, and reflective calls are no longer experimental and now correspond to the flags "-language:dynamics," "-language:implicitConversions", and "-language:reflectiveCalls." I also added the "-feature" flag for a reason I can't remember. A small other thing is that I had to define a selectDynamic function in addition to applyDynamic for types extending Dynamic.
  • Cross-version packaging. I still compile a Scala 2.9.0-1 binary for my other project because it uses Scalatra 2.0.0--although 2.0.5-SNAPSHOT is supposedly compatible with Scala 2.10.0. SBT provides nice support cross-version compilation: define your versions, make sure you declare the right dependencies based on the Scala version, and then just use "+ package" to produce binaries for all Scala versions. Here are my build.sbt lines corresponding to that:
     crossScalaVersions := Seq("2.9.0-1", "2.10.0")  
     libraryDependencies <+= scalaVersion(v => v match {  
      case "2.9.0-1" => "org.scalatest" %% "scalatest" % "1.8" % "test"  
      case "2.10.0" => "org.scalatest" %% "scalatest" % "1.9" % "test"  
     })  
     scalacOptions <++= scalaVersion map (v => v match {  
      case "2.9.0-1" => Seq("-deprecation", "-unchecked", "-Xexperimental")  
      case "2.10.0" => Seq("-deprecation", "-unchecked", "-language:dynamics", "-language:implicitConversions", "-language:reflectiveCalls", "-feature")  
     })  
    

For your versioning needs, you may want to investigate this version investigator that Paul Phillips wrote.

And finally, aquestion for my Scala-using friends: I have been hesitant to upgrade to Scalatra 2.2.0. Anybody have a pointer to a summary of the concrete changes I need to make to port my code?

Saturday, January 12, 2013

How Science Really Works

The theme of the week is research, theory vs. practice.

This past week, the #overlyhonestmethods Twitter hashtag went viral, my friend Phil Guo published an MIT Technology Review piece on the surprising importance of "grunt work," and my undergraduate researchers learned that "research" involves a lot of hunting down compatible versions of software packages.

Indeed, life in research is not what I fantasized as a six-year-old playing "professor" with my stuffed bear and raccoon, or even as an undergraduate student deciding the next years of my life.

How I thought graduate school would be. I arrive at MIT full of energy and wonder. Professors, noticing my unique capacity for innovative thinking, pull me aside and say, "Hey you, come work with me." I spend two weeks reading papers. After that, I produce ground-breaking ideas night and day. I discover productivity shortcuts that nobody knows about: I am able to eat less and sleep less, all while producing more. The brilliance of my ideas amazes the world. I spend most of my days explaining them in a cafe with a whiteboard. After I publish several seminal papers, universities call me offering me jobs, publishers call me offering me book deals, and journalists call me asking my opinion on all technological phenomenon. I tell them phones are so passe: everyone uses SMS now. During all this, I acquire a brilliant and handsome collaborator. After we realize we are twice as productive together, we fall in love. Together, we eradicate the world's software bugs. Along the way, I also eliminate sexism, racism, homophobia, xenophobia, and dust allergies. Since there is no Nobel Prize for Computer Science, the Nobel Committee awards me the prize in Literature*. All before my sixth year of graduate school.

How it is. I arrive at MIT full of energy and wonder. Professors, betting on my capacity for hard work, pull me aside and say, "Hey you, come work with me." I spend two weeks reading papers. After that, I work night and day trying to produce a paper for a conference deadline. I discover some "productivity techniques:" I eat less and sleep less and produce more work for two weeks.  Then I burn out for two months. Despite this sacrifice, the world does not seem to care how brilliant my advisor thinks our ideas are. We have another paper deadline. My advisor gives me all feedback between midnight and the 7am deadline. Learning from this, I become nocturnal in preparation for future deadlines. I spend most of my days (well, nights) installing obscure software packages and decompiling byte code into a more complex representation of the byte code. Nobody calls or texts anymore. During all this, I learn that the two-body problem** is one of many reasons I should avoid making eye contact with male academics, lest we accidentally fall in love and into a mutually destructive future. Reality crushes my soul.

And then. I stop idealizing science. I realize that even though science is conducted by humans according to subjective standards, we can still make progress. I accept that doing research means doing hard and often unglamourous work. I learn to play the game and to sell my ideas. I also learn to combat sexism, racism, homophobia, xenophobia, and dust allergies in my own life. Most importantly, I learn to enjoy myself.

To be continued: I am still in my fifth year of graduate school.

Addendum: Readers have noted that this paints a somewhat dismal picture of Ph.D. life. I describe some more positive aspects of this experience in this other post, Reasons to Pursue a Ph.D.

* Philosopher Bertrand Russell has also received the Nobel Prize in Literature.
** The difficulty of negotiating geographically colocated academic positions.

Saturday, January 05, 2013

What is Computer Science?

Every now and then, a non-technical friend--or stranger at a party--will ask me, "What is computer science?" As your typical antisocial computer scientist, I am often too tired to do justice to this question. From now on, I will carry around QR codes pointing to this post.

Computer science is the study of what machines can do for us. Among computer scientists there are theorists, applied logicians and mathematicians study what is even possible for a machine to compute. There are systems scientists who study how to get the power we want out of our machines. Then there are artificial intelligence researchers who help computers make unsupervised decisions--they are the ones responsible for the robots and the cool demos. Computer science researchers take the risk out of building systems; software engineers build the things we use.

At the core of computer science is a pile of logical foundations. People like to study how hard problems are: whether you can compute solutions in steps ("time") that are a polynomial function of size of the input, an exponential function, or something even worse. People talk about different classes of problems: for instance, P, the class of problems with polynomial-time algorithms, and NP, a class of problems for which polynomial-time algorithms are not known. It is not known whether P and NP are the same, but it would change everything if we discovered they were: all of our banking software is encrypted based on the assumption that P != NP (P not equal to NP). (A couple of years ago there was big news when somebody claimed to have proven P = NP.  Fortunately, it turned out to be a false alarm.) People across subfields talk about what is decidable: what it is possible to develop a terminating computational routine to do. This allows us to guarantee (haha!) that the software powering your space shuttle will not spontaneously freeze from getting stuck in an infinite loop.

Then there are the machines themselves. Incredible things. The invention of the transistor allowed for everything by making high (1) and low (0) signals, called bits, stable enough to build everything else on top. Computer hardware runs on bits combined into instructions and a set of logic gates (for instance: "and," "or," "not") for decoding these instructions. Instructions encode a unit of computation: for instance "add" or "store in memory" along with operands, either as constants or in as locations in memory. Memory just contains more of these bits (just 1's and 0's!). All a computer is doing, really, is reading instructions from code memory and reading to and writing from data memory. There are also details like interacting with a keyboard and a screen. One of the key ideas in systems is abstraction: building higher-level, simpler systems on top of existing ones to simplify reasoning. Programming languages shield the programmer from having to reason about bits and operating systems shield the programmer from reasoning about details like the interaction with the keyboard and in what order application processes should run. On top of all this we have the mobile app that let you read this post. Underneath, it's just bits.

And there is also the sexy artificial intelligence stuff. Back in the day, people thought they might create machine intelligence by modeling how the mind works. Now most of the routines powering robots (and spam filters and everything else) is based on statistical heuristics. Bayesian inference and fancier stuff. People who work on these kinds of things spend a lot of time thinking about how to get more accuracy out of their algorithms (for detecting faces, for predicting whether a message is spam, for planning the route a robot will take) and/or how to make these algorithms run faster. It is pretty amazing that all of the obscure stuff you might have learned in probability theory is the reason that your Facebook friend feed works.

As for me, I focus on programming languages. Programming languages protect people from having to reason about bits, or machine-level code, or sometimes even complex details of execution. Programming languages researchers range from logicians who try to make programs more correct and hackers who try to make people's lives easier. The specifics of this will be the subject of another blog post, as we seem to be reaching the limits of your attention span.

Oh, you were just trying to make polite conversation? Sorry. Maybe you can pass this along to a friend.