Planning another Online Research, Offline lunch

I’m planning another Online Research, Offline lunch for researchers in the Washington DC area later this month. The specific date and location are TBA, but it will be toward the end of February near Metro Center.

These lunches are designed to welcome professionals and students involved in online research across a variety of disciplines, fields and sectors. Past attendees have had a wide array of interests and specialties, including usability and interface design, data science, natural language processing, social network analysis, social media monitoring, discourse analysis, netnography, digital humanities and library science.

The goal of this series is to provide an informal venue for a diverse set of researchers to talk with each other and gain a wider context for understanding their work. They are an informal and flexible way to researchers to meet each other, talk and learn. Although Washington DC is a great meeting place for specific areas of online research, there are few informal opportunities for interdisciplinary gatherings of professionals and academics.

Here is a form that can be used to add new people to the list. If you’re already on the list you do not need to sign up again. Please feel free to share the form with anyone else who may be interested:


Storytelling about the Past and Predicting the Future: On People, Computers and Research in 2014 and Beyond

My Grandma was a force to be reckoned with. My grandfather was a writer, and he described her driving down the street amidst symphonies. She was beautiful and stubborn, strong willed and sharp. Once a young woman with the good looks of a model, she wore high heels and took daily trips to the gym well into her 90’s. At the age of 94 she managed to run across her house, turn off the water and stand with her hand on her hip in front of the shower before I returned from the next room over with the shampoo I forgot (lest I waste water).

My Grandma, looking amazing

My Grandma, looking amazing

A few years ago I visited her in Florida. She collected work for all of her visitors to do, and we were busy from the moment I arrived. To my surprise, many of the tasks she had gathered involved dealing with customer service and discovering the truth in advertisements. At one point she led me into the local pharmacy with a stack of papers and asked to see the manager. Once she found the manager she began to go through the papers one by one and ask about them. The first paper on the stack was about the Magic Jack. He showed her the package, and she questioned him in depth about how it worked. I was shocked. I’d never thought of a store manager in this role before.

After that trip I began to pay closer attention to the ways in which the people around me dealt with customer service, and I became a kind of customer service liaison for my family. My older family members had an expectation that any customer service agent be both extensively knowledgeable and dependably respectful, but the problems of customer service seemed to have grown beyond this small, personable level to a point where a large network of people with structurally different areas of knowledge act together to form a question answering system. The amount and structure of knowledge necessary has become the focus of the customer service problem, and people everywhere complain about the lack of knowledge, ability and pleasant attitude of the customer service agents they encounter.

This is a problem with many layers and levels to it, and it is a problem that reflects the developing data science industry well. In order to deliver good customer service a great deal of information has to be organized and structured in a meaningful way to allow for optimal extraction. But this layer cannot be everything. The customer service interaction itself needs to be set-up in such a way to allow customers to feel satisfied. People expect personalized, accurate interactions that are structured in a way that is intuitive to them. The customer service experience cannot be the domain of the data scientists. If it is automated, it requires usability experts to develop and test systems that are intuitive and easy to use. If it is done by people, the people need to have access to the expertise necessary for them to do their job and be trained in successful interpersonal interaction. I believe that this whole system could be integrated well under a single goal: to provide timely and direct answers to customer inquiries in 3 steps or less.

The past few years have brought a rapid increase in customization. We have learned to expect the information around us to be customized, curated and preprocessed. We expect customer service to know intuitively what our problems are and answer them with ease. We expect Facebook to know what we want to see and customize our streams appropriately. We expect news sites to be structured to reflect the way we use them. This increase in demand and expectations is the drive behind our hunger for data science, and it will fuel a boom in data and information science positions until we have a ubiquitous underlayer of organized information across all necessary domains.

But data and information science are new fields and not well understood. Our expectations as users exceed the abilities of this fast-evolving field. We attract pioneers who are willing to step into a field that is changing shape beneath their feet as they work. But we ask for too much of a result and expect too much of a result, because these pioneers can’t be everything across all fields. They are an important structural layer of our newly unfolding economy, but in each case, another layer of people are needed in order to achieve the end result.

Usability is an important step above the data and information science layer. Through usability studies, Facebook will eventually learn that people and goals are not constant across all visits. Sometimes I look at Facebook simply to see if I’ve missed any big developments in the lives of my friends and loved ones. Sometimes I want to catch news. Sometimes I’m bored and looking for ridiculous stuff to entertain me. Sometimes I have my daughter next to me and want to show her funny pet pictures that I normally wouldn’t look twice at. Through usability studies, Facebook will eventually learn that users need some control over the information presented to them when they visit.

Through usability studies newspapers will better understand the important practice of headline scanning and develop pay models that work with peoples reading habits. Through qualitative research newspapers will understand their importance as the originators of news about big events with few witnesses, like peace treaties and celebrity births and deaths and the real value of social media for events with large numbers of witnesses and points of view. News media sources are deep in a period of transition where they are learning to better understand dissemination, virality, clicks, page views, reader behavior and reader expectations, and the strengths and weaknesses of social media news sources.

There have been many blog posts (like this one) about Isaac Asimov’s predictions for the future, because he was so right about so many things. At this point we’re at a unique vantage point where his notions of machine programmers and machine tenders are taking deeper shape. This year we will continue to see these changes form and reform around us.

An Analytical person at the Nutcracker (or Research Methodology, Nutcracker Style)

Last night we attended a Russian Ballet performance of the Nutcracker. It was a great performance, and fun was had by all.

2013-12-17 18.38.03

Early in the performance I realized that although I have developed some understanding of the ballet, I hadn’t shared any of that knowledge with my kids. At this point, I started whispering to them quietly to explain what they were seeing. I whispered quick, helpful comments, such as “those are toys dancing” and “the kids have gone to sleep now, so this is just the adults dancing.” It wasn’t long into the performance that this dynamic began to change. I realized that their insights were much funnier than mine “wow, that guy should go on ‘So You Think You Can Dance!’ or ‘The Voice’ or something! “and that my comments were starting to be pretty off-base. My comments evolved into a mash-up of “The kids have gone to sleep now” “No, I guess the kids haven’t gone to sleep yet” “I really can’t tell if the kids are still up or not!” and “Those are the sugarplum fairies” “Wait, no, maybe these are the sugar plum fairies?” and “I don’t know, sweetie, just watch them dance!” By the end of the show I had no idea what was going on or why the Chuck.E.Cheese king was dancing around on stage (although one of the girls suspected this particular king was actually a bear?). The mom next to me told me she didn’t know what was going on either “and,” she added, “I go to the Nutcracker every year! Maybe that was what made it a Russian Nutcracker?” …And here I thought the Russian influences were the Matryoshka dolls and the Chinese dancers clothed in yellow (despite the awkward English conversation that the costumes prompted).

At the beginning of the show I was nervous to whisper with my kids, but I soon realized that there was a low hum all around me and throughout the concert hall of people whispering with their kids. This, I think, is what remix research methods should be all about- recording and interviewing many audience members to gain a picture of the many perspectives in their interpretations of the show. Here is a challenge question to my readers who are hipper to qualitative research methods: what research strategy could best capture many different interpretations of the same event?

Earlier this week I spoke with a qualitative researcher about the value of an outsider perspective when approaching a qualitative research project. Here is a good example of this dynamic at play: people clapped at various parts of the performance. I recognized that people were clapping at the end of solo or duo performances (like jazz). If I were to describe these dances, I would use the claps as a natural demarcation, but I probably would not think to make any note of the clapping itself. However, the kids in my crew hadn’t encountered clapping during a show before and assumed that clapping marked “something awesome or special.” Being preteens, the kids wanted to prove that they could clap before everyone else, and then revel in the wave of clapping that they seemingly started. At one point this went awry, and the preteens were the only audience members clapping. This awkward moment may have annoyed some of the people around us, but it really made the little sister’s day! From a research perspective, these kids would be more likely to thoroughly document and describe the clapping than I would, which would make for a much more thorough report. Similarly, from a kids-going-to-a-show perspective this was the first story they told to their Dad when they got home- and one that kicked off the rest of our report with uncontrollable laughter and tears.

As the show went on and appeared not to follow any of the recognizable plot points that I had expected (I expected a progressive journey through worlds experienced from the vantage of a sleigh but instead saw all of the worlds dancing together with some unrecognizable kids variously appearing on a sleigh and the main characters sometimes dancing in the mix or on their own), I began to search for other ways to make sense of the spectacle. I thought of a gymnast friend of mine and our dramatically different interpretations of gymnastics events (me: “Wow! Look what she did!” her: “Eh, she scratched the landing. There will be points off for that.” Which parts of the dancing should I be focusing on? I told my little one “Pay attention, so we can try these moves at home.” Barring any understanding of the technical competencies involved (but sure that laying your body at some of these amazing angles, or somehow spinning on one foot, or lifting another person into the air require tons of training, skills and knowledge) or any understanding of the plot as it was unfolding in front of me, I was left simply to marvel at it all. This is why research is an iterative process. In research, we may begin by marveling, but then we observe, note, and observe again. And who knows what amazing insights we will have developed once the process has run its course enough times for events to start making sense!

To be a researcher is not to understand, but rather to have the potential to understand- if you do the research.

Great readings that might shake you to your academic core? I’m compiling a list

In the spirit of research readings that might shake you to your academic core, I’m compiling a list. Please reply to this thread with any suggestions you have to add. They can be anything from short blog posts (microblog?) to research articles to books. What’s on your ‘must read’ list?

Here are a couple of mine to kick us off:


Charles Goodwin’s Professional Vision paper

I don’t think I’ve referred to any paper as much as this one. It’s about the way our professional training shapes the way we see the things around us. Shortly after reading this paper I was in the gym thinking about commonalities between the weight stacks and survey scales. I expect myself to be a certain relative strength, and when that doesn’t correlate with the place where I need to place my pin I’m a little thrown off.

It also has a deep analysis of the Rodney King verdict.


Revitalizing Chinatown Into a Heterotopia by Jia Lou

This article is based on a geosemiotic analysis of DC’s Chinatown. It is one of the articles that helped me to see that data really can come in all forms


After method: Mess in Social Science Research by John Law

This is the book that inspired this list. It also inspired this blog post.


On Postapocalyptic Research Methods and Failures, Honesty and Progress in Research

I’m reading a book that I like to call “post-apocalyptic research methodology.” It’s ‘After Method: Mess in Social Science Research’ by John Law. At this point the book reads like a novel. I can’t quite imagine where he’ll take his premise, but I’m searching for clues and turning pages. In the meantime, I’ve been thinking quite a bit about failure, honesty, uncertainty and humility in research.

How is the current research environment like a utopian society?

The research process is often idealized in public spaces. Whether the goal of the researcher is to publish a paper based on their research, present to an audience of colleagues or stakeholders about their research, or market the product of their research, all researchers have a vested interest in the smoothness of the research process. We expect to approach a topic, perform a series of time-tested methods or develop innovative new methods with strong historical traditions, apply these methods as neatly as possible, and end up with a series of strong themes that describe the majority of our data. However, in Law’s words “Parts of the world are caught in our ethnographies, our histories and our statistics. But other parts are not, and if they are then this is because they have been distorted into clarity.” (p. 2) We think of methods as a neutral middle step and not a political process, and this way of thinking allows us to focus on reliability and validity as surface measures and not inherent questions. “Method, as we usually imagine it, is a system for offering more or less bankable guarantees.” (p. 9)

Law points out that research methods are, in practice, very limited in the social sciences “talk of method still tends to summon up a relatively limited repertoire of responses.” (p. 3) Law also points out that every research method is inherently political. Every research method involves a way of seeing or a way of looking at the data, and that perspective maps onto the findings it yields. Different perspectives yield different findings, whether they are subtly or dramatically different. Law’s central assertion is that methods don’t just describe social realities but also help to create them. Recognizing the footprint of our own methods is a step toward better understanding our data and results.

In practice, the results that we focus on are largely true. They describe a large portion of the data, ascribing the rest of the data to noise or natural variation. When more of our data is described in our results, we feel more confident about our data and our analysis.

Law argues that this smoothed version of reality is far enough from the natural world that it should perk our ears. Research works to create a world that is simple and falls into place neatly and resembles nothing we know, “’research methods’ passed down to us after a century of social science tend to work on the assumption that the world is properly to be understood as a set of fairly specific, determinate, and more or less identifiable processes.” (p. 5) He suggests instead that we should recognize the parts that don’t fit, the areas of uncertainty or chaos, and the areas where our methods fail. “While standard methods are often extremely good at what they do, they are badly adapted to the study of the ephemeral, the indefinite and the irregular.” (p. 4). “Regularities and standardizations are incredibly powerful tools, but they set limits.” (p. 6)

Is the Utopia starting to fall apart?

The current research environment is a bit different from that of the past. More people are able to publish research at any stage without peer review using media like blogs. Researchers are able to discuss their research while it is in progress using social media like Twitter. There is more room to fail publicly than there ever has been before, and this allows for public acknowledgment of some of the difficulties and challenges that researcher’s face.

Building from ashes

Law briefly introduces his vision on p. 11 “My hope is that we can learn to live in a way that is less dependent on the automatic. To live more in and through slow method, or vulnerable method, or quiet method. Multiple method. Modest method. Uncertain method. Diverse method.”

Many modern discussions of about management talk about the value of failure as an innovative tool. Some of the newer quality control measures in aviation and medicine hinge on the recognition of failure and the retooling necessary to prevent or limit the recurrences of specific types of events. The theory behind these measures is that failure is normal and natural, and we could never predict the many ways in which failure could happen. So, instead of exclusively trying to predict or prohibit failure, failures should be embraced as opportunities to learn.

Here we can ask: what can researchers learn from the failures of the methods?

The first lesson to accompany any failure is humility. Recognizing our mistakes entails recognizing areas where we fell short, where our efforts were not enough. Acknowledging that our research training cannot be universal, that applying research methods isn’t always straightforward and simple, and that we cannot be everything to everyone could be an important stage of professional development.

How could research methodology develop differently if it were to embrace the uncertain, the chaotic and the places where we fall short?

Another question: What opportunities to researchers have to be publicly humble? How can those spaces become places to learn and to innovate?

Note: This blog post is dedicated to Dr Jeffrey Keefer @ NYU, who introduced me to this very cool book and has done some great work to bring researchers together

Methodology will only get you so far

I’ve been working on a post about humility as an organizational strategy. This is not that post, but it is also about humility.

I like to think of myself as a research methodologist, because I’m more interested in research methods than any specific area of study. The versatility of methodology as a concentration is actually one of the biggest draws for me. I love that I’ve been able to study everything from fMRI subjects and brain surgery patients to physics majors and teachers, taxi drivers and internet activists. I’ve written a paper on Persepolis as an object of intercultural communication and a paper on natural language processing of survey responses, and I’m currently studying migration patterns and communication strategies.

But a little dose of humility is always a good thing.

Yesterday I hosted the second in a series of online research, offline lunches that I’ve been coordinating. The lunches are intended as a way to get people from different sectors and fields who are conducting research on the internet together to talk about their work across the artificial boundaries of field and sector. These lunches change character as the field and attendees change.

I’ve been following the field of online research for many years now, and it has changed dramatically and continually before my eyes. Just a year ago Seth Grimes Sentiment Analysis Symposia were at the forefront of the field, and now I wonder if he is thinking of changing the title and focus of his events. Two years ago tagging text corpora with grammatical units was a standard midstep in text analysis, and now machine algorithms are far more common and often much more effective, demonstrating that grammar in use is far enough afield from grammar in theory to generate a good deal of error. Ten years ago qualitative research was often more focused on the description of platforms than the behaviors specific to them, and now the specific innerworkings of platform are much more of an aside to a behavioral focus.

The Association of Internet Researchers is currently having their conference in Denver (#ir14), generating more than 1000 posts per day under the conference hashtag and probably moving the field far ahead of where it was earlier this week.

My interest and focus has been on the methodology of internet research. I’ve been learning everything from qualitative methods to natural language processing and social network analysis to bayesian methods. I’ve been advocating for a world where different kinds of methodologists work together, where qualitative research informs algorithms and linguists learn from the differences between theoretical grammar and machine learned grammar, a world where computer scentists work iteratively with qualitative researchers. But all of these methods fall short because there is an elephant in the methodological room. This elephant, ladies and gentleman, is made of content. Is it enough to be a methodological specialist, swinging from project to project, grazing on the top layer of content knowledge without ever taking anything down to its root?

As a methodologist, I am free to travel from topic area to topic area, but I can’t reach the root of anything without digging deeper.

At yesterday’s lunch we spoke a lot about data. We spoke about how the notion of data means such different things to different researchers. We spoke about the form and type of data that different researchers expect to work with, how they groom data into the forms they are most comfortable with, how the analyses are shaped by the data type, how data science is an amazing term because just about anything could be data. And I was struck by the wide-openness of what I was trying to do. It is one thing to talk about methodology within the context of survey research or any other specific strategy, but what happens when you go wider? What happens when you bring a bunch of methodologists of all stripes together to discuss methodology? You lack the depth that content brings. You introduce a vast tundra of topical space to cover. But can you achieve anything that way? What holds together this wide realm of “research?”

We speak a lot about the lack of generalizable theories in internet research. Part of the hope for qualitative research is that it will create generalizable findings that can drive better theories and improve algorithmic efforts. But that partnership has been slow, and the theories have been sparse and lightweight. Is it possible that the internet is a space where theory alone just doesn’t cut it? Could it be that methodologists need to embrace content knowledge to a greater degree in order to make any of the headway we so desperately want to make?

Maybe the missing piece of the puzzle is actually the picture painted on the pieces?


The data Rorschach test, or what does your research say about you?

Sure, there is a certain abundance of personality tests: inkblot tests, standardized cognitive tests, magazine quizzes, etc. that we could participate in. But researchers participate in Rorschach tests of our own every day. There are a series of questions we ask as part of the research process, like:

What data do we want to collect or use? (What information is valuable to us? What do we call data?)

What format are we most comfortable with it in? (How clean does it have to be? How much error are we comfortable with? Does it have to resemble a spreadsheet? How will we reflect sources and transformations? What can we equate?)

What kind of analyses do we want to conduct? (This is usually a great time for our preexisting assumptions about our data to rear their heads. How often do we start by wondering if we can confirm our biases with data?!)

What results do we choose to report? To whom? How will we frame them?

If nothing else, our choices regarding our data reflect many of our values as well as our professional and academic experiences. If you’ve ever sat in on a research meeting, you know that “you want to do WHAT with which data?!” feeling that comes when someone suggests something that you had never considered.

Our choices also speak to the research methods that we are most comfortable with. Last night I attended a meetup event about Natural Language Processing, and it quickly became clear that the mathematician felt most comfortable when the data was transformed into numbers, the linguist felt most comfortable when the data was transformed into words and lexical units, and the programmer was most comfortable focusing on the program used to analyze the data. These three researchers confronted similar tasks, but their three different methods that will yield very different results.

As humans, we have a tendency to make assumptions about the people around us, either by assuming that they are very different or very much the same. Those of you who have seen or experienced a marriage or serious long-term partnership up close are probably familiar with the surprised feeling we get when we realize that one partner thinks differently about something that we had always assumed they would not differ on. I remember, for example, that small feeling that my world was upside down just a little bit when I opened a drawer in the kitchen and saw spoons and forks together in the utensil organizer. It had simply never occurred to me that anyone would mix the two, especially not my own husband!

My main point here is not about my husband’s organizational philosophy. It’s about the different perspectives inherently tied up in the research process. It can be hard to step outside our own perspective enough to see what pieces of ourselves we’ve imposed on our research. But that awareness is an important element in the quality control process. Once we can see what we’ve done, we can think much more carefully about the strengths and weaknesses of our process. If you believe there is only one way, it may be time to take a step back and gain a wider perspective.