Getting Started with Little Data

The last time I was invited to post here on the Involvio blog, I tried to convince you that you should be incorporating the “little data” you collect every day into your practice as a student affairs professional. What I didn’t say much about was how you’re supposed to do that. Little data is often compiled semi-informally, with small numbers of data points drawn from targeted populations, has a limited research scope, and is tied to specific applications.

The goal of little data is to provide guidance for immediate action, not to prove unequivocally that you’re right.

If you want to create formal data around the effectiveness of your initiative to support expansion or continued use, that’s a deeper process. However, when you’re collecting little data, all you’re asking for is enough information to get started on a productive path.

I (hopefully) have you convinced that little data has a place in the micro-initiatives that characterize student affairs interventions, so now I’d like to share some basic considerations on how to make that happen. One thing I want to stress is that while a research background of some sort helps, you shouldn’t discount your ability to generate and analyze data just because you haven’t had a stats class in a few years. An educational background that includes heavy doses of social science, educational psychology, and adult learning theory would be great, but student affairs is a field that a lot of people kind of stumble into. Your master’s degree in Greek philosophy is not a get-out-of-research-free card. Even people who don’t have a higher education or research focus in their past can learn to generate little data. You know your field, and research basics can be learned. I need you to be fearless about collecting little data. It’s not necessarily going to be easy, but it benefits your students, and learning to do it well is worth your time. All student affairs professionals can collect, analyze, and utilize data. Using little data in problem solving breaks down into four general steps: form a plan based on your general foundational knowledge, find out what information you have versus what you need to support or reject that plan, adapt the plan to that information and execute it, and assess the effectiveness of your plan in addressing the problem.

Foundations of Little Data

I firmly believe that one of the cornerstones of effective student affairs practice is theory. You must have a basic understanding of who students are, how they learn, and why they do what they do if you want to impact them. There are tons of well supported, relevant theories about learning. Make yourself a student of them. Spend an afternoon puttering around Google Scholar or EBSCOhost reading about issues relevant to your students, and follow those articles to others. Rummage through your campus library for intro course textbooks- Dale Schunk’s Learning Theories: An Educational Perspective (Pearson Publishing, 2012) is a good one. Take notes. Then talk to your colleagues. Find out what theories they think are interesting, and read about those, too. Your favorite professional organization is another great resource, both for learning about theory and discussing it. NACADA is my personal go-to resource for exploring theory (and just about everything else, I’m a NACADA nerd for life), but if you’re not into advising, there’s NASPA, ACPA, ASHE…pick your poison. has a nice list if you aren’t sure where to start.

As you learn about different foundational theories and research in education, you’ll start to see how different theories relate to you and your students. You may not agree with everything a particular theory espouses, but you’ll start to pick out the pieces that are meaningful to you, and your sense of what ideas enhance your practice will become more focused. That translates into a working knowledge of relevant concepts when the need for little data arises. I know that sounds like a lot, but it’s an evolving process that largely happens as you go.

The nuts and bolts of generating and using little data can happen while you’re still forming your theoretical foundations.

Identifying Your Question

Being methodical about setting up your data organization strategy is one thing you can do to make your future self really, really happy. That starts with figuring out what exactly you want to know. Deciding what you plan to do with the data should not be an afterthought.

Articulating the purpose is the most important part of collecting little data, and failing to give it adequate weight sets you up for worlds of frustration later.

The question comes before you compile the data, not vice versa. The process of articulating a specific question or set of questions that you want to answer is, in research-speak, identifying your research question. You will repeat it in some form every time you create a marketing item, request funding, ask someone to assist, or recruit a participant to your initiative. Writing a good research question has caused many sleepless nights for students in research methods courses, so we’ll keep it simple here: take the problem you have, add what you think might be causing it, and drop in your idea for how you might fix it. For example, say our problem is decreased usage of a campus tutoring center. The process of identifying our research question might look like this:

Problem: The tutoring center log shows that we averaged about 14 students a day last semester. Typical traffic is closer to 25 students a day.

Possible cause: Outreach for the tutoring center has been reduced to focus on electronic message boards and e-mails rather than paper communications to cut costs and reduce waste.

Idea: Institute in-person outreach by having tutors visit traditionally difficult gateway courses.

Research question: Are tutor visits to gateway courses during the first week of classes an effective way to increase usage of the tutoring center without returning to paper marketing tools?

You may notice that we already have some little data to work from. We have the tutoring center logs, giving us a comparison of previous semesters versus now, and we know that the method of outreach has changed. Now we can incorporate some big data. At many institutions, someone with a professional interest in this question would be able to use big data resources and reports to determine whether this semester’s grades in your target courses are comparable to previous semesters. If not, you may not actually have a problem-why would you expect students to continue utilizing tutoring at the same level if their grades show they don’t have the same academic need as previous students? If big data shows that tutoring needs have changed, that could mean you need to change the direction of your micro-initiative and dump the work you’ve done so far…but isn’t it better to know that before you sink more time and resources into activities that ignore the real problem? Now, if you discover that your expected correlation exists, you can move to step 2.


Getting Your Data

Once you think you’re probably on to something, you can collect the data to show whether your initiative helps. The best way to collect that data is up to you to figure out. Your collection strategy depends heavily on what kind of question you’re trying to answer. I can’t help you figure that out here, but one thing I can suggest is that you look at what you already have before you collect your little data. Pick apart your research question and ask yourself what information you would need to answer it. Be specific.

Then ask yourself, who’s already collecting pieces of that data? How do they get it? What are they doing with it?

Ask other people who are collecting data if you can get access to their tools. Don’t accept research tools that don’t fit your needs just because you didn’t bother to ask if there was something better out there. Silos are the enemy when you’re collecting little data. You need to ask other people what they know and how they know it, and be willing to return the favor. Allies inside and outside of your department, especially allies who have different strengths and areas of expertise than you, are important here. You’re probably not the only person asking research questions to solve relevant problems. Who else is learning new things that could be useful to you?

Once you have some institutional data, big or little, to guide your research, take the simple but often overlooked step of saving that information. You might save the report in a folder, start writing a research timeline with the relevant numbers listed, even just take a screenshot-whatever makes that information accessible later if you need it. A common frustration with big data is that it’s almost impossible to find the exact same report later because things change so often and the parameters it takes to get specific information can be complex. Little data generated by other people isn’t immune, either. Because little data is often informal, when you go back to get a report you borrowed from someone else, they may or may not still have it in the form you originally used. You’ll thank yourself if you capture your baseline, even if you expect it to be available to you later.

I would be remiss at this point if I didn’t give a shout out to the IRB. The Institutional Review Board is an entity at just about any higher education institution that monitors the use of human subjects for research. Anyone who plans to conduct any type of research with students should get familiar with their institution’s IRB guidelines. Most little data collection is used internally for departmental micro-initiatives and is likely to be exempt from IRB oversight, but you don’t want to come down on the wrong side of that line. The IRB is one place where it’s not better to ask forgiveness than permission. Plus, it’s nice to be ready if you’re an optimistic type who immediately starts wondering how to leverage this data into publication and/or presentations if it turns out to be useful. Being IRB compliant with your data collection methods takes out a possible roadblock to sharing your data if it turns out to be significant beyond the original scope.

Analysis of Little Data: What Do I Do With This?

The most important consideration in how you use little data is answering one simple question: what’s your desired outcome? If you collect data, it supports your idea, you institute your idea, and it works exactly as you hoped, what will that end result look like?

Once you’ve gotten as much information as you can from existing sources, you’ll use your own research to evaluate whether that end result is what you expected. Your methods can be complex or simple, as long as they’re as accurate as possible and they answer the research question. You also have to be crystal clear about the scope of little data when sharing your findings with stakeholders. Little data isn’t proof, it’s an information point to justify why you chose one possible solution over others. Your institution may or may not love your trailblazing spirit, so you’ll need to be transparent about what your little data tells you and what it can’t tell you, and be ready for reactions consistent with the institutional culture.

In addition to asking yourself what it would look like if you’re right, it’s a useful exercise to set your boundary for how far you’re willing to go before accepting that you’re probably wrong. A difficult component of collecting little data, at least for me, is the looming question of what it means if your original assumptions are off base.

I haaaate to be wrong. It’s the worst. But it’s an inherent risk of trying something new, so you’ve got to get over it.

That data is still tells you something useful, and you have to be willing to say that if the data supports looking at something else, you’ll do that.

I’m not going to pretend that this whole little data thing is easy. I’ve spent a lot of time gathering and analyzing data only to find out that things were a lot muddier than I’d signed up for. It can be frustrating to spend hours digging through student records looking for evidence to support an idea you love, only to find out that your idea isn’t likely to work. The time you spend on it may feel like time you don’t have, but all that planning goes a long way toward making your initiative a meaningful action for your students. Data gives your initiatives teeth with stakeholders, allows you to prioritize your time and resources more effectively, and is a powerful tool for student success. With some extra effort and a little bit of adventurous curiosity, you find that little data becomes an intuitive part of your creative process.