Behind the Scenes with EdChoice’s Research Team - EdChoice
Choose an audience +

RESOURCES

Blog

< Back to Blog
  • Dec 15 2017

Behind the Scenes with EdChoice’s Research Team

Meet the EdChoice research team as they discuss some of our frequently asked questions and criticisms in our latest podcast.

A few members of the EdChoice research team jumped in the studio to answer some of the most frequently asked questions about (you guessed it) our research. Listen now to learn more—from the services we provide the states to how we respond to critics of our work.

LISTEN ON ITUNES

LISTEN ON STITCHER

Our Interview Transcribed

Drew Catt: Hello, and welcome to another episode of EdChoice Chats. I’m Drew Catt, director of State Research and Policy Analysis and I’m joined today by two other members of our research team.

Martin Lueken: I’m Marty Lueken. I’m the director of Fiscal Policy and Analysis.

Paul DiPerna: And I am Paul DiPerna, the Vice President of Research.

Drew Catt: And we’re here to give you a crash course on EdChoice’s research department.

Paul DiPerna: Who we are, what we do and how we approach our work.

Drew Catt: Our communications team has compiled some of the most common questions they get online about our research, and we’re here to answer them. For anyone new, give the 30,000-foot view of what EdChoice’s research team does.

Paul DiPerna: We have a pretty clear mission for our research program, and it’s basically to inform folks having different perspectives about school choice programs and to help drive the national conversation about educational choice and access in K–12 education. So we have four lines of work that we do: data analysis and synthesizing school choice research on public request or to support our organizations, communications and outreach that we do on the website and for our trainings and events and social media or other aspects of our outreach. We also conduct empirical research with question that suggest national or very broad implications. We conduct empirical research within individual states that have more of a local or state level set of implications. Then finally, we occasionally commission thought leadership projects where research and data should inform, advance or even test staked out positions by the authors or the project leaders.

Drew Catt: So, Paul, what other resources or services do you provide in the state level and to whom?

Paul DiPerna: So we do a lot of state driven work on public request and for our state partners, and actually I don’t do the state level work. You guys do the state level work. So I’m gonna turn it over to you in a second, but I’d say that we have three core competency areas where we have some unique strengths.

One is looking at school-choice program data collection and tracking of any program changes over time. We also do a lot of fiscal research and analysis and then we do lots of survey-based research, doing polls and surveys of voters, school leaders and school parents. But I’ll turn it over to you guys.

Drew Catt: I definitely say that I do a lot of the latter, the surveys of the school parents. The surveys of the school leaders, and up until … So far I’ve been doing a lot of the data collection, the crunching of eligibility numbers for programs, but that has been shifting a lot to Mike Shaw, who as the two of you know, is our new research assistant and has been a wonderful addition to the research team.

But Marty what specifically do you provide other than what Paul’s already mentioned, or including what he’s already mentioned?

Martin Lueken: So I do a lot of fiscal analysis, where I cost out the current education choice programs that exist. I’ll also conduct fiscal analysis, or produce fiscal notes on school choice bills for legislators, or they could also be for our partners at the state level. Essentially for the network, I’m estimating how much these programs are going to cost state and local tax-payers and public schools.

Drew Catt: I would say that’s pretty valuable information.

Paul DiPerna: Yeah, it’s often a hot topic, every, especially the fall and the winter, as sessions are getting ready to go. It seems like you get a lot of requests.

Martin Lueken: It’s a busy time. It’s a really busy time for me for sure. But, the stuff that we do, I think, informs the bill design, and I get some pretty positive feedback from what we’re providing.

Drew Catt: So, if there’s one publication from EdChoice’s research library that newbies should read, and you can only pick one each, which would it be and why?

Paul DiPerna: So I would definitely say the ABCs of School Choice. That’s our flagship publication that we release every January, and it details a lot of information about every private school choice program in the country. And so, over the last five, six years. it’s doubled in size because of all the resent enactments of programs and expansions. We get a lot of feedback that it’s a useful publication just to learn more about program, and the different types of programs, school vouchers, education savings accounts, tax credit scholarships and so forth.

Martin Lueken: Yeah, I think that’s a really good publication. I would also add (A Win-Win Solution) by Greg Forster. He compiles and does a systematic review of research on school choice, looking at all the different strands of research that has been done, looking at the various student outcomes, participant effects to, competitive effects, segregation effects, fiscal, civics outcomes.

It also includes, I think, some really useful discussion about the methodology. This could be particularly useful for the lay reader who may not have research chops and wants to get an understanding of the research says. He does a pretty good job, I think, of explaining the strengths and weaknesses. As well as laying out fully his inclusion criteria, why he includes certain studies and he does not examine others, due to methodology reasons.

We’re also, by the way, have been talking about a longer-term project, where we expand on Win-Win. We’re planning to create a database that would be available for researchers. Where we extract information from all these studies, as well as other strands of research that has not been talked about, or covered in Win-Win, such as parental satisfaction. So we are planning to extract data and put this onto a web dashboard. Hopefully it will provide another useful resources for researchers.

Paul DiPerna: I think that’s something that we hope to have live and running by the end of next year. It’s going to be a considerable process, and a pretty big load to lift.

Martin Lueken: Yeah.

Paul DiPerna: But we’re up for the challenge, right?

Martin Lueken: I’m pretty excited about it.

Drew Catt: I think the two of you each stole the publications that I had in mind. Marty, you talking about another resource that we’re gonna have available on our website, triggered me to think, are the frequently asked questions that we have on our website, would be a wonderful resources for someone new to school choice to check out. Those do, each have a lot of the data and are … We try to update them at least on an annual basis.

Paul DiPerna: I think that is a great point Drew. That is something that we set out to regularly update on our website, and it is very top level from the 30,000- foot view looking at different types of questions about school choice outcome. Whether they’re about participating students, the competitive effects of school choice, public opinion on school choice, fiscal on school choice.

So there’s a little bit of overlap and putting it maybe in more plain-English terms what we publish in Win-Win and what we’re hoping to do to set out, to make available through this study database next year.

Martin Lueken: Sure, speaking of our website there is another useful page, not necessarily useful for newbies, but for researchers. We have our bibliography page that lists all the citations for studies that have been conducted on educational choice, and we’ve gotten some feedback from other researchers and graduate students as well, who found it to be useful. Which they weren’t aware of it before, and wish they heard about it sooner, or knew about it sooner for their work.

Paul DiPerna: Yeah, we definitely benefited from hearing from other researchers, grad students about if there’s maybe a study that we may have missed that just had recently released. So, people now are proactively letting us know that, “hey you should, you know, add this to our bibliography page,” just to keep it current. We appreciate, we always appreciate that type of feedback from our friends and the public.

Martin Lueken: Yeah.

Drew Catt: Now time for the gloves to come off a little bit. There’s as perception among many opponents that EdChoices advocacy for school choice biases our research. How do we respond to that criticism?

Paul DiPerna: I think this is something that is inevitable, when we’re mission driven organization with a clear point of view. We are honest about our mission and very forthcoming about it, so people know where we stand as an organization. So what we do is, we try to be as transparent as possible, whether it’s any one of us doing our in-house research or making sure with our external authors that they, as well, are disclosing as much information as possible about the methods and the data sources that they used for conducting their research.

For one example we joined the American Association for Public Opinion Research (AAPOR’s) Transparency Initiative a couple of years ago as a charter member and so there are very specific guidelines and standards for being part of the transparency initiative.

It’s focused on survey based research, or qualitative research, and so when you look at any of our survey reports you’ll see, we try to make it not only transparent but as clear as possible. So also there is a principle of clarity here, where we provide a survey profile that really gives all the information that should be made publicly available, for people to make judgments about the quality of research or any of the potential conflicting factors, if there are any.

It really gives a good outline of how these surveys are conducted, and so the AAPOR Transparency Initiative is just one example of how we try to adhere to this first principle of transparency for the kind of research that we do.

But, the advocacy side of our organization really helps to inform the questions and priorities that guide our research. So learning from our colleagues who have these really rich, in depth experiences on the ground in states and in capitals around the country, this is really, hugely beneficial for how we think about research questions. Then, there’s a certain point, where we as researchers put our priors and our values on the shelf and to the side a little bit, and we let the questions really guide the projects that we are pursing. And we follow the evidence.

Martin Lueken: I’d say for me, personally, being a member of AAPOR-TI, or the Transparency Initiative, has been wonderful for the survey reports that I’ve been working on. It really helps get all of those necessary data points down and on to a single page, such as, we get questions about, “Well who funds your research?” I’m like, “The name of the funder for this specific survey is listed on this page.”

Drew Catt: So another common question about our research, “Is it peer reviewed?”

Paul DiPerna: So this is a common question that we get, especially as I know from our friends in the communications team, they let us know what they say on Twitter and Facebook. It seems like this is an easy thing to have a cast-away kind of criticism and so we do have external reviewers for all of our publicly released research publications.

We have usually a minimum of two external reviewers, sometimes we have as many as four or five external reviewers of our research. That’s a process that can take about a month or two for us to collect all the reviews and then we provide those for the authors and often times my job will be to give some guidance in terms of the review suggestions or comments and notes and what maybe helpful.

There’s a lot of back and forth between the external reviewers and us, and we really value their input. Then we also have an internal review process where, I think, for any publication that we do, at least two of us do our own internal review at the same time as the external reviews.

Then we also have a very thorough copyediting process, and that’s always helpful where we are getting the perspective from a non-researcher when it’s going through the copy-editing process. So a lot of times we’ll be saying this might make sense to about eight people out there, and so we need to dial it back a little bit with the technical jargon or language and to make it a little bit more accessible to the public.

And so, that’s always really useful to hear those kinds of suggestions from our communications team.

Drew Catt: I’ll never forget spending months, and months on my first report once I joined the research team, and being so excited to send it out to reviewers, think that it was looking great. Then, one of the reviewers sent back pages, multiple pages of typed comments, notes and suggested edits, and I was almost crest-fallen thinking that I had this in the perfect position and then realizing how much work I still had yet to do.

Martin Lueken: You have to have thick skin.

Paul DiPerna: Yeah I think, it just goes to show that when we send these out to reviewers we don’t really take it easy on us. They do a very good job providing multiple critical comments that definitely help guide the report.

For all of my reports you can, if you’re ever interested in seeing who has reviewed them, you can usually find their names mentioned in the acknowledgement sections at the back of the reports.

Drew Catt: Right, that is an area of the reports where people can learn who did the external reviews for any of our publications.

Martin Lueken: These reviews are really detailed, high-quality too. They’re similar to reviews that occur in academic scholarly journals with the blind review process. A lot of my reviews have been similar, I think, in quality detail.

Paul DiPerna: And to follow up that, Marty, I would say that’s another, kind of, peer-review process that we really try to pursue. We really like to present at conferences, briefing and other types of meetings. Both of you have been able to do that recently and we get a lot of awesome feedback from experts in those fields who are attending those sessions. That’s another way to get really good critical feedback on the kind of work that we do.

Drew Catt: I would say that it’s kind of funny to me, that having university professors reviewing my researchers and submitting comments that are so much more substantial than what I would have received from my professors in Undergrad or Grad school.

So, as researchers yourselves, how do each of you, personally determine, the quality of a piece of research or data source?

Martin Lueken: I think that the gold standard of research is random assignment. This is what medical research is based on. Education policy is fortunate to have the ability to do this kind of research in some area’s including private school choice and with charter schools as well.

The idea being that when you have a program where schools are oversubscribed a lottery is conducted to determine what group of students receive a voucher and what get laddered out and group get business as usual. That affords us an apples-to-apples comparison where the only difference between two groups being studied, the only difference in the outcome that you are observing is that one gets the pill and one does not.

I think focus a lot on that strand of research, it’s top quality. Then you have the next tier, silver standard studies, which are based on panel data. This is longitudinal data where you’re observing the same group of students over time. There are iconometric or research methods that, I think are good quality for studying these program interventions, not as good as random assignment.

Then it goes down from there, for example probably the bronze or the … I’m not even sure if it would get a metal, looking at snap shots and comparing groups that aren’t really affording apples-to-apples comparisons. That’s kind of how I view the research and value the quality of the data.

Paul DiPerna: So, everything how you describe the gold standard, with randomization and silver standard with these panel based studies, but they do appropriate statistical adjustments or weighting of data for making sure a population is represented. I mean all that applies to the survey based research.

Drew Catt: I would say, for me personally, it’s a lot of conference papers. If a paper’s good enough to make it into an academic conference, like AEFP, which is the Association for Education, Finance and Policy, or APPAM, the Association of Public Policy and Management. The fact that the committees for those conferences would select research for inclusion speaks a lot to me. In addition to whether or not it’s included in a peer review journal.

All that to say there are plenty of very high quality working papers that are out there through organizations such as University of Arkansas, Department of EdReform that are in our realm of work.

Paul DiPerna: PEPG has their working paper series that they’ve had for a long time, which is really great.

Drew Catt: Yeah, those are highly valuable, and even along the lines of New Orleans and Louisiana research.

Paul DiPerna: Yep, yep and NBER, the National Bureau of Economic Research, they have a really famous line of working, working papers.

Drew Catt: Yeah, that’s a great way to view the research without having to wait the potentially three years for it to be accepted and published in a journal.

Paul DiPerna: Right, yep.

Drew Catt: So finally, where can people get EdChoice research, and how can they get in touch with our research team members?

Paul DiPerna: There are several ways to reach out and find us. One is to just go on our website. Folks can explore our research library. Which we have over, I think now, 120 publications that we’ve released and published over the last 15+ years. You can find our research library online at edchoice.org/research and you can also sign up for EdChoice emails, which give you regular updates to learn about new research releases.

We also have really useful overviews for navigating our School Choice in America Dashboard, which details a lot of the statistics and program participation, eligibility for the different private school choice programs around the country, and this dashboard and the research library are available on our website. But you can go and learn how to navigate and take the tutorials at our YouTube channel, which is at youtube.com/educationalchoice.

Drew Catt: In terms of the sign up on our website for the EdChoice emails, once you sign up you can watch your inbox and flesh out your profile with your mailing address if you want print copies mailed straight to your doorstep.

You can also follow our blog, or subscribe to our podcast, where we dive deeper with the authors of our latest research. You can also tweet us @edchoice. And finally our research teams email addresses are on the EdChoice website, so feel free to email us.

Paul, Marty, thank you both for joining today.

Martin Lueken: Thank you.

Paul DiPerna: Yeah, thanks a lot Drew for moderating and setting this up. This has been fun.

Drew Catt: Yeah, and thank you to all of you out there that are listening. I hope you are having wonderful days, evenings, what have you, and wish you nothing but the best. For all of us on the EdChoice research team, have a wonderful day.

 

Questions about Educational Choice?

Choose your path.

Receive Educational Choice Updates Straight to Your Inbox.

Email Newsletter Signup

Follow Our Progress.

Receive Educational Choice Updates Straight to Your Inbox.

Email Newsletter Signup Popup

Privacy Policy