NSF logo

I recently served on a review panel for an NSF program. We were asked not to say which one. But this is my first time doing this, & it was quite an education. When I first accepted, I had some moments of doubt & guilt over the seriousness of what I was being asked to do: tell, or at least suggest to the federal government how to spend taxpayer money. And then Yvonne said to shut up, because the amount of the program is about equal to half an hour of the war in Iraq. So there you go. That pretty much puts it in perspective: science and education at home, or killing people abroad? Which would you rather?

Anyway, I have a few observations about being on the panel:

Some of the proposals are quite poor: they don’t answer or gloss over fairly major questions that seemed obvious to me, & to others on the panel as well. Some of the more egregious faults:

  • Proposing 37 years worth of work in a 3-year grant: We had one person who was an ex-project manager from corporate, & several CS people, & they had a number of good laughs about this one.
  • Having a weak sustainability plan: What will happen once the funding runs out? If the answer looks like it would be, “apply for more funding,” forget it.
  • Failing to clearly explain the impact of the project on the area of the CFP (which I can’t tell you): If the reviewers didn’t see that the proposal would have a significant impact, or even if it just had an incremental impact, forget it.
  • “Not doing their homework”: If the proposal didn’t cite or didn’t seem to even know about work in an area where they say they’re going to make an impact, forget it! Several proposals were savaged for this. I suppose this is a hazard of writing a proposal at the intersection of 2 fields, one of which you know well & one of which not so much, but still.
  • Not budgeting appropriately for evaluation. The NSF’s User-Friendly Handbook for Project Evaluation says quite clearly to “allocate 5 to 10 percent of project cost for the evaluation.” How freaking difficult is that? I mean, they give you a percentage, for heavens’ sake! Also, having metrics that aren’t specific enough or that reviewers think won’t get at the correct phenomena, forget it.

Some observations about the panel process itself:

  • Some proposals were really easy to review: everyone agreed they were good or bad. Sometimes one reviewer had given a much higher rating than the rest & got talked down, but the opposite never happened. And then some got really bimodal reviews: half good & half bad, then we had to figure out why. On that note…
  • Emotions can run high: Some panelists were either very pro or very con some proposals & some discussions got pretty heated. The panel moderators from NSF were very tolerant of this: they asked politely that the panel reviews be written to include “the range of opinion.” Yeah.
  • Everyone talks a lot, even though we all know that we have a limited amount of time: we pulled a 10-hour day the first day, & left only because we were literally being kicked out of the building by the janitorial staff.
  • By the end of the day, the proposals were getting pretty short shrift. The panel instructions say 15 minutes per, but at the beginning of day 1 some got half an hour or more. At the end of the day some were lucky to get 10.

Conclusion: When you get a proposal back with comments, pay attention to them! The reviewers really make an effort to be thoughtful, & explain what the panel saw as the important contributions & the shortcomings of the proposal. The NSF moderators said that their boss wants journal-quality reviews (of course that depends a lot on what journal, now doesn’t it?). But anyway, everything that comes out in discussion makes it into the reviews one way or another, even if you maybe have to read between the lines a bit. One knows, of course, that science is a social process. But you can really see it in action in this situation.