Due or Overdue?

The blogosphere has done a rather thorough job of thrashing the recent *scathing* report by Senator Coburn (R-Oklahoma) on supposed mismanagement and waste by the NSF. I will therefore quell the urge to point out how poorly researched, written, and organized the report it. It's a good thing the report wasn't funded by NSF or it would indeed be an excellent example of wasted effort and money by that agency.

I was, however, bothered by a particular section that has not received as much attention: the part about the supposedly shocking number of late and never-turned-in project reports:

A 2005 audit found that “[a]pproximately 47 percent of the 151,000 final and annual project reports required in the past 5 years were submitted late or not at all.”

There is a handy chart, organized by Directorate, showing the % final reports submitted on time, late, or not at all. I have some points to make about this:

1 - It is not surprising that so many reports are late. Reports are due 90 days before the end of the budget period, and at that time some of us are still scrambling to submit additional grant-related publications to include in the report. I would rather submit a late but more impressive report than an on-time report with less substance. One of the main motivators for me to file my reports (annual or final) sooner rather than later is that it is necessary to be up-to-date with reports in order for the paperwork for new grants to be processed, but if there is no pressure from that, I will wait until I can write a more complete and informative report.

2 - According to the chart in Coburn's document, although there are many late reports, the number of missing (never submitted) reports is low. It is misleading to combine late + missing reports and then focus on that number as if there is a major problem.

3 - The reports are administratively important for NSF and are one way that the programs/directorates can see what has been produced from grants (publications, education, outreach etc.), but the reports are not the actual "products" of the research and therefore are an imperfect measure of project success. It would be more relevant (but perhaps more difficult) to look at the actual outcomes of a grant -- as opposed to whether a project report has been filed on time -- and to look at contributions and impact beyond the expiration date of the grant.

I also wondered about the numbers related to unspent grant money apparently lying around, uncollected by NSF. Does this include money that PIs retain as part of no-cost extensions that give more time to do the research, or is that apparently leftover money accounted for because the expiration date of the grant changes when a no-cost extension is approved? If you only consider the original expiration date of the grant, it would look like there is unspent grant money. I couldn't tell from the report whether the unspent money was real or not.

And did anyone else think it was strange, given Coburn's concern for duplication and overlap among programs that fund STEM education for underrepresented populations, that the subtitle of the report, "Under the Microscope", is also the name of an NSF-funded website created by The Feminist Press to encourage girls and women to be involved in STEM fields? How do we all feel about this duplication of titles? I don't have any real data, but I am sure I am against it.