I also emailed "triple-professor" Lyon for his response to Atlas's critique. It's at the end of this post.
By coincidence, I read this passage around the same time I found the report:
If rigorous empirical testing were required of economic models, many fewer papers would be written, the subject would become boring to many current practitioners, and the scientific reputation of the field would rise. There would be fewer fads and more lasting contributions. The criterion of success would be to understand phenomena rather than to maximize citations indices. -- James J. Heckman, writing in March 2003 for Lives of the Laureates (Fourth ed. 2004)Looking over these articles and emails, I am starting to think that the material in these two posts needs a broad circulation among
- Why did you write it?
When I first began researching the report one of my motivations was that researchers, developers, and implementers of environmental policy in other countries might be especially led astray by the studies examined in my report. This is because, at that time, studies of United States environmental policies so dominated what was published in English language academic journals, and people in other countries were less likely to be sufficiently familiar with United States’ environmental laws, policies, and databases to determine what was wrong with them. This could lead to serious mistakes in other countries’ environmental policies if they used these flawed studies as a guide. I had planned to make this point in the introduction of my report, but in the last few years there has been an enormous increase in the publication in English language academic journals of studies from other countries, so I decided it might now sound presumptuous to claim that other countries were so dependent on United States studies. Of course, this leads to another concern – if English language academic journals, typically with United States peer reviewers, have done so badly in understanding United States’ environmental laws, policies, and databases, how well do they do in understanding those from other countries?
- I reckon that represents about a year of work, but how long did it take you?
...it was definitely longer than a year, so long that I hate to admit it. Let’s just say that I worked on it for a few years. I originally got the idea when I was in academia about 10 years ago and realized the poor quality of social science empirical environmental policy studies. I tried to interest some other people in academia in collaborating on it, but with no luck. So I decided to do it myself, but when I started I naively assumed that it would only be a small fraction of its ultimate length.
- Why did you do it (I read the "repeat of 2002" comment, but was that the reason?) [This question was too vague; I meant "as a response to the 2002 critique of law review articles by economists;" see Question 9, below]
...I would first repeat that I did not initially assume that it would be anywhere near as long as it is. Thus, at the start this did not seem to be an imposing task, so it turned out to be more impressive looking than I initially imagined. In academic journal articles that I previously published I made it a point to discuss concerns about other academic journal articles on the same subject, so I always have viewed such criticism as an important part of contributing to readers’ knowledge. One of my articles (“Rush to Judgment” in Law & Society Review) that had an especially lengthy criticism of others’ articles probably is what was referred to in the “repeat of 2002” comment that you saw. Ultimately I realized, however, that only by doing a report like this one would it be possible to comprehensively point out these concerns at one time in one document that people could thereafter use as a sort of reference book to guide their understanding of past environmental policy studies and their review and planning of future studies.
- Have any authors replied to your report? Positive? negative? corrections or clarifications?
No reactions thus far -- unless it was one of those disgruntled authors that disabled the brakes on my car ;).
- Any interest from the consumers (EPA, et al.) of research material?
A good question. Just as I was preparing to post my report on the Social Science Research Network (“SSRN”) in late December, I received an e-mail, first through RESECON and then through a U.S. Environmental Protection Agency (“EPA”) e-mail list, originating from two EPA economists involved in deciding grants to social science researchers. They announced an EPA-sponsored workshop on January 18 at which the results from some recent social science environmental policy research funded by EPA would be presented and discussed. Some of the academic researchers involved had previously published articles that are discussed in my report (including my newest buddy, Thomas Lyon) – and we know what that means. Thus, on January 3, after my report was posted on the SSRN, I e-mailed my report to those two EPA economists, stating that “[y]ou might find it useful to have my attached report before your workshop, because some issues that the report raises probably are relevant to the workshop presentations.” Unfortunately, a severe ice storm on the morning of the workshop prevented me from attending, but hopefully the good news was that it also prevented many other people from attending the dissemination of more EPA-funded misinformation from the presenters. I have never received even an acknowledgement from anyone from EPA that they even received my e-mail, much less any contact thereafter from anyone at EPA regarding my report. Thus, that was not encouraging.
Although disappointing, I was not surprised because, as my report mentions, EPA has funded a substantial portion of the incompetent studies described in my report, and some current EPA economists co-authored other studies described in my report or flawed studies published after the 2005 endpoint of my report. Thus, the normal human and bureaucratic desire to avoid admitting mistakes would apply. Obviously, it puts EPA in an awkward position when many of the academics that presented or commented on research at that workshop are responsible for some of the dreadful studies described in my report. I have read transcripts of these workshops from prior years where substantively competent EPA staff had reviewed and commented on the research presented there, and it was disconcerting that they completely missed the serious problems that I later identified when that research was published in academic journals.
I will also note that several years ago I worked at a firm that also employed the person who was the EPA Assistant Administrator in charge of EPA economic analyses and policy planning during the Clinton Administration, during which time some of the EPA-funded studies described in my report were done. During a conversation with him one day I mentioned that I had concerns about the social science research grants to academics that EPA had made (but I was not as specific and blunt as in my report, since he was the one ultimately responsible for those decisions). He immediately responded by saying the process was “corrupt.” I am confident that he did not mean “corrupt” literally, such as bribes or kickbacks. Instead, I assume that he meant that the grants were awarded in a biased manner rather than solely on merit, such as to academic researchers friendly with or admired by EPA staff. Of course, I also vividly remember that when I interviewed with this guy for the job, he started what turned out to be a very brief conversation by saying that he knew nothing about environmental policy and that he was placed in charge of EPA economic analyses and policy planning simply because he was supposed to be a good manager. Effective American environmental policy in action!
- Have you seen any legislation/policies that were flawed (by design or implementation) as a result of mistakes in research that you have identified?
In my experience it is rare for social science empirical environmental policy articles to be expressly cited in support of any federal environmental regulations, unlike the frequent citing of articles by natural scientists or environmental engineers. The major exception is articles on the sulfur dioxide cap and trade program, which fortunately is one of the niches of social science empirical environmental policy articles that I concluded was largely lacking in substantive problems. Unfortunately, we do not know the extent to which other articles might have influenced the design of EPA regulations, but were not cited. As I mentioned in my report, from a legal perspective it would be preferable for EPA not to admit that it was influenced by an article, and instead claim that it was using its best judgment to make decisions. If EPA claimed to have been influenced by an article, then groups opposed to the regulation could try to use any deficiencies in the article to argue that EPA’s decision was legally improper (i.e., arbitrary or capricious). If EPA instead claimed that there was no persuasive guidance one way or the other from the available articles, then the legal burden would be on groups opposing the regulation to prove that the available articles provided compelling evidence that EPA made the wrong decision. This would be very difficult because of the typically flawed nature of the articles. This is the irony that I mentioned in my report – EPA funds some of this research, but then is better off officially ignoring the results. Unfortunately, however, we do not know the extent to which EPA might have been influenced by these articles, but never admitted it. The situation is even worse for state regulations because the public record of the process leading up to them is even less accessible.
As far as these articles influencing flawed legislation, I am unaware of how frequently such articles have been cited because, unlike for federal regulations, there is no easy, or perhaps even feasible, way of searching documents or statements included in the legislative process. Unless articles were cited in transcribed legislative hearings or debates, we would not know. For example, an obvious use of such articles would be in reports or oral communications by interest groups lobbying individual members of the legislature, which are unlikely to be in any public record.
I believe, however, that your question would be very interesting to research. I know, though, from over 25 years of reading federal regulations that I have rarely seen any mention of social science empirical environmental policy articles. This seems rather strange – hundreds of such articles published over decades that supposedly offer guidance on important government policies, and the government does not cite them in proposing policies. Perhaps the only exception are articles about placing economic values on human life or natural resources, but those are not the type of articles included in the scope of my report because they are typically survey-based.
- Is there a climate change conspiracy angle in here (from authors or people who may read your report? (I ask this one after seeing the fallout from the Climategate emails...)
As I was moving towards completion of my report over the last year, I realized that my report’s findings would probably make some people think of the climate change controversy. I am not a climate change skeptic, although I believe there should be concern about possible unintentional and intentional errors in any study, including those from climate change scientists. The frequency of obvious and simple computational errors in the social science empirical environmental policy articles that I reviewed was frightening.
That said, however, when I was in academia I was atypical among social scientists in that I actively sought the involvement of environmental science and engineering faculty in my research. As an environmental attorney, I routinely interacted with environmental science and engineering staff and consultants, so this was standard operating procedure for me. I found environmental science and engineering faculty to be routinely competent, conscientious, and professional, in stark contrast to social science academics. I believe that environmental science and engineering faculty actually think of themselves as scientists, with a career-long commitment to research (and incentive, or requirement, to bring in research grants), whereas most environmental policy social scientists, at best, just try to publish enough elegant, or at least convoluted, pushing around of numbers to get tenure, after which they hibernate.
The other obvious major difference between environmental studies natural and social scientists is that natural scientists actually are educated in what they are researching. It is difficult to imagine that many natural scientists would even attempt to do or publish research in an area in which they had essentially no education, or that natural science peer reviewers would not notice obvious errors in the resulting research. In contrast, I am confident that most environmental policy social scientists have had few, if any, courses in environmental science or environmental law. From my own experience of over 25 years in environmental law, I know that environmental law and science are very challenging to master. The notion that people lacking any meaningful education, training, or experience in these areas could churn out reliable studies is ridiculous. As I stated in my report, only in social science academia are people with no relevant backgrounds allowed to do such research, and the results are exactly as expected.
My report’s findings also are inconsistent with the climate change skeptics’ allegations in that I did not conclude that there was a bias in any particular direction in the social science empirical environmental policy articles that I reviewed. With the possible exception of environmental justice studies that supposedly found discrimination against minority or low income people in the distribution of environmental hazards or burdens, I believe that the articles’ conclusions were often inconsistent. Thus, I believe that the “conspiracy” is not one of promoting certain conclusions, but rather to allow anyone to use any pseudo-facts or inappropriate data to get their research published.
- What result would you hope to see from your report?
The recommendations that I made in my report’s “Conclusions” chapter describe in detail the specific steps that should address the problems that I identified, so I need not repeat them here. In summary, however, academic journals need to correct or retract already published articles that are significantly flawed, establish a review system for future manuscripts that will minimize the likelihood of their publishing articles with significant misinformation, and encourage readers to submit concerns about journal articles. Government entities that fund social science empirical environmental policy research need to establish a grant approval system that will minimize the likelihood of their funding misguided research. EPA should create an Internet site where everyone, including its staff, can contribute and review concerns expressed about social science empirical environmental policy research.
What is important to recognize is that this is not a problem that can be solved by educating and motivating the consumers involved (in this case, those who read and use the research). This situation is not like educating consumers that smoking is bad for them or that they should read the nutritional labels on food products. It is unrealistic and horrifically inefficient for consumers of this research to obtain the knowledge needed to independently evaluate the substance of social science empirical environmental policy studies and to recreate the datasets used in those studies. Even with my over 25 years of environmental policy experience and extensive knowledge of environmental databases, this was a challenging and time-consuming process. That is why the focus must be on preventing flawed studies from being published in the first place, rather than counteracting them afterwards. For example, in a 2001 article that I published (the one you probably are referring to as my “2002 critique”), I believe that I clearly and laboriously demolished two previously published studies on the same subject, yet those studies have still been favorably cited by other articles since then. Having worked outside of academia in environmental policy research organizations, I know that implementing a competent quality control system is not difficult -- it is standard operating procedure everywhere but in academia. This defective system exists in academia because of the lack of accountability and because the lack of quality control benefits the people involved in the academic process.
- How did lawyers respond to the 2002 critique?
As my report cursorily mentioned (see footnote 10), there were three short articles by very prominent law professors responding to the 2002 critique published in the same issue as that critique (if you would like any of them or the 2002 critique, let me know, as I have PDF copies [see footnote 1, below]). Not surprisingly, they were unhappy. In general , the rebuttal articles acknowledged that although there were some problems in law review empirical articles, the 2002 critique exaggerated their extent or did not take into account that it is appropriate for law review articles, supposedly unlike social science journal articles, to advocate certain positions, and thus, like attorneys’ arguments in legal proceedings, they should not be assumed to offer balanced, unbiased, and fully-informative substantiation for their positions. One of the rebuttal articles was by a law professor who authored an article that the 2002 critique specifically criticized, and he attempted to defend his work. The authors of the 2002 critique replied to his defense with a revelation that reminds me of my exchange with Thomas Lyon. I have not seen any further responses to the 2002 critique and am unaware if law reviews did anything to address these concerns. In the years since the 2002 critique there has been a large increase in quantitative research published in law reviews, so it would be interesting for someone (not me) to examine whether the situation has improved in the last 10 years.
Unlike most social science and law professors, I have read a large amount of both law review and social science journal articles. The good news for readers of law review articles is that they typically provide copious substantiation for their assumptions. The bad news is that we cannot be confident that the offered substantiation is complete and balanced. The bad news about social science empirical environmental policy articles is that they typically provide little, if any, substantiation for their assumptions. The good news is supposed to be that the peer reviewers of these articles ensure that these articles’ assumptions are correct, so that readers do not have to be concerned about the lack of substantiation. Obviously, the peer reviewers have failed with respect to social science empirical environmental policy articles and this produces the worst of all worlds – no assurance that incorrect articles were prevented from being published and no substantiation provided in the articles to facilitate readers checking for themselves to determine if there were inaccuracies or biases.
I sent this email to Lyon:
Dear Professor Lyon,His response:
As you are surely aware, Mr. Atlas accused you of professional negligence (and perhaps fraud).
Do you have any response?
People are quite interested to see if academics actually debate, or avoid the topic, hiding behind tenure/title/past events/etc...
So, I'd prefer to hear what you say (or get a link to where you've replied elsewhere).
Thanks, but I really don't think he is worth my time.Independent of that exchange, Mark offered the following prediction on whether Lyon would reply:
No news from Lyon, nor do I expect any. My consistent experience with social science academics is that if they are not certain that they will win an argument, they run away (and try to get revenge through indirect means).
Epstein and King published their 2002 paper [pdf] in the Chicago Law Review, suggesting that lawyers were not using rigorous empirical methods.
Responses by Revesz [pdf], Cross et al. [pdf] and Goldsmith and Vermeule [pdf]
Epstein and King respond to them [pdf]