Comments on Questions

About John R. Lott’s Claims Regarding a 1997 Survey

 

Personal Note, January 17, 2003: 

 

In September 2002 I offered to look into a question raised about John Lott’s work.  I thought that offering such help to Lott and to the profession was the responsible thing to do when serious questions were raised, and I thought it would be exceedingly simple to establish that a survey of 2,424 people had been done.   While I recognized that it is extremely easy to lose data in a computer crash, I had not anticipated that Lott would claim to have done a large national survey without discussing the sampling design with anyone, leaving any financial or other records of the study, or remembering anyone who had worked on it.  I had never heard of a professor doing anything of that size with no funding, paid support, paid staff, phone reimbursements, or records (though there are probably precedents for such an unusual method).  As it stands now, unless someone comes forward to verify working on the study--as I still hope occurs--we may never know with any certainty whether the 1997 study was done.  Although I strongly favor emailing 1997-98 University of Chicago college graduates to see if any remember any classmates working on the study, John Lott now raises serious questions about how complete the University’s alumni records are, rendering that approach a less reliable route to an answer than I had anticipated.

 

I recently contacted Otis Dudley Duncan, who first raised questions about Lott’s claim that 98% of defensive gun uses involved the mere brandishing of a weapon.  He agrees (as I do) with this statement of Tim Lambert on Lambert’s website: 

 

“I should comment on the overall significance of this question. Lott's 98% claim takes up just one sentence of his book. Whether or not it's true, it doesn't affect his main argument, which is about alleged benefits of concealed carry laws.” 

 

So there agreement even among those who have raised questions about Lott’s work that his 98% claim is not central to his book, More Guns, Less Crime.  Both Duncan and Lambert, however, emphasize their belief that whether the study was done does go to John Lott’s credibility.

 

I find recent developments in this affair personally troubling.  I carefully recorded what John Lott told me and now Lott has changed the story he told me in several specific ways--most of them minor.  They are discussed in my revisions to this report.   I have no research interests in this subfield and no ideas for further efforts to get to the bottom of this inquiry beyond surveying graduates and Lott’s looking at picture books of former students. This project detracts from my other scholarly efforts.  Accordingly, my part in this affair is essentially done, at least if John Lott will stop changing his stories about our conversation.  If not, then I suspect that I will have to stay in it a little longer, at least to respond to comments on this report.

 

For those who have been following the dispute over Lott’s 1997 study, other than a few fairly small changes, the portions of this report new on January 17, 2003 are this Personal Note, Sections 4 and 5(“Comments on John Lott’s Response to this Report” and “Conclusion”), and Appendix 3 (John Lott’s email responding to this report).

 

James Lindgren

Professor of Law

Northwestern University

 

 

 


Comments on Questions

About John R. Lott’s Claims Regarding a 1997 Survey

 

December 24, 2002;

Revised January 12, 2003;

Sections 4-5 added and Appendix 3 added on January 17, 2003
Chicago, IL

1.  Background and John R. Lott’s Claims

 

a.  Different Sources Cited for the 98% of Defensive Gun Uses Being Mere Brandishing

On September 15, 2002 on the discussion list FirearmsRegProf, Tim Lambert raised the possibility that John R. Lott fabricated a study claiming that 98% of defensive gun uses involved brandishing with no shots being fired.  Lambert wrote: "it seems increasingly likely that the survey is fictional."  He was following up on problems first identified by Otis Dudley Duncan and discussed publicly in the Jan./Feb. 2000 issue of The Criminologist.  These extremely serious charges concerned a study referred to in the second edition of Lott’s More Guns, Less Crime, but never actually published.  Over the years, however, Lott has referred to this 98% number over four dozen times in publicly available sources.

 

Here is Lott’s statement in the (first) May 1998 edition of More Guns, Less Crime, which attributes the number to “national surveys.”

 

"If national surveys are correct, 98 percent of the time that people use guns defensively, they merely have to brandish a weapon to break off an attack." More Guns, Less Crime (University of Chicago Press, 1998), p. 3. 

 

The year before, in the July 16, 1997 Wall Street Journal, Lott appeared to attribute the 98% figure to one or more of three specific survey organizations:

 

“Other research shows that guns clearly deter criminals. Polls by the Los Angeles Times, Gallup and Peter Hart Research Associates show that there are at least 760,000, and possibly as many as 3.6 million, defensive uses of guns per year. In 98% of the cases, such polls show, people simply brandish the weapon to stop an attack.” John R. Lott Jr., Childproof Gun Locks: Bound to Misfire, Wall Street Journal, 7/16/97 Wall St. J. A22

 

The same language (other than typesetting conventions) appears the following year in two articles by Lott on the same topic for the Chicago Tribune and the Washington Times. John R. Lott Jr., Prime Suspect: Gun-Lock Proposal Bound to Misfire, 8/6/98 Chi. Trib. 23; John Lott, Commentary: Gun Locks That are Bound to Misfire, 8/14/98 Wash. Times (D.C.) A17. 

 

            Starting in January 1999, Otis Dudley Duncan began writing a series of letters to Lott questioning aspects of his work, including the 98% figure.  In May 1999, Duncan informed Lott that he was writing an article calling the 98% a “rogue number” and then sent him a draft of an article containing these words, “The '98 percent' is either a figment of Lott's imagination or an artifact of careless computation or proofreading."

 

            Lott then called Duncan on May 21, 1999 and, for the first time, told Duncan that he had conducted a hitherto unrevealed study in 1997.  Not long after that phone call, Duncan received a letter dated May 13, 1999, which also mentioned a 1997 study.  In that letter, Lott responded to Duncan’s suggestion that the 98% figure might have resulted from a misreading of a study by Gary Kleck:

 

“I am a great admirer of Gary Kleck's work, and I think that he has done a great deal to advance the study of crime.  Few academics have his integrity and courage.  His numbers are a little higher in terms of the total number of defensive uses that I have found and the frequency of brandishing is lower than I have found.  The information of over 2 million defensive uses and 98 percent is based upon survey evidence that I have put together involving a large nationwide telephone survey conducted over a three month period during 1997.  Follow up telephone calls were made to ensure that the questions were answered by those who we attempted to contact.  The survey was not as detailed as several other surveys, but it did try to include a couple initial questions to ensure accuracy and screen out any problems and then focus exclusively on defensive gun uses.  I plan on repeating the survey again during the next year to year and a half.  I will be happy to inform you what the results of that survey are after I have conducted it.”  Letter from John Lott to Otis Dudley Duncan, dated May 13, 1999.

 

        Note that Lott makes no mention of having lost the data for the 1997 study.  According to Duncan, Lott’s claim that he had lost his data surfaced only much later.

 

        In the second 2000 edition of More Guns, Less Crime, Lott gives the same 98% figure, but (as in his letter to Duncan) attributes the number to his own study:

 

"If a national survey that I conducted is correct, 98 percent of the time that people use guns defensively, they merely have to brandish a weapon to break off an attack." More Guns, Less Crime, second edition (University of Chicago Press, 2000), p. 3.

 

In an email to me on December 26, 2002, Lott writes that he submitted the manuscript at least 9 months before publication, which would place Lott’s submission of this language in late 1999.

 

Then, in February and March 2000, Lott gives the same 98% figure, but this time attributes it to Gary Kleck, an authority that he had disavowed as the source for the 98% figure in 1999.  In his March 2000 piece on gun locks, Lott wrote: 

 

"Guns clearly deter criminals. Americans use guns defensively over 2 million times every year--five times more frequently than the 430,000 times guns were used to commit crimes in 1997, according to research by Florida State University criminologist Gary Kleck. Kleck's study of defensive gun uses found that 98 percent of the time, simply brandishing the weapon is sufficient to stop an attack." John Lott, Gun Locks: Bound to Misfire, online publication of the Independence Institute, March 1, 2000

 

Lott used almost identical language in a version of the same article on February 9, 2000, also published by the Independence Institute.  Kleck’s work does not support this claim, though (as I understand it) some others have mistakenly read it as supporting this claim.  As Duncan points out based on a discussion with the Independence Institute, there is a chance that Lott might have written the article earlier than 2000, though when he covers the same ground in several articles in 1997-98, he does not list Kleck as the source. 

 

            Then in his comment in the September/October 2000 Criminologist, Lott returned to claiming that he got his 98% number, not from Kleck, but from his own 1997 study.  Otis Dudley Duncan raised questions about the 98% figure in The Criminologist (Jan./Feb. 2000), after exchanges between Lott and Duncan that occurred in 1999.  In response to Duncan’s comments in The Criminologist, Lott describes his 1997 survey, also in The Criminologist (Sept./Oct. 2000):

 

“The survey that I oversaw interviewed 2,424 people from across the United States. It was done in large part to see for myself whether the estimates put together by other researchers (such as Gary Kleck) were accurate. The estimates that I obtained implied about 2.1 million defensive gun uses, a number somewhat lower than Kleck's. However, I also found a significantly higher percentage of them (98 percent) involved simply brandishing a gun. My survey was conducted over 3 months during 1997. I had planned on including a discussion of it in my book, but did not do so because an unfortunate computer crash lost my hard disk right before the final draft of the book had to be turned in. Duncan raises a related issue that "Lott may well have read Will, in as much as Will's article is in the bibliography of More Guns, Less Crime. ... Did Lott borrow the '98 percent' from Kleck . . . from Snyder, via Will? Even if that account explains part of the puzzle, the question remains. Where did the 2 million come from?" (Page 6) The course that Duncan tries to follow - from a 1988 article by Kleck to a 1993 piece by Snyder to George Will to my book because it cites Will - is fascinating. Yet, I am not sure why this entire discussion was necessary since I told Duncan on the telephone last year that the "98 percent" number came from the survey that I had done and I had also mentioned the source for the 2 million number.

 

That means that the source that Lott gave for the 98% figure has shifted over time:

 

1.  In the 1998 edition of More Guns, Less Crime, John Lott attributes the 98% figure to “national surveys.”

 

2.  Elsewhere in 1997 and 1998, Lott appears to attribute the 98% figure to “such polls” as the “Los Angeles Times, Gallup and Peter Hart Research Associates.”

 

3.  In a May 13, 1999 letter and in 1999 revisions to the 2000 edition of his book, Lott attributes the 98% number instead to his own 1997 study, saying in the letter that the 98% figure is not based on Kleck.

 

4.  In almost identical February 9, 2000 and March 1, 2000 articles for the Independence Institute, Lott switches to attributing the 98% figure, not to his own study, but to Gary Kleck’s (which does not support this figure).

 

5.  In the Criminologist  (Sept./Oct. 2000), Lott switches back to claiming that the 98% figure came from Lott’s own 1997 study, not from Kleck, which is where things stand as of this report. 

 

Note that by using the plural “national surveys” and “such studies” Lott is stating that there are more than just one study showing the 98% figure, yet he now insists that the 98% figure came from his own study, not Kleck’s (and Kleck’s study does not support the 98% figure).  Indeed, as discussed in the next section, because the 98% figure is supposed to be based on his own study, not those done by others, Lott says that his critics will have “nailed” him if they find that he began talking about the 98% figure before he says he did his study in 1997.  Yet, if Lott really based the 98% figure on his own study alone, it seems strange that he would attribute the 98% figure to such plural entities as “national surveys” or “such studies”--until Duncan challenged him in 1999 and Lott revealed for the first time that the 98% figure was based on a study he did himself. 

 

 

 

c.  A Study Done “over 3 months during 1997”

 

In the Criminologist, Lott also wrote:  “My survey was conducted over 3 months during 1997.”  Lott called me on the telephone and repeated that he had conducted the study over several months during 1997.  If he spent 3 months doing it in 1997, as he claims, the earliest that he could have completed it would be early April.  Further, in an email to me Lott wrote, “I am willing to bet that I don't start mentioning this [98%] figure until the spring of 1997.  If I use it before I said that I did the survey, I will say that they nailed me.  But if I only started using it about the time that I said that I did the survey, I think that it would be strong evidence the other way.” 

 

Lott’s first mention of the 98% figure located by Dudley Duncan is February 6, 1997, nearly two months before he could have completed the survey, according to his prior claims:

 

"There are surveys that have been done by the Los Angeles Times, Gallup, Roper, Peter Hart, about 15 national survey organizations in total that range from anything from 760,000 times a year to 3.6 million times a year people use guns defensively. About 98 percent of those simply involve people brandishing a gun and not using them."

Page 41, State of Nebraska, Committee on Judiciary LB465, February 6, 1997, statement of John Lott, Transcript prepared by the Clerk of the Legislature, Transcriber's Office [Otis Dudley Duncan & Tim Lambert, http://www.cse.unsw.edu.au/~lambert/guns/lottbrandish.html]

 

I have not verified this transcript, but if accurate, Lott’s February 6, 1997 testimony is inconsistent with Lott’s claims--published, by phone, and by email--that he did the study in “three months” (and “several months”) in 1997.  Lott himself set up this timing issue as the test whether he “will say that they nailed me.”  It is possible that Lott's memory was simply off on the year the study was done, that he instead did it in 1996, but if Lott did indeed conduct the study over 3 months in 1997 (as Lott wrote in the Criminologist),  then he could not have reported the results on February 6, 1997, just weeks after beginning data collection.

 

After this discrepancy was noted in the first draft of this report, on December 26, 2002, Lott wrote me by email:

 

“The overwhelming majority of the survey work was done at the beginning of the period over which the survey was done.  It has obviously been a while, but my recollection is that the small number of people surveyed after the first four or five weeks (mainly January 1997) did not include any more defensive gun uses.”

 

While again this story is certainly possible, Lott himself gave spring 1997 as the time before which he should not have been discussing the 98% figure.  Additional matters bear mentioning.  It hardly matters whether all of the defensive gun uses were found in the first 4-5 weeks of the study, since Lott could not have known that at the time he spoke about the results unless data collection were complete.  If data collection were partial, the precise percentage of defensive gun uses would have been higher with partial data.  Collecting so much data in 4-5 weeks would have been unusual for unpaid volunteers who were full-time undergraduate students at the University of Chicago at the time, unless there were a very large number of volunteers.  As I discuss below in the section on technical problems with the study, Lott’s numbers suggest that only ½ or 1/8th or 5/8ths of a respondent reported certain kinds of defensive uses. The partial respondents necessary to support Lott’s percentages would be most likely to result from some extreme demographic weights being applied after the data collection were complete and the results were compared with the characteristics of the adult population.  If the study were not complete, it would be very unlikely that someone would have weighted the results against the general population before knowing how skewed his sample was.  Such weighting is not easy and would have been a colossal waste of time before data collection were complete, since they would need to be redone at the end of data collection.  Last, of course, Lott does not mention that he is reporting partial data in his February 6, 1997 testimony.

 

There are other, more ambiguous contextual clues that Lott had contrasted his main work, which was done on county-wide data, with surveys done by others, which involved household surveys.  For example, in the Washington Times in 1999, Lott wrote:

 

“Indeed, about 450,000 crimes, including 10,744 murders, were committed with guns in 1996. But Americans also use guns defensively over 2 million times a year and 98 percent of the time merely brandishing the weapon is sufficient to stop an attack.

In my own recent research on gun ownership across states and over time, I found that states with the largest increases in gun ownership rates had the largest drops in crime rates.”  John R. Lott, Lethal Handgun Fears, 2/24/99 Wash. Times (D.C.) A17. 

 

In a long 1999 Chicago Tribune Magazine story, after speaking to John Lott, Linnet Myers twice contrasted Lott’s county-wide work with Kellermann’s household study: 

 

Bolstering the other side is Dr. Arthur Kellermann, of Emory University's Center for Injury Control in Atlanta. His research

indicates that owning a gun is far more dangerous to a homeowner than it is to potential intruders.

. . .

Lott didn't examine home protection, but he did study the impact of armed self-defense. In his book, "More Guns, Less Crime" (University of Chicago Press), he wrote that violent crime dropped noticeably in the 31 states that now give permits to qualified

citizens who want to carry handguns. Twelve states allow permits in certain cases. Seven, including Illinois, prohibit carrying.

[Tom] Smith points out that while the two researchers clearly support opposing sides in America's gun debate, their findings aren't exactly opposite. Kellermann addresses the risks of keeping a gun at home and he measured only self-defense shootings--not occasions when guns were used simply to threaten.

Lott didn't study gun use at home, but looked at the impact of laws that allow guns to be carried outdoors. Even so, Lott said that in most cases of self-defense, "people merely need to brandish a gun . . . less than 2 percent are fired." He said guns particularly help women, who become more "equal" to men when they're armed. "Women who behave passively are 2.5 times more likely to end up being seriously injured than women who are able to brandish a gun when confronted by a criminal," said Lott. Linnet Myers, Go Ahead Make Her Day With Her Direct Approach And Quiet Confidence, Chicago Lawyer Anne Kimball Gives Gunmakers A Powerful Weapon, Chicago Tribune, 5/2/99 Chi. Trib. 12.

 

If this newspaper account is accurate (and newspapers often aren’t), it is odd that Lott would try to answer the reporter’s claims about the Kellermann household study without pointing out that he had done a big household study himself.  Although this contextual evidence is less telling, it does tend to fit the pattern that, until Lott replied to Duncan in mid-May 1999, Lott had consistently attributed the 98% figure to several specific survey organizations or to no one, never to his own 1997 study.
 

2.  Technical Problems

Another problem (mentioned by Tim Lambert and Dudley Duncan) is the small and almost impossible numbers of respondents on which Lott would have based his claim about the rate of firing v. merely brandishing.  According to Lott, the study found that 98% of the defensive uses of a gun involve mere brandishing, and that 3/4ths of that 2% involve firing warning shots, the other 1/4th of that 2% involves firing at a person threatening the shooter.  With Lott’s estimate of 2.1 defensive gun uses a year and about 200 million adults, that would mean that about 25-26 respondents reported defensive gun uses out of his 2,424 people surveyed.  Thus, only ½ of a person (2% of 25 people) reported firing a gun--and that ½ of a person breaks down further into 3/8ths of a person firing warning shots and 1/8th of a person firing at someone. 

While one can get fractions of a person if one weights respondents by their numbers in the general population, getting 8 times more people in a sample group than ideal would be rare (the number needed to justify a weight of 1/8th).  Further, how can one generalize three different rates (firing, firing a warning shot, and firing at a person) from at most only one (or two) people, people who are so unrepresentative that they are weighted as 1/8th or 3/8ths of a person?

It is possible that a multi-year window was inquired about, though even with a five year rate, one still has only 5/8ths of a person and 15/8ths of a person to support his reported rates.  Further, Lott never gives the rate as being for a multi-year window such as 1993-97, sometimes implying that the rate is for 1997.  Even if a five-year window were used, the numbers of respondents would be still be so low (e.g., 5/8ths of a person) as to be unreliable for reporting a rate. 

Unless John Lott can come up with a sensible explanation for why his rates could possibly be justified with only a 2,424 person sample, it is my opinion that Lott should withdraw the 98% figure as probably erroneous and, in any event, too unreliable to form the basis of an estimated rate.  Perhaps he has an explanation that doesn’t appear yet.  If not, withdrawing the 98% figure in some appropriate way would be a simple matter of good social science. 

 

After this report was written, Lott disclosed that he had done a new survey of about 1,000 respondents in the fall of 2002.  Daniel Polsby has spoken to enough people involved and seen enough records to determine conclusively that this new study was done.  Even more than with the earlier study, however, I don’t see how one can get an estimate of something that Lott says happened to about 1 out of every 4,800 people each year (2% of 1.05%) with a sample size of just over 1,000 people, asking about their experiences over the last year.

 


3.  The More Serious Issue: Was the 1997 Study Ever Done?

 

a. Circumstantial Evidence of a 1997 Computer Crash

The more serious question, which Tim Lambert has raised, is whether Lott ever did the 1997 survey giving rise to the 98% figure.  As I posted in September, all evidence of a study with 2,400 respondents does not just disappear when a computer crashes. Having done one large survey (about half the size of John Lott’s) and several smaller surveys, I can attest that it is an enormous undertaking. Typically, there is funding, employees who did the survey, financial records on the employees, financial records on the mailing or telephoning, the survey instrument, completed surveys or tally sheets, a list of everyone in the sample, records on who responded and who declined to participate, and so on. While all of these things might not be preserved in every study, some of them would almost always be retained or recoverable.  Just to get a representative list of the US public would take consultation with an experienced sampler and probably the purchase of an expensive sample.  As far as I know, there was no cheap commercial list of almost every person or household in the United States from which to draw a good sample.

According to Lott, he lost all of his data on his hard drive when it crashed in June of 1997.  I talked with one of Lott’s co-authors on another paper, Bill Landes, and received emails from David Mustard, another co-author, and Gregory Huck, Lott’s editor at the time at the University of Chicago Press.  With varying degrees of certainty, all give circumstantial support to Lott’s story of a sudden loss of data and text on projects, requiring delays and regeneration of work.  Further, Mustard recalls hearing about the 1997 study, though when Lott told him about it is a little unclear from Mustard’s email to me:

 

John told me that he had conducted a survey in 1997. I did not participate in the survey--it was after our concealed carry paper had been published (Jan 1997) and was after I was on the job market and while I was finishing my dissertation and then moving to Georgia (Aug 1997).

 

John had some major computer problems in 1997 or 1998--I am not sure of the exact timing, but I think I was already here in Georgia.  John lost a lot of data and material and I and others tried to replace what he lost. Lott indicated that the survey results were not backed up and that he lost all of that, none of which could be replaced. Nobody else had the survey data. That is about all I know about the survey.

 

John has always impressed me with his willingness to give out his data; to anyone who requests them. To my knowledge John has always released his data to anyone who asks of it. In fact, we gave out our data about 4 months before the article even came out in print. We have now given our data out to about 75 people from around the world; perhaps more[.] As I understand the survey situation, John does not release the survey data because he no longer has it, not because he is unwilling to do so.

 

 

Mustard also is not sure when the data loss occurred.  At first he uncertainly placed it after he moved from Chicago in August 1997, later than Lott’s time of June 1997.  But in a follow-up December 26, 2002 email to me after the first draft of this report, Mustard quite plausibly explained:

 

“As to the _date_ of John's computer crash, it could have happened in June 1997. My previous response would more accurately be that I _sent him the data_ after I was at Georgia. I do not really remember _when_ it crashed, only that it did and I sent him the data shortly after I arrived in Georgia. Given that I was gone for large parts of July and that our possessions were being shipped, it is feasible that John's crash could have occurred in June and I sent him the data after I got set up in Georgia.”

 

David Mustard, like Bill Landes, Russell Roberts, and Daniel Polsby, also went out of his way to note Lott’s scrupulousness in sharing data on other projects.



            b.  No Direct Evidence of a 1997 Survey

 

As for obtaining direct, rather than circumstantial, evidence that the study was done, I did not fare as well as might be expected.  Lott called me and told me the following:

1.  Lott had no funding for the project; he paid for expenses himself.


2.  The survey was done by phone by several
University of Chicago undergraduate volunteers in their junior or senior years in 1997, so there are no financial employee records.


3.  The calling was done by the undergraduates from their own phones.  Periodically, they would bring over their phone bills and Lott would reimburse them out of his own pocket--either in cash or by check.  Asked whether he retained his checks, Lott said that he destroyed them after 3 years. I did not think to mention it, but the phone research expenses should have been deductible if they were not reimbursed by his employers.


4.  Lott does not remember the names of any of the undergraduates who did the calling for him.


5.  Lott had no discussions with any samplers about his sampling design.


6.  Lott did not weight his sample for household size and did not ask how many adults were in the household (as is standardly done, thus rendering his results too heavily influenced by small households). [In response to Lott’s recent question to me about why weighting for household size is necessary, if the unit of interest is households rather than people, then you do not have to weight responses by the number of adults in the household.  If, however, the unit of analysis is people rather than households, then you should weight for household size, otherwise you will over represent people who live alone and under represent those who live in large households. For example, if the average number of adults in households were 2 people, then you would normally weight the result for a person living in a household with 4 adults by 2, and weight the result for a person who lives alone by ½.  I asked this question of Lott for another reason; it might have explained his small apparent weights for some people if he had adjusted for household size, but he didn’t.  Sampling is a lot more complicated than this simple example implies, which is why very few people would attempt a national sample without consulting a sampling expert.] 


7.  For his list from which to draw the sample, Lott used a commercially available CD-ROM with names on it.  He does not remember where he got it or now have the CD.

 
8.  Lott does not remember how he drew his sample from the CD-ROM.


9.  Lott does not have a copy of the survey instrument and doesn’t remember the wording of the questions, though he was probing defensive uses in more detail than other studies.  He ended with a very few demographic questions.


10.  Lott weighted his respondents by demographic information taken from his main national study in More Guns, Less Crime.


11.  In his book More Guns, Less Crime, Lott had planned to include a chapter on the 1997 study, a chapter that he had not yet written, but decided not to do so after the data loss.  He did not end up publishing the 1997 study itself, just referring to it many times, including a sentence about it in the second edition of More Guns, Less Crime.


12.  Lott thinks that he did not retain any of the tally sheets, though he is not certain.  He reported that he might have tossed out tally sheets or other evidence of the 1997 study during one of his several moves over the years.

 

The discussion of the tally sheets is possibly in conflict with what Lott wrote to Dudley Duncan on AEI letterhead dated June 4, 2002: “I used students to conduct the survey. Most (though not all) of the information was originally entered into the students' own computers and then they were combined onto my machine. You know what happened to my computer.”  This version of the story suggested original computer entry of data, which would have required central programming of a data form and then sharing of that program with several callers for their home computers.  From Lott’s discussions with me, I was left with the impression that the callers instead used sheets to record answers, which he thought that he no longer retained.  Besides the possible discrepancy between the two versions of the ways that the students collected and conveyed the data, if the data were instead sitting on the students’ computers, it is possible that Lott could have replaced most or all of his lost data by asking the students to give it to him again. 

With the surprising lack of any of the normal indicia of having done a large national study of 2,424 respondents, the key remains locating the undergraduates who Lott says did the calling.  The 1997 study was large, extremely time-consuming, and very expensive in phone charges.  Getting 2,424 respondents with refusals and callbacks would have required thousands and thousands of phone calls.  Students would have had to spend many hours calling, which they and their friends would well remember.  With John Lott’s permission, I therefore contacted Saul Levmore, the Dean of the University of Chicago Law School, requesting him to try to obtain from the alumni office the email addresses of the 1997 and 1998 college graduating classes at the University of Chicago, so that I could contact them asking whether any of them or any of their friends had done any research for John Lott during his fellowship at Chicago.  Levmore declined to make such a request for email addresses from the University of Chicago Alumni Office at this time, but he did not rule out cooperating with a request coming from other quarters or in a different situation.  Of course, my request was in the form of fact-finding, rather than a complaint, which might have triggered a different process.

Thus, I have reached a temporary dead end.  If someone were to email the 1997-98 Chicago college alumni twice, it is likely that at least one of those who did the study for Lott would come forward, if the study were actually done.

 


4.  Comments on John Lott’s Response to this Report

 

a.  Several Changes in Lott’s Story

 

On January 14, 2003, John Lott sent an emailed response commenting on this Report to me, three bloggers, and several of his friends and colleagues.  Most of it was posted on the internet by Marie Gryphon.  Lott makes several good points, which you can read below in his own words in the third appendix to this report.  There are, however, several changes in his story from what he told me when he called me in September that bear mentioning. I will restate what Lott told me on the phone in September and then print what he is saying now:

 

THIS REPORT (above): “2. The survey was done by phone by several University of Chicago undergraduate volunteers in their junior or senior years in 1997, so there are no financial employee records.”

 

LOTT’S RESPONSE (Appendix 3 below): “Lindgren does not accurately report my conversation with him about how I paid people (in that I said that I possibly paid by check) . . . . Incidentally, I told Jim that there were “two” Chicago students. Those students had also gotten others that they knew from other campuses from places such as I think the University of Illinois at Chicago circle (but I am not sure that I remember this accurately).”

 

THIS REPORT: 3. “The calling was done by the undergraduates from their own phones. Periodically, they would bring over their phone bills and Lott would reimburse them out of his own pocket--either in cash or by check.”

 

LOTT’S RESPONSE:  “most of this next statement [“calling was done by the undergraduates from their own phones “] is correct except the point about the “possible” use of checks.”

 

The facts of my conversation with John are different than he now remembers them.  John Lott called me shortly after I posted a notice to the FireArmsRegProf discussion list on September 15, 2002 offering my help to Lott to help sort out Tim Lambert’s extremely serious charges.  I was looking for evidence to support Lott’s claim that he did the 1997 study.  At the time I knew very little about the affair and had no other scenarios running around in my head other than what Lott told me.  I did not even know when Lott called whether he had supposedly done a mail or telephone survey.  I listened very closely to his answers, especially on questions that went to credibility.  I was pleased when he told me that he had sometimes paid the students by check and sometimes in cash.  There are a lot of reasons to pay by check, including the size of the phone bills that would have been involved and the costs being tax-deductible (though I wasn’t thinking specifically about that latter point at the time).   Even though I thought that Lott’s contacting his bank for check records would be unsuccessful, I thought that by claiming that he sometimes paid by check, Lott was leaving himself open to the possibility of being checked, which was a good sign for assessing his credibility.  Now sadly Lott has changed his story, claiming that he told me instead that he only “possibly” paid some by check.  That is not what he said to me in September.  I noticed then both what he said and his demeanor in saying it; he was not defensive or hesitant to say that he sometimes paid them by check.

 

What is most surprising about this claim is that Lott emailed me on January 13, 2003, the day before he wrote his long response quoted above.  On January 13 Lott wrote to me:

 

As to the check issue, I believe that the word "possible" was included, but you have to go on what you remember.  In any case, I am sure that I can get many people who will say that I paid them with cash over the years.

 

As you can see, on Monday afternoon, Lott is “sure” on one issue, but he only “believe[s]” that he included the word “possible” and says that I have to go on what I remember.  Just a day later, on Tuesday, Lott is suddenly certain that I am wrong, writing: “Lindgren does not accurately report my conversation with him about how I paid people (in that I said that I possibly paid by check)” and “most of [Lindgren’s] . . .  next statement is correct except the point about the ‘possible’ use of checks.”

 

            In September, I carefully listened to both Lott’s words and his demeanor, taking notes on the more important parts of the long conversation.  I wrote up those notes (which formed the basis for what Lott told me listed in items 1-12 above), later revising them into my report. I retain a vivid recollection of what Lott told me on many points, including this one.  One day, four months after the conversation, Lott sent me an email saying that he “believe[s]” that he said that he qualified his having paid students by check with the word “possible,” but appears not to be sure (as he is about other instances of paying cash on other projects).  He tells me that I have to go on what I remember.  Then the very next day, Lott is certain enough to accuse me of “inaccuracy.”

 

Lott’s new assertion about there being only two University of Chicago students involved and the rest of the callers coming from other universities is perhaps the most disturbing of his changes of story.  John and I had a long discussion about how best to reach the University of Chicago students who did the calling.  When he first mentioned University of Chicago students, I thought he meant law students, but he corrected me, saying that they were all University of Chicago undergraduates, both those he dealt with and the callers.   We discussed perhaps looking at a picture book of Chicago students (which he was willing to do, though he didn’t sound optimistic about his being able to recognize the students).  He said that he "mostly" dealt with one or two guys who were seniors in 1997, but that some of the callers, who were also Chicago students, might have been juniors.  Indeed, when I suggested emailing just the 1997 senior class, Lott said that although he was pretty sure the ones he dealt with were seniors, some of the callers might have been University of Chicago juniors at the time, so I said I would try to email that class as well. We talked about how to contact the people who did the calling and at all times he talked about contacting University of Chicago students.  He never even hinted that there might be callers from any other school.  If he had mentioned University of Illinois-Chicago students, we would have discussed how to contact them as well.  Lott could not have been clearer that the callers were University of Chicago undergraduates.   Now his story has shifted.

 

Lott goes on to make less serious changes in his story.  For example:

 

THIS REPORT: “5. Lott had no discussions with any samplers about his sampling design.”

 

LOTT’S RESPONSE: “I had lunch Tom Smith during the fall of 1996. However, while I asked him many questions about surveys, I did not tell him what I was planning on doing because Tom works very closely with gun control organizations.”

 

A related LOTT RESPONSE: “Russell Roberts is someone that I bounced the survey questions off of and he can possibly talk to you about it, though he hasn't kept e-mails from 1997.”

 

One of the online commentators is reassured that Lott now says that he discussed the survey questions at the time with Russell Roberts of Wash. U.  But Lott did not say this when I asked him in September.  He said that he regularly asked Roberts for advice and that he might or might not have discussed his 1997 survey with Roberts.  He didn’t remember whether he did or not.  I spoke with Russell Roberts yesterday and Roberts said exactly what Lott told me in September--that Roberts regularly discussed matters with Lott, but couldn’t remember hearing about the 1997 study and couldn’t remember one way or the other whether Lott had discussed it with him.  Lott could well have discussed the study, but Roberts didn’t remember him doing so.

 

Also, on January 13, 2003, just the day before Lott first claimed in his response to this report that he discussed the survey questions with Russell Roberts, Lott wrote me the following:  “Russell is a friend who I talk to about lots of things.  Whether you classify him as an expert is up to you.  Hopefully, he will be able to recover an e-mail.”  This statement is much closer to what Lott said to me in September--then Lott had said that he discusses things with Roberts, but didn’t remember whether he had discussed the survey with him.

 

Lott’s recent change of story--that he discussed specifically the survey questions with Roberts--might well be true (Lott’s memory might have been jogged), but it is not what Lott said to me in September (or even bothered to mention the day before he changed his story when he reasserted his contacts with Roberts in an email).  If Lott had told me in September that he had discussed the survey questions with Roberts, I certainly would have called Roberts right away, as I did yesterday.  After all, I was looking for ways to verify the study. 

 

By the way, in Lott’s defense I must point out that some online commentators have falsely claimed that Lott never mentioned the study to anyone, but if you read my report carefully above, David Mustard quite strongly confirmed Lott’s claim to have discussed the 1997 study with him (and his data loss), but Mustard does not remember when he first heard about it.  

 

As to my statement that “Lott had no discussions with any samplers about his sampling design,” I said this because I asked Lott this question point blank and he flatly said “No.”  He did mention that he talked with Tom Smith, but he said that he did not discuss his sampling design with him.  One doesn’t just pull a national sample out of one’s head.  One usually either uses a random digit dialing program or a national sample provided or designed by an expert in survey sampling.  A CD-ROM with names on it designed for telemarketing is not the sort of thing academics usually use if they want a representative sample, which is the reason I asked whom he consulted on his sampling design. 

 

Another change in story involved the CD-ROM and the survey questions:

 

THIS REPORT: 8. “Lott does not remember how he drew his sample from the CD-ROM.”

 

LOTT’S RESPONSE:  “Not true. I told Jim that one of the students had a program to randomly sample the telephone numbers by state. My guess is that it was part of the CD, but on that point I can¹t be sure.”

 

THIS REPORT: “9. Lott does not have a copy of the survey instrument and doesn’t remember the wording of the questions, though he was probing defensive uses in more detail than other studies. He ended with a very few demographic questions.”

 

LOTT’S RESPONSE: “It is also not quite correct to say that “doesn¹t remember the wording of the questions.” I told Jim that I don¹t remember the “exact wording” of the questions, but I gave him the general outline of the questions.”

 

Once again, Lott’s memory fails him.  I was listening closely to Lott’s answers to these questions to see what details he could pull off the top of his head.  I asked him how he drew the sample from the CD.  He said that he didn’t remember, but assured me it was drawn randomly.  I remember being disappointed in his answer because I thought that a social scientist would probably remember how he solved this problem of getting a random sample (there are several solutions).   Also, I am absolutely positive that he did not mention pre-stratifying the sample by state, which is a form of proportional sampling, not random sampling.  His current claim is inconsistent with his September claim made to me that the sample was drawn randomly from the list of names on the CD (though he didn't remember how he drew that random sample); he seems to be claiming now that he drew the sample proportionally by state then randomly within states.  I was paying close attention to any details he mentioned about sampling design and he never mentioned breaking down proportionally by state first.  I am not disturbed by the content of his claim so much as that he would think he could just tell me one thing in September (he did not remember how he drew the sample from the CD) and tell people something else in January (“I told Jim that one of the students had a program to randomly sample the telephone numbers by state”). 

 

            A similar problem obtains for question wording.  I asked him directly whether he remembered the wording for any questions.  He said “No,” not their wording, but he assured me that he was trying to probe defensive uses in more detail than prior studies had and that he ended with a few demographic questions.  He gave no details of his questions other than this.  The notion that he gave me “the general outline of the questions” beyond what I faithfully reported is just plain false.  Drafting questions to probe something you care about might (or might not) be just the kind of thing that would stick in someone’s mind, so I was hoping that he could come up with plausible approximate wording off the top of his head.  If he had, I certainly would have noted it.  He couldn’t.  I was looking for exculpatory evidence and was disappointed to find not much more than good evidence of his exemplary pattern of sharing data and very good circumstantial evidence that Lott had a major computer crash in 1997.

 

 

b.  Lott’s Attempt to Put Things in Context

 

            In his response to this report, John Lott makes several odd statements other than about our conversation.  For example, Lott claims: "I have told people directly (including Otis Duncan) from the beginning that the data were lost.”  Otis Dudley Duncan, who first raised questions about Lott’s 98% figure in early 1999, however, says that he first he learned of any data loss when Lott published his comment in the Sept./Oct. 2000 Criminologist.  Duncan retained Lott's May 13, 1999 letter to Duncan, which was the first documented time that Lott disclosed that he did the 1997 study.  I have not seen a photocopy of it, though Dudley Duncan sent me a full transcript.  It is quoted earlier in this report, though I repeat here the relevant portion of that May 13, 1999 letter:

 

“The information of over 2 million defensive uses and 98 percent is based upon survey evidence that I have put together involving a large nationwide telephone survey conducted over a three month period during 1997. Follow up telephone calls were made to ensure that the questions were answered by those who we attempted to contact. The survey was not as detailed as several other surveys, but it did try to include a couple initial questions to ensure accuracy and screen out any problems and then focus exclusively on defensive gun uses. I plan on repeating the survey again during the next year to year and a half. I will be happy to inform you what the results of that survey are after I have conducted it.” Letter from John Lott to Otis Dudley Duncan, dated May 13, 1999.

 

As you can see, while far from conclusive on the point, Lott’s letter is consistent with Duncan's contention that Lott did not disclose that he had lost his data when he first notified Duncan in May 1999 about the 1997 study. Certainly, Lott said nothing about losing the data in this letter, the first documented time that Lott claimed to have done the 1997 study. 

 

Lott also claims that he has “always acknowledged” that the 98% figure is based on small samples. 

 

LOTT’S RESPONSE (Appendix 3 below): “As to so-called technical problems, I am [sic] have always acknowledged that these are small samples, especially when one breaks down the composition of those who use guns defensively.  Even the largest of the surveys have few observations in this category.”

 

As pointed out above in the section of my report called “Technical Problems,” in a sample of 2,424 respondents and a one-year window, Lott would find only about 25 respondents reporting defensive gun uses, 2% of which (1/2 of a person) answering that they had fired their gun.  On the more than four dozen occasions collected by Dudley Duncan and Tim Lambert in which Lott mentioned the 98% figure, I do not see a single instance in which Lott “acknowledged that these are small samples.”  When Lott says that he has “always acknowledged” this fact, he seems to be in error.

 

 Lott uses another style of argumentation that I find troubling.  Lott writes:

 

As to the attribution of sources, look at the complete context of the quote Lindgren mentions:

 

Polls by the Los Angeles Times, Gallup and Peter Hart Research Associates show that there are at least 760,000, and possibly as many as 3.6 million, defensive uses of guns per year. In 98 percent of the cases, such polls show, people simply brandish the weapon to stop an attack. -- August 6, 1998, Chicago Tribune and August 14, 1998, Washington Times

 

References by Lindgren to things like the Linnet Myers piece in the Chicago Tribune to provide evidence that I didn¹t do a survey or that I have changed my statements over time are simply bizarre.  Attached below is an edited down version of the letter that was published by me in the Tribune.  Myers used her article to refloat claims such as my Olin Funding, inaccurately reported exactly what the concealed handgun research covered, and claimed that "others haven't confirmed (my) findings."  I no longer have the original letter to the editor, but as I recall this is just a partial listing of her inaccurate statements.  The Tribune was not willing to run a longer letter, though the letter that they ran was quite long.

 

Lott says that he is going to give “the complete context” for a statement that I “mention” from the Chicago Tribune and Washington Times.  A fair minded reader would conclude that Lott is actually quoting me, but he isn’t.  I didn’t quote the Chicago Tribune version of the statement, but rather quoted the original version in the Wall Street Journal:

 

The year before, in the July 16, 1997 Wall Street Journal, Lott appeared to attribute the 98% figure to one or more of three specific survey organizations:

  

 “Other research shows that guns clearly deter criminals. Polls by the Los Angeles Times, Gallup and Peter Hart Research Associates show that there are at least 760,000, and possibly as many as 3.6 million, defensive uses of guns per year. In 98% of the cases, such polls show, people simply brandish the weapon to stop an attack.” John R. Lott Jr., Childproof Gun Locks: Bound to Misfire, Wall Street Journal, 7/16/97 Wall St. J. A22

 

The same language (other than typesetting conventions) appears the following year in two articles by Lott on the same topic for the Chicago Tribune and the Washington Times. John R. Lott Jr., Prime Suspect: Gun-Lock Proposal Bound to Misfire, 8/6/98 Chi. Trib. 23; John Lott, Commentary: Gun Locks That are Bound to Misfire, 8/14/98 Wash. Times (D.C.) A17.

 

Instead of responding to the quotation I actually use from his 1997 oped in the Wall Street Journal, Lott instead quotes an almost identical version of the same statement that he published in a later Aug. 1998 oped in the Chicago Tribune, fails to mention that the words in the Tribune are under his byline, and then appears to provide "the complete context" for his own statement by questioning an unrelated May 1999 Chicago Tribune story by a reporter.  In the guise of providing “context” Lott omits crucial information that is in my presentation above.   Lott omits that he first used the language in the Wall Street Journal in 1997 and presents the quotation as if these are the words of the Chicago Tribune or the Washington Times.  Lott never mentions that these are his words, published in an oped under his byline--a failure that tends to undercut his claim that he is providing “the complete context.” 

 

Lott never says why he chose to quote a Chicago Tribune version of his statement rather than the earlier Wall Street Journal version that I actually quote, nor does he say what his Tribune oped has to do with a later Tribune story by a reporter.  Yet (whatever Lott’s intentions) I think a fair minded reader might presume that one somehow gives context for the other--perhaps one might conclude that I am wrong to mention a 1998 statement in the Chicago Tribune because you can't trust their 1999 reporting on him.  That might appear to be a sound implication if one fails to realize that the 1998 Tribune statement he quotes is not the version I quoted (I quoted the 1997 Wall Street Journal), that the words he quotes are from his own 1998 oped in the Tribune under his own byline (a fact that he neglects to mention), and that he made a close version of the same statement in his own oped in the Wall Street Journal the year before (presumably untainted by any supposed Tribune bias).   Of course, Lott never tells us what part of the context he is providing for his 1998 Tribune oped that I mentioned but didn’t actually quote.

 

In the course of this, Lott also criticizes my use of a 1999 Linnet Myers story in the Chicago Tribune.  I did qualify my use of that story with these words:

 

“If this newspaper account is accurate (and newspapers often aren’t), it is odd that Lott would try to answer the reporter’s claims about the Kellermann household study without pointing out that he had done a big household study himself.  Although this contextual evidence is less telling, it does tend to fit the pattern that, until Lott replied to Duncan in mid-May 1999, Lott had consistently attributed the 98% figure to several specific survey organizations or to no one, never to his own 1997 study.” (This Report, above)


In the course of his criticism of my use of that story, Lott makes a potentially damaging disclosure.  Lott reveals that he published a 1999 letter in the Tribune complaining about errors in Myers’s story.  The text of his Tribune letter is included in Lott’s email response in Appendix 3 below.  Myers had written: “Lott didn't examine home protection, but he did study the impact of armed self-defense. . . . Lott didn't study gun use at home, but looked at the impact of laws that allow guns to be carried outdoors.” Linnet Myers, Go Ahead Make Her Day With Her Direct Approach And Quiet Confidence,
Chicago Lawyer Anne Kimball Gives Gunmakers A Powerful Weapon, Chicago Tribune, 5/2/99 Chi. Trib. 12.

 

In his June 1999 letter to the editor published in the Tribune, Lott responded particularly to Myers’s sentence in which she claims that Lott didn’t study gun use at home, but rather the impact of laws that allow carrying guns outdoors:

 

 “My book analyzed FBI crime statistics for all 3,054 American counties from 1977 to 1994 as well as extensive cross-county information on accidental gun deaths and suicides. This is by far the largest study ever conducted on crime, accidental gun deaths or suicide. I examined not only concealed-handgun laws, but also other gun-control laws such as state waiting periods, the length of waiting periods, the Brady law, criminal background checks, penalties for using guns in commission of crime and the impact of increasing gun ownership. The only gun laws that produced benefits were those allowing concealed handguns. The evidence also strongly indicates that increased gun ownership on net saves lives.”  John Lott, Letter, Chicago Tribune, June 20, 1999.

 

Note that in his letter to the Tribune, Lott makes no mention of his 1997 study of households that he claimed to have done, even while responding to a sentence that asserted that Lott didn’t study gun use at home.  Lott lists seven things he looked at: concealed-carry laws, waiting periods, waiting period length, the Brady law, background checks, extra penalties, and gun ownership rates. In his published letter, he omits to mention his 1997 telephone household study, even though Myers twice says he didn’t do a household study.  In Lott’s defense, Lott says that the letter was cut; the original was longer.  But Lott’s published letter as it stands includes a long sentence listing the sorts of inquiries Lott did, but fails to mention his 1997 study done at the household level, rather than his other inquiries at the county level. 

 

 

5.  Conclusion

 

I think it prudent to withhold judgment on the question whether the 1997 study was done until an email inquiry of University of Chicago students has been done and its results are known.  I hope that John Lott and the University of Chicago Press will join in encouraging the administration of the University of Chicago to conduct or coordinate the appropriate email inquiry.  Further, I think it advisable that Lott examine the University of Chicago undergraduate picture book for the classes of 1997 and 1998, if such a book exists.  Perhaps a few names or faces might seem familiar and be worth contacting.

 

            I remain hopeful that University of Chicago undergraduates will come forward with a credible story about hours of phone calling in January 1997.  Everyone would be enormously relieved were that to occur.  If no one does come forward, Lott has done his career a great disservice this January by changing his story in so many ways.  Although most of these changes are small ones, the fact that he would make them at this worst possible time is profoundly disappointing to those of us who would like to think the best of him. As it stands now, unless someone comes forward to verify working on the study--as I still hope occurs--we may never know with any certainty whether the 1997 study was done.   

 

 

James Lindgren
Professor of Law
Director, Demography of Diversity Project
Northwestern University School of Law
357 East Chicago Avenue
Chicago, IL 60611
312-503-8374

 

 

Note:  In some cases, I relied on secondary sources for quotations, in particular a compilation by Tim Lambert and Dudley Duncan <http://www.cse.unsw.edu.au/~lambert/guns/lottbrandish.html>--this is, after all, only an informal preliminary inquiry.  If I have misquoted anything, I would appreciate any corrections.

 

 

 

 

 

 

APPENDICES:
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

1.  Email of 12/26/02 from John Lott to James Lindgren, commenting on the first draft of the report.

 

Dear James:

 

My survey never had the ambition of yielding precise numbers.  Rather the main intention was to check the general accuracy of the claim that there were 2.5 million defensive gun uses each year.

 

1) You are of course right that the sample of people using guns defensively is very small and has an obviously large 95 percent confidence interval associated with it. The estimate provided was a point estimate, nothing more and I have never made any pretense to it being more than that.  I am sure that you can provide the confidence intervals for the subgroups estimates.

 

2) The overwhelming majority of the survey work was done at the beginning of the period over which the survey was done.  It has obviously been a while, but my recollection is that the small number of people surveyed after the first four or five weeks (mainly January 1997) did not include any more defensive gun uses.

 

3) Unfortunately, the documentation and results for the survey was lost.  However, concerns about the accuracy of the survey can be addressed through replication. I did another survey over 10 days this past fall and it will be discussed in a book coming out in a couple of months.  The results of the survey are very similar to those previously reported.  All the documentation and the results will be made available to anyone interested in examining it after the book is released.

 

4) Just a note on your discussion of the timing of things.  The University of Chicago Press is not particularly fast.  It took them over nine months to publish the second edition after receiving a finalized manuscript.  For them, that was an incredibly fast turn around.  I am sure that the Press is happy to confirm these types of time lags for you if you are interested.  The time lags are much longer than for other types of publications, but you should consider it when putting together your time line.

 

5) I am not sure that I understand why things should be weighted by household size since I was asking questions about individual experiences.

 

Sincerely,

 

John Lott

 

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

2.  Email of 12/26/02 from David Mustard to James Lindgren, commenting on the first draft of the report:

 

Jim,

 

        I emailed quickly when I responded the first time and that is probably some reason for the ambiguity. I did not realize that you were trying to use it for such a complete accounting of the details and timing of the events. Below I try to be more clear about the timing of events.

 

        John and I started working on our paper in the fall of 1995 or so. We worked on it intensively from about Feb. 1996-Sep. 1996. We presented it at the Am Law and Econ meetings in May and then at a couple of Chicago workshops in either May or June of 1996. We finished the JLS proofs in Sep or so and were then essentially done with the article.

 

        As we finished the concealed carry paper John talked about working on other projects related to guns. So the first sentence of my previous response should be more accurately "... after our concealed carry paper had been finished (about Sep 1996)...". Once it was finished he started to work on a number of extensions, including the book. This is about the extent of my knowledge about John's activities and the timing of those activities from the fall of 1996 when we finished our JLS paper through the summer of 1997, when I left Chicago. I did not work with John on any of these other projects because I had to finish my job market paper and send my applications out (most due by 1 Dec 1996). In the rest of Dec 1996 I worked on my job market paper and practiced interviews. In Jan. 1997 I went to the AEA annual meetings and interviewed. I had campus visits through early Feb and signed my contract with Georgia around the second or third week of Feb in 1997. From March to June I finished my dissertation and taught. In July I defended and traveled a lot, and my wife moved down to Georgia. I moved to Georgia in early August.

 

        As to the _date_ of John's computer crash, it could have happened in June 1997. My previous response would more accurately be that I _sent him the data_ after I was at Georgia. I do not really remember _when_ it crashed, only that it did and I sent him the data shortly after I arrived in Georgia. Given that I was gone for large parts of July and that our possessions were being shipped, it is feasible that John's crash could have occurred in June and I sent him the data after I got set up in Georgia.

 

        I hope this more complete documentation is helpful. If you have other questions let me know.

 

        David

 

David B. Mustard

Terry College of Business

528 Brooks Hall

University of Georgia

Athens, GA 30602

 

++++++++++++++++++++++++++++++++++++++++++++++++++++++++

3.  Email of 1/14/03 from John Lott to various bloggers, colleagues, and James Lindgren, responding to  the second draft of this report:

 

Tue, 14 Jan 2003 15:51:35 -0500

Subject: Responses

 

Dear Everyone:

 

Here is a response to some of what has been going on over the web.  I have already sent much of this information to people who have already contacted me in person.  If Eugene would like to post this on his web site, I must ask that all the e-mail addresses and telephone numbers be removed.  If you all don't trust the leg work done by Dan Polsby on this issue, you can nominate someone else to go and do it, but I don't think that it is appropriate for everyone from Lambert on to go and harass these people.  I suppose that such an edited copy could be sent to Mark A. R. Kleiman.

 

Regnery (the publisher of my new book due the middle or end of March) wants me not to release the results from the poll last year.  They want me to keep quite about the book until it comes out.  As has been reported previously, the survey was done with similar questions in a very similar way to what was done earlier and the results were essentially the same.  I am sure that I could arrange it so that interested parties could question the person who keep the survey results as they came in to confirm that we only got one person who said that they had actually fired a gun.

 

Here are some of the things that I have done to try to establish a record of events.  1) My wife contacted the bank that we had in Chicago and tried to get copies of bank statements and checks from the period of time. Unfortunately, the bank does not keep copies of statements or checks longer than five years.  (If you would like to verify, we talked to Yvonne Macias in the book keeping department at University National Bank, [phone number omitted].) Lindgren does not accurately report my conversation with him about how I paid people (in that I said that I possibly paid by check), but this information makes that point irrelevant.  2) I asked Sam Peltzman last year about whether the Alumni Association has the e-mail of past students.  Sam, who seems to know virtually everything that is going on at the University, told me that they have the e-mail addresses for at most 10 percent of the former students.  3) I had a former alumni and several time co-author, John Whitley, placed in an ad in the Alumni magazine in the December issue to track down the students.  I don't know if the ad has appeared but thus far I have gotten no response.

 

I have given out massive amounts of data to people on the guns and other issues, and I will be happy to do so on the new survey.  Data has been given to critics as well as people who have been unwilling to share their own data on other projects.  I have given out county, state, and city level crime data to academics at dozens of universities, with data sets ranging from 36MB to over 300MB.  I have given out data on multiple victim public shootings as well as safe storage laws.  These different data have often been given out before the research is published and sometimes even before it has been accepted for publications.  We are not talking about recent events or conversations and there is a question about what is a reasonable time period for people to keep records.  There is also a question as to why people have waited so long to ask for this additional information when people have known about the lost data for years.

 

As to the claims about ²apparently changing positions I disagree.  I have told people directly (including Otis Duncan) from the beginning that the data were lost.  Op-ed pieces and other public statements where I mention these numbers briefly usually do not lend themselves to discussions of the sources of numbers.  The fact that David Mustard does not remember exactly when we discussed the survey 6+ years ago does not surprise me given how long ago this was.

 

Unfortunately, there are many problems with Lindgren¹s write up.  He gives essentially uncritical acceptance of Otis Duncan¹s discussion of events in 1999.  Yet, while Lindgren writes that ³Otis Dudley Duncan raised questions about the 98% figure . . .  after exchanges between Lott and Duncan Duncan¹s write-up in the Criminologist news letter failed to mention any such possible discussions.  In fact his newsletter piece leaves the opposite impression as he endlessly speculates about what I may have meant about certain statements.  My response in the Criminologist also discussed other incorrect claims by Duncan.

 

As to the attribution of sources, look at the complete context of the quote Lindgren mentions:

 

Polls by the Los Angeles Times, Gallup and Peter Hart Research Associates

show that there are at least 760,000, and possibly as many as 3.6 million,

defensive uses of guns per year. In 98 percent of the cases, such polls

show, people simply brandish the weapon to stop an attack. -- August 6,

1998, Chicago Tribune and August 14, 1998, Washington Times

 

 

References by Lindgren to things like the Linnet Myers piece in the Chicago Tribune to provide evidence that I didn¹t do a survey or that I have changed my statements over time are simply bizarre.  Attached below is an edited down version of the letter that was published by me in the Tribune.  Myers used her article to refloat claims such as my Olin Funding, inaccurately reported exactly what the concealed handgun research covered, and claimed that "others haven't confirmed (my) findings."  I no longer have the original letter to the editor, but as I recall this is just a partial listing of her inaccurate statements.  The Tribune was not willing to run a longer letter, though the letter that they ran was quite long.

 

As to so-called technical problems, I am have always acknowledged that these are small samples, especially when one breaks down the composition of those who use guns defensively.  Even the largest of the surveys have few observations in this category.  The attached e-mail that I sent to Glenn Reynolds goes into this more in depth.

 

 ³No direct evidence of survey² ­ discussing Lindgren¹s point-by-point

discussion of our conversation

 

1) ³No funding for the project²

     I regularly have paid for research myself.  Sometimes large amounts of

money have been spent, but it is not uncommon for me to spend several

thousand dollars.  On the paper on multiple victim public shootings, I know

that one payment that I made to Kevin, a research assistant to Landes and

Posner, was $750.  I paid for the special issue of the JLE in 1999 on

sentencing myself, and the special issue and part of the conference cost me

around $30,000.  I have not applied for funds from outside sources over the

years.

 

2) ³No financial employee records²

    This is not unrelated to the first point.  Incidentally, I told Jim that

there were ³two² Chicago students.  Those students had also gotten others

that they knew from other campuses from places such as I think the

University of Illinois at Chicago circle (but I am not sure that I remember

this accurately).

 

3) ³calling was done by the undergraduates from their own phones.²

most of this next statement is correct except the point about the ³possible²

use of checks. But as noted earlier this point is irrelevant in terms of

evidence.

 

4) ³does not remember names²

I have had 12 interns and RAs just since I arrived at AEI.  This excludes

people whose only work was on the survey.  I am horrible at names and I

couldn¹t even give you the names for all of these folks let alone people who

did something six years ago.  All my names and addresses for everything were

on my computer when the hard disk crashed.

 

5) ³no discussions with any samplers²

 

I had lunch Tom Smith during the fall of 1996.  However, while I asked him

many questions about surveys, I did not tell him what I was planning on

doing because Tom works very closely with gun control organizations.

 

6) weighting the sample

 

I did not weight the sample by household size but used the state level age,

race, and sex data that I had used in the rest of my book.  There where 36

categories by state.  Lindgren hypotheses why you can get such small weights

for some people and I think that this fine of a breakdown easily explains

it.  I don¹t remember who answered what after all these years, but suppose

someone who fired a gun was a elderly black in Utah or Vermont.

 

7) ³commercially available CD-ROM with names on it.  He does not remember

where he got it from.²

 

It is true that I don¹t have the original CD-ROM.  I have a telephone number

CD from the end of 1997, but it is not the one that we used. I only picked

up the other one on the off chance that I was going to have the time and

resources to redo the lost data.  The CD did have the features that the

earlier one had and was not very useable.  I was so rapped up in trying to

replace my lost data on so many other projects that I had no thought of

going back to what I regarded as a minor project.  I had revise and

resubmits at the JPE and other journals that had much greater importance and

the data for the book had to be replaced.

 

8) ³Lott does not remember how he drew his sample from the CD-ROM²

 

Not true.  I told Jim that one of the students had a program to randomly

sample the telephone numbers by state.  My guess is that it was part of the

CD, but on that point I can¹t be sure.

 

9) ³doesn¹t remember the wording of the questions.²

 

It is also not quite correct to say that ³doesn¹t remember the wording of

the questions.²  I told Jim that I don¹t remember the ³exact wording² of the

questions, but I gave him the general outline of the questions.

 

10)  more on weighting

 

See point 6 above.

 

11)  ³A chapter he had not yet written²

 

This is not correct.  What I had done is write up the section, but I only

had a computer file of it.  When the hard disk crashed, I only had a hard

copy of the book and I had to spend considerable time scanning in the book

and correcting the new file.  I was unable to replace the lost polling

section that I had recently added.  I didn¹t think that it was worthwhile

relying solely on memory for different things and I had too much else to do

to concern myself with something that wasn¹t central to the book.

 

12) ³did not retain any of the tally sheets²

 

I have looked through some things but I haven¹t found anything.  As Lindgren

correctly notes, I have moved three times in the last six years.

 

13) Sheets versus entry of data into computers

 

Lindgren has the ³impression² that the students entered the data on sheets.

I do not directly recall this part of our conversation, but I would have

said that both were done.

 

I sent Lindgren two e-mails on December 26th.  Just so no one accuses me of

adding new things in now, one of my e-mails to Lindgren noted: "I did not

take the time to correct or respond to all the issues raised, but I wanted

to mention a few points."  Recent e-mails to Lindgren have also already

responded to some of these points beyond the e-mail that he apparently

posted.

 

I have not participated in the firearms discussion group nor in the apparent

online newsgroup discussions, but what I have done is respond to e-mails.

(The one exception are those from Lambert whose e-mail address was placed on

my blocked list.)  If you all have questions, I will be happy to discuss

them, but I am not going be involved in these online groups.  My response to

Glenn below goes through some of the history of what I heard on this and

when I heard it.  The bottom line is that you all should not assume that

everyone participates in these discussions.

 

Appendix

 

Chicago Tribune

June 20, 1999 Sunday, CHICAGOLAND FINAL EDITION

 

SECTION: MAGAZINE; Pg. 4; ZONE: C; LETTERS TO THE EDITOR.

 

LENGTH: 684 words

 

HEADLINE: GUNS AND CRIME

 

BODY:

The article accompanying "Anne, Get Your Gun" (May 2), discussing my book "More Guns, Less Crime" (University of Chicago Press, 1998), made several inaccurate claims.

 

Despite the claims in the article, my research looked at much more than just the "impact of laws that allow guns to be carried outdoors." My book analyzed FBI crime statistics for all 3,054 American counties from 1977 to 1994 as well as extensive cross-county information on accidental gun deaths and suicides. This is by far the largest study ever conducted on crime, accidental gun deaths or suicide. I examined not only concealed-handgun laws, but also other gun-control laws such as state waiting periods, the length of waiting periods, the Brady law, criminal background checks, penalties for using guns in commission of crime and the impact of increasing gun ownership. The only gun laws that produced benefits were those allowing concealed handguns. The evidence also strongly indicates that increased gun ownership on net saves lives.

 

More disappointing were inaccurate references to the funding of my research. The claims previously floated by gun-control groups like Handgun Control were found by the Tribune's own Steve Chapman to be false (Aug. 15, 1996). Chapman pointed out that not only was the Olin Foundation "independent" of the ties the Sunday Magazine article discussed, but also that the "foundation didn't (1) choose Lott as a fellow, (2) give him money or (3) approve his topic."

 

The article's claim that "others haven't confirmed (my) findings" is bizarre. To date, I have made the data available to academics at 37 universities, from Harvard to Berkeley. Everyone who has tried has been able to replicate my findings, and only three have written pieces critical of my general approach. Although the vast majority of researchers concur that concealed weapons deter crime, not even those three critics have argued that more guns cost lives or increase crime.

n       John R. Lott Jr., University of Chicago

n        

Editor's note: Reporter Linnet Myers responds:

 

Various researchers have praised John Lott's thorough research, although some disagree with his results, which indicate that crime drops when laws allow citizens to carry concealed guns. Whether his findings have been "confirmed" may depend on exactly what that means.

 

Three professors interviewed at separate universities said Lott's data and computations were mathematically correct. But because each professor's analysis differed, one didn't find significant drops in crime while another found more dramatic decreases than Lott did. The third said Lott's results have been "confirmed in the sense that they've been replicated."

 

Yet the findings remain hotly debated. Some researchers, as well as many gun-control advocates, flat-out reject them. Others say only time will tell.  In the midst of this controversy, my statement that Lott's results haven't been "confirmed" was one of caution. And the article did not suggest that he hasn't studied anything beyond those laws.

 

Most researchers interviewed did agree on one point: Despite the fears of gun-control groups, there is currently little evidence that the laws have caused any rise in crime.

 

Lastly, though I don't think the reference to Lott's funding was "inaccurate," it may have been unclear. The original version of my article quoted a researcher who said that while Lott's fellowship had a link to an ammunition company, "Lott's findings weren't swayed by the somewhat remote connection." The researcher said that though gun-control advocates have focused on it, the funding foundation "isn't reputed to be an arm of the gun industry any more than the Rockefeller Foundation is a tool of the oil companies."

 

Because of limited space, however, the story was cut and that quote never made it into print. I apologize to Mr. Lott for that trim.

 

Most of an e-mail that I sent recently to Glenn Reynolds (cutting some personal comments at the end)

 

Dear Glenn:

 

First, I have responded to people.  I responded to the e-mail from you that had been forwarded via Clayton Cramer last year (you did not send it directly to me for some reason).  I have responded extensively to Polsby when he wrote me after Christmas and I responded again to Lindgren (twice) when he e-mailed me on December 24th.  During the last week, I have also corresponded with Dave Kopel.  The data on the original survey was lost and I will go into it later.  First, here is a similar survey that I did as well as some comments on it.  This survey is NOT for public dissemination as it is for a book that I have that will shortly becoming out.  My publisher would be very upset if the results of the survey or the survey itself were released.

 

Survey questions:

 

[questions in email omitted here] . . .

 

Write up by James Knowles of the discussion of the survey:

 

We had a small army of interns and AEI staff making phone calls. The callers for any given night varied according to who was available/willing to make phone calls. I was here every night supervising from my office at AEI. The survey was conducted over eight nights. Calls were made between 7pm and 9pm local time. Here are the list of callers and their email addresses, I can

try to track down phone numbers if need be.

 

[names and emails in Lott’s email omitted here] . . .

 

. . . [details in email of random digit dialing procedure used omitted here]

 

 

___________________

 

End discussion on survey.  . . .   [notes about Lott’s contact info omitted here] . . .

1) I have lots of people who can say that I lost my hard disk in July of 1997: for example, David Mustard ([phone omitted]), Geoff Huck (the editor at the University of Chicago Press ([phone omitted]),  John Whitley, William Landes.  If you want to you can feel free to contact especially David and John.  Russell Roberts is someone that I bounced the survey questions off of

and he can possibly talk to you about it, though he hasn't kept e-mails from 1997. . . . [Personal comment about Lott omitted for privacy reasons] I have told people for years about the data being lost.

 

2) I lost ALL the data for my paper with David in the JLS and for my book. David and I reconstructed the county and state level data from our paper and I got the rest together that was in my book.  I have consistently provided the county, state, and city level data as well as the multiple victim shooting data and the safe storage data to people when they have asked. That constitutes almost all the numbers mentioned in the book.  Months were spent redoing this data so that it could be given out to people.   (Just a note, the critics to whom we had given out the data up to that point were unwilling to return the data to us so we had to put the JLS data together all over again.)  If I had the other data, I would have been happy to give it to who ever wanted it.

 

The survey data could not be replaced by going to things like the UCR or the census data.  It could only be replaced by doing another survey.  I ended up only briefly mentioning the survey in the past and worked on replacing all the other data that I lost for not just the book but for all the other projects that I was working on.  I spent a good portion of the next two years trying to replace data for other projects that I had been working on. I had important papers for the Journal of Political Economy and other journal for which replacing the data was my first priority after replacing the data for the JLS and the book.  Thinking about the survey was well down on my list.  Its importance was not particularly high given that I had only one sentence on the issue in my book and have never written the survey into a research paper because the data was lost.

 

3) This is something that was done six years ago.  During the intervening time I have moved three times.  Usually I pay students in cash.  When I am at universities I don't apply for grants and the money is mine so there is no record of universities paying for the students.  My records of the

students names and contact information were lost.  You can get an idea of how much total time was involved from the survey discussed above.  An ad was taken out in the fall in the University of Chicago Alumni magazine to try to contact the two University of Chicago students who organized some other people from different places to work on it.  I have gone through well over 12 RAs and interns independent of those used on the survey since I got to AEI and I can't say that I remember most of their names (I am really horrible at names).

 

4) Issues about the significance difference in results.  Given the very small sample sizes, the differences in results are not statistically significant and are really trivial.  About one percent of people in a survey note that they have used a gun defensively.  Whether one is talking about 2

percent or 20 percent simply brandishing a gun, you are talking about the different of 2 percent of 1 percent or 20 percent of 1 percent.  Depending upon who answers the questions and what weighted group they are in, you are literally talking about the answers of a few people out of 1,000 that is the difference between these two results.

 

5) If you have doubts about anything in specific, you should ask me about it.  Last year I worked extremely hard.  I was the only affirmative numbers expert for the Senator Mitch McConnell's side in the campaign finance case. Once that was done, I had to deal with something that Ian Ayres and John Donohue wrote attacking my work and finishing things for my book.  Only in the last couple of weeks have I gotten a breath on things, but I have responded when people e-mailed me.  I am not a member of the firearms discussion groups and I have not been following them.  I read your site once in a while just to keep up with the news (and because of that I have sent you some money from time to time e.g., just on Saturday), but otherwise I have been too busy to follow a lot of things.  (When I recently accidentally sent you an e-mail it was at the end of one of many all nighters.)

 

So as to state things clearly, the bottom line is that I have provided data on county, state, and city level crime data when I have been asked as well as the data on the multiple victim public shootings and the data on safe storage laws, even before the papers have been published. We are talking about one number in one sentence in the book, a claim that I have also used in some op-eds and in some talks.  I know of no one who has given out his data as quickly and consistently before even papers are published as I have and over the years.  Finally, let me note the most important bottom line: the survey that was done last fall produced very similar results.  The earlier results were replicated.  This survey was done more recently and I will release the data when the book is released in March.  To keep the publisher happy, I will not release it before hand unless you can give me a

very good reason.

 

[cut [by Lott]]  There are errors in Lindgren's write up (at least the one that he sent me) and if you have specific questions about it, I will respond.  But instead of claiming that I haven't responded to people you should talk to people like Dan Polsby who raised claims voiced to him by others that I had fabricated this second survey.  He spent a good deal of time verifying that the survey did indeed take place.  Polsby can be reached at [phone omitted]. I am sure that he would be happy to talk to you.

 

Best.

 

John