UNITED STATES DISTRICT COURT
FOR THE DISTRICT OF COLUMBIA
UNITED STATES OF AMERICA, :
: Criminal Action No.: 19-358 (RC)
v. :
: Re Document No.: 22
DEMONTRA HARRIS, :
:
Defendant. :
MEMORANDUM OPINION
DENYING DEFENDANT’S MOTION IN LIMINE TO EXCLUDE EXPERT TESTIMONY AS TO
FIREARM EXAMINATION TESTING
I. INTRODUCTION
Defendant Demontra Harris is charged with unlawful possession of a firearm as a person
previously convicted of a felony, assault with a dangerous weapon, and possession of a firearm
during a crime of violence. Superseding Indictment at 1–2, ECF No. 39. On July 24, 2019, the
D.C. Metropolitan Police Department (“MPD”) responded to a report of gunshots and recovered
four 9mm shell casings from the incident scene, which were then entered into the National
Integrated Ballistic Information Network (“NIBIN”). A witness later provided MPD with a
video filmed that night that allegedly shows Mr. Harris holding and then discharging a firearm in
the location where the shell casings were later discovered. No firearm was recovered at the time.
Roughly six weeks later on September 8, 2019, during a response to a call for service for a
person with a weapon, MPD recovered a Glock 17 Gen4 9x19 pistol (“Glock 17”). This
recovered firearm was test-fired and the resulting casings were entered into the NIBIN, where a
match was identified with the casings recovered on the night of July 24, 2019. The Government
then submitted the relevant evidence to an independent firearms examiner for forensic
examination. Chris Monturo, a tool mark examiner who operates the Ohio-based forensic
services firm Precision Forensic Testing, examined the evidence and concluded in a report that
he believed the four recovered casings from the July 24, 2019 incident scene were fired by the
recovered Glock 17. See March 14, 2020 Report of Chris Monturo (“Monturo Report”), ECF
No. 22-2. The Government intends to call Mr. Monturo to testify regarding these findings at the
upcoming trial in this matter.
This opinion addresses Mr. Harris’s motion in limine to Exclude Expert Testimony as to
Firearm Examination Testing (“Def.’s Mot.”), ECF No. 22, pursuant to Daubert v. Merrell Dow
Pharm. Inc., 509 U.S. 579 (1993), Federal Rule of Evidence 702, and Federal Rule of Evidence
403. Def.’s Mot. at 1–2. The motion has been fully briefed, with both parties also filing
supplemental motions. See generally Def.’s Mot.; Govt.’s Opp’n to Def.’s Mot. to Excl. Firearm
and Toolmark Testimony (“Govt. Opp’n”), ECF No. 28; Def.’s Supp. Mot. to Excl. Expert
Testimony as to Firearm Exam. Testing (“Def.’s Supp. Mot.”), ECF No. 32; Govt.’s Opp’n to
Def.’s Supp. to Excl. Firearm and Toolmark Testimony (“Govt. Supp. Opp’n”), ECF No. 33. In
addition, the Court conducted a Daubert hearing on October 15, 2020 to consider this issue,
taking the testimony of Todd Weller, an expert in the field. A jury trial in this matter is currently
scheduled to begin on November 12, 2020.
Mr. Harris argues that the field of firearm and toolmark identification lacks a reliable
scientific basis and is not premised on sufficient facts or data, is not the product of reliable
principles and methods, and was not applied properly by Mr. Monturo to the facts of the case.
Def.’s Mot. at 1–2. The Court disagrees, and will admit Mr. Monturo’s testimony to the extent it
falls within the Department of Justice’s Uniform Language for Testimony of Reports for the
Forensic Firearms/Toolmarks Discipline – Pattern Matching Examination (“DOJ ULTR”).
While Mr. Harris raises important issues as to the reliability of firearm and toolmark
identification, memorialized most notably by the 2016 President’s Council of Advisors on
Science and Technology Report (“PCAST Report”), these issues are for cross-examination, not
exclusion, as recent advancements in the field in the four years since the PCAST Report address
many of Mr. Harris’s concerns. Mr. Harris also remains free to have his own expert examine the
firearm and ballistics evidence and contradict the Government’s case.
II. ANALYSIS
A. Legal Standard
“Motions in limine are designed to narrow the evidentiary issues at trial.” Williams v.
Johnson, 747 F. Supp. 2d 10, 14 (D.D.C. 2010). “While neither the Federal Rules of Civil
Procedure nor the Federal Rules of Evidence expressly provide for motions in limine, the Court
may allow such motions ‘pursuant to the district court’s inherent authority to manage the course
of trials.’” Barnes v. District of Columbia, 924 F. Supp. 2d 74, 78 (D.D.C. 2013) (quoting Luce
v. United States, 469 U.S. 38, 41 n.4 (1984)).
Federal Rule of Evidence 702 provides that qualified expert testimony is admissible if
“(a) the expert’s scientific, technical, or other specialized knowledge will help the trier of fact to
understand the evidence or to determine a fact in issue; (b) the testimony is based on sufficient
facts or data; (c) the testimony is the product of reliable principles and methods; and (d) the
expert has reliably applied the principles and methods to the facts of the case.” Fed. R. Evid.
702. “In general, Rule 702 has been interpreted to favor admissibility.” Khairkhwa v. Obama,
793 F. Supp. 2d 1, 10 (D.D.C. 2011) (citing Daubert v. Merrell Dow Pharm., Inc., 509 U.S. 579,
587 (1993); Fed. R. Evid. 702 advisory committee’s note to2000 amendment (“A review of the
caselaw after Daubert shows that the rejection of expert testimony is the exception rather than
the rule.”). Indeed, the Supreme Court has clarified that it is not exclusion, but rather “vigorous
cross-examination, presentation of contrary evidence, and careful instruction on the burden of
proof” that “are the traditional and appropriate means of attacking shaky but admissible
evidence.” Daubert, 509 U.S. at 596.
When considering the admissibility of expert evidence under Federal Rule of Evidence
702, district courts are required to “assume a ‘gatekeeping role,’ ensuring that the methodology
underlying an expert’s testimony is valid and the expert’s conclusions are based on ‘good
grounds.’” Chesapeake Climate Action Network v. Export-Import Bank of the U.S., 78 F. Supp.
3d 208, 219 (D.D.C. 2015) (quoting Daubert, 509 U.S. at 590–97). This gatekeeping analysis is
“flexible,” and “the law grants a district court the same broad latitude when it decides how to
determine reliability as it enjoys in respect to its ultimate reliability determination.” Kumho Tire
Co. v. Carmichael, 526 U.S. 137, 141–42 (1999) (emphasis omitted). While district courts may
apply a variety of different factors to assess reliability, in Daubert the Supreme Court provided a
non-exhaustive list of five factors to guide the determination, including: (1) whether the
technique has been or can be tested; (2) whether the technique has a known or potential rate of
error; (3) if the technique has been subject to peer review and publishing; (4) the existence of
controls that govern the technique’s operation; and (5) whether the technique has been generally
accepted within the relevant scientific community. See Daubert, 509 U.S. at 593–94. In
contrast, expert testimony “that rests solely on ‘subjective belief or unsupported speculation’ is
not reliable.” Groobert v. President & Directors of Georgetown Coll., 219 F. Supp. 2d 1, 6
(D.D.C. 2002) (citing Daubert, 509 U.S. at 590).
“The burden is on the proponent of [expert] testimony to show by a preponderance of the
evidence that . . . the testimony is reliable.” Sykes v. Napolitano, 634 F. Supp. 2d 1, 6 (D.D.C.
2009) (citing Meister v. Med. Eng'g Corp., 267 F.3d 1123, 1127 n.9 (D.C. Cir. 2001)). Even if
the proposed expert testimony is reliable, the Court may nonetheless exclude it “if its probative
value is substantially outweighed by a danger of one or more of the following: unfair prejudice,
confusing the issues, misleading the jury, undue delay, wasting time, or needlessly presenting
cumulative evidence.” Fed. R. Evid. 403; see Bazarian Int'l Fin. Assocs., LLC v. Desarrolos
Aerohotelco, C.A., 315 F. Supp. 3d 101, 128 (D.D.C. 2018) (analyzing expert testimony under
Rule 403).
B. Firearm and Toolmark Identification
1. Firearm and Toolmark Identification Science
Mr. Harris’s motion challenges the reliability of the Government’s proposed use of
firearm toolmark identification as a discipline for expert testimony. Firearm identification began
as a forensic discipline in the 1920s, see James E. Hamby, The History of Firearm and Toolmark
Identification, 31 Ass’n of Firearm and Tool Mark Examiners J. 266, 266–284 (1999), and “for
decades” has been routinely admitted as appropriate expert testimony in district courts. United
States v. Taylor, 663 F. Supp. 2d 1170, 1175 (D.N.M. 2009); see also United States v. Brown,
973 F.3d 667, 704 (7th Cir. 2020) (noting firearm and toolmark identification has been “almost
uniformly accepted by federal courts”) (citations omitted).
Firearm and toolmark identification “is used to determine whether a bullet or casing was
fired from a particular firearm.” Brown, 973 F.3d at 704. A firearm and toolmark examiner will
make this determination “by looking through a microscope to see markings that are imprinted on
the bullet or casing by the firearm during the firing process,” which will include marks left on the
bullet by the firing pin as well as scratches that occur when the bullet travels down the barrel. Id.
A firearm examiner is trained to observe and classify these marks into three types of
characteristics during a firearm toolmark examination, which include:
(1) Class characteristics: i.e., the weight or caliber of the bullet, the number of
lands and grooves, the twist of the lands and grooves, and the width of the lands
and grooves, that appear on all bullet casings fired from the same type of weapon
and are predetermined by the gun manufacturer;
(2) Individual characteristics: unique, microscopic, random imperfections in the
barrel or firing mechanism created by the manufacturing process and/or damage
to the gun post-manufacture, such as striated and/or impressed marks, unique to a
single gun; and
(3) Subclass characteristics: characteristics that exist, for example, within a
particular batch of firearms due to imperfections in the manufacturing tool that
persist during the manufacture of multiple firearm components mass-produced at
the same time.
Ricks v. Pauch, No. 17-12784, 2020 WL 1491750, at *8–9 (E.D. Mich. Mar. 23, 2020).
A qualified examiner can conclude that casings were fired by the particular firearm by
“comparatively examining bullets and determining whether ‘sufficient agreement’ of toolmarks
exist,” which occurs when the class and individual characteristics match. Id. at *9; see also
Brown, 973 F.3d at 704. The methodology of determining when sufficient agreement is present
is detailed by the Association of Firearm Toolmark Examiners (“AFTE method”), and is “the
field’s established standard.” United States v. Ashburn, 88 F. Supp. 3d 239, 246 (E.D.N.Y.
2015). Under the governing AFTE theory, no two firearms will bear the same microscopically
identical toolmarks due to differences in individual characteristics. United States. v. Otero, 849
F. Supp. 2d 425, 427 (D.N.J. 2012).
In recent years three scientific reports have examined the underlying scientific validity of
firearm and toolmark identification. They include the 2008 Ballistic Imaging Report, Def.’s
Supp. Mot. Ex. 1, ECF No. 32-1, the 2009 National Academy of Science Report, Def.’s Supp.
Mot. Ex. 2, ECF No. 32-2, and the 2016 President’s Council of Advisors on Science and
Technology Report (“PCAST Report”), Def.’s Supp. Mot. Ex. 3, ECF No. 32-3. Mr. Harris
argues that these reports “reject the claim that firearms identification is a valid and reliable
science.” Def.’s Supp. Mot. at 2–3. The Court is generally convinced by the Government’s
arguments and ample citations to case law that the 2008 Ballistic Imaging Report and the 2009
National Academy of Science Report are both “outdated by over a decade” due to intervening
scientific studies and as a result have been repeatedly rejected by courts as a proper basis to
exclude firearm and toolmark identification testimony. Govt. Supp. Opp’n at 2–4 (collecting
cases holding firearms identification evidence admissible after considering these reports). The
PCAST Report provides better support for Mr. Harris’s arguments, given its more recent origin
and use in recent opinions that have interrogated the danger of subjectivity in this discipline.
See, e.g., United States v. Tibbs, No. 2016-CF1-19431, 2019 WL 4359486 (D.C. Super. Ct. Sept.
5, 2019).
The PCAST Report ultimately concluded that firearm and toolmark identification fell
“short of the criteria for foundational validity,” after raising a number of critiques of the science.
PCAST Report at 11. Chief among them was that the report concluded that “foundational
validity can only be established through multiple independent black-box studies” 1 and at the time
the report was published in 2016, there had only been one black-box study conducted on the
discipline to date. Def.’s Supp. Mot. at 4 (citing PCAST Report at 106, 111). In response, the
Government has put forth sworn affidavits from researchers that speak to post-PCAST Report
scientific studies that they argue contradicts the PCAST Report’s conclusions. The
Government’s Daubert hearing expert, Todd Weller, devoted much of his testimony to
1
The PCAST report defined a black-box study as “an empirical study that assesses a
subjective method by having examiners analyze samples and render opinions about the origin or
similarity of samples.” PCAST Report at 48. Mr. Weller added at the Evidentiary Hearing that a
black-box study is one in which there are “question samples [given to examiners] that have a
matching known, and question samples that do not have a matching known, and also that each of
those comparisons is independent from each other.” October 15, 2020 Evidentiary Hearing Tr.
(“Evid. Hr’g Tr.”) 49:6-12.
discussing the scientific advances that have occurred since the PCAST Report was published in
2016, all of which he posited affirms the discipline’s validity. See generally Evid. Hr’g Tr.
2. Mr. Monturo’s Report Methodology
Mr. Harris’s motion in limine specifically challenges the proposed testimony of the
Government’s firearm and ballistics expert Chris Monturo, who examined the firearms evidence
at issue in this case. In creating his report for the Government, Mr. Monturo first test fired the
Glock 17 and found it to be operable. Monturo Report at 2. He then used the Glock 17 to create
test-fired cartridge cases. Id. Mr. Monturo then microscopically compared his test-fired cartridge
cases to the cartridge cases recovered from the crime scene on July 26, 2019, and found the two
sets of cartridges “to have corresponding individual characteristics.” Id. These results were then
verified that same day by Calissa Chapin, another qualified firearm and ballistics expert from Mr.
Monturo’s lab. March 14, 2020 Report of Chris Monturo Notes (“Monturo Report Notes”) at 3,
ECF No. 22-3. As a result, Mr. Monturo is expected to testify that “[b]ased upon these
corresponding individual characteristics. . . namely aperture sheer marks,” 2 “along with Mr.
Monturo’s training and experience, [he] is of the opinion that the Glock firearm fired” the cartridge
casings recovered from the July 26, 2019 crime scene. Govt. Opp’n at 11–12.
C. The Subject Matter of Mr. Monturo’s Testimony Meets Rule 702’s Standards
Mr. Harris argues that the Government’s proposed expert must be excluded under Rule
702 and Daubert because the underlying firearm and toolmark identification discipline “is based
2
As defined in the AFTE Glossary, 6th Edition, a firing pin aperture shear is “[s]triated
marks caused by the rough edges of the firing pin aperture scraping the primer metal during
unlocking of the breech.” Govt. Supp. Opp’n, Ex. 15, ECF No. 33-15. It is these individual
characteristics Mr. Monturo used to classify the cartridge cases at issue.
not upon science but rather ‘subjectivity.’” 3 Def.’s Supp. Mot. at 2. To address Mr. Harris’s
concerns about the admission of Mr. Monturo’s expert testimony, the Court will undertake a
factor-by-factor analysis of the discipline’s reliability, using Daubert as a guide. Complicating
this process is the fact that Mr. Harris did not specifically address the Daubert criteria in his
briefing on this topic, so the Court will instead rely on the implications raised by the PCAST
Report and other scientific reports he has brought to the Court’s attention.
1. Whether the methodology has been tested
As previously noted, the first Daubert factor asks whether the technique in question has
been or can be tested. See Daubert, 509 U.S. at 593–94. This “testability” inquiry, as
articulated in the Advisory Committee Notes to Rule 702, concerns “whether the expert’s theory
can be challenged in some objective sense, or whether it is instead simply a subjective,
conclusory approach that cannot be reasonably assessed for reliability.” Fed. R. Evid. 702
advisory committee’s note to 2000 amendment. Mr. Harris argues that firearm and toolmark
identification is “unavoidably subjective,” and also cites to the 2008 Ballistics Imagining Report
which expressed concerns about “the fundamental assumptions of uniqueness and reproducibility
of firearms-related toolmarks.” Def.’s Supp. Mot. at 2–3. In response, the Government has put
forth evidence to show “[f]irearms and toolmark identification has been thoroughly tested with
3
Based on remarks such as these and his citation to United States v. Glynn, Mr. Harris
appears to be peripherally raising the point that firearm and toolmark identification cannot “fairly
be called ‘science,’” United States v. Glynn, 578 F. Supp. 2d 567, 570 (S.D.N.Y. 2008), a
preliminary inquiry some courts have investigated before proceeding to the Daubert analysis.
The Court does not believe such an inquiry is required here, given that, as other courts have also
found, firearm and toolmaking identification is “clearly is technical or specialized, and therefore
within the scope of Rule 702.” United States v. Hunt, No. CR-19-073-R, 2020 WL 2842844, at
*3 n.2 (W.D. Okla. June 1, 2020) (citing United States v. Willock, 696 F. Supp. 2d 536, 571 (D.
Md. 2010), aff'd sub nom. United States v. Mouzone, 687 F.3d 207 (4th Cir. 2012)).
ground-truth experiments designed to mimic casework.” Govt. Opp’n at 1. The Court agrees
with the Government that this factor supports admissibility.
A number of courts have examined this factor in depth to conclude that firearm toolmark
identification can be tested and reproduced. See, e.g., Otero, 849 F. Supp. 2d at 432 (“The
literature shows that the many studies demonstrating the uniqueness and reproducibility of
firearms toolmarks have been conducted.”); Taylor, 663 F. Supp. 2d at 1175–76 (noting studies
“demonstrating that the methods underlying firearms identification can, at least to some degree,
be tested and reproduced.”); United States v. Diaz, No. CR 05-00167, 2007 WL 485967, at *6
(N.D. Cal. Feb. 12, 2007) (holding that “the theory of firearms identification, though based on
examiners’ subjective assessment of individual characteristics, has been and can be tested.”).
Indeed, even Judge Edelman in the Tibbs opinion relied on by Mr. Harris concluded that
“virtually every court that has evaluated the admissibility of firearms and toolmark identification
has found the AFTE method to be testable and that the method has been repeatedly tested.”
Tibbs, 2019 WL 439486 at *7 (collecting cases).
The fact that there are subjective elements to the firearm and toolmark identification
methodology is not enough to show that the theory is not “testable.” Indeed, studies have shown
that “the AFTE theory is testable on the basis of achieving consistent and accurate results.”
Otero, 849 F. Supp. 2d at 433; see also July 7, 2017 Decl. of Todd Weller (“Weller I”) at 2–6,
ECF No. 28-5 (describing various studies that support the reproducibility of the AFTE
identification theory). This conclusion has only been further strengthened in recent years due to
advances in three-dimensional imaging technology, which has allowed the field to interrogate the
process and sources of “subjectivity” behind firearm and toolmark examiners' conclusions. For
example, Mr. Weller testified regarding a study which used 3D image technology to assess the
process used by trained firearm examiners when identifying casings to a particular firearm. See
Sept. 19, 2019 Decl. of Todd Weller (“Weller II”) at 15–16 (citing Pierre Duez et al.,
Development and Validation of a Virtual Examination Tool for Firearm Forensics, 63 J.
Forensic Sci, 1069–84 (2018), (“Heat Map Study”)), ECF No. 28-6. The Heat Map Study
indicated that firearm examiners from fifteen different laboratories, all conducting an
independent assessment, were “mostly using the same amount and same location of microscopic
marks when concluding identification.” Weller II at 16. Critically, the trained examiners also
correctly reported 100% of known matches while reporting no false positives or false negatives.
Id.
It is also important to note that the testability criticism leveled at the firearm and
toolmark field in the PCAST Report—that at the time of publishing “there [was] only a single
appropriately designed study to measure validity and estimate reliability”—appears to now be
out of date. PCAST Report at 112. As previously discussed, the PCAST Report only considered
studies that were a “black-box” or “open-set” design, disregarding hundreds of validation studies
in the process. See Evid. Hr’g Tr. 48:9-17 (noting that PCAST only evaluated nine of the
hundreds of studies that were submitted for review). Setting aside for the moment the utility of
this “black-box” requirement— which goes beyond what is required by Rule 702— the
Government has provided to the Court three recent scientific studies that meet the PCAST’s
black-box model requirements and demonstrate the reliability of the firearm and toolmark
identification method. These include one of the tests administered during the Heat Map Study
detailed above, see Weller II at 16 n. 84, along with another recent black box study testing the
identification of fired casings, which resulted in a .433% false positive error rate from three
errors among 693 total comparisons. See Lilien et al., Results of the 3D Virtual Comparison
Microscopy Error Rate (VCMER) Study for Firearm Forensics, J. of Forensic Sci. Oct. 1, 2020
(“Lilien Study”) at 1, ECF No. 41. A third post-PCAST Report study also followed the PCAST
recommended black-box model and found that of 1512 possible identifications tested, firearms
examiners correctly identified 1508 casings to the firearm from which the casing was fired.
Keisler et. al., Isolated Pairs Research Study, Ass’n of Firearm and Tool Mark Examiners J. 56,
58 (2018) (“Keisler Study”), ECF No. 33-9; see also Evid. Hr’g Tr. 65:3-11. This evidence
indicates that even under the PCAST’s stringent black-box only criteria, firearm and toolmark
identification can be tested and reasonably assessed for reliability.
A final factor demonstrating the strength of the testability prong is that firearm and
toolmark examiners are required, as Mr. Monturo has done here, to document their results and
findings through written reports and photo documentation, and have these results validated by
another qualified examiner. These elements “ensure sufficient testability and reproducibility to
ensure that the results of the technique are reliable.” Diaz, 2007 WL 485967 at *5 (citing United
States v. Monteiro, 407 F.Supp.2d 351, 369 (D. Mass. 2006)). 4 For all of these reasons, the
Court concludes that the testability factor supports admissibility of Mr. Monturo’s testimony.
2. The known or potential error rate
The second Daubert factor inquires as to whether the technique has a known or potential
rate of error. See Daubert, 509 U.S. at 594. The PCAST Report concluded that non-black box
4
Mr. Harris’s only explicit acknowledgement of this Daubert factor is an assertion in a
parenthetical that the court in United States v. Green found that “ballistic evidence fails to meet
Daubert criteria regarding . . . testability.” Def.’s Mot. at 7 (citing United States v. Green, 405 F.
Supp. 2d 104, 120–22 (D. Mass. 2005)). But the facts at issue in Green were quite different than
the instant case. Green’s holding that the methods at issue could not be tested rested on an
absence of notes and photographs from the initial examination that “made it difficult, if not
impossible” for another expert to verify the examination. Green, 405 F. Supp. 2d at 120. In
contrast, Mr. Monturo documented his work in addition to having it verified that same day by
another certified firearms analyst. Accordingly, reproducibility is not at issue here.
studies had “inconclusive and false-positives rate that are dramatically lower (by more than 100-
fold)” compared to partly black-box or fully black-box designed studies. PCAST Report at 109.
The Government counters that “collectively, th[e] body of scientific data demonstrate[s] a low
rate of error” for firearm and toolmark identification, and provides several recently published
studies to refute the PCAST Report’s finding of differences in rate of error tied to study design.
Govt. Opp’n at 2; Govt. Supp. Opp’n at 13–14.
First, as the Government argues and this Court agrees, the critical inquiry under this
factor is the rate of error in which an examiner makes a false positive identification, as this is the
type of error that could lead to a conviction premised on faulty evidence. See Otero, 849 F.
Supp. 2d at 434 (noting, “the critical validation analysis has to be the extent to which false
positives occur”). 5 Mr. Weller testified that “over the past couple of decades in research” he had
seen a rate of false positives in research studies ranging from 0-1.6 percent. Evid. Hr’g. Tr.
84:19–22. To support this assertion, the Government provided the false positive error rates for
nineteen firearm and toolmark validation studies conducted between 1998 and 2019, of which
eleven studies had a false positive error rate of zero percent, and the highest false positive error
rate calculated was 1.6%. Govt. Opp’n at 27–29. Other federal courts have also recognized that
validation studies as a whole show a low rate of error for firearm and toolmark identification.
See, e.g., United States v. Romero-Lobato, 379 F. Supp. 3d 1111, 1119 (D. Nev. 2019) (“[T]he
studies cited by [the firearms examiner] in his testimony and by other federal courts examining
the issue universally report a low error rate for the AFTE method.”); Taylor, 663 F. Supp. 2d at
1177 (“[T]his number [less than 1%] suggests that the error rate is quite low”).
5
Perhaps the false negative rate could be important in a case where a defendant asserts
his co-defendant (or a third party) was the culprit and examination of that person’s firearm tested
negative. But that situation does not apply here.
As was the case under the testability prong of the Daubert analysis, here too recent
studies have resolved some of the concerns raised by the PCAST Report. Mr. Weller described
for the Court how three black box studies that post-date the PCAST Report all have extremely
low rates of error. Govt. Supp. Opp’n at 14, Evid. Hr’g Tr. 65:2-77:8. The Heat Map and Keisler
studies both had an overall error rate of zero percent, and the Lilien study produced a false
positive rate of only 0.433%. Govt. Supp. Opp’n at 14. Because the evidence shows that error
rates for false identifications made by trained examiners is low—even under the PCAST’s black-
box study requirements—this factor also weighs in favor of admitting Mr. Monturo’s expert
testimony.
3. Whether the methodology has been subject to peer review and publication
The third Daubert factor concerns if the methodology has been subject to peer review and
published in scientific journals, a component the Supreme Court emphasized as critical to “good
science” since “it increases the likelihood that substantive flaws in methodology will be
detected.” See Daubert, 509 U.S. at 593–94. The Government contends that scientific data
concerning firearms and toolmark identification “have been published in a multitude of scientific
peer-reviewed journals,” Govt. Opp’n at 1, and Mr. Weller presented evidence to this effect at
the evidentiary hearing, describing the variety of scientists from different disciplines who have
published on the topic in several different peer-reviewed journals. See Weller I at 9–10. The
Court agrees with the Government that this factor weighs in favor of admissibility.
Much of the literature in this discipline has been published in the AFTE Journal, a peer-
reviewed journal that “publishes articles, studies and reports concerning firearm and toolmark
evidence.” United States v. McCluskey, No. CR 10-2734 JCH, 2013 WL 12335325, at *6
(D.N.M. Feb. 7, 2013). The AFTE Journal uses a formal process for article submissions,
including “specific instructions for writing and submitting manuscripts, assignment of
manuscripts to other experts within the scientific community for a technical review, returning of
manuscripts to other experts within the scientific community for clarification or re-write, and a
final review by the Editorial Committee.” Id. (quoting Richard Grzybowski, et al.,
Firearm/Toolmark Identification: Passing the Reliability Test Under Federal and State
Evidentiary Standards, 35 AFTE J. 209, 220 (2003)).
Other courts have examined the scientific credibility of the AFTE Journal. Notably, the
court in Tibbs concluded that the AFTE Journal’s lack of a double-blind peer review process
along with the fact that it is published by the group of practicing firearms and toolmark
examiners could create an “issue in terms of quality of peer review.” Tibbs, 2019 WL 4359486,
at *10. In response, the Government asserts, citing to testimony from Dr. Bruce Budowle, “the
most published forensic DNA scientist in the world,” that there is far from consensus in the
scientific community that double-blind peer review is the only meaningful kind of peer review.
Govt. Supp. Opp’n at 23; see also Affidavit of Bruce Budowle at 2, ECF No. 33–17. To this
point, Mr. Weller described the various advantages and disadvantages of each type of peer
review. Weller II at 22–24. Compellingly, the Government also refuted the allegation by Judge
Edelman in Tibbs that the AFTE Journal does not provide “meaningful” review, by bringing to
the Court’s attention a study that was initially published in the AFTE Journal, and then was
subsequently published in the Journal of Forensic Science with no further alterations. Govt.
Supp. Opp’n at 27. Because the Journal of Forensic Science employs a double-blind peer review
process, this indicates that at least in this instance, the open peer review process of the AFTE
Journal led to the same outcome as a double-blind peer review. Id. In addition, numerous courts
have concluded that publication in the AFTE Journal satisfies this prong of the Daubert
admissibility analysis. See, e.g., Romero-Lobato, 379 F. Supp. 3d at 1119; United States v.
Johnson, No. 16 Cr. 281, 2019 WL 1130258, at *16 (S.D.N.Y. Mar. 11, 2019); Ashburn, 88 F.
Supp. 3d at 245–46; Otero, 849 F. Supp. 2d at 433; Taylor, 663 F. Supp. 2d at 1176; Monteiro,
407 F. Supp. 2d at 366–67. The Court queries whether excluding certain journals from
consideration based on the type of peer review the journal employs goes beyond a court’s
appropriate gatekeeping function under Daubert.
And even if the Court were to discount the numerous peer-reviewed studies published in
the AFTE Journal, Mr. Weller’s affidavit also cites to forty-seven other scientific studies in the
field of firearm and toolmark identification that have been published in eleven other peer-
reviewed scientific journals. Weller II at Ex. A. This alone would fulfill the required
publication and peer review requirement.
Because the toolmark identification methodology used by Mr. Monturo has been subject
to peer review and publication, the Court finds this Daubert factor to also weigh in favor of
admission.
4. The existence and maintenance of standards to control the methodology’s operation
The fourth Daubert factor inquires as to whether there are proper standards and controls
to govern the operation of the technique in question. See Daubert, 509 U.S. at 594. Mr. Harris
argues that there are insufficient objective standards in place, citing to the PCAST Report to
claim that the AFTE’s “sufficient agreement” analysis that is used by examiners to reach their
conclusions is subjective and impermissibly based on the “personal judgment” of each examiner.
Def.’s Supp. Mot. at 4 (citing PCAST Report at 47, 60, 104, 113). In opposition, the
Government argues that “the firearms community has implemented standards,” citing to a
number of industry guidebooks and regulations. Govt. Opp’n at 2. While a close call, the Court
finds that the lack of objective standards ultimately means this factor cannot be met. 6
The Government identifies a number of what they refer to as “standards for professional
guidance” for the firearm and toolmark profession, Govt. Opp’n at 32–33, but the primary
standard that governs the discipline is the AFTE Theory of Identification, which describes the
methodology examiners should undertake when “pattern matching” between firearms and
cartridges. See, e.g., Govt. Opp’n at 8 (explaining that Theory of Identification was created “to
explain the basis of opinion of common origin in toolmark comparisons”). According to the
AFTE Theory of Identification, examiners can conclude that a firearm and cartridges have a
common origin when a comparison of toolmarks shows there is “sufficient agreement” between
“the unique surface contours of two toolmarks.” The Association of Firearm and Tool Mark
Examiners, AFTE Theory of Identification as It Relates to Toolmarks, https://afte.org/about-
us/what-is-afte/afte-theory-of-identification (last visited November 4, 2020). This theory of
identification dictates that “sufficient agreement” between two toolmarks exists only when “the
agreement of individual characteristics is of a quantity and quality that the likelihood another
tool could have made the mark is so remote as to be considered a practical impossibility.” Id.
The Court finds this standard to be generally vague, and indeed, the AFTE Theory acknowledges
that “the interpretation of individualization/identification is subjective in nature, founded on
scientific principles and based on the examiner’s training and experience.” Id. As other courts
have found, under this method “matching two tool marks essentially comes down to the
examiner's subjective judgment based on his training, experience, and knowledge of firearms.”
6
This Daubert factor is, as the Government concedes, “the only Daubert factor that some
courts have found lacking” in firearm toolmark identification. Govt. Opp’n at 33. This makes it
all the more puzzling that the Government fails entirely to address this factor in its reply.
Romero-Lobato, 379 F. Supp. 3d at 1121; Glynn, 578 F. Supp. 2d at 572 (“[T]he standard
defining when an examiner should declare a match – namely ‘sufficient agreement’ – is
inherently vague.”).
Accordingly, it is evident and hardly disputed that the “AFTE theory lacks objective
standards.” Ricks, 2020 WL 1491750, at *10. The entire process of reaching a conclusion
regarding the “sufficient agreement in individual characteristics” is one that relies wholly on the
examiner’s judgment, without any underlying numerical standards or guideposts to direct an
examiner’s conclusion. See Evid. Hr’g Tr. 37:16–38:25 (noting the absence at this time of
objective standards to guide an examiner’s findings). And as Mr. Weller testified, even in
contrast to other subjective disciplines such as fingerprint analysis, firearm toolmark
identification does not provide objective standards even as a quality control measure, such as a
baseline to trigger further verification. See Evid. Hr’g Tr. 112:18-113:17 (explaining that while
fingerprint testing does not have an agreed-upon standard for the number of matching points
required for an identification, it does use matching points as a quality control measure that
triggers further verification if below a certain threshold). While Mr. Monturo’s additional use of
“basic scientific standards” through taking contemporaneous notes, documenting his comparison
with photographs, and the use of a second reviewer for verification surely assist in maintaining
reliable results, without more the Court cannot conclude this Daubert factor is met.
It should be noted, however, that even if this factor cannot be met, a partially subjective
methodology is not inherently unreliable, or an immediate bar to admissibility. Rule 702 “does
not impose a requirement that the expert must reach a conclusion via an objective set of criteria
or that he be able to quantify his opinion with a statistical probability. Romero-Lobato, 379 F.
Supp. 3d at 1120. And indeed, “all technical fields which require the testimony of expert
witnesses engender some degree of subjectivity requiring the expert to employ his or her
individual judgment, which is based on specialized training, education, and relevant work
experience.” Johnson, 2019 WL 1130258 at *18 (citations omitted); see also Evid. Hr’g Tr. at
30:14–31:6 (Mr. Weller testified that “all science involves some level of interpretation,” and
went on to describe subjective components to both drug testing and DNA interpretation).
Accordingly, this factor weighs against the admission of Mr. Monturo’s testimony, but does not
disqualify it.
5. Whether the methodology has achieved general acceptance in the relevant community
Finally, the fifth and last Daubert factor asks whether the technique has been generally
accepted within the relevant scientific community, reasoning that “a known technique which has
been able to attract only minimal support within the community, may properly be viewed with
skepticism.” See Daubert, 509 U.S. at 594. The Court finds that the Government has put forth
more than sufficient evidence to show that the AFTE theory as used by Mr. Monturo enjoys
widespread scientific acceptance. See Govt. Opp’n at 2; Govt. Supp. Opp’n at 28.
Mr. Weller testified that firearm and toolmark identification is practiced by accredited
laboratories in the United States and throughout the world, including England (Scotland Yard),
New Zealand, Canada, South Africa, Australia, Germany, Sweden, Greece, Turkey, China,
Mexico, Singapore, Malaysia, Belgium, Netherlands, and Denmark. See Weller II at 30. In the
United States alone, there are 233 accredited firearm and toolmark laboratories, that often
operate within a larger forensic laboratory providing chemistry, DNA, and fingerprint
identification, and scientists from a variety of disciplines author studies within the area of
firearms and toolmark identification. Id.
The criticism contained in the PCAST Report does not undermine this factor, as
“techniques do not need to have universal acceptance before they are allowed to be presented
before a court.” Romero-Lobato, 379 F. Supp. 3d at 1122. Even courts that have been critical of
the validity of the discipline have conceded that it does enjoy general acceptance as a reliable
methodology in the relevant scientific community of examiners. See Otero, 849 F. Supp. 2d at
435 (collecting cases). Furthermore, as Mr. Weller noted at the evidentiary hearing, the
committee responsible for the PCAST Report did not include any firearm and toolmark
examiners or researchers in the field, see Evid. Hr’g Tr. 47:18-23, thus raising the question of
whether the PCAST Report criticism would even constitute a lack of acceptance from the
“relevant scientific community.” For all of these reasons, this factor weighs in favor of
admitting Mr. Monturo’s testimony.
6. The Daubert Analysis Urges Admission of Mr. Monturo’s Testimony
Balancing all five Daubert factors, the Court finds that the Government’s proposed
expert testimony of Mr. Monturo is reliable and admissible, though subject to what the Court
considers prudent limitations, discussed in detail below. The only factor that does not favor
admissibility is the lack of objective criteria under the fourth Daubert factor, but as discussed,
“the subjectivity of a methodology is not fatal under Rule 702 and Daubert.” Ashburn, 88 F.
Supp. 3d at 246. And as other courts have also found, this deficiency “is countered by the
method's relatively low rate of error, widespread acceptance in the scientific community,
testability, and frequent publication in scientific journals.” Romero-Lobato, 379 F. Supp. 3d at
1122. Accordingly, the Court will allow the admission of Mr. Monturo’s expert testimony as to
his firearm and toolmark identification analysis, subject to certain limitations.
D. Federal Rule of Evidence 702(d)
Federal Rule of Evidence 702(d) provides that qualified expert testimony is admissible
only when “the expert has reliably applied the principles and methods to the facts of the case.”
Fed. R. Evid. 702. Mr. Harris challenges the admission of Mr. Monturo’s testimony, asserting
that he “has not applied the principles and methods reliably to the facts of the case.” Def.’s Mot.
at 1. However, he provides no evidence or further analysis to flesh out this conclusory claim.
Accordingly, the Court finds this argument to be without merit.
As previously described, Mr. Monturo detailed the firearm and toolmark examination he
conducted in his report, providing both a description of his process and photo documentation.
See generally Monturo Report. Mr. Monturo’s findings were then verified by another qualified
examiner the same day. Monturo Report Notes at 2. In contrast, Mr. Harris has not put forth any
evidence to suggest that Mr. Monturo applied the firearm and toolmarking methodology in an
unreliable manner. Mr. Monturo also appears to be well-qualified, with the Government noting
that he “has significant training and experience, has not failed any proficiency exams, and has
designed consecutively manufactured firearms test kits for training other firearms examiners,”
information that they plan to elicit at trial during qualification of his testimony and also set out in
his curriculum vitae. Govt. Opp’n at 35. In light of his failure to identify any unreliability on
Mr. Monturo’s part, and also because Mr. Harris will have the ability to question Mr. Harris
regarding his analysis during cross examination, the Court is convinced exclusion on this ground
is not warranted. See Daubert, 509 U.S. at 596 (“Vigorous cross-examination, presentation of
contrary evidence, and careful instruction on the burden of proof are the traditional and
appropriate means of attacking shaky but admissible evidence.”). If Mr. Harris has lingering
concerns about Mr. Monturo’s application of the firearm and toolmark methodology in this case,
he is welcome to retain an independent expert to review Mr. Monturo’s work, or have an
independent examination of his own performed.
E. Federal Rule of Evidence 403
Next, Mr. Harris argues that even if the proposed testimony of Mr. Monturo is admissible
pursuant to Daubert and Federal Rule of Evidence 702, it is inadmissible under Federal Rule of
Evidence 403. Def. Mot. at 2. In support of this claim, Mr. Harris argues that Mr. Monturo’s
“conclusions appear to extend beyond his claimed expertise and are not reliable since they are
not based on objective standards but rather his subjective observations and conclusions.” Id.
“The prejudice to Mr. Harris is simple, a connection to a firearm, a connection to a shell casing,
all premised on analysis that at its best can only conclude that it ‘may’ be correct.” Def. Supp.
Mot. at 2.
Under Rule 403, a Court may exclude otherwise probative testimony if its value is
substantially outweighed by unfair prejudice, confusing the issues, misleading the jury, undue
delay, a waste of time, or cumulative evidence. Fed. R. Evid. 403. Mr. Harris’s concern under
Rule 403 appears to be that the value of Mr. Monturo’s testimony will be substantially
outweighed by the risk of him potentially misleading the jury through his reliance on a
methodology Mr. Harris does not believe is sufficiently reliable. First, Mr. Harris’s concerns
about the reliability of the firearm and toolmarking methodology have already been analyzed,
and the Court has found the underlying analysis sufficiently reliable such that Mr. Harris’s
concerns do not “substantially outweigh” the value of Mr. Monturo’s testimony. Additionally,
the Court believes that the risk of prejudice raised here can be alleviated through alternatives to
exclusion. Cross-examination of Mr. Monturo’s testimony, in conjunction with the appropriate
limiting instruction governing the degree of certainty Mr. Monturo can express about his
conclusions will sufficiently deter the risks of harm Mr. Harris has raised.
F. Limiting Instruction
In his final request, Mr. Harris asks that if the testimony of Mr. Monturo is not excluded,
then the Court put in place limitations on his testimony. Def. Supp. Mot. at 6–7. Specifically, he
requests that Mr. Monturo not “use the term ‘match’” but he “may be allowed to tell the jury that
he could not exclude the gun as the weapon that produced a casing.” Id.
Limitations restricting the degree of certainty that may be expressed on firearm and
toolmark expert testimony are not uncommon. See, e.g., Romero-Lobato, 379 F. Supp. 3d at
1117 (noting the “general consensus” of the courts “is that firearm examiners should not testify
that their conclusions are infallible or not subject to any rate of error, nor should they arbitrarily
give a statistical probability for the accuracy of their conclusions”); Ashburn, 88 F. Supp. 3d at
249 (limiting expressions of an expert’s conclusions to that of a “reasonable degree of ballistics
certainty” or a “reasonable degree of certainty in the ballistics field.”); Diaz, 2007 WL 485967 at
*1 (same).
With respect to Mr. Harris’s stated concerns, the Government has already agreed to a
number of limitations on Mr. Monturo’s testimony, chief among them that he will not use terms
such as “match,” he will “not state his expert opinion with any level of statistical certainty,” and
he will not use the phrases when giving his opinion of “to the exclusion of all other firearms” or
“to a reasonable degree of scientific certainty.” Govt. Opp’n at 12. These limitations are in
accord with the Department of Justice Uniform Language for Testimony and Reports for the
Forensic Firearms/Toolmarks Discipline—Pattern Matching Examination. See Govt. Opp’n, Ex.
4 (“DOJ ULTR”), ECF No. 28-4. The DOJ ULTR permits firearms examiners to conclude that
casings were fired from the same firearm when all class characteristics are in agreement, and
“the quality and quantity of corresponding individual characteristics is such that the examiner
would not expect to find that same combination of individual characteristics repeated in another
source and has found insufficient disagreement of individual characteristics to conclude they
originated from different sources.” Id. at 2–3. This Court believes, as other courts have also
concluded, see Hunt, 2020 WL 2842844, at *8, that the testimony limitations as codified in the
DOJ ULTR are reasonable and should govern the testimony at issue here. Accordingly, the
Court instructs Mr. Monturo to abide by the expert testimony limitations detailed in the DOJ
ULTR.
III. CONCLUSION
For the foregoing reasons, Defendant’s Motion to Exclude Expert Testimony as to
Firearm Examination Testing, ECF No. 22, is DENIED. An order consistent with this
Memorandum Opinion is separately and contemporaneously issued.
Dated: November 4, 2020 RUDOLPH CONTRERAS
United States District Judge