What’s Your Plan?
Examining Mental Health Experts in Family Law
January/February 2023
Download This Article (.pdf)This article discusses the challenges associated with examining mental health experts in Colorado family law cases and provides practical tips for evaluating their credentials and testimony.
How can you hold experts accountable to explain their work clearly and satisfy evidentiary requirements for reliable, trustworthy testimony? A key caselaw-based question offers a mindset for your approach: How do you know what you say you know? Although the question seems straightforward, lawyers struggle to apply it when examining the admissibility or quality of mental health expert testimony. This article addresses this key question by first showing how the Shreck line of cases, which flesh out CRE 702, provides the legal basis for critiquing mental health testimony. It then discusses the PLAN Model1—a four-step, caselaw-based framework to critique experts’ work and testimony—and includes sample lines of questions illustrating how to apply the model to probe and develop compelling legal arguments related to experts’ work and testimony.
The Problem
Critiquing the work and testimony of mental health experts who have conducted parental responsibility evaluations (PREs) presents significant challenges for family law attorneys. An expert’s report and testimony, often jargon-laden, may seem to explain much about the litigant yet convey little meaning. The methodology, including psychological testing, may fail to address the court’s concerns. The reasoning, sometimes bias-infused, may not fit the evaluation data and case facts. And the recommendations, ostensibly supported by these shortcomings, are asserted confidently. Further, because child and family investigators (CFIs) and PRE evaluators are court-appointed, some judges view these experts as neutral investigative arms of the court and may too readily dismiss challenges to the reliability of their testimony.
CRE 702 and Shreck Principles: More Than Admissibility
For many Colorado lawyers, applying People v. Shreck’s CRE 702 principles to test the reliability of mental health testimony further confuses the problem. Totality of the circumstances? General acceptance? Peer review? Error rates? Lawyers and courts often find these factors difficult to understand and apply.2 But merely understanding reliability factors and principles doesn’t provide tools to effectively challenge or support mental health testimony. Colorado’s CRE 702 caselaw emphasizes that the reliability of expert testimony is to be judged by the totality of the circumstances—a wide range of factors—of each specific case.3
Admissible expert testimony must be grounded in “the methods and procedures of science rather than subjective belief or unsupported speculation.”4 And a court may reject expert testimony that is connected to existing data only by the expert’s bare assertion5—the reliability of an expert’s methods and reasoning does not rest on an expert’s degrees or reputation. Critiquing mental health testimony in Colorado courts requires more than mechanically applying legal principles. The flexibility and breadth of a Shreck CRE 702 analysis to challenge mental health testimony requires a plan.
To determine the admissibility of expert evidence in Colorado courts, the trial judge must gauge the scientific principles’ reliability, the expert’s qualifications to opine on the testimony’s subject matter, and the testimony’s usefulness to the fact finder.6 Some lawyers view these requisites only as a test for the admissibility of scientific expert evidence. However, litigators should anticipate that because this scheme sharply tilts toward admitting evidence, courtroom arguments over the testimony will likely shift from admissibility challenges to disputes over the testimony’s meaning and weight,7 particularly when the admitted testimony is less than sterling. Three points discussed below show why.
Liberal Admissibility Standards
Colorado courts will likely allow psychological testimony much leeway when considering its admissibility. Masters v. People, a case involving a psychologist’s testimony, states that “because social science attempts to highlight complex behavior patterns, it is necessarily inexact”8—implying a not too rigorous admissibility standard. Also, Masters notes that “syndrome and framework evidence” brings social science insight to trials if that evidence is “reasonably reliable” and helpful to the jury,9 even though psychology’s literature points to unreliability of some syndrome-based testimony. Finally, because court-appointed PRE evaluators will have met the threshold qualifications listed in CRS § 14-10-127(4), an admissibility challenge on qualifications grounds will likely fail.
Wide Range of Factors Considered
The trial court in a Shreck admissibility hearing should consider the totality of the circumstances—a broad inquiry that accounts for a wide range of factors.10 Shreck details several factors, including those from Daubert v. Merrell Dow Pharmaceuticals, Inc., that judges may use to inform their decisions about the quality of the testimony. Not all factors always apply, and judges may consider others besides those listed.11 The Shreck factors include:12
- whether the technique can and has been tested
- whether the theory or technique has been peer reviewed and published
- the scientific technique’s known or potential rate of error
- whether the technique has been generally accepted
- the relationship of the proffered technique to more established modes of scientific analysis
- the existence of specialized literature discussing the technique
- the nonjudicial uses of the technique
- the frequency and type of error generated by the technique
- whether such evidence has been offered in previous cases to support or dispute the merits of a particular scientific procedure.
These factors are not defined legal principles. Rather, researchers in academia use most of the factors when they critique the quality of each other’s writings, methodology, and reasoning. Courts expect expert testimony to reflect “the same level of intellectual rigor that characterizes the practice of an expert in the relevant field.”13 As a result, besides admissibility considerations for the trial judge, the Shreck factors can also be viewed as practical, science- and logic-based tools to test the quality of admitted mental health expert testimony and sharpen legal arguments regarding the quality of that testimony. Shreck’s factors can be viewed much like Daubert’s, which “provide a rich framework by which expert evidence can be judged.”14
Reasonably Reliable Standard
The quality and admissibility of expert testimony is held to a “reasonably reliable” standard under the flexible, liberal CRE 702 framework.15 And the standard of review regarding the admissibility of expert testimony is “highly deferential.”16 As a result, trial courts will likely admit most challenged expert testimony. Further, the court has discretion whether to conduct a Shreck admissibility hearing.17 To the criticism that the “reasonably reliable” admissibility test is too liberal, CRE 702 caselaw stresses that such concerns are mitigated by traditional trial means, such as “[v]igorous cross-examination, presentation of contrary evidence, and careful instruction on the burden of proof.”18 Thus, lawyers should prepare for vigorous challenges to the quality of expert testimony, even after the testimony is admitted—using a focused approach driven by our primary question of experts: How do you know what you say you know?
The PLAN Model to Organize and Critique Mental Health Testimony
The PLAN Model is a practical, caselaw-based framework to critique and examine mental health expert testimony, whether for admissibility or for assessing the quality of already admitted expert work and testimony.
The PLAN Model and Shreck
The PLAN Model highlights two features of Colorado and federal caselaw in expert witness testimony: that the court should judge an expert’s qualifications separately from the quality of the expert’s methods and reasoning, and that the expert’s testimony should have a reliable, trustworthy basis in the knowledge and experience of professional psychology. Practically, the PLAN Model helps lawyers organize and critique mental health work and testimony, marshal or defend admissibility challenges, focus direct- and cross-examinations of experts, and sharpen written and oral legal arguments.19
The PLAN Model corresponds to Shreck’s requirements for the admissibility of expert testimony and to Colorado statutes that detail prohibited activities for mental health professionals20 and that outline requirements for conducting PREs.21 Recall Shreck’s holding that a CRE 702 admissibility hearing requires the trial judge to address three elements: (1) the reliability of the scientific principles, (2) the qualifications of the witness, and (3) the usefulness of the testimony to the fact finder.22 When determining the reliability of scientific evidence, the court’s inquiry should consider the totality of the circumstances, weighing a wide range of factors.23 Then the court should apply its discretionary authority under CRE 403 to ensure that the probative value of the evidence is not substantially outweighed by unfair prejudice.24
The four steps of the PLAN Model are:
1. Determine the expert’s qualifications to testify.25
2. Determine whether the expert’s methods conform to relevant professional standards.26
3. Evaluate the empirical and logical connections between the data from the expert’s methods and the expert’s social science conclusions.27
4. Gauge the connection between the expert’s conclusions and opinions.28
Using the PLAN Model
The best way to use the PLAN Model is to consider each step sequentially. Problems at any step of the model allow you to focus your critiques and direct- or cross-examinations of experts on specific legal and professional psychology issues associated with that respective step. Pointing to problems at any step also allows you to target legal arguments that reflect the testimony’s strengths and weaknesses.
Step 1: Determine the Expert’s Qualifications to Testify
Step 1 focuses on whether the expert has the expertise to testify on the topics about which they are offering opinions—an evidentiary matter that goes beyond meeting statutory qualifications to conduct a PRE. An expert must be qualified “by knowledge, skill, experience, training, or education.”29 This demand raises two concerns: How strictly should these elements apply to admissibility of the testimony? How qualified is qualified?30 Exploring these questions during direct examination can demonstrate the expert’s credibility; on cross, the questions may sharply critique the testimony.
But lawyers often slide by experts’ qualifications, assuming that an expert must be qualified if the expert has a Ph.D., has a good reputation, or is court-appointed. Caselaw highlights the problem. The US Supreme Court in Kumho Tire Co. v. Carmichael notes that “there are many different kinds of experts, and many different kinds of expertise.”31 A Texas Supreme Court case more sharply defines the principle: Trial courts must “ensur[e] that those who purport to be experts truly have expertise “concerning the actual subject about which they are offering an opinion.”32 Shreck, echoing this demand, asserts that when determining whether evidence is reliable, the court should consider whether the witness is qualified to opine on such matters.33 Finally, a Colorado mental health professional is prohibited from providing services outside their area of training, experience, or competence.34
Step 2: Determine Whether the Expert’s Methods Conform to Relevant Professional Standards
Examining the methods upon which experts base their conclusions and opinions is critical. Inadequate methods produce faulty data. Such data cannot support reliable expert conclusions, opinions, and recommendations.
How should courts determine the reliability of methods experts use to produce the data that support their conclusions and opinions? As referenced earlier, experts’ testimony must employ “the same level of intellectual rigor that characterizes the practice of an expert in the relevant field.”35 In People v. Ramirez, the Colorado Supreme Court noted that admissible expert testimony must be grounded in “the methods and procedures of science rather than subjective belief or unsupported speculation.”36 Further, federal caselaw directs courts to “ensure that the opinion comports with applicable professional standards outside the courtroom . . . .”37
For PRE evaluators, applicable professional standards may be identified from three sources: (1) evaluators’ ethics and licensing codes, (2) protocols or generally accepted procedures in the professional literature, and (3) professional practice guidelines of relevant mental health organizations. Examples of the professional standards found in each source are discussed below.
Evaluators’ ethics and licensing codes. A key Colorado statute prohibits Colorado mental health professionals from acting “in a manner that does not meet the generally accepted standards of the professional discipline under which the person practices.”38 Generally accepted standards may include standards of practice recognized by national organizations of practitioners in the evaluator’s professional discipline.39 For psychologists, these standards include the APA’s Ethical Principles of Psychologists and Code of Conduct (APA Ethics Code).40 The APA Ethics Code, the most comprehensive and detailed of the mental health disciplines, offers key demands of psychologists with which lawyers should be familiar when they critique experts’ methods.
For example, APA Ethics Code standard 2.04 states that “[p]sychologists’ work is based upon established scientific and professional knowledge of the discipline,” and standard 9.01(a) provides that “[p]sychologists base the opinions contained in their recommendations, reports, and diagnostic or evaluative statements, including forensic testimony, on information and techniques sufficient to substantiate their findings.”
Protocols or generally accepted methods in the professional literature. For example, empirically based protocols have been developed for questioning young children in sexual abuse evaluations.41 Strictly following these protocols is essential to minimize the effects of suggestibility and leading questions during the interview with the child.
In addition, competent PRE evaluators should follow generally accepted methods.42 These methods can be organized into three categories: (1) interviews of the parents and children, (2) psychological testing (when the evaluator is a psychologist) and completion of questionnaires, and (3) collateral source information—interviews with persons with relevant information about the case and reviews of relevant records.43 Think of these three categories as three legs of a footstool—a good metaphor for your PRE methods arguments. If one of these methods categories falls short—inadequate interviews, poor test choices or interpretations, or insufficient information from collateral sources—the footstool will become unstable, if not unusable. As noted below, the PRE billing statement or invoice provides clues about the methods and the data used in the evaluation.
Professional practice guidelines of the major mental health organizations. Recall that Colorado statutes for mental health professionals define generally accepted standards as including “standards of the professional discipline under which the person practices.”44 The primary PRE-related practice guidelines include publications for conducting child custody and forensic evaluations from the APA45 and the AFCC,46 a well-regarded multidisciplinary organization of mental health and legal professionals. Colorado has an active state AFCC chapter.
Lawyers should understand how each source applies to an expert’s testimony—a consulting expert can help with this task. Of course, merely following professional standards does not ensure the reliability or trustworthiness of the resulting opinions. But not following these standards is “powerful evidence that the opinion’s reasoning and supporting methodology may be invalid.”47
Step 3: Evaluate the Empirical and Logical Connections Between Data and Conclusions
Conclusions differ from opinions. Conclusions are social science-based inferences (evidence plus reasoning) that experts choose to explain their evaluation data and case facts; opinions apply those conclusions to legal standards addressed in the case.48 For example, a father is depressed (conclusion), based on social science-based inferences informed by psychological testing, interviews with father and other relevant sources, and review of father’s counseling records. The seriousness of father’s depression may affect the expert’s opinion about what parental responsibility arrangements serve the child’s best interest (legal standard).
The strength of an expert’s inferences determines how much the conclusions can be trusted. Nassim Taleb’s assertion that “science lies in the rigor of the inference” captures the idea.49 General Electric Co. v. Joiner, the second case in the US Supreme Court’s Daubert “trilogy,” reflects Taleb’s assertion that there is simply too great an analytical gap between the data (e.g., interpretations of a child’s drawings) and the expert’s conclusion or opinion (e.g., the opinion that the child has been abused).50 The greater the gap, the weaker or less rigorous the inferences and the more likely the expert is offering unsupported, speculative opinions. People v. Ramirez notes that unreliable speculative testimony under CRE 702 is opinion testimony that has no analytically sound basis.51 And the Colorado statute outlining PRE methods notes that a conclusion in a PRE report should explain how the resulting recommendations were reached from the data collected.52
Another important step 3 task is to determine whether experts are hiding—purposely or unwittingly—wide gaps in the inferences or reasoning with which they connect their data to their conclusions, making their testimony appear stronger than it is. Mental health experts hide reasoning deficiencies in several ways. For example, they may misapply or misrepresent research to support poorly based conclusions. Or they may rely on overly abstract but commonly accepted psychology-related terms (e.g., self-esteem or emotional trauma) to gild their conclusions.
In addition, evaluators may allow judgment biases to slant their conclusions, including:
- Confirmatory bias—seeking or interpreting evidence consistent with one’s views. For example, an evaluator, believing that equal parenting time is always the best arrangement for children, dismisses evidence that would better fit a different parenting plan.
- Hindsight bias—believing that the litigant should have predicted a past event’s outcome even though the evaluator does not account for all the variables—the “messiness of life”—with which the litigant had to deal. For example, a mother, accused of leaving her toddler unsupervised when he burned his hand on the kitchen stove, had left her toddler quickly because she heard her infant child’s shriek in the bedroom down the hall.
- Halo effect—favoring positive first or limited impressions, good or bad, of one or more family members based on limited interviews or information.
- Overconfidence bias—“I’m certain I’m right no matter what the evidence shows.”
The best indicator that a PRE evaluator sought to manage biases is whether the evaluator considered reasonable alternative explanations of the data while collecting the data.53 Further, the APA stresses that examining issues from all reasonable perspectives and seeking information that will differentially test plausible rival hypotheses reflects the expert’s professional integrity.54
Step 4: Gauge the Connection Between the Expert’s Conclusions and the Proffered Expert Opinion
Opinions apply social science-based conclusions to the legal standard being addressed (e.g., best interest of the child or termination of parental rights). For example, if the father’s parenting is compromised by his depression (conclusion), what living and access arrangements, now and in the future, are in the child’s best interests (opinion)? Like conclusions, the strength of the opinion is measured by Joiner’s analytical gap test—“A court may conclude that there is simply too great an analytical gap between the data and the [conclusion or] opinion proffered.”55
Step 4 addresses two opinion-related concerns. First, is the evaluator coloring the legal standard with their beliefs? For example, the best interest of the child standard is susceptible to beliefs about children, parenting, and families. While Colorado statutes enumerate factors that the court shall consider, among all relevant factors, when determining the best interest of the child,56 those factors are freighted with psychological, emotion-laden terms that may draw on an expert’s personal and professional experiences with families. It’s easy to see how an expert’s values may mix, even unwittingly, with best interest standards, which include the wishes of the child, parent-child interactions and relationships, and encouraging the sharing of love, affection, and contact between the child and the other party.57
The second opinion-related concern relates to an expert’s recommendations. In PRE reports, the recommendations, reflecting the evaluator’s opinions, should flow from reliable social science-based conclusions (step 3) derived from data from reliable methodology (step 2) and reasoning (step 3). In step 4, consider the two following issues related to recommendations.
First, do the recommendations follow from the report’s conclusions? Or does a recommendation seem like a conventional cookie-cutter suggestion that might appear in any PRE? Too often, evaluators’ recommendations don’t follow from the narrative and data discussed in their reports. To help address this problem, the APA suggests a that the PRE should follow a three-pronged model, focusing on parenting attributes, the child’s psychological needs, and the resulting fit.58 The child’s psychological best interests recommendations (the resulting fit prong) should derive from reliable assessments of the parent’s attributes and of the child’s psychological and developmental needs. If the evaluator doesn’t adequately assess the parent and child prongs or if the information from those two prongs doesn’t reasonably lead to the “resulting fit,” you may question the reliability of the expert’s conclusions and recommendations.
Second, the APA defines what psychologists’ opinions and recommendations should entail, tying together professional psychology’s demands and the law’s requirements for reliable expert testimony. In its guidelines for custody evaluations in family law proceedings, the APA notes that “psychologists strive to employ a systematic approach that is designed to avoid biased and inadequately supported decision making . . . .”59
The following APA summary for the basis of recommendations is the PLAN Model’s goal: “Psychologists attempt to convey their recommendations in a respectful and logical fashion, reflecting articulated assumptions, detailed interpretations, and acknowledged inferences that are consistent with established professional and scientific standards.”60 It reflects language from Daubert, Shreck, Ramirez, and the APA Ethics Code. Experts’ recommendations should adhere to this statement.
Applying the PLAN Model
As noted earlier, Colorado courts will likely admit social science testimony, even if the testimony’s reliability, though challenged in a Shreck hearing, is not deemed sterling. Thus, distinguishing good from questionable admitted testimony is critical. If the PRE is generally favorable to your client, highlight the strengths of the evaluation on direct examination. On cross-examination, show how one or more legs of a PRE’s three-legged stool are wobbly or broken.
With CRE 702 and principles from Shreck’s line of cases in mind, the PLAN Model helps structure depositions, develop chapters for direct- and cross-examinations, and sharpen legal arguments to the court. The sample lines of questions in the appendix, each slotted into a PLAN Model step, help organize inquiries of experts and develop arguments to present to the court. You may also include other lines of questions that flesh out the expert’s methods, reasoning, and recommendations and address important case facts.
Conclusion
Use the key caselaw-based question—How do you know what you say you know?—to orient your critiques of the work and testimony of mental health experts and sharpen your legal arguments about their testimony. To flesh out the question, understand how the Shreck line of cases provides legal principles for critiquing mental health testimony, even if the court admits the testimony. Next, use the PLAN Model to critique the work and testimony of experts (PRE evaluators, CFIs, or therapists), addressing each step—each supported by caselaw and professional psychology’s literature—in sequence. Then, apply the PLAN Model by slotting lines of deposition or trial exam questions into their corresponding PLAN Model step. Finally, organize and sharpen your legal arguments, step-by-step.
Related Topics
Notes
1. The PLAN (PsychologyLaw ANalysis) Model was developed by John A. Zervopoulos, coauthor of this article, who serves as a consultant on using the PLAN Model.
2. See, e.g., People v. Shreck, 22 P.3d 68, 70 (Colo. 2001); Daubert v. Merrell Dow Pharm., Inc., 509 U.S. 579, 600 (Rehnquist, J., dissenting); Dahir et al., “Judicial Application of Daubert to Psychological Syndrome and Profile Evidence,” 11 Psychol. Pub. Pol’y & L. 62, 75 (2005).
3. Shreck, 22 P.3d at 70.
4. People v. Ramirez, 155 P.3d 371, 378 (Colo. 2007).
5. Id. at 379 (citing Gen. Elec. Co. v. Joiner, 522 U.S. 136, 146 (1997)).
6. Shreck, 22 P.3d at 77.
7. de La Torre and Martin, “Is C.R.E. 702 the Full-employment Act for Experts? Practitioner Advice based on an Examination of the Shreck and Ramirez Holdings” (paper presented at Rocky Mountain Academy of Legal Studies in Business, Annual Conference, Vail, Colo., 2007).
8. Masters v. People, 58 P.3d 979, 989 (Colo. 2002).
9. Id.
10. Shreck, 22 P.3d at 70.
11. Id. at 77–78.
12. Id.
13. Kumho Tire Co. v. Carmichael, 526 U.S. 137, 152 (1999).
14. Faigman et al., Modern Scientific Evidence 21 (2d ed. Thomson West 2005).
15. Shreck, 22 P.3d at 77.
16. Ramirez, 155 P.3d at 380.
17. People v. Rector, 248 P.3d 1192, 1201 (Colo. 2011).
18. Shreck, P.3d at 78 (quoting Daubert, 509 U.S. at 596).
19. See Zervopoulos, How to Examine Mental Health Experts: A Family Lawyer’s Handbook of Issues and Strategies (2d ed. ABA 2020).
20. CRS § 12-245-224.
21. CRS § 14-10-127.
22. Shreck, 22 P.3d at 70.
23. Id.
24. Id.
25. Shreck elements 2 and 3; CRE 401 and 402; CRS § 12-245-224(1)(h); CRS § 14-10-127(4) and (5).
26. Shreck element 1; CRS § 12-245-224(1)(g)(I).
27. Shreck element 1 and consideration of the totality of the circumstances in the case; CRS § 14-10-127(6)(a), (b).
28. Shreck element 1 and consideration of the totality of the circumstances in the case; CRE 403 to determine whether the probative value of the evidence is substantially outweighed by unfair prejudice; CRS § 14-10-127(7)(b).
29. CRE 702; FRE 702.
30. See, e.g., Broders v. Heise, 924 S.W.2d 148, 152 (Tex. 1996).
31. Kumho Tire, 526 U.S. at 150.
32. Broders, 924 S.W.2d at 152 (emphasis added).
33. Shreck, 22 P.3d at 77. See also People v. Martinez, 74 P.3d 316, 321 (Colo. 2003).
34. CRS § 12-245-224(1)(h).
35. Kumho Tire, 526 U.S. at 152.
36. Ramirez, 155 P.3d at 378.
37. Watkins v. Telsmith, Inc., 121 F.3d 984, 991 (5th Cir. 1997).
38. CRS § 12-245-224(1)(g)(I).
39. Id. See also Davis v. Bd. of Psych. Exam’rs, 791 P.2d 1198, 1204 (Colo.App. 1989).
40. Ethical Principles of Psychologists and Code of Conduct (APA 2017), http://www.apa.org/ethics/code/index.aspx.
41. See Poole, Interviewing Children (APA 2016); Poole and Lamb, Investigative Interviews of Children (APA 1998).
42. See CRS § 14-10-127(6)(a), (b).
43. Zervopoulos, supra note 19 at 83.
44. CRS § 12-245-224(1)(g)(I).
45. See APA, APA Professional Practice Guidelines, http://www.apa.org/practice/guidelines/index.aspx, for a current list of APA practice guidelines.
46. See AFCC, Practice Guidelines, https://www.afccnet.org/Resource-Center/Practice-Guidelines, for a current list of AFCC standards and guidelines.
47. Shuman and Greenberg, “The Role of Ethical Norms in the Admissibility of Expert Testimony,” 37 A.B.A. Judges J. 4 (1998).
48. Zervopoulos, supra note 19 at 139–44.
49. Taleb, Fooled by Randomness 72 (2d ed. Random House Trade Paperbacks 2005).
50. Joiner, 522 U.S. at 146.
51. Ramirez, 155 P.3d at 378.
52. CRS § 14-10-127(7)(b)(III).
53. See Zervopoulos, Confronting Mental Health Evidence: A Practical PLAN to Examine Reliability and Experts in Family Law 151–58 (2d ed. ABA 2015).
54. Specialty Guidelines for Forensic Psychology, guideline 9.01 (APA 2013), https://www.apa.org/practice/guidelines/forensic-psychology; 68 Am. Psychol. 7, 14–15 (2013).
55. Joiner, 522 U.S. at 146.
56. CRS § 14-10-124(1.5)(a).
57. Id.
58. Guidelines for Child Custody Evaluations in Family Law Proceedings, guideline 2 at 5 (APA 2022), https://www.apa.org/about/policy/child-custody-evaluations.pdf.
59. Id. guideline 22 at 21.
60. Id.
Some lawyers view these requisites only as a test for the admissibility of scientific expert evidence. However, litigators should anticipate that because this scheme sharply tilts toward admitting evidence, courtroom arguments over the testimony will likely shift from admissibility challenges to disputes over the testimony’s meaning and weight, particularly when the admitted testimony is less than sterling.