LAJAYVIAN D. DANIELS v. STATE OF FLORIDA ( 2021 )


Menu:
  •        DISTRICT COURT OF APPEAL OF THE STATE OF FLORIDA
    FOURTH DISTRICT
    LAJAYVIAN D. DANIELS,
    Appellant,
    v.
    STATE OF FLORIDA,
    Appellee.
    No. 4D19-822
    [February 24, 2021]
    Appeal from the Circuit Court for the Fifteenth Judicial Circuit, Palm
    Beach County; Joseph George Marx, Judge; L.T. Case No. 50-2015-CF-
    009320-AXXX-MB.
    Carey Haughwout, Public Defender, and Christine C. Geraghty,
    Assistant Public Defender, West Palm Beach, for appellant.
    Ashley Moody, Attorney General, Tallahassee, and Jessenia J.
    Concepcion, Assistant Attorney General, West Palm Beach, for appellee.
    CONNER, J.
    Appellant, Lajayvian D. Daniels, appeals his convictions and sentences
    for first degree murder with a firearm and robbery with a firearm, raising
    five issues on appeal. We affirm without discussion the trial court’s rulings
    on four issues, but we write to explain our affirmance on the fifth issue
    because it concerns a matter of first impression in Florida. The issue
    involves the admissibility of expert evidence of a probabilistic genotype
    software program used to analyze DNA samples collected while
    investigating a crime that contain mixtures of genetic material from
    multiple people. We determine the trial court properly admitted the
    evidence.
    Background
    Appellant was indicted for first degree murder with a firearm and
    robbery with a firearm in connection with a robbery and shooting resulting
    in the death of a gas station employee. For purposes of the issue we
    address on appeal, we focus on the pertinent facts and procedural
    background regarding the DNA evidence Appellant argues was erroneously
    admitted.
    The forensic quality assurance manager for the Palm Beach County
    Sheriff’s Office (“PBSO”) crime lab testified at trial about DNA samples
    collected from the articles discovered during the investigation.
    Additionally, she produced DNA reference profiles for four people:
    Appellant, the victim, a second suspect, and Appellant’s girlfriend. She
    compared those profiles to five DNA samples obtained from five clothing
    items believed to be related to the crime. The five samples contained a
    mixture of DNA from different persons. Because PBSO did not have
    sufficient statistical calculation tools to analyze samples with DNA
    mixtures, the forensic quality assurance manager sent the PBSO data files
    for those five samples to Cybergenetics, a private lab, for further analysis.
    Cybergenetics specializes in DNA mixtures. Cybergenetics was the
    developer of TrueAllele, a computer software designed to analyze complex
    data to determine the individual profiles of genetic material in DNA
    mixtures.
    A DNA analyst at Cybergenetics testified at trial that the TrueAllele
    software can separate the different genetic types present in samples in
    order to calculate match statistics or the level of association between crime
    evidence and references. TrueAllele is a probabilistic genotyping system
    that relies on Bayesian probability modeling and “Markov Chain” and
    “Monte Carlo” statistical sampling. She explained that a Cybergenetics
    analyst entered the data compiled by the PBSO DNA analyst into the
    TrueAllele program and entered how many people were suspected of
    contributing to the mixture. TrueAllele then separated the genetic types
    present in the five clothing samples into individual profiles and compared
    those profiles to a reference sample to calculate a match statistic. The
    Cybergenetics DNA analyst that testified at trial was not the same
    Cybergenetics analyst that entered the PBSO lab data into the TrueAllele
    software, but she was one of the three reviewers of the TrueAllele analysis
    required as a standard protocol by Cybergenetics. She was the witness
    who authenticated and discussed at trial the TrueAllele analysis report
    admitted into evidence.
    The TrueAllele analysis found a statistical match between one clothing
    item and Appellant that was 872 trillion times more probable than a
    coincidental match to an unrelated person. As to a second item, the match
    was 77.1 million times more probable than a coincidental match to an
    unrelated person. As to a third item, the match was 194 quadrillion times
    more probable than a coincidental match to an unrelated person. As to a
    2
    fourth item, the match was 789 billion times more probable than a
    coincidental match from an unrelated person.
    The jury returned a verdict of guilty as charged on both counts. The
    trial court sentenced Appellant to concurrent life sentences.
    Appellant’s Motion to Exclude TrueAllele Evidence
    Prior to trial, the defense filed a “Motion to Exclude the Interpretation
    of DNA Mixtures by the TrueAllele Software Due to the Failure to Perform
    the Required Internal Validation.” The motion argued that this evidence
    did not meet the requirements for admissibility under Daubert v. Merrell
    Dow Pharmaceuticals, Inc., 
    509 U.S. 579
     (1993) or Frye v. United States,
    
    293 F. 1013
     (D.C. Cir. 1923), because the TrueAllele program was not
    “internally validated” prior to being used on the data generated at PBSO.
    After acknowledging that TrueAllele evidence has never been litigated in
    Florida and the issue of the lack of internal validation has not been ruled
    upon in any other jurisdictions regarding the TrueAllele software, the
    motion focused on an unpublished New York county court decision issued
    in People v. Hillary, 1 which addressed an issue of a lack of internal
    validation with regards to STRmix, a different probabilistic genotype
    software.
    The matter proceeded to a lengthy hearing. The State’s witness at the
    hearing was the Cybergenetics DNA analyst who testified about the
    development of TrueAllele as a validated probabilistic genotyping system.
    She explained the science behind the software and how it works to
    determine probabilities. She testified that, in this case, PBSO sent data
    on five different samples collected from clothing for interpretation.
    The Cybergenetics DNA analyst also testified that she co-authored two
    peer-reviewed publications regarding the TrueAllele method of analyzing
    DNA and had participated in twelve additional studies involving validating
    computer programs or reviewing different aspects of the computer
    interpretation. She had testified in nine other cases prior to the instant
    case and each time her testimony concerned DNA interpretation and
    TrueAllele. She also stated there have been thirty-five validation studies
    regarding TrueAllele, and TrueAllele analysts from Cybergenetics have
    testified seventy-six times in court. She said that seven validation peer-
    1 A copy of People v. Hillary, Decision & Order on DNA Analysis Admissibility,
    Indictment No. 2015-15 (N.Y. St. Lawrence Cty. Ct. Aug. 26, 2016) was attached
    to Appellant’s motion.
    3
    reviewed papers on the TrueAllele case work system established that “the
    error rates, the sensitivities, specificity and reproducibility of the system
    as well as its accuracy,” reflected that the results it was getting made
    sense. She testified that there have been sixteen times in multiple states
    that TrueAllele was challenged under Daubert or Frye, and in each case,
    the results were ruled admissible.
    The Cybergenetics DNA analyst further testified that in 2014, she co-
    authored a “validation study,” which she explained refers to a test done on
    any new method or system or computer method to determine that it is
    working as expected, so that it can be tested on a wide variety of data. She
    explained there are two types of validation studies: developmental and
    internal, which are done to ensure that the system is reliable and establish
    any limits or error rates. Developmental validation refers to tests done by
    the manufacturer of any method or system to ensure scientific accuracy.
    Internal validation, as it relates to forensic science, refers to validation
    performed by labs to follow the FBI quality assurance standards for access
    to the Combined DNA Index System (“CODIS”). Notably, the Cybergenetics
    DNA analyst testified that in eight of the admissibility challenges against
    TrueAllele in prior cases where the TrueAllele evidence was ruled
    admissible, there was never any internal validation done on the lab from
    which the data came nor was the lack of internal validation on a specific
    lab’s data an issue for the reliability of the evidence. She noted that a
    challenge similar to that raised by the defense in this case to the internal
    validation was never raised in any of the prior legal cases of which she was
    aware. She moreover testified she was not aware of any studies or papers
    reflecting that, without an internal validation, the TrueAllele results are
    not scientific.
    Concerning People v. Hillary, the Cybergenetics DNA analyst testified
    that when using STRmix, internal validation might be necessary, but the
    same is not true for TrueAllele:
    A: Right, we do not use STRmix, we use TrueAllele.
    Q: All right, and so you don’t – do you have to have internal
    validation like this Dr. Buckleton was doing where he was
    picking and choosing himself the parameters?
    A. No, with TrueAllele you just put in – we put in all of the
    data. The computer can learn from that data, figure out the
    different parameters that he had to input. There is no
    calibration that needs to be done with TrueAllele. It can learn
    it all from the data. The mathematical model is sophisticated
    4
    enough that you can put in all the data above the baseline, it
    can solve for all those different parameters. You do tell it a few
    things, like how many contributors or how long to run for but
    not too much more than that.
    (emphasis added). In contrast to STRmix, the Cybergenetics DNA analyst
    explained:
    [W]ith Hillary, the reason why the internal validation would be
    important is because they do need those calibration settings
    to be able to properly run the [STRmix] program on data from
    a specific lab because it’s dependent on knowing information
    about that data ahead of time, whereas TrueAllele is not.
    The Cybergenetics DNA analyst reiterated that TrueAllele’s sophisticated
    scientific model does not require calibration in order to be scientifically
    reliable and that this concept had been scientifically tested many times
    and is accepted within the community as scientifically reliable.
    She further testified that the Scientific Working Group on DNA Analysis
    Methods (“SWGDAM”) issues guidelines and recommendations, rather than
    mandates, which are then passed on to the FBI to update their quality
    assurance guidelines for public crime labs. The Cybergenetics DNA
    analyst testified that the latest SWGDAM recommendation, issued in
    2015, is applicable to validation of probabilistic genotyping systems, like
    TrueAllele. However, she explained that public crime labs utilizing such
    systems themselves are required to follow those quality assurance
    standards, part of which is the internal validation, in order to access
    CODIS, but those standards do not apply to private laboratories, like
    Cybergenetics, which do not have access to CODIS. In other words, the
    SWGDAM guidance for internal validation is for public labs running their
    own tests to be able to maintain CODIS access and verify their handling
    of raw specimens. The Cybergenetics DNA analyst further testified:
    [V]alidation is important for any scientific method. There are
    thirty-five TrueAllele validation studies. Seven of them have
    been published in peer reviewed papers. And they test all
    kinds of data, including the types of data that were produced
    in this case, the same kits, the same sequencers.
    (emphasis added).     In this regard, the Cybergenetics DNA analyst
    explained that internal validation of the in-house data from PBSO was not
    required before being used on case work tested at the PBSO lab where
    such has been done on the same types of data and where there is no
    5
    special setting that Cybergenetics would have to input for any kind of data.
    The Cybergenetics DNA analyst also added that the SWGDAM guidelines
    concerning probabilistic genotyping systems were issued in June 2015
    after all of Cybergenetics’s analysis in this case, and the SWGDAM
    guidelines state that they are not to be applied retroactively.
    On cross examination, the Cybergenetics DNA analyst clarified and
    reiterated that Cybergenetics does not produce actual DNA data profiles,
    but rather, only interprets them. She further reiterated that Cybergenetics
    does comply with the SWGDAM as to all of the applicable guidelines,
    noting that it has done developmental validations and that she herself has
    done internal validations on TrueAllele on different kinds of data. The
    Cybergenetics DNA analyst confirmed that the TrueAllele operator reviews
    the data to determine how many contributors there might be, and if there
    is a question, for example, as to whether there are three or four
    contributors, the operator can have TrueAllele solve for both scenarios and
    provide the information for both scenarios. She also confirmed that the
    operator can adjust the degradation option, but stated that changing the
    input for either factor does not materially affect the result.
    The defense called the forensic quality assurance manager for the PBSO
    crime lab as a witness at the motion hearing. She testified that because
    the PBSO lab is accredited by the FBI, it must follow the FBI’s standards,
    which require internal validation of all DNA methods used. She testified
    that PBSO now uses STRmix in their lab, for which PBSO conducted an
    internal validation. However, she testified that PBSO does not follow every
    SWGDAM guideline if it determines it does not apply to the system it is
    using or mixtures it is interpreting. The PBSO forensic manager testified
    that the data in this case was generated at PBSO using equipment and
    kits PBSO routinely uses and had been internally validated prior to use.
    She testified that labs using the same testing kits can use different settings
    such as the number of analysis cycles they run which can have differences
    in the data generated.
    On cross examination, the PBSO forensic manager testified she has not
    been trained on either the STRmix or the TrueAllele systems. She testified
    that: “One of the things that the STRmix does in our system is it uses our
    thresholds, it uses our validation, everything that we have goes into the
    system so that it takes all of that into account.” However, she contrasted
    this with TrueAllele, stating that her understanding of
    how TrueAllele works is that you don’t need any of those types
    of data, that it’s self – the program self-teaches when it looks
    at the data. It doesn’t use stochastic thresholds and things of
    6
    that nature, that it takes longer to run and it analyzes and
    self-teaches with the data.
    The defense also called an adjunct professor who teaches forensic
    science and who had previously been the quality assurance manager for a
    small unrelated private lab. She testified that SWGDAM guidelines are
    regarded as the best practices in the field. She explained that validation
    is a scientific principle and that for probabilistic genotyping software,
    internal validation is “ground truth testing,” where the operator creates a
    mixed sample from known individuals and tests the software to see if it
    makes the proper inclusions and exclusions. Using in-house data (in this
    case, data derived at PBSO) to create the known mixture is important
    because the data fluctuations can affect the analysis. She testified that
    part of internal validation at the lab generating the data is determining
    what is good data and when results are reliable, so as to learn the system
    and know its limitations with the individual crime lab’s data. She testified
    that things like baseline and instrumentation can impact the data and
    that the limitations of the software program need to be defined for the data
    at the laboratory that is developing it. She opined that a Cybergenetics
    analyst making subjective adjustments in the number of cycle times the
    program performs an analysis and other parameters can impact what is
    reported and that a Cybergenetics analyst is not in the best position to
    know what is the best representation of that data since he or she has not
    studied the generating laboratory or the data that is being produced at
    that lab. She testified that based on documents she reviewed, there was
    no internal validation done using known test sample data created by PBSO
    in connection with the TrueAllele system. Finally, she opined that this
    internal validation requirement should be imposed as to TrueAllele just as
    it is on STRmix, citing to the Hillary case.
    On cross examination, the professor admitted she had not prepared
    any type of report of her analysis or opinions in this case. Significantly,
    she also admitted she had never been associated with TrueAllele or trained
    on how it works. She further admitted she had not published any scientific
    journals or conducted research on whether TrueAllele was reliable without
    internal validation. Moreover, she explained the basis for her opinion on
    TrueAllele was her own “self-study” from what was provided by the public
    defender and in videos. She also testified she did not perform a validation
    to say whether there was a problem with the data.
    At the conclusion of the hearing, defense counsel agreed that the
    SWGDAM guidelines were not mandated, but argued they were generally
    accepted guidelines and best practices which include internal validation,
    which was not done in this case by PBSO.
    7
    The State maintained that TrueAllele was not a lab running forensic
    DNA analysis, but was rather a machine calculating mathematical
    problems. It pointed to the published TrueAllele validation papers,
    asserting that the scientific community has validated TrueAllele and has
    embraced its procedures. The State argued that both the Daubert and
    Frye standards were met by the evidence it submitted.
    After the hearing, the trial court ruled that the TrueAllele expert
    testimony was admissible, finding that the TrueAllele analysis results in
    this case met the requirements of Frye.
    Appellate Analysis
    A trial court’s ruling regarding the admissibility of expert testimony is
    reviewed on appeal for abuse of discretion. Kemp v. State, 
    280 So. 3d 81
    ,
    88 (Fla. 4th DCA 2019). “However, ‘that discretion is limited by the rules
    of evidence.’” Vitiello v. State, 
    281 So. 3d 554
    , 559 (Fla. 5th DCA 2019)
    (quoting Michael v. State, 
    884 So. 2d 83
    , 84 (Fla. 2d DCA 2004)).
    When Appellant filed his motion to exclude the DNA interpretations by
    the TrueAllele software program, the law was unclear as to whether the
    Florida Legislature’s adoption of the Daubert standard for the admission
    of expert evidence was constitutional. Hence, Appellant’s motion sought
    exclusion of the evidence under both Frye and Daubert. Approximately a
    month before the trial court ruled on the motion, our supreme court issued
    its opinion in DeLisle v. Crane Co., 
    258 So. 3d 1219
     (Fla. 2018) declaring
    the legislative amendment to the Evidence Code unconstitutional. Id. at
    1229. Hence, the trial court issued its ruling applying the Frye standard.
    Two months after this appeal was filed, however, the supreme court
    adopted the legislature’s amendments to the Evidence Code as rules of
    procedure, thus changing the evidentiary standard in Florida from Frye to
    Daubert. See In re Amendments to Fla. Evidence Code, 
    278 So. 3d 551
    ,
    551–52 (Fla. 2019). Similar to the situation we confronted in Larocca v.
    State, 
    289 So. 3d 492
     (Fla. 4th DCA 2020), we apply Daubert to the
    resolution of this case because the amendment to the Evidence Code
    implementing Daubert is procedural, making its application binding for
    our decision. See In re Amendments to Fla. Evidence Code, 278 So. 3d at
    552; Larocca, 289 So. 3d at 493 (explaining that “[u]nder Florida’s ‘pipeline
    rule,’ the ‘disposition of a case on appeal should be made in accord with
    the law in effect at the time of the appellate court’s decision rather than
    the law in effect at the time the judgment appealed was
    rendered’”(alteration in original) (quoting Kemp, 280 So. 3d at 88)).
    8
    The Daubert standard is codified under section 90.702, Florida
    Statutes, which provides:
    If scientific, technical, or other specialized knowledge will
    assist the trier of fact in understanding the evidence or in
    determining a fact in issue, a witness qualified as an expert
    by knowledge, skill, experience, training, or education may
    testify about it in the form of an opinion or otherwise, if:
    (1) The testimony is based upon sufficient facts or data;
    (2) The testimony is the product of reliable principles and
    methods; and
    (3) The witness has applied the principles and methods
    reliably to the facts of the case.
    § 90.702, Fla. Stat. (2018).
    Under Daubert, a trial judge is to function as a gatekeeper to “ensure
    that any and all scientific testimony or evidence admitted is not only
    relevant, but reliable.” 
    509 U.S. at 589
     (emphasis added). The gatekeeping
    function is “‘to ensure that speculative, unreliable expert testimony does
    not reach the jury’ under the mantle of reliability that accompanies the
    appellation ‘expert testimony.’” Kemp, 280 So. 3d at 88 (quoting Rink v.
    Cheminova, Inc., 
    400 F.3d 1286
    , 1291 (11th Cir. 2005)). Although there
    is no definitive list of factors for the court to consider in making this
    determination, the Daubert court laid out several observations it deemed
    appropriate for consideration of the reliability inquiry, including: (1)
    “whether [the] theory or technique . . . can be (and has been) tested”; (2)
    “whether the theory or technique has been subjected to peer review and
    publication”; (3) “in the case of a particular scientific technique, the court
    ordinarily should consider the known or potential rate of error”; and (4)
    “general acceptance.” Daubert, 
    509 U.S. at
    593–94.
    The crux of Appellant’s argument on appeal is that the failure to
    internally validate the TrueAllele software using a test sample of PBSO-
    generated DNA data prior to using the program for case work rendered the
    TrueAllele analysis results unreliable under Daubert, and therefore, the
    trial court abused its discretion in permitting such evidence. Appellant
    supports his position by contending that there was no dispute that the
    SWGDAM guidelines (which require internal validation) are the best
    practices, and that failure to use the acknowledged best practices means
    the results obtained from the TrueAllele software would not be generally
    9
    accepted in the DNA scientific community and, therefore, the TrueAllele
    analysis in this case was unreliable. Additionally, Appellant asserts that
    the internal validation study done with regard to one lab does not relieve
    the best practice of performing internal validations of other labs. Thus,
    Appellant asserts that Cybergenetics’s internal validation studies on DNA
    samples for other labs does not alleviate the failure to perform an internal
    validation study as to the PBSO lab.
    Appellant acknowledges there are no appellate cases from this state or
    any other jurisdictions determining that the failure of a lab to perform an
    internal validation study renders the results of a TrueAllele analysis
    inadmissible. Instead, Appellant relies on the unpublished New York
    county court ruling regarding the inadmissibility of an analysis using the
    STRmix software because the law enforcement agency did not internally
    validate the software. In seeking reversal, Appellant argues that while the
    Cybergenetics DNA analyst testified TrueAllele did not require calibration
    like STRmix, the operator still has to decide how many different people
    contributed to the DNA mixture and whether to turn off the degradation
    feature, such that these subjective factors inputted by the operator are the
    reason internal validation is necessary. However, we conclude that
    Appellant’s argument fails to establish the trial court abused its discretion
    in permitting the evidence.
    Upon review of the transcript of the hearing on the motion to exclude
    and, in particular, the pertinent background facts identified above, we are
    satisfied that the trial court properly assessed and concluded that the DNA
    statistical interpretation performed by the TrueAllele software program
    was reliable after considering: (1) the theory or technique has been tested;
    (2) the theory or technique has been subjected to peer review and
    publication; (3) the known or potential rate of error for the program; and
    (4) the general acceptance of the program. See Daubert, 
    509 U.S. at
    593–
    94. We are also satisfied that the trial court gave specific consideration to
    Appellant’s argument regarding the lack of internal validation but
    concluded the argument and evidence did not merit excluding the
    TrueAllele evidence. Although the trial court applied the Frye standard,
    we conclude the trial court’s analysis and conclusions would have been
    the same under the Daubert standard. 2 In terms of assessing reliability
    2The Frye standard is that “expert testimony should be deduced from generally
    accepted scientific principles.” DeLisle, 258 So. 3d at 1225. As discussed above,
    one of the indicia of reliability under Daubert is “general acceptance” in the
    scientific community. We note that in denying the motion to exclude, the trial
    10
    under either standard, the focal point of Appellant’s argument in the trial
    court and on appeal has been the reliability factor of “general acceptance.”
    It is particularly significant that Appellant has cited no appellate decision
    in Florida or elsewhere to support his argument. It is also particularly
    significant that the defense expert in this case was not sufficiently familiar
    with the TrueAllele software to effectively opine as to how the failure to
    internally validate the software using PBSO-generated test data
    compromised the reliability of the analysis of the DNA samples collected
    from clothing during the criminal investigation of this case. The trial court
    did not abuse its discretion or violate any evidentiary rule by determining
    it was not convinced that the lack of internal validation in this case made
    the results unreliable. We therefore affirm on this issue, as well as the
    other issues raised on appeal.
    Affirmed.
    WARNER and FORST, JJ., concur.
    *         *         *
    Not final until disposition of timely filed motion for rehearing.
    court references and discusses evidence regarding all of the Daubert reliability
    factors.
    11
    

Document Info

Docket Number: 19-0822

Filed Date: 2/24/2021

Precedential Status: Precedential

Modified Date: 2/24/2021