Forensic Science Graduate Research Projects

Permanent URI for this collection

Browse

Recent Submissions

Now showing 1 - 20 of 30
  • Publication
    Comparison of Commercial DNA Extraction Kits for Extraction of DNA From Bones Prepared Using the Novel Theralin Method
    (2023-11-06) Bowers, Madalyn; Wilson, Mark
    The extraction of DNA from bones to produce high quality samples is extremely important in cases involving unidentified human remains and victims of mass disasters, and many times in these cases, only human bone is available as a source of DNA that can be used to identify the individual. A novel method of preparing bones for extraction using TheralinTM, a precipitating fixative that demineralizes bone, has been developed by researchers at George Mason University. This novel method essentially transforms the bone into a gum-like state that allows for the bone material to be cut into smaller pieces and then digested using lysis buffers. Research has not yet been performed to determine an ideal method of DNA extraction from Theralin-prepared bone, and this study seeks to answer that question. In this study, four different methods of DNA extraction using commercial extraction kits were carried out on bone samples that were prepared using the Theralin method. These extraction methods were the Qiagen QIAamp® DNA Investigator kit protocol with MinElute columns, the PrepFilerTM BTA Forensic DNA Extraction kit protocol with PrepFiler magnetic beads, a combination method using the Qiagen lysis buffers with the PrepFiler magnetic beads, and a combination method using the PrepFiler BTA lysis buffers with the Qiagen MinElute columns. Following extraction with these four methods, the DNA extracts were quantified using the QuantifilerTM Trio DNA Quant kit, and the quantification values of the extracts were compared to determine which method provided the highest amount of DNA. The extracts were amplified using the GlobalFilerTM PCR Amplification kit and genotyped using the Applied Biosystems 3500 Genetic Analyzer with GeneMapperTM ID-X v1.6 software. The STR profiles were compared to the known profiles of the bone samples and analyzed for allelic dropout and signal strength metrics to determine the quality of each DNA extract. The results of this study provided insight into the usefulness and efficiency of the Theralin method of bone preparation, combined with a commercially available DNA extraction technique, to give quality DNA extracts that can be used for downstream applications.
  • Item
    Comparison of Forensically Significant Carrion Insects at Buried vs. Surface Remains in Northern Virginia
    (2023-11-20) Kline, Sonja K.; DiZinno, Joseph
    Several factors complicate using necrophagous and other forensically important insects to determine a post-mortem interval in death investigations. First, insects demonstrate wide temporal ranges and spatial diversity, making it difficult to predict which species will be present. Secondly, carrion insects have consistent patterns in carcass size preference, and such patterns may indicate that dismembered remains, which represent smaller food sources, have different patterns of succession than those documented for whole cadavers. Although carrion insect succession has been well documented on whole human and porcine remains, there is limited documentation of insect succession for sectioned, or dismembered, remains. This study compares data collected from dismembered porcine remains in Central Virginia to published data for whole cadavers. Additionally, data was collected from surface-deposited dismembered and from buried remains to address differences between whole and dismembered remains under both types of conditions. Results of my study suggest that while fauna on surface dismembered remains followed a consistent successional pattern, the insects present were typical of later stages of decomposition on whole bodies. Surface dismembered remains also showed higher abundances of different Diptera taxa, such as Heleomyzidae, Sepsidae, and Phoridae, compared to the most common families – Calliphoridae, Muscidae, and Sarcophagidae – traditionally reported for whole remains. Additionally, the final amounts of tissue remaining were similar for both surface-deposited and buried dismembered remains, suggesting that smaller body parts may decompose equally rapidly regardless of the level of activity of insects or the diversity of arthropod fauna. Finally, fauna on buried remains consisted of understudied taxa that recent studies show have species-specific food preferences, and therefore, that might be of forensic significance, such as Collembola, Mycetophilidae, and Acariformes mites.
  • Item
    Frozen Justice: Detecting and Recovering Firearms Evidence at Snow Scenes
    (2023-05) Kruk, Kyle J.
    This research project identifies and explores the complexities associated with the detection and recovery of firearms evidence, namely casings, from areas in which snow is prevalent. By reviewing established crime scene processing methodologies, this research sought to determine what, if any, effect snow has on an investigator’s ability to find and collect casings when they are concealed underneath a layer of solid precipitation. Casings of eight different sizes were used to establish a mock scene at Fort Drum, New York, where the average annual snowfall regularly exceeds 100 inches. Two different brands of metal detectors were used to recover the casings, a Garrett CSI Pro and a Bounty Hunter Tracker IV. Bench tests performed with both detectors in controlled indoor conditions showed consistent responses to a distance of approximately eight inches across the eight calibers and gauges examined. During the recovery experiment approximately one month after initial deposition, the less expensive detector performed poorly when pitted against the more expensive model. The experiment resulted in the recovery of 48 of 80 casings, a 60% recovery rate. Neither detector was able to locate .22 Long Rifle casings at a depth of approximately 7”, indicating a relationship between size of the casing, depth of snow, and strength of the detectors.
  • Item
    Enhancing Forensic Science Education for Law Enforcement Personnel
    (2022-05) Anderson, Sydney
    As crime escalates throughout the United States, it is important that law enforcement personnel are aware of, and educated, in the vast array of forensic science disciplines that can, in many cases, aid in the identification of suspects. This research project aims to identify and examine the needs of the law enforcement community regarding forensic science education, with the goal of constructing easily accessible training programs and information repositories. Questionnaires to assess current knowledge and training, and to identify areas of need and improvement, were recently formulated and disseminated to law enforcement agencies across the United States. Forty-eight responses were received from officers/agents in 20 states. Thirty-eight respondents were invited to participate in focus groups, with 8 opting to participate in a final survey instead. An additional 23 individuals were excluded due to missing response fields or incorrect self-identification as a member of the law enforcement community. The survey focused on assessing the needs for future trainings as identified and articulated by Law Enforcement Personnel. Focus groups were conducted following the initial survey to learn in greater detail what training topics are the most valuable. Five focus group sessions were held with 2-4 attendants per session; each session lasted approximately one hour long. Exploratory methodology was employed in the focus groups conduct and design to gather information regarding the favored method of training, desired tools, and information dissemination techniques. Following the focus groups a final survey was distributed to the law enforcement community. 161 responses were received from officers/agents nationwide. 85 respondents answered all survey questions provided, 17 respondents answered some survey questions provided and 59 respondents answered zero questions. Training programs are planned in the near future with the goal of creating accessible educational training online, and/or in person, to all individuals involved in law enforcement.
  • Item
    Fentanyl Degradation In Syringes Obtained From IV Drug Users in Washington, D.C.
    (2022-04-18) Montalvo Vargo, Shayla
    Considering the continued opioid epidemic, it is important to understand the local drug trends to support public health initiatives. In 2020, the Public Health Laboratory (PHL) within the D.C. Department of Forensic Sciences (DFS) established a needle-exchange program to monitor local intravenous (IV) drug trends. The syringes are collected anonymously from various programs throughout the District and are analyzed in the lab for the presence of controlled dangerous substances (CDS). In addition to identifying the CDS qualitatively, there is also forensic interest in determining the degradation of specific drugs in syringes over time. Presented in this study is a timeline of fentanyl stability and other various adulterants such as heroin, etizolam and xylazine that are commonly found in combination with fentanyl in syringes from the D.C. needle-exchange program. Polypropylene syringes were conditioned to mimic used syringes among intravenous drug users in D.C. and subsequently analyzed for drug residue after 28 days. The method followed in this study consisted in the detection of unknown quantities of the target drugs via gas chromatography-mass spectrometry. This proved to be successful in the identification of degraded compounds as well as the quantification of the specific drugs over a tracked period of time. Data gathered from this study supported the efforts of the D.C. PHL Forensic Chemistry Unit (FCU) in the needle-exchange program by providing an accurate timeline for storage protocols and the optimal timeframe for drug analysis.
  • Item
    Examination Of The Duration Of Immersion In Water And Extent Of Pruning Observed On Fingertips
    (2022-05) Ross, Sheena
    In the realm of forensic science, fingers are normally thought of as evidence for identification; however, it is imperative that the forensic science community understands other valuable evidence that fingers can provide in aquatic medicolegal death investigations and criminal nonfatal investigations. This research project examined the development of fingertip pruning during 120 minutes of immersion in warm tap water that was allowed to cool, instead of being held constant. Additionally, this research examined the dissipation of fingertip pruning for 60 minutes after removal from the water. This research utilized ImageJ, an image analysis software, to provide two measures of quantitative results: the amount of swelling of individual friction ridges at each time interval and the percentage of the overall fingertip surface area with visible pruning. The findings of this study indicated that friction ridges increased in width as the duration of immersion increased, with some variation. Additionally, the percentage of surface area covered by pruning had a strong correlation between duration of immersion in water and duration of time removed from water. Lastly, the changes in the fingertip condition occurred quickly, within 20 minutes, and even after two hours of water immersion, the most obvious presence of pruning dissipated within 30 minutes; this supported that fingertip pruning should be treated as transient evidence in aquatic criminal investigations.
  • Item
    A Quantitative Study On Extraction Versus Direct Amplification Of Touch DNA Samples
    (2022-05) Basrawi, Jude
    Touch DNA, also known as Trace DNA, is an important aspect of criminal investigations, as the perpetrator is unaware of the DNA they have left behind. While there are multiple ways to extract touch DNA, it has been established that direct amplification provides exemplary results. Direct Amplification is the process of analyzing the samples by amplifying them without extraction or quantitation. However, there is no measure on how much information direct amplification provides as opposed to other widely used methods including standard extraction procedures. This study aimed to look at the differences in information obtained when processing a touch DNA sample using standard extraction procedures versus a direct amplification approach. Understanding the scope of information collected by direct amplification versus standard extraction procedures can be used to facilitate new protocols for processing touch DNA. This study focused on collecting touch DNA samples from 20 volunteers and processing the DNA using two methods. In the first half of the study, half of the samples were extracted, quantified, and amplified using the InnoXtractTM, InnoQuantTM and InnoTyperTM kits, respectively, from InnoGenomics, LLC. The next part of the study included adding swabs to be directly amplified using the InnoTyperTM 21 kit. Rather than the standard extraction protocol, this method included placing the swab heads directly into the PCR tubes for amplification. Samples undergoing direct amplification are expected to show more information due to the fact that standard extraction protocols often result in loss of DNA. Data generated from capillary electrophoresis was compared by reviewing the allele peak heights. Samples exhibiting “real” and representative peaks were further reviewed to determine if the peaks were real or due to artifacts such as noise. The research findings may lead to revised protocols that can be applied to difficult sample types, such as touch DNA, which often result in partial DNA profiles that often contain very little information.
  • Item
    Crime Scene Documentation- iPhone vs the Nikon D5600
    (2021-05) Mansell, Andrew
    A picture is worth a thousand words because of its power and influence. They are an unbiased representation of fact and why crime scenes are documented with powerful cameras like the Nikon Digital Single-Lens Reflex. Modern smartphone technology provides us with an alternative. This study compares an iPhone XS Max to a Nikon D5600 to accurately depict a crime scene. This study compares each camera's mechanisms and judges the quality of their exposures through sharpness, resolution, and acutance
  • Item
    Evaluation Of Virtual Standard Curve Functionality Of The HID Real-Time Pcr Analysis Software By Comparison To Assay Specific Standard Curves And An External Standard Curve Generated In- House
    (2021-05) Mauriello, Angelina
    Quantitative PCR (qPCR) plays a critical role in the field of forensic biology to determine the amount of “amplifiable” human specific DNA. If too much or too little DNA is present, it can result in profiles difficult to interpret. Therefore, qPCR is beneficial in determining the quality and quantity of DNA to generate an interpretable profile from a forensic sample. To determine the DNA quantity, a set of standard with known DNA concentrations are used to make a standard curve using ordinary least squares which is compared to samples with unknown quantities of DNA to determine the concentration. The goal of this research project was to examine two alternative methods in determining the quantity of DNA that does not require a standard curve for each run to minimize run-to-run variation and reduce costs and analyst time. These two methods are the use of an external standard curve and a virtual standard curve using different lot numbers, curve preparations by different analysts, and between instrument calibrations. Samples were quantified in duplicate, and a linear regression was determined utilizing the of the average of all runs to calculate the slope and y-intercept per variable and target to generate a virtual standard curve in the new HID Real-Time PCR Analysis Software v1.3. It was determined that the external standard curve method and virtual standard curve method were identical. Results showed there was no significance between instrument calibration and no difference between kit lots when comparing the assay specific curve to the virtual/external curve methods. For the virtual standard curve, there was no significant difference to the assay specific method. There was significant differences between pipetting from different analysts when looking at the different standards prepared. A recommendation from this research regarding the use of these techniques is to have as many analysts as possible pipetting. If more than one variable is introduced throughout the process, a new virtual standard curve needs to be generated. This study demonstrates the feasibility of the implementation of the virtual standard curve function into a case working laboratory workflow and that a laboratory can benefit using these methods.
  • Item
    Age-at-death estimations using cementochronology in thermally altered teeth
    (2021-04) Weaver, Ryen L.
    Macroscopic age-at-death estimations provide age ranges that give broad and often insufficient insight to an individual’s chronological age. Accurate age estimations can become more complex for a forensic anthropologist if the unknown individual has been subjected to extreme heat from an assortment of scenarios that include but are not limited to structure fires, airplane crashes, automobile accidents, and attempts to conceal evidence of homicide. Due to their high mineral content and sequestered placement in the jaw, teeth have the ability to be one of the best-preserved human tissue remains in extreme heat situations (Beach, Passalacqua, & Chapman, 2015). Cementochronology utilizes the cementum, the mineralized covering of a tooth root, as an aid in estimating an individual’s age (Colard, et al., 2015). The utilization of cementochronology is one of the most accurate ways to estimate age-at-death due to the countable cementum annulations found in the cross-sections directly correlating to the individuals age (Wittwer-Backofen, 2004). This study aims to provide an in-depth analysis of thermal alteration to human teeth by various accelerants when utilizing the cementochronology method to build a biological profile. The sample in this study consists of 36 teeth from both male and female donors from all odontological positions ranging in age from 9-87 years. Three accelerants with varied volumes were used to determine if the readability of cementum annulations can still remain accurate after alteration. Results indicated that the type of tooth had significant impact on the ability to count annulations. Annulations were able to be read and estimated after alteration with all three accelerants used in this study. However, acetone yielded the lowest results with the most severe alterations. A novel formula was developed to help approximate the amount of cementum annulations found within each sample. This formula was found to yield estimated tooth cementum annulations that were highly correlated with the actual ages of the individuals from whom the samples were from. After thermal alteration of teeth with accelerants it was found that cementochronology is an accurate and helpful tool for estimating age-at-death in unknown individuals.
  • Item
    Reproducibility of Quantifiler® Trio Assay-Specific Standard Curves Over a Range of Variables to Generate Guidelines for Crime Laboratory Workflows
    (2021-12-08) Betts, Stephanie
    Quantitative PCR (qPCR) is the preferred method of quantitation in forensic DNA analysis, used to determine the amount of amplifiable human DNA present in evidence or reference samples. A standard curve is created via quantitation of a serial dilution containing known DNA concentrations. Quantifiler® Trio manufacturer guidelines were followed for the creation of five concentrations varying from 50 ng/µL to 0.005 ng/µL. The primary goal of this research was to generate laboratory guidelines and recommendations for quantitation standards for forensic laboratories hoping to streamline their workflows, and to determine how long standards are valid to decrease the amount of time and money spent on assay-specific standard curves. Most often, a standard curve is generated every time an assay is performed. The research data was generated over a two-month period. Through multiples runs, including two analysts involved in plating of samples, standard curves were analyzed for variation in curve parameters, e.g., has the slope or quantitation range changed over time. Mock case samples were prepared and analyzed to check the efficacy of the assay-specific standards versus a virtual curve to validate the suitability in DNA crime laboratories. In order to illustrate the relevance of laboratory generated internal standards (assay standard curves). First, if the standard curve is slightly different every time a new curve is generated, how could it affect the laboratory DNA results? Secondly, how variable are these standard curves overtime when performed by the same or multiple individuals; this will be evaluated by a comparison of linear regression values calculated by the software for each standard curve, between analysts. Results indicated little to no difference in the values for T.Y, small, and large autosomal targets. Linearity remained consistent beyond recommended discard of 14-days. Whether a laboratory prefers ASC or VSC the data provided shows that there is little to no difference between the two curve methods prior to amplification. The use of a Quantifiler® Trio kit to generate laboratory standard curves remains effective for a maximum of 21-days, experimentation will prove that the length of laboratory generated standard curves will last well beyond the manufactured data.
  • Item
    Comparison of two different photo protocols and increasing the accuracy of 3D modeling of snow shoeprints by Photogrammetry
    (2021-03-29) Sheen Vento, Karla
    Sometimes a shoeprint can help in narrowing down the number of suspects in a crime scene, so having an efficient recovery method for them can be helpful (Andalo et al., 2012). Photogrammetry has been proposed as a simple and reliable method for shoeprint analysis in previous studies, however, its use in certain surfaces such as snow can be challenging, and it also requires following a strict protocol for picture taking (Larsen and col., 2020). The objectives of this study are to test the equivalency of an alternative picture-taking protocol proposed by Larsen et al. with the standard protocol proposed by the developers of Digtrace, a software that allows the 3D modeling of shoeprints; and to test the effectiveness of different techniques for improving the quality of shoeprint’s photos taken in snow. In the first experiment, two shoeprints were created on sand and mud, and photographed using Larsen’s and Digtrace’s photo taking protocol. A series of 3D models were created in Digtrace, randomized, and cloud points extracted from them were compared using the CloudCompare software to assess differences in variability. In the second experiment five shoeprints were created in snow and several enhancing techniques (oblique light, red filter /black – white photo, red, and blue dyes) were used to increase the contrast of the photographs. The same comparison process from experiment was used to determine a reduction in the variability of cloud point distances with a control group. The results shown a higher accuracy from Larsen’s protocol (mean distance 0.1025 mm) than Digtrace’s protocol on mud surface, however on sand surface Digtrace’s protocol revealed less error distance (0.0968 mm) than Larsen’s protocol. The results from the second experiment shown that the use of blue and red dyes produced noticeable improvement of the reliability values. (mean error distance 0.0648 mm and 0.0734 mm). In contrast, oblique lights and red filters/black-white photos did not produce a significant improvement. This study shows that both Larsen’s and Digtrace’s protocols can be used to build reliable shoeprint 3D models and that the accuracy of 3D snow shoeprints can be improved with a simple method such as the spraying of red or blue dyes.
  • Item
    Forensic Taphonomy: Copper and Aluminum Staining on Skeletal Material
    (2021-05-13) Cheney, Eleanor
    In forensic investigations, when an unknown decedent is found, the postmortem interval is a critical data point in establishing identification as well as reconstructing circumstances surrounding the death event. When the body reaches the fourth stage of decomposition, advanced decay, the soft tissues have been completely broken down. The decomposition of soft tissues leads to skeletonization, where only the hard skeletal tissues remain. These materials are then subject to diagenetic processes, including discoloration. The most commonly encountered stains on bones are from soil, organic materials, or metals. Staining on bone from metal compounds can be caused by numerous circumstances wherein various types of metal from clothing, projectiles, or other personal artifacts comes in contact with the remains. Because several factors influence the rate of decomposition, the postmortem interval between death and skeletonization can vary widely. Furthermore, the methods for determining the time between skeletonization and discovery are limited. The following study explored the potential for estimating the postmortem interval via copper and aluminum staining patterns on skeletal remains. The specific goal of this research was to determine whether the discolorations can assist in estimating time since skeletonization and reconstructing the depositional environment. Additionally, the two types of metals were compared to establish any distinct staining pattern or discoloration on bone that is unique and can be presumed as belonging to either copper or aluminum. In the experiment, seven deer tibias were buried in a temperature-controlled environment. Pieces of copper and aluminum were affixed to each tibia. Once a week, for 20 weeks, each bone was examined for signs of discoloration from the metals. Munsell soil color charts were used to quantify the observed skeletal color changes, and a qualitative scoring system was used to measure the degree of staining each week. The staining on bone caused by copper was predominantly green with some yellow and grey variation and became more pronounced over time. Aluminum staining was largely white and exhibited a lesser extent of color change. The data analysis suggested both types of staining possess a rate of color change whose variability is correlated with time. The results of this study will contribute to the identification and assessment of discolorations on skeletal remains, which can potentially help reconstruct the depositional scene and estimate the time since skeletonization.
  • Item
    Direct Transfer: Obtaining Latent Prints from the Skin of a Living Person
    (2021-05-10) Hinze, Dustin
    Fingerprint identification has been at the core of Forensic Science for more than 100 years. It remains one of the most valuable tools to assist law enforcement in identifying suspects and solving crimes. Over time techniques have made it possible to recover latent prints from the skin of human remains and, in some cases, a living person’s skin. Identifying latent prints from human skin could directly corroborate or refute statements or provide investigative leads. One technique is called direct transfer, in which paper is pressed against the skin to transfer latent prints present on the skin. The paper is then processed with various techniques to develop the potential latent prints. This study examined the direct transfer technique in obtaining latent prints deposited on the skin of a living person utilizing kromekote, thermal, and ink/laser jet paper. Magnetic powder and Indanedione were utilized to process each type of paper to develop the potentially transferred latent prints. This research consisted of 1,035 trials conducted at several time intervals: immediately after print deposit, 5, 10, 15, and 30 minutes after deposit. The purpose of this research was to identify the most effective transfer paper substrate, fingerprint development technique, and timeframe to recover latent prints from the skin. The substrate and development technique did not have a significant impact on the results; however, time of recovery after deposition had a significant impact. After five minutes, there was a drop in the level of identification which grew more significant over time.
  • Item
    Deep Brain Stimulation: Treatment for Clinical Depression
    (2020) Admassu, Azaria
    Depression is the most common mental disorder in the United States. A person with this disorder is generally described as feeling sad, discouraged and in general, disinterested in life. Deep brain stimulation (DBS) is a method of electrically stimulating a specific part of a brain using implanted electrodes. Since the symptoms of major depressive disorder have been linked to the dysfunction of the reward circuitry system of which the NAcs are the major players of, DBS to these neurons have been one suggested method to improve patients’ symptoms. In 2013, DBS of the NAcs had given a promising result when patients treated by DBS had shown an improvement with little to no side effect. This paper will discuss the advantages to DBS as an anti-depressant by the assessment of available studies and will further discuss the current as well as future challenges facing DBS.
  • Item
    Peeling Away Uncertainty: A Probabilistic Approach to DNA Mixture Deconvolution
    (2020) Chaudhry, Hajara
    Mixture deconvolution involves the ability to reliably decipher and separate component genotypes of individual contributors at each tested genetic marker. The ultimate objective of this study is to develop an understanding of the integrated framework for attesting the value of using known samples when appropriate to decrease uncertainty in mixture deconvolution by leveraging more of the available genotyping data and observing the impact genotype conditioning has on multiple-contributor mixtures and resulting LRs. In this study known mixtures containing two, three, four, and five contributors were separated in iterative analyses through the assumption of contributors using provided known reference samples, a process referred to as genotype peeling or genotype conditioning.To direct the order of genotype conditioning, contributor mixture weights were estimated as all contributors to the mixture were assumed by mixture weight. Conditioning by match statistic was directed without genotype assumptions, where all contributor genotypes were inferred solely on STR peak height data. Subsequent analyses of each mixture item were conducted, in which, the order of contributors was assumed from highest to lowest based on mixture weight as well as match statistics by utilizing a probabilistic program, TrueAllele®, developed by Cybergenetics. The study demonstrates how genotype conditioning effects mixture deconvolution and resulting match statistics by also considering mixture weight and the number of contributors to a mixture. The results of this study demonstrate that it is possible to generate more informative statistics by refocusing probability distributions for each contributor to the original mixture, leading to refined LRs and reduced uncertainty.
  • Item
    Deconvolution of DNA Mixtures Using Replicate Sampling and TrueAllele® Mixture Interpretation
    (2020) Antillon, Sara
    Analysis of DNA mixture evidence does not always yield distinct profiles. This process is further complicated with low template DNA (LT-DNA)samples often seen in forensic casework. Traditional qualitative methods use thresholds to distinguish allele peaks from stutter peaks, noise, etc.resulting in data being omitted during analysis. In cases where LT-DNA is present, low peaks that could potentially be attributed to low contributor profiles may not be called due to these instituted thresholds. The probabilistic genotyping computer software program created by Cybergenetics (Pittsburgh, PA), TrueAllele® Casework, considers all data and performs quantitative analysis using probability to represent uncertainty. It objectively forms likelihood ratios (LR) that compare the probabilities of an evidentiary genotype with a suspect genotype relative to a reference population. A joint likelihood function (JLF) takes two or more independent sets of data and compares them jointly as opposed to a single event. The JLFcan elicit more identification information proving useful in DNA mixture analysis. This project used TrueAllele® Casework to perform DNA mixture analysis on two sets of previously published mixture data provided by Cybergenetics. The first set comprised 40 two contributor mixture samples and the second set included four sets of 10 randomized mixtures with two, three, four, and five contributors, respectively. The selected samples were interpreted singly and jointly in three variable groups: mixture weight, template concentration, and complex mixtures. The differences between the match logLRs of the single and joint analyses were calculated and an information gain was seen in all three groups when the samples were analyzed jointly. Changing DNA collection and amplification procedures for touch andDNA mixture evidence samples will increase the amount of data available forDNA mixture analysis using probabilistic genotyping. These procedures can be modified so that multiple swabs and replicate amplifications produce more data that TrueAllele can analyze using the JLF. Jointly analyzing each independent evidence data can lead to higher match statistics which will ultimately help in the identification of those who commit crimes.
  • Item
    Latent Fingerprint Development on Adhesive Surfaces After Application to Fabric
    (2020-04) Boarts, Jason
    One of the many items of evidence found at a crime scene that can yield breakthrough clues if handled and processed appropriately is adhesive tape. Through fracture matching and DNA analysis, a person can be linked to the tape and, therefore, the scene of the crime. Another way adhesive tape can link a person to the crime is with development of latent fingerprints. Thorough research and real-life case work have proven latent fingerprints can be developed from adhesive surfaces through a variety of processing techniques. A search of extant literature shows this to be true, but there is little information found on the ability to develop comparative latent fingerprints from adhesive surfaces after they have been applied to fabrics. This study adds to the literature through the deposition of simulant laden latent fingerprints on tan packing tape, clear packing tape and grey duct tape, then applying the tape to denim, polyester, and cotton fabric samples. The tape was processed utilizing crystal violet, black wet powder and small particle reagent. Careful processing and analysis of 135 samples of adhesives and 405 latent fingerprints determined adhesive and fabric types coupled with processing methods play a role in the ability to develop latent fingerprints from adhesive surfaces that have been applied to fabrics. The results of the study fill an apparent gap in the literature and provides investigators and lab analysts another means of potentially identifying persons of interest in criminal investigations.
  • Item
    Factors Effecting Friction Ridge Transfer Through Gloves Used During Crime Scene Processing
    (2020) Wasserman, Aliah
    A pillar of the crime scene processing curriculum, Locard’s Exchange Principle states that every place one goes, they take something with them and also leave a trace behind. The ultimate goal of crime scene processing is to collect valuable forensic evidence while minimizing the effects of Locard’s Exchange Principle. The use of personal protective equipment (PPE) is one of the most common ways the forensic community mitigates the risk of cross-contamination during crime scene processing. Through trial, error, and research, it is known that friction ridge detail can transfer through gloves. The objective of this research is to help the field of crime scene investigation to develop best practices for minimizing scene contamination by way of this transfer. Though the community agrees it is possible to deposit this detail, it is lesser known what practices can intensify that chance, and more importantly – what can prevent it. This research tests two sets of circumstances in an attempt to determine best practices for mitigating friction ridge transfer during crime scene processing. This study determines the effect of hand condition, glove size, and utilization of the double-gloving method on the transfer of friction ridge detail to a glass surface through nitrile examination gloves. The findings were that the condition of the hands prior to donning the glove(s) did not impact this transfer, but that wearing two pairs of gloves significantly reduced the occurrence of friction ridge detail on the deposition surface as compared to one pair of gloves.
  • Item
    Bullet Hole Characteristics, Limiting Factors, and Reconstructing Shooter Location within a Crime Scene
    (2020) Jennings, Jonatthan
    Shooting scene reconstruction and the identification of where the shooter and weapon most likely were located, can be critical pieces of information for law enforcement and crime scene investigators. During an extensive literature review, the gap which appeared was identifying the most likely position of the shooter when accounting for limiting factors, such as room size and furniture. There are several ways to conduct shooting trajectory analysis, with individuals such as Haag, L., and Haag, M. (2011), Hueske (2009), Gardner and Bevel (2009), Gardner and Krouskup (2019), writing at length about the process of determining shooting trajectory and overall crime scene reconstruction. The trajectory analysis for this project was adopted from Gardner and Bevel (2009) and through coordination with the Virginia State Police and included using trajectory rods, angle finders, protractors, and lasers to determine the trajectory of the bullet. Then limiting factors along the path of the bullet, gunshot residue (GSR), and overall room size was accounted for, to identify the most likely position of the shooter. According to Gardner and Bevel (2009) and through coordination with certified crime scene experts, it was determined that shooter positions are broadly assigned to zones one through three. The results expected from this project are to refine zone 1 described as the most probable shooting location, identify the overall accuracy rate of shooting trajectory analysis, and to develop a predictive model statistical analysis to determine the impact of the limiting factors on predicting the shooter’s distance. The conclusion anticipated from this research is when all factors are taken into account, a most likely shooter location could be identified within +- three through five feet.