Forensic Science Graduate Research Projects
Permanent URI for this collection
Browse
Browsing Forensic Science Graduate Research Projects by Issue Date
Now showing 1 - 20 of 31
Results Per Page
Sort Options
Item Computed Tomography as a Supplement for Analyzing Antemortem and Perimortem Blunt Force Cranial Trauma(2019) Marks, FeliciaWithin the field of forensic anthropology, skeletal trauma analysis plays a critical role in reconstructing the events surrounding the life and death of an individual. For example, analyzing cranial blunt force trauma (BFT) could provide insight into a history of abuse based on the amount of healing present. Blunt force trauma to the skull has been researched extensively in the past, but to date has not focused on interpreting the timing of fractures to differentiate early antemortem from perimortem stages of healing. The goal of this research was to examine the fracture characteristics present by traditional macroscopic assessment, then enhancing the details of cranial injuries through computed tomography (CT) in order to have better visualization and aid in determining the timing of fractures. A total of 23 antemortem and 20 perimortem BFT injuries were initially observed within a sample of 30 skulls from skeletal collections at the Smithsonian’s Museum Support Center and National Museum of Natural History. The skeletal collections consisted of the Robert J. Terry Anatomical Collection, and the Peruvian skeletal collection created by Aleš Hrdlička in the early 1900s. This study revealed that by using computed tomography for analyzing cranial blunt force trauma, there was no significant difference in the further classification of fracture timing for early antemortem and perimortem defects.Item Estimation of Ancestry Using Mesiodistal, Buccolingual and Diagonal Tooth Measurements(2019) Navale, ShraddhaIn forensic science, when an individual is to be identified, inmost of the cases, teeth are considered the precious and reliable tool forobservation and analysis owing to the fact that they are the hardest tissueand the only last tissue to decompose in the body, they are multi-rooted onesakin the premolars and molars and also one single rooted tooth – thecanine, attributable to the fact that it has the longest root and thereforebetter allow for better anchorage. Keeping this in mind, this project aims atbeing able to estimate ancestry from the dental data collected viz: the crownmeasurements mesiodistally, buccolingually, diagonally and vertically such asthe crown height, namely. And in this research, it can be achieved byphysically measuring dental diagnostic casts using digital calipers and thenanalyzing the data generated from it using Discriminant Function Analysis.The interpretation and conclusion expected from this analysis is to be ableto classify the dimensions of the teeth as per their ancestral line.Item Forensic Facial Reconstruction: Soft Tissue Thickness from Medical Record Computed Tomography Images(2019) Simpson, AshleyUnidentified human remains are examined using a multitude of forensic techniques (i.e., DNA and fingerprinting) to identify the individual. The last-resort is facial reconstruction, which aims to produce a likeness of the individual that can be recognized by someone familiar with the person (Wilkinson C. , Forensic Facial Reconstruction, 2004). Facial soft tissue thickness (FSTT) datasets allow forensic artists to transform the unidentified skull into a representation of the individual’s face. Over the years, several methodologies have been studied and recommended to create these datasets (needle puncture, ultrasound, CT, MRI). This paper assesses the use of medical record computed tomography (CT) images to improve the accuracy of forensic facial reconstructions. The vastness of available data in medical records can improve the specificity of each forensic reconstruction. Additionally, using medical records can provide geographical region-specific FSTT values. Region-specific values will be valuable as more cultures and ethnicities migrate and connect. This paper randomly selected 30 medical records of individuals ages 21-38 from three different ancestral groups: Caucasian, African American, and Hispanic. This study grouped subjects by sex and race for statistical comparison. FSTT was collected at eight facial landmarks using a DICOM viewer called PACS. The tissue depths measured were analyzed and compared with previous CT datasets. ANOVA results showed two statistically significant (p<0.05) landmarks at the infraorbital and zygomatic arch. Due to time restrictions, the sample size used for this study was very small resulting in a large percent error. Understandably the data from this study alone is not helpful to the field of forensic facial reconstruction, but the future potential described in this paper is promising. One way to increase the accuracy of facial reconstructions is to identify groups of similar morphology. The more we expand the dataset across individuals of various ages and races, the more narrowly the groups can be defined. This paper shows that FSTT collected from medical records is a valuable source of data for facial reconstruction datasets, but the risk for error is significant. Therefore, an automated method of data extraction that minimizes the risk for error would be a viable future project.Item Testing the Matching Capabilities of Megvii's Face++ Using Age-progressed and Real-life Images(2020) Brown, EmilyThe National Center for Missing & Exploited Children (NCMEC) assisted law enforcement with over 29,000 missing children cases in 2019 and has completed more than 6,800 age-progressed images in its history of working on long-term missing children cases. There is currently little research on the topic of age progressions and their impact on facial recognition algorithms specifically when comparing real-life images and digitally produced age-progressed images of the same individuals. The goal of this study was to determine if a facial recognition algorithm could accurately match and generate a missing child's age-progressed image in a list of top 5 candidates when using the child's real-life image as the probe image for the search. Another goal of this research was to determine if there were any differences in the likelihood of matching based on the age of the missing child and the age variation between the child's real-life image and his or her respective age-progressed images. The age-progressed and real-life images of 347 children who went missing between the ages of 1 to 20 were included in the study. A gallery of images (called a FaceSet) was created and included the age-progressed images of all 347 missing children. The missing children's real-life images were searched against the FaceSet using Face++'s Search API and the top 5 matches for each person were generated. Every child was categorized as being in the 'older' group (>=13 - 20 years old) or 'younger' group (<13 years old) based on the age the child was when he or she went missing. The results of the study showed that the confidence scores of matches are higher for older children and there is a greater likelihood of matching for older children. The results of the study also demonstrated that the age-progressed images closest in age to the age of the missing child have a greater chance of being matched as compared to the age-progressed images with more age variation.Item Degrees of contrast: Detection of latent bloodstains on fabric using ALS and the effects of washing(2020) James, Matthew E.Bloodstains are a useful piece of evidence for solving many crimes. The DNA analysis of bloodstains deposited on a piece of clothing can identify whose blood is on the clothing and may place a subject at the scene. In some cases, the stain’s shape, and overall pattern, can provide much more information. However, it is particularly difficult to identify bloodstains on dark clothing and clothing with patterns. Current methods to detect these stains include advanced photography techniques with Alternate Light Sources (ALS) or the use of chemicals that react to the hemoglobin and fluoresce. Photography methods are non-invasive, but there is little research on what wavelengths are the most effective. Chemicals such as Luminol, Bluestar, and Fluorescein are effective, but ultimately ruin the pattern and prevent morphology interpretation of the stain. This study explores the use of ALS to photograph bloodstains in order to provide an alternative non-invasive tool before the use of chemical detection techniques. This study examined whether blood always absorbed light in the 300nm to 900nm range and the best wavelength for observing blood on dark and or patterned fabrics. It also explored whether fabric type, fabric color, or pattern affected the ability to view blood on fabrics, if washing the fabric affected the use of ALS, and, if so, to what extent. Sixty-nine fabrics were photographed in monochrome under ambient light, and then with and without filter under 350nm - 380nm (UV), 400nm - 430nm (Violet), 430nm - 480nm (Blue), 480nm - 560nm (Green), and 800nm - 900nm (Infrared) light. Each photograph was bracketed to ensure the best exposure and contrast between the stain and fabric. In total, 33 photographs were taken for each fabric after each wash cycle. Contrast was measured between the bloodstain and the fabric using ImageJ software to measure the effectiveness of each wavelength. Results indicated photography with ALS was a viable method for blood detection on fabrics and should be used prior to chemical means. Further, infrared, followed by violet light with no filter, were the most effective light sources for viewing bloodstains on dark fabrics without the use of chemicals.Item Peeling Away Uncertainty: A Probabilistic Approach to DNA Mixture Deconvolution(2020) Chaudhry, HajaraMixture deconvolution involves the ability to reliably decipher and separate component genotypes of individual contributors at each tested genetic marker. The ultimate objective of this study is to develop an understanding of the integrated framework for attesting the value of using known samples when appropriate to decrease uncertainty in mixture deconvolution by leveraging more of the available genotyping data and observing the impact genotype conditioning has on multiple-contributor mixtures and resulting LRs. In this study known mixtures containing two, three, four, and five contributors were separated in iterative analyses through the assumption of contributors using provided known reference samples, a process referred to as genotype peeling or genotype conditioning.To direct the order of genotype conditioning, contributor mixture weights were estimated as all contributors to the mixture were assumed by mixture weight. Conditioning by match statistic was directed without genotype assumptions, where all contributor genotypes were inferred solely on STR peak height data. Subsequent analyses of each mixture item were conducted, in which, the order of contributors was assumed from highest to lowest based on mixture weight as well as match statistics by utilizing a probabilistic program, TrueAllele®, developed by Cybergenetics. The study demonstrates how genotype conditioning effects mixture deconvolution and resulting match statistics by also considering mixture weight and the number of contributors to a mixture. The results of this study demonstrate that it is possible to generate more informative statistics by refocusing probability distributions for each contributor to the original mixture, leading to refined LRs and reduced uncertainty.Item Vanishing Blood Stains: Determining the optimum apparel fabric and residential lighting conditions for a bloodstain to disappear(2020) Sullivan, BrittniBloodstain Pattern Analysis (BPA) is one of the most important forensic crime scene techniques to date. Fluid dynamics with blood remain relatively similar across the board, which is why many researchers remain focused on specific patterns and what they can reveal about the movements of contributors in a scene. As a bloodstain pattern analyst becomes more proficient in the identification of certain stains, the advanced analyst will begin to study the effect of blood on fabrics. It is well known within the field of BPA that Alternate Light Sources (ALS) are required in most scenes containing blood evidence. This researcher sought to identify which specific fabrics (color and composition) best mask blood stains under residential lighting conditions. Determining the fabric which best disguised a given blood stain will prove useful to the Crime Scene Technician (CST) as well as the subsequent litigation team. If a person alleged they observed “bloody clothing” on a subject/victim, this research answered whether this observation was possible to the untrained eye. Additionally, the results sought to determine the optimum fabric type, pattern, and color to best disguise blood on apparel fabric at a scene. When the “optimum fabric” was observed at a given crime scene with potential blood evidence, this research would trigger the crime scene technician to utilize an ALS, as well as submit the item for in-depth analysis at the laboratory. A sample of fabrics was tested using recent statistics of U.S. consumption of fabric. This ensured an equal sampling of the general American population, and included various solid color and printed fabrics. Additionally, the sample fabrics included various military uniforms (Army Multicam, Marine Combat Utility Uniform, Navy Working Uniform, and Airman Battle Uniform) to make the study relevant to the Department of Defense. The various fabric types and patterns provided different qualities with the respective blood stains that were subsequently measured to gain quantitative results. The fabric samples were placed in four separate indoor lighting conditions, and the contrast of the stains were objectively measured and determined there was no significant difference between each light source. This research ultimately determined there was a significant difference between the construction and color of the fabric.Item A Comparative Analysis of Casting Materials on Developed Latent Prints(2020) Brock, Jenny F.The drive to preserve and collect latent prints has led to the creation of innovative products and technologies. However, because of the increasing variety of the implements, supplementary research was necessary to ensure appropriate application of these products dependent on circumstances, environments, and availability. Further, crime scene technicians and departments, who rely on experience and accessibility of materials, also need current information on product capabilities and limitations. This research compared AccuTrans, First Contact polymer solution, and Elmer’s glue as casting materials applied to latent prints developed with dual-use black powder. The latent prints samples (n=60) were deposited on either flat or curved substrates, and then preserved with one of the three casting materials. Evaluation of the collected latent prints included: whether friction ridge detail was present, how many individual characteristics could be observed with a restriction of 10, and how many levels of friction ridge detail were represented. The results of this study indicated that no matter which casting material was selected, there was no significant variation in the ability of the materials to perform (p=.794) and successfully collect a developed latent print for examination on certain substrates (p=.462). Individually the casting materials had observable strengths and weaknesses such as short or extended dry times, deformation after collection, and ease of collection.Item The Efficacy of Acidified Hydrogen Peroxide on Cartridge Cases as a Latent Print Development Technique(2020) Flores, CamilleFriction ridge impressions, commonly known as fingerprints, are a type of evidence investigators search for when examining crime scenes and evidence. They are left behind by friction ridge skin after physical contact with a surface. These impressions not readily visible by the naked eye are latent prints. Latent prints are composed of oils, perspiration, and other constituents (Dutelle, 2014). In order to visualize these impressions, agencies follow protocols proven to develop prints on various surfaces of substrates. A cartridge case is a nonporous surface, which means the fingerprint residue will not be absorbed by the substrate. Due to the small surface area and the influence of the firing process on any impression left behind, recovering latent prints on cartridge cases is a challenge. This study aims to analyze the effectiveness of various formulations of Acidified Hydrogen Peroxide (AHP) as a fingerprint processing technique on various metal-types of unfired and fired cartridge cases. A subsequent study analyzes how the processing technique can be implemented in conjunction with three commonly used chemical processing techniques for cartridge cases. The techniques are cyanoacrylate ester fuming, fluorescent dye staining, and gun bluing. This study revealed that a manufactured formula of AHP developed the greatest number of samples compared to other formulae. This study also showed that the sequence of AHP, cyanoacrylate ester fuming, and then R.A.Y. developed the greatest number of samples with measurable ridges by the end of the sequence.Item Bullet Hole Characteristics, Limiting Factors, and Reconstructing Shooter Location within a Crime Scene(2020) Jennings, JonatthanShooting scene reconstruction and the identification of where the shooter and weapon most likely were located, can be critical pieces of information for law enforcement and crime scene investigators. During an extensive literature review, the gap which appeared was identifying the most likely position of the shooter when accounting for limiting factors, such as room size and furniture. There are several ways to conduct shooting trajectory analysis, with individuals such as Haag, L., and Haag, M. (2011), Hueske (2009), Gardner and Bevel (2009), Gardner and Krouskup (2019), writing at length about the process of determining shooting trajectory and overall crime scene reconstruction. The trajectory analysis for this project was adopted from Gardner and Bevel (2009) and through coordination with the Virginia State Police and included using trajectory rods, angle finders, protractors, and lasers to determine the trajectory of the bullet. Then limiting factors along the path of the bullet, gunshot residue (GSR), and overall room size was accounted for, to identify the most likely position of the shooter. According to Gardner and Bevel (2009) and through coordination with certified crime scene experts, it was determined that shooter positions are broadly assigned to zones one through three. The results expected from this project are to refine zone 1 described as the most probable shooting location, identify the overall accuracy rate of shooting trajectory analysis, and to develop a predictive model statistical analysis to determine the impact of the limiting factors on predicting the shooter’s distance. The conclusion anticipated from this research is when all factors are taken into account, a most likely shooter location could be identified within +- three through five feet.Item Comparison of Vacuum Metal Deposition and Gun Bluing for Developing Latent Fingerprints on Fired Nickel-Plate Brass Ammunition(2020) Osborne, AmyVacuum metal deposition (VMD) is a highly sensitive method for developing latent fingerprints on semiporous and nonporous surfaces with extensive research focusing on various classes of polymers. The relatively high cost of a VMD unit and the need for an experienced operator, has prevented the technique from replacing traditional methods, such as gun bluing, for developing latent fingerprints on spent cartridge casings. A literature review revealed that while VMD has the potential to be a powerful technique in the development of expended cartridge casings, there is a lack of comparison studies between VMD and other latent print development techniques on shell casings. The purpose of this study was to determine if there is a significant difference in the equality of latent fingerprints developed by vacuum metal deposition and gun bluing (GB) on fired cartridge casings. To accomplish this, a single latent print spiked using a sebaceous oil reference pad was deposited on 250 9mm nickel-plated brass ammunition samples and developed by either VMD or GB. From these 250 casings, 50 were treated in a brief preliminary study that subsequently guided the gun bluing process. As a result, a modified gun bluing protocol using a 20% Brass Black (Birchwood Casey, Texas, USA) solution was followed to process 100 casings while the remaining 100 were subjected to silver/zinc deposition in the VMD560 (West Technology Forensics, England, UK). All casings were photographed, examined, and graded on a 0-4 scale by a Certified Latent Print Examiner. Of the 200 casings developed, 115 samples failed to yield any ridge details (Grade 0) with 54.78% having been developed by VMD. There was limited ridge detail (Grade 1) present on 62 samples with comparable results from both techniques. Low quality detail (Grade 2) was visualized in 23 samples with 86.96% having been produced by GB. Using the Kruskal-Wallis H test with a 90% confidence level to determine statistical significance, the primary hypothesis that VMD would be the superior method was not supported. Therefore, it is recommended that nickel-plated brass casings be processed using the modified GB protocol.Item The Interpretation of Various Skin Conditions on the Transfer of Secondary Touch DNA(2020) Kirkland, JadeSecondary DNA transfer has been an increasing topic of study throughout the forensic science community recent years. Little information is known about how the condition of the surface of an individual’s hand may play a role in the transfer of secondary DNA. This study evaluated whether individuals’ hand conditions, whether oily or dry, played a role in the deposition of DNA onto another individuals’ hands and then onto a glass slab. Participants were assigned as the primary DNA contributor, meaning they would be the person touching the glass. Participants were then told their hand condition and the researcher and each participant had direct contact with each other for 30 seconds via handshakes after which the primary placed their hand on a glass substrate. Swabbings of the glass were taken them extracted, quantified, amplified, and then injected into the AB 3500 Genetic Analyzer. DNA typing results showed that under oily conditions, the secondary contributors' profile was likely to show up in more abundance than if that contributor had dry skin, having 35.4% of alleles detected when both contributors had oily hands and 20% of alleles detected when only the secondary contributor had oily hands.Item Deconvolution of DNA Mixtures Using Replicate Sampling and TrueAllele® Mixture Interpretation(2020) Antillon, SaraAnalysis of DNA mixture evidence does not always yield distinct profiles. This process is further complicated with low template DNA (LT-DNA)samples often seen in forensic casework. Traditional qualitative methods use thresholds to distinguish allele peaks from stutter peaks, noise, etc.resulting in data being omitted during analysis. In cases where LT-DNA is present, low peaks that could potentially be attributed to low contributor profiles may not be called due to these instituted thresholds. The probabilistic genotyping computer software program created by Cybergenetics (Pittsburgh, PA), TrueAllele® Casework, considers all data and performs quantitative analysis using probability to represent uncertainty. It objectively forms likelihood ratios (LR) that compare the probabilities of an evidentiary genotype with a suspect genotype relative to a reference population. A joint likelihood function (JLF) takes two or more independent sets of data and compares them jointly as opposed to a single event. The JLFcan elicit more identification information proving useful in DNA mixture analysis. This project used TrueAllele® Casework to perform DNA mixture analysis on two sets of previously published mixture data provided by Cybergenetics. The first set comprised 40 two contributor mixture samples and the second set included four sets of 10 randomized mixtures with two, three, four, and five contributors, respectively. The selected samples were interpreted singly and jointly in three variable groups: mixture weight, template concentration, and complex mixtures. The differences between the match logLRs of the single and joint analyses were calculated and an information gain was seen in all three groups when the samples were analyzed jointly. Changing DNA collection and amplification procedures for touch andDNA mixture evidence samples will increase the amount of data available forDNA mixture analysis using probabilistic genotyping. These procedures can be modified so that multiple swabs and replicate amplifications produce more data that TrueAllele can analyze using the JLF. Jointly analyzing each independent evidence data can lead to higher match statistics which will ultimately help in the identification of those who commit crimes.Item Deep Brain Stimulation: Treatment for Clinical Depression(2020) Admassu, AzariaDepression is the most common mental disorder in the United States. A person with this disorder is generally described as feeling sad, discouraged and in general, disinterested in life. Deep brain stimulation (DBS) is a method of electrically stimulating a specific part of a brain using implanted electrodes. Since the symptoms of major depressive disorder have been linked to the dysfunction of the reward circuitry system of which the NAcs are the major players of, DBS to these neurons have been one suggested method to improve patients’ symptoms. In 2013, DBS of the NAcs had given a promising result when patients treated by DBS had shown an improvement with little to no side effect. This paper will discuss the advantages to DBS as an anti-depressant by the assessment of available studies and will further discuss the current as well as future challenges facing DBS.Item Factors Effecting Friction Ridge Transfer Through Gloves Used During Crime Scene Processing(2020) Wasserman, AliahA pillar of the crime scene processing curriculum, Locard’s Exchange Principle states that every place one goes, they take something with them and also leave a trace behind. The ultimate goal of crime scene processing is to collect valuable forensic evidence while minimizing the effects of Locard’s Exchange Principle. The use of personal protective equipment (PPE) is one of the most common ways the forensic community mitigates the risk of cross-contamination during crime scene processing. Through trial, error, and research, it is known that friction ridge detail can transfer through gloves. The objective of this research is to help the field of crime scene investigation to develop best practices for minimizing scene contamination by way of this transfer. Though the community agrees it is possible to deposit this detail, it is lesser known what practices can intensify that chance, and more importantly – what can prevent it. This research tests two sets of circumstances in an attempt to determine best practices for mitigating friction ridge transfer during crime scene processing. This study determines the effect of hand condition, glove size, and utilization of the double-gloving method on the transfer of friction ridge detail to a glass surface through nitrile examination gloves. The findings were that the condition of the hands prior to donning the glove(s) did not impact this transfer, but that wearing two pairs of gloves significantly reduced the occurrence of friction ridge detail on the deposition surface as compared to one pair of gloves.Item Latent Fingerprint Development on Adhesive Surfaces After Application to Fabric(2020-04) Boarts, JasonOne of the many items of evidence found at a crime scene that can yield breakthrough clues if handled and processed appropriately is adhesive tape. Through fracture matching and DNA analysis, a person can be linked to the tape and, therefore, the scene of the crime. Another way adhesive tape can link a person to the crime is with development of latent fingerprints. Thorough research and real-life case work have proven latent fingerprints can be developed from adhesive surfaces through a variety of processing techniques. A search of extant literature shows this to be true, but there is little information found on the ability to develop comparative latent fingerprints from adhesive surfaces after they have been applied to fabrics. This study adds to the literature through the deposition of simulant laden latent fingerprints on tan packing tape, clear packing tape and grey duct tape, then applying the tape to denim, polyester, and cotton fabric samples. The tape was processed utilizing crystal violet, black wet powder and small particle reagent. Careful processing and analysis of 135 samples of adhesives and 405 latent fingerprints determined adhesive and fabric types coupled with processing methods play a role in the ability to develop latent fingerprints from adhesive surfaces that have been applied to fabrics. The results of the study fill an apparent gap in the literature and provides investigators and lab analysts another means of potentially identifying persons of interest in criminal investigations.Item Comparison of two different photo protocols and increasing the accuracy of 3D modeling of snow shoeprints by Photogrammetry(2021-03-29) Sheen Vento, KarlaSometimes a shoeprint can help in narrowing down the number of suspects in a crime scene, so having an efficient recovery method for them can be helpful (Andalo et al., 2012). Photogrammetry has been proposed as a simple and reliable method for shoeprint analysis in previous studies, however, its use in certain surfaces such as snow can be challenging, and it also requires following a strict protocol for picture taking (Larsen and col., 2020). The objectives of this study are to test the equivalency of an alternative picture-taking protocol proposed by Larsen et al. with the standard protocol proposed by the developers of Digtrace, a software that allows the 3D modeling of shoeprints; and to test the effectiveness of different techniques for improving the quality of shoeprint’s photos taken in snow. In the first experiment, two shoeprints were created on sand and mud, and photographed using Larsen’s and Digtrace’s photo taking protocol. A series of 3D models were created in Digtrace, randomized, and cloud points extracted from them were compared using the CloudCompare software to assess differences in variability. In the second experiment five shoeprints were created in snow and several enhancing techniques (oblique light, red filter /black – white photo, red, and blue dyes) were used to increase the contrast of the photographs. The same comparison process from experiment was used to determine a reduction in the variability of cloud point distances with a control group. The results shown a higher accuracy from Larsen’s protocol (mean distance 0.1025 mm) than Digtrace’s protocol on mud surface, however on sand surface Digtrace’s protocol revealed less error distance (0.0968 mm) than Larsen’s protocol. The results from the second experiment shown that the use of blue and red dyes produced noticeable improvement of the reliability values. (mean error distance 0.0648 mm and 0.0734 mm). In contrast, oblique lights and red filters/black-white photos did not produce a significant improvement. This study shows that both Larsen’s and Digtrace’s protocols can be used to build reliable shoeprint 3D models and that the accuracy of 3D snow shoeprints can be improved with a simple method such as the spraying of red or blue dyes.Item Age-at-death estimations using cementochronology in thermally altered teeth(2021-04) Weaver, Ryen L.Macroscopic age-at-death estimations provide age ranges that give broad and often insufficient insight to an individual’s chronological age. Accurate age estimations can become more complex for a forensic anthropologist if the unknown individual has been subjected to extreme heat from an assortment of scenarios that include but are not limited to structure fires, airplane crashes, automobile accidents, and attempts to conceal evidence of homicide. Due to their high mineral content and sequestered placement in the jaw, teeth have the ability to be one of the best-preserved human tissue remains in extreme heat situations (Beach, Passalacqua, & Chapman, 2015). Cementochronology utilizes the cementum, the mineralized covering of a tooth root, as an aid in estimating an individual’s age (Colard, et al., 2015). The utilization of cementochronology is one of the most accurate ways to estimate age-at-death due to the countable cementum annulations found in the cross-sections directly correlating to the individuals age (Wittwer-Backofen, 2004). This study aims to provide an in-depth analysis of thermal alteration to human teeth by various accelerants when utilizing the cementochronology method to build a biological profile. The sample in this study consists of 36 teeth from both male and female donors from all odontological positions ranging in age from 9-87 years. Three accelerants with varied volumes were used to determine if the readability of cementum annulations can still remain accurate after alteration. Results indicated that the type of tooth had significant impact on the ability to count annulations. Annulations were able to be read and estimated after alteration with all three accelerants used in this study. However, acetone yielded the lowest results with the most severe alterations. A novel formula was developed to help approximate the amount of cementum annulations found within each sample. This formula was found to yield estimated tooth cementum annulations that were highly correlated with the actual ages of the individuals from whom the samples were from. After thermal alteration of teeth with accelerants it was found that cementochronology is an accurate and helpful tool for estimating age-at-death in unknown individuals.Item Crime Scene Documentation- iPhone vs the Nikon D5600(2021-05) Mansell, AndrewA picture is worth a thousand words because of its power and influence. They are an unbiased representation of fact and why crime scenes are documented with powerful cameras like the Nikon Digital Single-Lens Reflex. Modern smartphone technology provides us with an alternative. This study compares an iPhone XS Max to a Nikon D5600 to accurately depict a crime scene. This study compares each camera's mechanisms and judges the quality of their exposures through sharpness, resolution, and acutanceItem Evaluation Of Virtual Standard Curve Functionality Of The HID Real-Time Pcr Analysis Software By Comparison To Assay Specific Standard Curves And An External Standard Curve Generated In- House(2021-05) Mauriello, AngelinaQuantitative PCR (qPCR) plays a critical role in the field of forensic biology to determine the amount of “amplifiable” human specific DNA. If too much or too little DNA is present, it can result in profiles difficult to interpret. Therefore, qPCR is beneficial in determining the quality and quantity of DNA to generate an interpretable profile from a forensic sample. To determine the DNA quantity, a set of standard with known DNA concentrations are used to make a standard curve using ordinary least squares which is compared to samples with unknown quantities of DNA to determine the concentration. The goal of this research project was to examine two alternative methods in determining the quantity of DNA that does not require a standard curve for each run to minimize run-to-run variation and reduce costs and analyst time. These two methods are the use of an external standard curve and a virtual standard curve using different lot numbers, curve preparations by different analysts, and between instrument calibrations. Samples were quantified in duplicate, and a linear regression was determined utilizing the of the average of all runs to calculate the slope and y-intercept per variable and target to generate a virtual standard curve in the new HID Real-Time PCR Analysis Software v1.3. It was determined that the external standard curve method and virtual standard curve method were identical. Results showed there was no significance between instrument calibration and no difference between kit lots when comparing the assay specific curve to the virtual/external curve methods. For the virtual standard curve, there was no significant difference to the assay specific method. There was significant differences between pipetting from different analysts when looking at the different standards prepared. A recommendation from this research regarding the use of these techniques is to have as many analysts as possible pipetting. If more than one variable is introduced throughout the process, a new virtual standard curve needs to be generated. This study demonstrates the feasibility of the implementation of the virtual standard curve function into a case working laboratory workflow and that a laboratory can benefit using these methods.