5 resultados para Crime scene searches
em ArchiMeD - Elektronische Publikationen der Universität Mainz - Alemanha
Resumo:
Die Analyse tandem-repetitiver DNA-Sequenzen hat einen festen Platz als genetisches Typisierungsverfahren in den Breichen der stammesgeschichtlichen Untersuchung, der Verwandtschaftsanalyse und vor allem in der forensischen Spurenkunde, bei der es durch den Einsatz der Multiplex-PCR-Analyse von Short Tandem Repeat-Systemen (STR) zu einem Durchbruch bei der Aufklärung und sicheren Zuordnung von biologischen Tatortspuren kam. Bei der Sequenzierung des humanen Genoms liegt ein besonderes Augenmerk auf den genetisch polymorphen Sequenzvariationen im Genom, den SNPs (single nucleotide polymorphisms). Zwei ihrer Eigenschaften – das häufige Vorkommen innerhalb des humanen Genoms und ihre vergleichbar geringe Mutationsrate – machen sie zu besonders gut geeigneten Werkzeugen sowohl für die Forensik als auch für die Populationsgenetik.rnZum Ziel des EU-Projekts „SNPforID“, aus welchem die vorliegende Arbeit entstanden ist, wurde die Etablierung neuer Methoden zur validen Typisierung von SNPs in Multiplexverfahren erklärt. Die Berücksichtigung der Sensitivität bei der Untersuchung von Spuren sowie die statistische Aussagekraft in der forensischen Analyse standen dabei im Vordergrund. Hierfür wurden 52 autosomale SNPs ausgewählt und auf ihre maximale Individualisierungsstärke hin untersucht. Die Untersuchungen der ersten 23 selektierten Marker stellen den ersten Teil der vorliegenden Arbeit dar. Sie umfassen die Etablierung des Multiplexverfahrens und der SNaPshot™-Typisierungsmethode sowie ihre statistische Auswertung. Die Ergebnisse dieser Untersuchung sind ein Teil der darauf folgenden, in enger Zusammenarbeit der Partnerlaboratorien durchgeführten Studie der 52-SNP-Multiplexmethode. rnEbenfalls im Rahmen des Projekts und als Hauptziel der Dissertation erfolgten Etablierung und Evaluierung des auf der Microarray-Technologie basierenden Verfahrens der Einzelbasenverlängerung auf Glasobjektträgern. Ausgehend von einer begrenzten DNA-Menge wurde hierbei die Möglichkeit der simultanen Hybridisierung einer möglichst hohen Anzahl von SNP-Systemen untersucht. Die Auswahl der hierbei eingesetzten SNP-Marker erfolgte auf der Basis der Vorarbeiten, die für die Etablierung des 52-SNP-Multiplexes erfolgreich durchgeführt worden waren. rnAus einer Vielzahl von Methoden zur Genotypisierung von biallelischen Markern hebt sich das Assay in seiner Parallelität und der Einfachheit des experimentellen Ansatzes durch eine erhebliche Zeit- und Kostenersparnis ab. In der vorliegenden Arbeit wurde das „array of arrays“-Prinzip eingesetzt, um zur gleichen Zeit unter einheitlichen Versuchsbedingungen zwölf DNA-Proben auf einem Glasobjektträger zu typisieren. Auf der Basis von insgesamt 1419 typisierten Allelen von 33 Markern konnte die Validierung mit einem Typisierungserfolg von 86,75% abgeschlossen werden. Dabei wurden zusätzlich eine Reihe von Randbedingungen in Bezug auf das Sonden- und Primerdesign, die Hybridisierungsbedingungen sowie physikalische Parameter der laserinduzierten Fluoreszenzmessung der Signale ausgetestet und optimiert. rn
Resumo:
This thesis presents an analysis for the search of Supersymmetry with the ATLAS detector at the LHC. The final state with one lepton, several coloured particles and large missing transverse energy was chosen. Particular emphasis was placed on the optimization of the requirements for lepton identification. This optimization showed to be particularly useful when combining with multi-lepton selections. The systematic error associated with the higher order QCD diagrams in Monte Carlo production is given particular focus. Methods to verify and correct the energy measurement of hadronic showers are developed. Methods for the identification and removal of mismeasurements caused by the detector are found in the single muon and four jet environment are applied. A new detector simulation system is shown to provide good prospects for future fast Monte Carlo production. The analysis was performed for $35pb^{-1}$ and no significant deviation from the Standard Model is seen. Exclusion limits subchannel for minimal Supergravity. Previous limits set by Tevatron and LEP are extended.
Resumo:
One of the main goals of the ATLAS experiment at the Large Hadron Collider (LHC) at CERN in Geneva is the search for new physics beyond the Standard Model. In 2011, proton-proton collisions were performed at the LHC at a center of mass energy of 7 TeV and an integrated luminosity of 4.7 fb^{-1} was recorded. This dataset can be tested for one of the most promising theories beyond limits achieved thus far: supersymmetry. Final states in supersymmetry events at the LHC contain highly energetic jets and sizeable missing transverse energy. The additional requirement of events with highly energetic leptons simplifies the control of the backgrounds. This work presents results of a search for supersymmetry in the inclusive dilepton channel. Special emphasis is put on the search within the Gauge-Mediated Symmetry Breaking (GMSB) scenario in which the supersymmetry breaking is mediated via gauge fields. Statistically independent Control Regionsrnfor the dominant Standard Model backgrounds as well as Signal Regions for a discovery of a possible supersymmetry signal are defined and optimized. A simultaneous fit of the background normalizations in the Control Regions via the profile likelihood method allows for a precise prediction of the backgrounds in the Signal Regions and thus increases the sensitivity to several supersymmetry models. Systematic uncertainties on the background prediction are constrained via the jet multiplicity distribution in the Control Regions driven by data. The observed data are consistent with the Standard Model expectation. New limits within the GMSB and the minimal Supergravity (mSUGRA) scenario as well as for several simplified supersymmetry models are set or extended.
Resumo:
Although the Standard Model of particle physics (SM) provides an extremely successful description of the ordinary matter, one knows from astronomical observations that it accounts only for around 5% of the total energy density of the Universe, whereas around 30% are contributed by the dark matter. Motivated by anomalies in cosmic ray observations and by attempts to solve questions of the SM like the (g-2)_mu discrepancy, proposed U(1) extensions of the SM gauge group have raised attention in recent years. In the considered U(1) extensions a new, light messenger particle, the hidden photon, couples to the hidden sector as well as to the electromagnetic current of the SM by kinetic mixing. This allows for a search for this particle in laboratory experiments exploring the electromagnetic interaction. Various experimental programs have been started to search for hidden photons, such as in electron-scattering experiments, which are a versatile tool to explore various physics phenomena. One approach is the dedicated search in fixed-target experiments at modest energies as performed at MAMI or at JLAB. In these experiments the scattering of an electron beam off a hadronic target e+(A,Z)->e+(A,Z)+l^+l^- is investigated and a search for a very narrow resonance in the invariant mass distribution of the lepton pair is performed. This requires an accurate understanding of the theoretical basis of the underlying processes. For this purpose it is demonstrated in the first part of this work, in which way the hidden photon can be motivated from existing puzzles encountered at the precision frontier of the SM. The main part of this thesis deals with the analysis of the theoretical framework for electron scattering fixed-target experiments searching for hidden photons. As a first step, the cross section for the bremsstrahlung emission of hidden photons in such experiments is studied. Based on these results, the applicability of the Weizsäcker-Williams approximation to calculate the signal cross section of the process, which is widely used to design such experimental setups, is investigated. In a next step, the reaction e+(A,Z)->e+(A,Z)+l^+l^- is analyzed as signal and background process in order to describe existing data obtained by the A1 experiment at MAMI with the aim to give accurate predictions of exclusion limits for the hidden photon parameter space. Finally, the derived methods are used to find predictions for future experiments, e.g., at MESA or at JLAB, allowing for a comprehensive study of the discovery potential of the complementary experiments. In the last part, a feasibility study for probing the hidden photon model by rare kaon decays is performed. For this purpose, invisible as well as visible decays of the hidden photon are considered within different classes of models. This allows one to find bounds for the parameter space from existing data and to estimate the reach of future experiments.
Resumo:
In the year 2013, the detection of a diffuse astrophysical neutrino flux with the IceCube neutrino telescope – constructed at the geographic South Pole – was announced by the IceCube collaboration. However, the origin of these neutrinos is still unknown as no sources have been identified to this day. Promising neutrino source candidates are blazars, which are a subclass of active galactic nuclei with radio jets pointing towards the Earth. In this thesis, the neutrino flux from blazars is tested with a maximum likelihood stacking approach, analyzing the combined emission from uniform groups of objects. The stacking enhances the sensitivity w.r.t. the still unsuccessful single source searches. The analysis utilizes four years of IceCube data including one year from the completed detector. As all results presented in this work are compatible with background, upper limits on the neutrino flux are given. It is shown that, under certain conditions, some hadronic blazar models can be challenged or even rejected. Moreover, the sensitivity of this analysis – and any other future IceCube point source search – was enhanced by the development of a new angular reconstruction method. It is based on a detailed simulation of the photon propagation in the Antarctic ice. The median resolution for muon tracks, induced by high-energy neutrinos, is improved for all neutrino energies above IceCube’s lower threshold at 0.1TeV. By reprocessing the detector data and simulation from the year 2010, it is shown that the new method improves IceCube’s discovery potential by 20% to 30% depending on the declination.