884 resultados para Fully automated
Resumo:
An efficient two-level model identification method aiming at maximising a model׳s generalisation capability is proposed for a large class of linear-in-the-parameters models from the observational data. A new elastic net orthogonal forward regression (ENOFR) algorithm is employed at the lower level to carry out simultaneous model selection and elastic net parameter estimation. The two regularisation parameters in the elastic net are optimised using a particle swarm optimisation (PSO) algorithm at the upper level by minimising the leave one out (LOO) mean square error (LOOMSE). There are two elements of original contributions. Firstly an elastic net cost function is defined and applied based on orthogonal decomposition, which facilitates the automatic model structure selection process with no need of using a predetermined error tolerance to terminate the forward selection process. Secondly it is shown that the LOOMSE based on the resultant ENOFR models can be analytically computed without actually splitting the data set, and the associate computation cost is small due to the ENOFR procedure. Consequently a fully automated procedure is achieved without resort to any other validation data set for iterative model evaluation. Illustrative examples are included to demonstrate the effectiveness of the new approaches.
Resumo:
Our study investigated the effects of condensed tannins (CT) on rumen in vitro methane (CH4) production and fermentation characteristics by incubating lucerne in buffered rumen fluid in combination with different CT extracts at 0 (control), 40, 80 and 120 g CT/kg of substrate DM. Condensed tannins were extracted from four sainfoin accessions: Rees ‘A’, CPI63763, Cotswold Common and CPI63767. Gas production (GP) was measured using a fully automated GP apparatus with CH4 measured at distinct time points. Condensed tannins differed substantially in terms of polymer size and varied from 13 (Rees ‘A’) to 73 (CPI63767) mean degree of polymerization, but had relatively similar characteristics in terms of CT content, procyanidin: prodelphinidin (PC: PD) and cis:trans ratios. Compared to control, addition of CT from CPI63767 and CPI63763 at 80 and 120 g CT/kg of substrate DM reduced CH4 by 43% and 65%, and by 23% and 57%, respectively, after 24-h incubation. Similarly, CT from Rees ‘A’ and Cotswold Common reduced CH4 by 26% and 46%, and by 28% and 46% respectively. Addition of increasing level of CT linearly reduced the maximum rates of GP and CH4 production, and the estimated in vitro organic matter digestibility. There was a negative linear and quadratic (p < 0.01) relation between CT concentration and total volatile fatty acid (VFA) production. Inclusion of 80 and 120 g CT/kg of substrate DM reduced (p < 0.001) branched-chain VFA production and acetate: propionate ratio and was lowest for CPI63767. A decrease in proteolytic activity as indirectly shown by a change in VFA composition favouring a shift towards propionate and reduction in branched-chain VFA production varied with type of CT and was highest for CPI63767. In conclusion, these results suggest that tannin polymer size is an important factor affecting in vitro CH4 production which may be linked to the CT interaction with dietary substrate or microbial cells.
Resumo:
In practically all vertical markets and in every region of the planet, loyalty marketers have adopted the tactic of recognition and reward to identify, maintain and increase the yield of their customers. Several strategies have been adopted by companies, and the most popular among them is the loyalty program, which displays a loyalty club to manage these rewards. But the problem with loyalty programs is that customer identification and transfer of loyalty points are made in a semiautomatic. Aiming at this, this paper presents a master's embedded business automation solution called e-Points. The goal of e-Points is munir clubs allegiances with fully automated tooling technology to identify customers directly at the point of sales, ensuring greater control over the loyalty of associate members. For this, we developed a hardware platform with embedded system and RFID technology to be used in PCs tenant, a smart card to accumulate points with every purchase and a web server, which will provide services of interest to retailers and customers membership to the club
Resumo:
The widespread growth in the use of smart cards (by banks, transport services, and cell phones, etc) has brought an important fact that must be addressed: the need of tools that can be used to verify such cards, so to guarantee the correctness of their software. As the vast majority of cards that are being developed nowadays use the JavaCard technology as they software layer, the use of the Java Modeling Language (JML) to specify their programs appear as a natural solution. JML is a formal language tailored to Java. It has been inspired by methodologies from Larch and Eiffel, and has been widely adopted as the de facto language when dealing with specification of any Java related program. Various tools that make use of JML have already been developed, covering a wide range of functionalities, such as run time and static checking. But the tools existent so far for static checking are not fully automated, and, those that are, do not offer an adequate level of soundness and completeness. Our objective is to contribute to a series of techniques, that can be used to accomplish a fully automated and confident verification of JavaCard applets. In this work we present the first steps to this. With the use of a software platform comprised by Krakatoa, Why and haRVey, we developed a set of techniques to reduce the size of the theory necessary to verify the specifications. Such techniques have yielded very good results, with gains of almost 100% in all tested cases, and has proved as a valuable technique to be used, not only in this, but in most real world problems related to automatic verification
Resumo:
Sugar cane burning in Brazil causes remarkable amounts of organic compounds to be emitted amongst which the polycyclic aromatic hydrocarbons (PAHs) represent serious health hazards. Therefore, 24-h aerosol samples (< 10 mum aerodynamic diameter) were collected in Araraquara city (São Paulo state) during the harvest season using a Hi-Vol sampler. PAHs were recovered using an Accelerated Solvent Extractor and analyzed by low-pressure gas chromatography-ion trap mass spectrometry (LP-GC-IT-MS). The fully automated extraction process was performed in less than 25 min with a solvent consumption of approximately 20 ml. The use of a deactivated 0.6 m x 0. 10 mm i.d. restrictor coupled to a 10 m wide-bore analytical column allowed most of the 16 PAHs in EPA's priority list to be identified and quantified in only 13 min. Concentrations of PAHs in Nraraquara aerosols ranged between 0.5 and 8.6 ng m(-3). (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
This paper presents a novel approach to the computed assessment of a mammographic phantom device. The approach shown here is fully automated and is based on the automatic selection of the region of interest, in the use of the discrete wavelet transform (DWT) and morphological operators to assess the quality of the American College of Radiology (ACR) mammographic phantom images. The algorithms developed here have succesfully scored 30 images obtained with different combinations of voltage applied to the tube and exposure and could notice the differences in the radiographs due to the different level of exposure to radiation. © 2013 Springer-Verlag.
Resumo:
A tartaruga da amazônia (Podocnemis expansa) corresponde a um recurso faunístico muito importante para as populações ribeirinhas da região amazônica, além de ser uma das principais espécies indicadas para produção em cativeiro. O consumo dessa espécie como alimento na região, gerou uma demanda de estudos quanto à questão sanitária e seu impacto na saúde pública. O principal objetivo deste trabalho foi avaliar a microbiota intestinal de tartarugas da amazônia de vida livre e cativeiro, verificando a ocorrência de bactérias da Família Enterobacteriaceae no trato intestinal desses animais. Para isso, foram utilizadas 116 tartarugas adultas, de ambos os sexos, sendo que, 51 foram capturadas na Ilha de São Miguel, município de Santarém (PA), 50 animais pertenciam a um cativeiro comercial e 15 eram provenientes de um criadouro conservacionista, localizados na região metropolitana de Belém, Pará. De cada animal, foi colhida amostra de material biológico cloacal, utilizando-se swabs estéreis para em seguida serem acondicionados em tubos com meios de transporte e enviados ao laboratório para análises bacteriológicas. Todas as amostras foram imersas em caldos Selenito e BHI durante 24 horas e posteriormente semeadas em Agar Shigella-Salmonella e Agar Mac Conkey na temperatura de 37ºC por 24 horas. As UFCs (Unidades formadoras de colônia) foram semeadas em Agar Muller Hilton por mais 24 horas em estufa a 37ºC e identificadas pelo sistema Vitek® totalmente automatizado. Do total de 116 amostras foram obtidos 245 crescimentos bacterianos nos quais 83 (33,87%) eram provenientes dos animais de vida livre, com a identificação de 20 espécies bacterianas. Nos animais mantidos em cativeiro, foram obtidos 162 (65,72%) isolamentos, identificando-se 10 espécies de bactérias. Oito espécies foram encontradas em ambos os ambientes e 14 espécies em apenas um deles. A espécie Klebsiella pneumoniae foi a mais frequente, com 52 isolamentos, totalizando 21,22% dos crescimentos bacterianos, seguida de Enterobacter cloacae (35/14,29%), Serratia marcescens (29/11,84%) e Salmonella species (24/9,80%). Nos quelônios de vida livre, os microrganismos mais isolados constituiram-se dos genêros Enterobacter, Klebsiella, Citrobacter e Aeromonas. Klebsiella pneumoniae, Serratia marcescens, Enterobacter cloacae e Salmonella spp. apresentaram frequências elevadas naqueles animais cativos. Este resultado evidencia uma maior diversidade de microrganismos entre os animais de vida livre e uma contaminação elevada por amostra nos animais de cativeiro. As espécies Salmonella sp., E. coli e Acinetobacter ssp., tiveram sua frequência aumentada provavelmente devido a influência do cativeiro, sendo portanto, sugeridas como indicativas da qualidade sanitária de populações da tartaruga da Amazônia.
Resumo:
Fogging of ReJeX-iT7 TP-40 offers a very efficient method for the control and dispersal of nuisance birds from many diverse areas. The amount of the repellent is greatly reduced over any other control method. The method is direct and is independent of the activity of the birds. The applications with any fogger, thermal or mechanical, that can deliver droplets of less than 20 microns, can be manually or fully automated and pose only minimal risks to operators or animals. All birds that became a nuisance and safety problem in the hangars of TWA and AA at LaGuardia, and TWA warehouse at Newark Airport were successfully driven out by fogging ReJeX-iT7 TP-40 with a Curtis Dyna-Fog AGolden Eagle@ thermal fogger.
Resumo:
A thin-layer electrochemical flow cell coupled to capillary electrophoresis with contactless conductivity detection (EC-CE-(CD)-D-4) was applied for the first time to the derivatization and quantification of neutral species using aliphatic alcohols as model compounds. The simultaneous electrooxidation of four alcohols (ethanol, 1-propanol, 1-butanol, and 1-pentanol) to the corresponding carboxylates was carried out on a platinum working electrode in acid medium. The derivatization step required 1 min at 1.6 V vs. Ag/AgCl under stopped flow conditions, which was preceded by a 10 s activation at 0 V. The solution close to the electrode surface was then hydrodynamically injected into the capillary, and a 2.5 min electrophoretic separation was carried out. The fully automated flow system operated at a frequency of 12 analyses per hour. Simultaneous determination of the four alcohols presented detection limits of about 5 x 10(-5) mol As a practical application with a complex matrix, ethanol concentrations were determined in diluted pale lager beer and in nonalcoholic beer. No statistically significant difference was observed between the EC-CE-(CD)-D-4 and gas chromatography with flame ionization detection (GC-FID) results for these samples. The derivatization efficiency remained constant over several hours of continuous operation with lager beer samples (n = 40).
Resumo:
For the first time, multiwavelength polarization Raman lidar observations of optical and microphysical particle properties over the Amazon Basin are presented. The fully automated advanced Raman lidar was deployed 60 km north of Manaus, Brazil (2.5 degrees S, 60 degrees W) in the Amazon rain forest from January to November 2008. The measurements thus cover both the wet season (Dec-June) and the dry or burning season (July-Nov). Two cases studies of young and aged smoke plumes are discussed in terms of spectrally resolved optical properties (355, 532, and 1064 nm) and further lidar products such as particle effective radius and single-scattering albedo. These measurement examples confirm that biomass burning aerosols show a broad spectrum of optical, microphysical, and chemical properties. The statistical analysis of the entire measurement period revealed strong differences between the pristine wet and the polluted dry season. African smoke and dust advection frequently interrupt the pristine phases during the wet season. Compared to pristine wet season conditions, the particle scattering coefficients in the lowermost 2 km of the atmosphere were found to be enhanced, on average, by a factor of 4 during periods of African aerosol intrusion and by a factor of 6 during the dry (burning) season. Under pristine conditions, the particle extinction coefficients and optical depth for 532 nm wavelength were frequently as low as 10-30 Mm(-1) and <0.05, respectively. During the dry season, biomass burning smoke plumes reached to 3-5 km height and caused a mean optical depth at 532 nm of 0.26. On average during that season, particle extinction coefficients (532 nm) were of the order of 100 Mm(-1) in the main pollution layer (up to 2 km height). Angstrom exponents were mainly between 1.0 and 1.5, and the majority of the observed lidar ratios were between 50-80 sr.
Resumo:
During the previous 10 years, global R&D expenditure in the pharmaceuticals and biotechnology sector has steadily increased, without a corresponding increase in output of new medicines. To address this situation, the biopharmaceutical industry's greatest need is to predict the failures at the earliest possible stage of the drug development process. A major key to reducing failures in drug screenings is the development and use of preclinical models that are more predictive of efficacy and safety in clinical trials. Further, relevant animal models are needed to allow a wider testing of novel hypotheses. Key to this is the developing, refining, and validating of complex animal models that directly link therapeutic targets to the phenotype of disease, allowing earlier prediction of human response to medicines and identification of safety biomarkers. Morehover, well-designed animal studies are essential to bridge the gap between test in cell cultures and people. Zebrafish is emerging, complementary to other models, as a powerful system for cancer studies and drugs discovery. We aim to investigate this research area designing a new preclinical cancer model based on the in vivo imaging of zebrafish embryogenesis. Technological advances in imaging have made it feasible to acquire nondestructive in vivo images of fluorescently labeled structures, such as cell nuclei and membranes, throughout early Zebrafishsh embryogenesis. This In vivo image-based investigation provides measurements for a large number of features at cellular level and events including nuclei movements, cells counting, and mitosis detection, thereby enabling the estimation of more significant parameters such as proliferation rate, highly relevant for investigating anticancer drug effects. In this work, we designed a standardized procedure for accessing drug activity at the cellular level in live zebrafish embryos. The procedure includes methodologies and tools that combine imaging and fully automated measurements of embryonic cell proliferation rate. We achieved proliferation rate estimation through the automatic classification and density measurement of epithelial enveloping layer and deep layer cells. Automatic embryonic cells classification provides the bases to measure the variability of relevant parameters, such as cell density, in different classes of cells and is finalized to the estimation of efficacy and selectivity of anticancer drugs. Through these methodologies we were able to evaluate and to measure in vivo the therapeutic potential and overall toxicity of Dbait and Irinotecan anticancer molecules. Results achieved on these anticancer molecules are presented and discussed; furthermore, extensive accuracy measurements are provided to investigate the robustness of the proposed procedure. Altogether, these observations indicate that zebrafish embryo can be a useful and cost-effective alternative to some mammalian models for the preclinical test of anticancer drugs and it might also provides, in the near future, opportunities to accelerate the process of drug discovery.
Resumo:
Aufbau einer kontinuierlichen, mehrdimensionalen Hochleistungs-flüssigchromatographie-Anlage für die Trennung von Proteinen und Peptiden mit integrierter größenselektiver ProbenfraktionierungEs wurde eine mehrdimensionale HPLC-Trennmethode für Proteine und Peptide mit einem Molekulargewicht von <15 kDa entwickelt.Im ersten Schritt werden die Zielanalyte von höhermolekularen sowie nicht ionischen Bestandteilen mit Hilfe von 'Restricted Access Materialien' (RAM) mit Ionenaustauscher-Funktionalität getrennt. Anschließend werden die Proteine auf einer analytischen Ionenaustauscher-Säule sowie auf Reversed-Phase-Säulen getrennt. Zur Vermeidung von Probenverlusten wurde ein kontinuierlich arbeitendes, voll automatisiertes System auf Basis unterschiedlicher Trenngeschwindigkeiten und vier parallelen RP-Säulen aufgebaut.Es werden jeweils zwei RP-Säulen gleichzeitig, jedoch mit zeitlich versetztem Beginn eluiert, um durch flache Gradienten ausreichende Trennleistungen zu erhalten. Während die dritte Säule regeneriert wird, erfolgt das Beladen der vierte Säule durch Anreicherung der Proteine und Peptide am Säulenkopf. Während der Gesamtanalysenzeit von 96 Minuten werden in Intervallen von 4 Minuten Fraktionen aus der 1. Dimension auf die RP-Säulen überführt und innerhalb von 8 Minuten getrennt, wobei 24 RP-Chromatogramme resultieren.Als Testsubstanzen wurden u.a. Standardproteine, Proteine und Peptide aus humanem Hämofiltrat sowie aus Lungenfibroblast-Zellkulturüberständen eingesetzt. Weiterhin wurden Fraktionen gesammelt und mittels MALDI-TOF Massenspektrometrie untersucht. Bei einer Injektion wurden in den 24 RP-Chromatogrammen mehr als 1000 Peaks aufgelöst. Der theoretische Wert der Peakkapazität liegt bei ungefähr 3000.
Resumo:
Ein neu konstruierter Kondensationskernzähler COPAS (COndensation PArticle counting System) für in-situ-Messungen der Konzentration von Aitken-Teilchen und ultrafeinen Aerosolpartikeln wurde im Rahmen dieser Arbeit erstmals erfolgreich bei Flugzeugmessungen eingesetzt. COPAS ist ein für flugzeuggestützte Messungen an Bord des Forschungsflugzeuges „Geophysica“ in der oberen Troposphäre und unteren Stratosphäre angepaßtes und voll automatisiertes System. Die Verfahrensweise, die Aerosolpartikel des Größenbereichs mit Durchmessern d < 100 nm zum Anwachsen zu bringen, um sie mittels optischer Detektion zu erfassen, ist im COPAS durch das Prinzip der thermischen Diffusion realisiert, wodurch eine kontinuierliche Messung der Aerosolkonzentration mit der untersten Nachweisgrenze für Partikeldurchmesser von d = 6 nm gewährleistet ist. Durch die Verwendung einer Aerosolheizung ist die Unterscheidung von volatilem und nichtvolatilem Anteil des Aerosols mit COPAS möglich. In umfassenden Laborversuchen wurde das COPAS-System hinsichtlich der unteren Nachweisgrenze in Abhängigkeit von der Betriebstemperatur und bei verschiedenen Druckbedingungen charakterisiert sowie die Effizienz der Aerosolheizung bestimmt. Flugzeuggestützte Messungen fanden in mittleren und polaren Breiten im Rahmen des EUPLEX-/ENVISAT-Validierungs–Projektes und in den Tropen während der TROCCINOX/ENVISAT-Kampagne statt. Die Messungen der vertikalen Konzentrationsverteilung des Aerosols ergaben in polaren Breiten eine Zunahme der Konzentration oberhalb von 17 km innerhalb des polaren Vortex mit hohem Anteil nichtvolatiler Partikel von bis zu 70 %. Als Ursache hierfür wird der Eintrag von meteoritischen Rauchpartikeln aus der Mesosphäre in die obere und mittlere Stratosphäre des Vortex angesehen. Ferner konnte in der unteren Stratosphäre des polaren Vortex der Einfluß troposphärischer Luft aus niedrigen Breiten festgestellt werden, die sich in einer hohen Variabilität der Aerosolpartikelkonzentration manifestiert. In tropischen Breiten wurde die Tropopausenregion untersucht. Dabei wurden Konzentrationen von bis zu 104 ultrafeiner Aerosolpartikel mit 6 nm < d < 14 nm pro cm-3 Luft gemessen, deren hoher volatiler Anteil einen sicheren Hinweis darauf gibt, daß die Partikel durch den Prozeß der homogenen Nukleation gebildet wurden. Damit konnte erstmals die Schlußfolgerungen von Brock et al. (1995) durch direkte Messungen der ultrafeinen Partikelkonzentration weitergehend belegt werden, daß in der tropischen Tropopausenregion die Neubildung von Aerosolpartikeln durch homogene Nukleation stattfindet. Die vertikalen Verteilungen der stratosphärischen Aerosolpartikelkonzentration mittlerer Breiten verdeutlichen die Ausbildung einer über 6 Jahre hinweg nahezu konstanten Hintergrundkonzentration des stratosphärischen Aerosols unter vulkanisch unbeeinflußten Bedingungen. Ferner gibt die vergleichende Untersuchung der stratosphärischen Aerosolpartikelkonzentration aus polaren, mittleren und tropischen Breiten Aufschluß über den Transport und die Prozessierung des stratosphärischen Aerosols und insbesondere über den Austausch von Luftmassen zwischen der Stratosphäre und der Troposphäre.
Resumo:
The last decades have witnessed significant and rapid progress in polymer chemistry and molecular biology. The invention of PCR and advances in automated solid phase synthesis of DNA have made this biological entity broadly available to all researchers across biological and chemical sciences. Thanks to the development of a variety of polymerization techniques, macromolecules can be synthesized with predetermined molecular weights and excellent structural control. In recent years these two exciting areas of research converged to generate a new type of nucleic acid hybrid material, consisting of oligodeoxynucleotides and organic polymers. By conjugating these two classes of materials, DNA block copolymers are generated exhibiting engineered material properties that cannot be realized with polymers or nucleic acids alone. Different synthetic strategies based on grafting onto routes in solution or on solid support were developed which afforded DNA block copolymers with hydrophilic, hydrophobic and thermoresponsive organic polymers in good yields. Beside the preparation of DNA block copolymers with a relative short DNA-segment, it was also demonstrated how these bioorganic polymers can be synthesized exhibiting large DNA blocks (>1000 bases) applying the polymerase chain reaction. Amphiphilic DNA block copolymers, which were synthesized fully automated in a DNA synthesizer, self-assemble into well-defined nanoparticles. Hybridization of spherical micelles with long DNA templates that encode several times the sequence of the micelle corona induced a transformation into rod-like micelles. The Watson-Crick motif aligned the hydrophobic polymer segments along the DNA double helix, which resulted in selective dimer formation. Even the length of the resulting nanostructures could be precisely adjusted by the number of nucleotides of the templates. In addition to changing the structural properties of DNA-b-PPO micelles, these materials were applied as 3D nanoscopic scaffolds for organic reactions. The DNA strands of the corona were organized by hydrophobic interactions of the organic polymer segments in such a fashion that several DNA-templated organic reactions proceeded in a sequence specific manner; either at the surface of the micelles or at the interface between the biological and the organic polymer blocks. The yields of reactions employing the micellar template were equivalent or better than existing template architectures. Aside from its physical properties and the morphologies achieved, an important requirement for a new biomaterial is its biocompatibility and interaction with living systems, i.e. human cells. The toxicity of the nanoparticles was analyzed by a cell proliferation assay. Motivated by the non-toxic nature of the amphiphilic DNA block copolymers, these nanoobjects were employed as drug delivery vehicles to target the anticancer drug to a tumor tissue. The micelles obtained from DNA block copolymers were easily functionalized with targeting units by hybridization. This facile route allowed studying the effect of the amount of targeting units on the targeting efficacy. By varying the site of functionalization, i.e. 5’ or 3’, the outcome of having the targeting unit at the periphery of the micelle or in the core of the micelle was studied. Additionally, these micelles were loaded with an anticancer drug, doxorubicin, and then applied to tumor cells. The viability of the cells was calculated in the presence and absence of targeting unit. It was demonstrated that the tumor cells bearing folate receptors showed a high mortality when the targeting unit was attached to the nanocarrier.
Resumo:
Die vorliegende Dissertation entstand im Rahmen eines multizentrischen EU-geförderten Projektes, das die Anwendungsmöglichkeiten von Einzelnukleotid-Polymorphismen (SNPs) zur Individualisierung von Personen im Kontext der Zuordnung von biologischen Tatortspuren oder auch bei der Identifizierung unbekannter Toter behandelt. Die übergeordnete Zielsetzung des Projektes bestand darin, hochauflösende Genotypisierungsmethoden zu etablieren und zu validieren, die mit hoher Genauigkeit aber geringen Aufwand SNPs im Multiplexformat simultan analysieren können. Zunächst wurden 29 Y-chromosomale und 52 autosomale SNPs unter der Anforderung ausgewählt, dass sie als Multiplex eine möglichst hohe Individualisierungschance aufweisen. Anschließend folgten die Validierungen beider Multiplex-Systeme und der SNaPshot™-Minisequenzierungsmethode in systematischen Studien unter Beteiligung aller Arbeitsgruppen des Projektes. Die validierte Referenzmethode auf der Basis einer Minisequenzierung diente einerseits für die kontrollierte Zusammenarbeit unterschiedlicher Laboratorien und andererseits als Grundlage für die Entwicklung eines Assays zur SNP-Genotypisierung mittels der elektronischen Microarray-Technologie in dieser Arbeit. Der eigenständige Hauptteil dieser Dissertation beschreibt unter Verwendung der zuvor validierten autosomalen SNPs die Neuentwicklung und Validierung eines Hybridisierungsassays für die elektronische Microarray-Plattform der Firma Nanogen Dazu wurden im Vorfeld drei verschiedene Assays etabliert, die sich im Funktionsprinzip auf dem Microarray unterscheiden. Davon wurde leistungsorientiert das Capture down-Assay zur Weiterentwicklung ausgewählt. Nach zahlreichen Optimierungsmaßnahmen hinsichtlich PCR-Produktbehandlung, gerätespezifischer Abläufe und analysespezifischer Oligonukleotiddesigns stand das Capture down-Assay zur simultanen Typisierung von drei Individuen mit je 32 SNPs auf einem Microarray bereit. Anschließend wurde dieses Verfahren anhand von 40 DNA-Proben mit bekannten Genotypen für die 32 SNPs validiert und durch parallele SNaPshot™-Typisierung die Genauigkeit bestimmt. Das Ergebnis beweist nicht nur die Eignung des validierten Analyseassays und der elektronischen Microarray-Technologie für bestimmte Fragestellungen, sondern zeigt auch deren Vorteile in Bezug auf Schnelligkeit, Flexibilität und Effizienz. Die Automatisierung, welche die räumliche Anordnung der zu untersuchenden Fragmente unmittelbar vor der Analyse ermöglicht, reduziert unnötige Arbeitsschritte und damit die Fehlerhäufigkeit und Kontaminationsgefahr bei verbesserter Zeiteffizienz. Mit einer maximal erreichten Genauigkeit von 94% kann die Zuverlässigkeit der in der forensischen Genetik aktuell eingesetzten STR-Systeme jedoch noch nicht erreicht werden. Die Rolle des neuen Verfahrens wird damit nicht in einer Ablösung der etablierten Methoden, sondern in einer Ergänzung zur Lösung spezieller Probleme wie z.B. der Untersuchung stark degradierter DNA-Spuren zu finden sein.