943 resultados para automated full waveform logging system
Resumo:
Abstract : In the subject of fingerprints, the rise of computers tools made it possible to create powerful automated search algorithms. These algorithms allow, inter alia, to compare a fingermark to a fingerprint database and therefore to establish a link between the mark and a known source. With the growth of the capacities of these systems and of data storage, as well as increasing collaboration between police services on the international level, the size of these databases increases. The current challenge for the field of fingerprint identification consists of the growth of these databases, which makes it possible to find impressions that are very similar but coming from distinct fingers. However and simultaneously, this data and these systems allow a description of the variability between different impressions from a same finger and between impressions from different fingers. This statistical description of the withinand between-finger variabilities computed on the basis of minutiae and their relative positions can then be utilized in a statistical approach to interpretation. The computation of a likelihood ratio, employing simultaneously the comparison between the mark and the print of the case, the within-variability of the suspects' finger and the between-variability of the mark with respect to a database, can then be based on representative data. Thus, these data allow an evaluation which may be more detailed than that obtained by the application of rules established long before the advent of these large databases or by the specialists experience. The goal of the present thesis is to evaluate likelihood ratios, computed based on the scores of an automated fingerprint identification system when the source of the tested and compared marks is known. These ratios must support the hypothesis which it is known to be true. Moreover, they should support this hypothesis more and more strongly with the addition of information in the form of additional minutiae. For the modeling of within- and between-variability, the necessary data were defined, and acquired for one finger of a first donor, and two fingers of a second donor. The database used for between-variability includes approximately 600000 inked prints. The minimal number of observations necessary for a robust estimation was determined for the two distributions used. Factors which influence these distributions were also analyzed: the number of minutiae included in the configuration and the configuration as such for both distributions, as well as the finger number and the general pattern for between-variability, and the orientation of the minutiae for within-variability. In the present study, the only factor for which no influence has been shown is the orientation of minutiae The results show that the likelihood ratios resulting from the use of the scores of an AFIS can be used for evaluation. Relatively low rates of likelihood ratios supporting the hypothesis known to be false have been obtained. The maximum rate of likelihood ratios supporting the hypothesis that the two impressions were left by the same finger when the impressions came from different fingers obtained is of 5.2 %, for a configuration of 6 minutiae. When a 7th then an 8th minutia are added, this rate lowers to 3.2 %, then to 0.8 %. In parallel, for these same configurations, the likelihood ratios obtained are on average of the order of 100,1000, and 10000 for 6,7 and 8 minutiae when the two impressions come from the same finger. These likelihood ratios can therefore be an important aid for decision making. Both positive evolutions linked to the addition of minutiae (a drop in the rates of likelihood ratios which can lead to an erroneous decision and an increase in the value of the likelihood ratio) were observed in a systematic way within the framework of the study. Approximations based on 3 scores for within-variability and on 10 scores for between-variability were found, and showed satisfactory results. Résumé : Dans le domaine des empreintes digitales, l'essor des outils informatisés a permis de créer de puissants algorithmes de recherche automatique. Ces algorithmes permettent, entre autres, de comparer une trace à une banque de données d'empreintes digitales de source connue. Ainsi, le lien entre la trace et l'une de ces sources peut être établi. Avec la croissance des capacités de ces systèmes, des potentiels de stockage de données, ainsi qu'avec une collaboration accrue au niveau international entre les services de police, la taille des banques de données augmente. Le défi actuel pour le domaine de l'identification par empreintes digitales consiste en la croissance de ces banques de données, qui peut permettre de trouver des impressions très similaires mais provenant de doigts distincts. Toutefois et simultanément, ces données et ces systèmes permettent une description des variabilités entre différentes appositions d'un même doigt, et entre les appositions de différents doigts, basées sur des larges quantités de données. Cette description statistique de l'intra- et de l'intervariabilité calculée à partir des minuties et de leurs positions relatives va s'insérer dans une approche d'interprétation probabiliste. Le calcul d'un rapport de vraisemblance, qui fait intervenir simultanément la comparaison entre la trace et l'empreinte du cas, ainsi que l'intravariabilité du doigt du suspect et l'intervariabilité de la trace par rapport à une banque de données, peut alors se baser sur des jeux de données représentatifs. Ainsi, ces données permettent d'aboutir à une évaluation beaucoup plus fine que celle obtenue par l'application de règles établies bien avant l'avènement de ces grandes banques ou par la seule expérience du spécialiste. L'objectif de la présente thèse est d'évaluer des rapports de vraisemblance calcul és à partir des scores d'un système automatique lorsqu'on connaît la source des traces testées et comparées. Ces rapports doivent soutenir l'hypothèse dont il est connu qu'elle est vraie. De plus, ils devraient soutenir de plus en plus fortement cette hypothèse avec l'ajout d'information sous la forme de minuties additionnelles. Pour la modélisation de l'intra- et l'intervariabilité, les données nécessaires ont été définies, et acquises pour un doigt d'un premier donneur, et deux doigts d'un second donneur. La banque de données utilisée pour l'intervariabilité inclut environ 600000 empreintes encrées. Le nombre minimal d'observations nécessaire pour une estimation robuste a été déterminé pour les deux distributions utilisées. Des facteurs qui influencent ces distributions ont, par la suite, été analysés: le nombre de minuties inclus dans la configuration et la configuration en tant que telle pour les deux distributions, ainsi que le numéro du doigt et le dessin général pour l'intervariabilité, et la orientation des minuties pour l'intravariabilité. Parmi tous ces facteurs, l'orientation des minuties est le seul dont une influence n'a pas été démontrée dans la présente étude. Les résultats montrent que les rapports de vraisemblance issus de l'utilisation des scores de l'AFIS peuvent être utilisés à des fins évaluatifs. Des taux de rapports de vraisemblance relativement bas soutiennent l'hypothèse que l'on sait fausse. Le taux maximal de rapports de vraisemblance soutenant l'hypothèse que les deux impressions aient été laissées par le même doigt alors qu'en réalité les impressions viennent de doigts différents obtenu est de 5.2%, pour une configuration de 6 minuties. Lorsqu'une 7ème puis une 8ème minutie sont ajoutées, ce taux baisse d'abord à 3.2%, puis à 0.8%. Parallèlement, pour ces mêmes configurations, les rapports de vraisemblance sont en moyenne de l'ordre de 100, 1000, et 10000 pour 6, 7 et 8 minuties lorsque les deux impressions proviennent du même doigt. Ces rapports de vraisemblance peuvent donc apporter un soutien important à la prise de décision. Les deux évolutions positives liées à l'ajout de minuties (baisse des taux qui peuvent amener à une décision erronée et augmentation de la valeur du rapport de vraisemblance) ont été observées de façon systématique dans le cadre de l'étude. Des approximations basées sur 3 scores pour l'intravariabilité et sur 10 scores pour l'intervariabilité ont été trouvées, et ont montré des résultats satisfaisants.
Resumo:
Hepatitis B virus (HBV) and Hepatitis C virus (HCV) infections pose major public health problems because of their prevalence worldwide. Consequently, screening for these infections is an important part of routine laboratory activity. Serological and molecular markers are key elements in diagnosis, prognosis and treatment monitoring for HBV and HCV infections. Today, automated chemiluminescence immunoassay (CLIA) analyzers are widely used for virological diagnosis, particularly in high-volume clinical laboratories. Molecular biology techniques are routinely used to detect and quantify viral genomes as well as to analyze their sequence; in order to determine their genotype and detect resistance to antiviral drugs. Real-time PCR, which provides high sensitivity and a broad dynamic range, has gradually replaced other signal and target amplification technologies for the quantification and detection of nucleic acid. The next-generation DNA sequencing techniques are still restricted to research laboratories.The serological and molecular marker methods available for HBV and HCV are discussed in this article, along with their utility and limitations for use in Chronic Hepatitis B (CHB) diagnosis and monitoring.
Resumo:
Final report produced by DOT on development of manual crack quantification and automatic crack measurment system.
Resumo:
A avaliação de terras é o processo que permite estimar o uso potencial da terra com base em seus atributos. Grande variedade de modelos analíticos pode ser usada neste processo. No Brasil, os dois sistemas de avaliação das terras mais utilizados são o Sistema de Classificação da Capacidade de Uso da Terra e o Sistema FAO/Brasileiro de Aptidão Agrícola das Terras. Embora difiram em vários aspectos, ambos exigem o cruzamento de inúmeras variáveis ambientais. O ALES (Automated Land Evaluation System) é um programa de computador que permite construir sistemas especialistas para avaliação de terras. As entidades avaliadas pelo ALES são as unidades de mapeamento, as quais podem ser de caráter generalizado ou detalhado. A área objeto desta avaliação é composta pelas microrregiões de Chapecó e Xanxerê, no Oeste catarinense, e engloba 54 municípios. Os dados sobre os solos e sobre as características da paisagem foram obtidos no levantamento de reconhecimento dos solos do Estado, na escala de 1:250.000. O presente estudo desenvolveu o sistema especialista ATOSC (Avaliação das Terras do Oeste de Santa Catarina) e, na sua construção, incluiu-se a definição dos requerimentos dos tipos de utilização da terra, bem como foi feita a subseqüente comparação destes com os atributos de cada unidade de mapeamento. Os tipos de utilização da terra considerados foram: feijão, milho, soja e trigo, em cultivos solteiros, sob condições de sequeiro e de manejo característicos destas culturas no Estado. As informações sobre os recursos naturais compreendem os atributos climáticos, de solos e das condições da paisagem que interferem na produção destas culturas. Para cada tipo de utilização da terra foram especificados, no ATOSC, o código, o nome e seus respectivos requerimentos de uso da terra. Os requerimentos de cada cultura foram definidos por uma combinação específica das características das terras selecionadas, que determina o nível de severidade de cada um deles em relação à cultura. Estabeleceram-se quatro níveis de severidade que indicam aumento do grau de limitação ou diminuição do potencial para determinado tipo de uso da terra, a saber: limitação nula ou ligeira (favorável); limitação moderada (moderadamente favorável), limitação forte (pouco favorável); e limitação muito forte (desfavorável). Na árvore de decisão, componente básico do sistema especialista, são implementadas as regras que permitirão o enquadramento das terras em classes de adequação definidas, baseado na qualidade dos requerimentos de acordo com o tipo de uso. O ATOSC facilitou o processo de comparação entre as características das terras das microrregiões de Chapecó e Xanxerê e os requerimentos de uso considerados, por permitir efetuar automaticamente a avaliação das terras, reduzindo, assim, o tempo gasto neste processo. As terras das microrregiões de Chapecó e Xanxerê foram enquadradas, em sua maior parte, nas classes de adequação pouco favorável (3) e desfavorável (4) para os cultivos considerados. Os principais fatores limitantes identificados nestas microrregiões foram a fertilidade natural e o risco de erosão, para o feijão e o milho, e condições de mecanização e risco de erosão, para a soja e o trigo.
Resumo:
The Office of Special Investigations at Iowa Department of Transportation (DOT) collects FWD data on regular basis to evaluate pavement structural conditions. The primary objective of this study was to develop a fully-automated software system for rapid processing of the FWD data along with a user manual. The software system automatically reads the FWD raw data collected by the JILS-20 type FWD machine that Iowa DOT owns, processes and analyzes the collected data with the rapid prediction algorithms developed during the phase I study. This system smoothly integrates the FWD data analysis algorithms and the computer program being used to collect the pavement deflection data. This system can be used to assess pavement condition, estimate remaining pavement life, and eventually help assess pavement rehabilitation strategies by the Iowa DOT pavement management team. This report describes the developed software in detail and can also be used as a user-manual for conducting simulation studies and detailed analyses. *********************** Large File ***********************
Resumo:
Kansas State University, with funding from the Kansas Department of Transportation (KDOT), has developed a computerized reduction system for profilograms produced by mechanical profilographs. The commercial version of the system (ProScan (trademark)) is marketed by Devore Systems, Inc. The system consists of an IBM Compatible PC 486SX33 computer or better, Epson LQ-570 printer, a Logitech Scanman 32 hand scanner system, a paper transport unit, and the ProScan software. The Scanner is not adaptable to IBM computers with the micro channel architecture. The Iowa DOT Transportation Centers could realize the following advantages by using ProScan: (1) Save about 5 to 8 staff hours of reduction and reporting time per Transportation Center per week for a Materials Technician 3 or 4 (the time savings would come during the busiest part of the season); (2) Reduce errors in reduction, transfer, and typing of profile values; (3) Increase the accuracy of the monitor results; and (4) Allow rapid evaluation of contractor traces when tolerance limits between monitor and certified results are exceeded.
Resumo:
An assay for the simultaneous analysis of pharmaceutical compounds and their metabolites from micro-whole blood samples (i.e. 5 microL) was developed using an on-line dried blood spot (on-line DBS) device coupled with hydrophilic interaction/reversed-phase (HILIC/RP) LC/MS/MS. Filter paper is directly integrated to the LC device using a homemade inox desorption cell. Without any sample pretreatment, analytes are desorbed from the paper towards an automated system of valves linking a zwitterionic-HILIC column to an RP C18 column. In the same run, the polar fraction is separated by the zwitterionic-HILIC column while the non-polar fraction is eluted on the RP C18. Both fractions are detected by IT-MS operating in full scan mode for the survey scan and in product ion mode for the dependant scan using an ESI source. The procedure was evaluated by the simultaneous qualitative analysis of four probes and their relative phase I and II metabolites spiked in whole blood. In addition, the method was successfully applied to the in vivo monitoring of buprenorphine metabolism after the administration of an intraperitoneal injection of 30 mg/kg on adult female Wistar rat.
Resumo:
A geophysical and geochemical study has been conducted in a fractured carbonate aquifer located at Combioula in the southwestern Swiss Alps with the objective to detect and characterize hydraulically active fractures along a 260-m-deep borehole. Hydrochemical analyses, borehole diameter, temperature and fluid electrical conductivity logging data were integrated in order to relate electrokinetic self-potential signals to groundwater flow inside the fracture network. The results show a generally good, albeit locally variable correlation of variations of the self-potential signals with variations in temperature, fluid electrical conductivity and borehole diameter. Together with the hydrochemical evidence, which was found to be critical for the interpretation of the self-potential data, these measurements not only made it possible to detect the hydraulically active fractures but also to characterize them as zones of fluid gain or fluid loss. The results complement the available information from the corresponding litholog and illustrate the potential of electrokinetic self-potential signals in conjunction with temperature, fluid electrical conductivity and hydrochemical analyses for the characterization of fractured aquifers, and thus may offer a perspective for an effective quantitative characterization of this increasingly important class of aquifers and geothermal reservoirs.
Resumo:
PURPOSE: EOS (EOS imaging S.A, Paris, France) is an x-ray imaging system that uses slot-scanning technology in order to optimize the trade-off between image quality and dose. The goal of this study was to characterize the EOS system in terms of occupational exposure, organ doses to patients as well as image quality for full spine examinations. METHODS: Occupational exposure was determined by measuring the ambient dose equivalents in the radiological room during a standard full spine examination. The patient dosimetry was performed using anthropomorphic phantoms representing an adolescent and a five-year-old child. The organ doses were measured with thermoluminescent detectors and then used to calculate effective doses. Patient exposure with EOS was then compared to dose levels reported for conventional radiological systems. Image quality was assessed in terms of spatial resolution and different noise contributions to evaluate the detector's performances of the system. The spatial-frequency signal transfer efficiency of the imaging system was quantified by the detective quantum efficiency (DQE). RESULTS: The use of a protective apron when the medical staff or parents have to stand near to the cubicle in the radiological room is recommended. The estimated effective dose to patients undergoing a full spine examination with the EOS system was 290μSv for an adult and 200 μSv for a child. MTF and NPS are nonisotropic, with higher values in the scanning direction; they are in addition energy-dependent, but scanning speed independent. The system was shown to be quantum-limited, with a maximum DQE of 13%. The relevance of the DQE for slot-scanning system has been addressed. CONCLUSIONS: As a summary, the estimated effective dose was 290μSv for an adult; the image quality remains comparable to conventional systems.
Resumo:
Tässä diplomityössä tutkitaan automatisoitua testausta ja käyttöliittymätestauksen tekemistä helpommaksi Symbian-käyttöjärjestelmässä. Työssä esitellään Symbian ja Symbian-sovelluskehityksessä kohdattavia haasteita. Lisäksi kerrotaan testausstrategioista ja -tavoista sekä automatisoidusta testaamisesta. Lopuksi esitetään työkalu, jolla testitapausten luominen toiminnalisuus- ja järjestelmätestaukseen tehdään helpommaksi. Graafiset käyttöliittymättuovat ainutlaatuisia haasteita ohjelmiston testaamiseen. Ne tehdään usein monimutkaisista komponenteista ja niitä suunnitellaan jatkuvasti uusiksi ohjelmistokehityksen aikana. Graafisten käyttöliittymien testaukseen käytetään usein kaappaus- ja toistotyökaluja. Käyttöliittymätestauksen testitapausten suunnittelu ja toteutus vaatii paljon panostusta. Koska graafiset käyttöliittymät muodostavat suuren osan koodista, voitaisiin säästää paljon resursseja tekemällä testitapausten luomisesta helpompaa. Käytännön osuudessa toteutettu projekti pyrkii tähän tekemällä testiskriptien luomisesta visuaalista. Näin ollen itse testien skriptikieltä ei tarvitse ymmärtää ja testien hahmottaminen on myös helpompaa.
Resumo:
Tässä diplomityössä esitellään ohjelmistotestauksen ja verifioinnin yleisiä periaatteita sekä käsitellään tarkemmin älypuhelinohjelmistojen verifiointia. Työssä esitellään myös älypuhelimissa käytettävä Symbian-käyttöjärjestelmä. Työn käytännön osuudessa suunniteltiin ja toteutettiin Symbian-käyttöjärjestelmässä toimiva palvelin, joka tarkkailee ja tallentaa järjestelmäresurssien käyttöä. Verifiointi on tärkeä ja kuluja aiheuttava tehtävä älypuhelinohjelmistojen kehityssyklissä. Kuluja voidaan vähentää automatisoimalla osa verifiointiprosessista. Toteutettu palvelin automatisoijärjestelmäresurssien tarkkailun tallentamalla tietoja niistä tiedostoon testien ajon aikana. Kun testit ajetaan uudestaan, uusia tuloksia vertaillaan lähdetallenteeseen. Jos tulokset eivät ole käyttäjän asettamien virherajojen sisällä, siitä ilmoitetaan käyttäjälle. Virherajojen ja lähdetallenteen määrittäminen saattaa osoittautua vaikeaksi. Kuitenkin, jos ne määritetään sopivasti, palvelin tuottaa hyödyllistä tietoa poikkeamista järjestelmäresurssien kulutuksessa testaajille.
Resumo:
The capabilities and thus, design complexity of VLSI-based embedded systems have increased tremendously in recent years, riding the wave of Moore’s law. The time-to-market requirements are also shrinking, imposing challenges to the designers, which in turn, seek to adopt new design methods to increase their productivity. As an answer to these new pressures, modern day systems have moved towards on-chip multiprocessing technologies. New architectures have emerged in on-chip multiprocessing in order to utilize the tremendous advances of fabrication technology. Platform-based design is a possible solution in addressing these challenges. The principle behind the approach is to separate the functionality of an application from the organization and communication architecture of hardware platform at several levels of abstraction. The existing design methodologies pertaining to platform-based design approach don’t provide full automation at every level of the design processes, and sometimes, the co-design of platform-based systems lead to sub-optimal systems. In addition, the design productivity gap in multiprocessor systems remain a key challenge due to existing design methodologies. This thesis addresses the aforementioned challenges and discusses the creation of a development framework for a platform-based system design, in the context of the SegBus platform - a distributed communication architecture. This research aims to provide automated procedures for platform design and application mapping. Structural verification support is also featured thus ensuring correct-by-design platforms. The solution is based on a model-based process. Both the platform and the application are modeled using the Unified Modeling Language. This thesis develops a Domain Specific Language to support platform modeling based on a corresponding UML profile. Object Constraint Language constraints are used to support structurally correct platform construction. An emulator is thus introduced to allow as much as possible accurate performance estimation of the solution, at high abstraction levels. VHDL code is automatically generated, in the form of “snippets” to be employed in the arbiter modules of the platform, as required by the application. The resulting framework is applied in building an actual design solution for an MP3 stereo audio decoder application.
Resumo:
Tutkimuksen tavoitteena oli selvittää automaattisten tilausjärjestelmien onnistuneen käyttöönottoon taustalla vaikuttavia tekijöitä vähittäiskaupan toimialalla ja etsiä ratkaisua kyseisten järjestelmien onnistuneeseen käyttöönottoon tässä ympäristössä. Tutkimus analysoi yli sadan kaupan järjestelmän käyttöönottoa ja käyttöönoton tuloksia. Tutkimusta varten haastateltiin niin yhtiön sisältä kuin ulkopuoleltakin mukana olleita hankintajärjestelmän ja jalkautuksen asiantuntijoita. Tämän lisäksi järjestelmän käyttöönottaneisiin kauppoihin lähetettiin kyselyt, joita analysoitiin ryhmissä automaattisen tilausjärjestelmän tietojen pohjalta. Työn tuloksena pystyttiin tunnistamaan tietty joukko taustatekijöitä, jotka tulee ottaa käyttöönotossa huomioon sekä saatuihin tutkimustuloksiin perustuen kehitettiin malli vastaavanlaisten järjestelmien käyttöönotolle vähittäiskaupan alalle.
Resumo:
The development of new procedures for quickly obtaining accurate information on the physiological potential of seed lots is essential for developing quality control programs for the seed industry. In this study, the effectiveness of an automated system of seedling image analysis (Seed Vigor Imaging System - SVIS) in determining the physiological potential of sun hemp seeds and its relationship with electrical conductivity tests, were evaluated. SVIS evaluations were performed three and four days after sowing and data on the vigor index and the length and uniformity of seedling growth were collected. The electrical conductivity test was made on 50 seed replicates placed in containers with 75 mL of deionised water at 25 ºC and readings were taken after 1, 2, 4, 8 and 16 hours of imbibition. Electrical conductivity measurements at 4 or 8 hours and the use of the SVIS on 3-day old seedlings can effectively detect differences in vigor between different sun hemp seed lots.