877 resultados para Analysis tools


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Although equines have participated in the forming and development of several civilizations around the world since their domestication 6,000 years ago in comparison to other species that have zootechnical interest, few researches have been done related to animal breeding area, especially in Brazil. Some reasons for that are difficulties associated with the species as well as operational aspects. However, developments in genetics in the last decades contributed to a better understanding of the traits related to reproduction, heath, behavior and performance of domestic animals, including equines. Recent technologies as next generation sequencing methods and the high density chips of SNPs for genotyping allowed some advances in the researches already done. These researches used basically the candidate gene strategy, and identified genomic regions related to diseases and syndromes and, more recently, the performance in sport competition and specific abilities. Using these genomic analysis tools, some regions related to race performance have been identified and based on this information; genetic tests to select superior animals for racing performance have started to be available in the market.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The increasing expansion of agricultural activities, without considering the potential and limitations of soils is a potential source of environmental degradation. Thus, the present study assessed the variation of use and occupation in 49 years, between 1962 and 2011 scenarios of watershed of São Caetano - Botucatu (SP). geoprocessing techniques were used in this study. In a Geographic Information System (GIS) - IDRISI – it was integrated information from IBGE digital cards, scale 1:50,000, plus aerial photographs (1962) and satellite images LANDSAT - 5 (2011). In the study area, we can view the progress of the urban area, which in 1962 was not present in the watershed. In 2011, the urban area occupied 21.37% of the total area. Even with this breakthrough occurring in the period of 49 years, there was an increase in the area of natural vegetation, which once occupied only 12.33% of the area (1962), and in 2011 represents 25% of the total area of the watershed, showing an increase in awareness on the importance of preserving nature. Thus, we can conclude that the analysis tools based on GIS enabled us to analyze variations in space and time and to propose alternatives to the correct use and occupation of land.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Abstract Background A large number of probabilistic models used in sequence analysis assign non-zero probability values to most input sequences. To decide when a given probability is sufficient the most common way is bayesian binary classification, where the probability of the model characterizing the sequence family of interest is compared to that of an alternative probability model. We can use as alternative model a null model. This is the scoring technique used by sequence analysis tools such as HMMER, SAM and INFERNAL. The most prevalent null models are position-independent residue distributions that include: the uniform distribution, genomic distribution, family-specific distribution and the target sequence distribution. This paper presents a study to evaluate the impact of the choice of a null model in the final result of classifications. In particular, we are interested in minimizing the number of false predictions in a classification. This is a crucial issue to reduce costs of biological validation. Results For all the tests, the target null model presented the lowest number of false positives, when using random sequences as a test. The study was performed in DNA sequences using GC content as the measure of content bias, but the results should be valid also for protein sequences. To broaden the application of the results, the study was performed using randomly generated sequences. Previous studies were performed on aminoacid sequences, using only one probabilistic model (HMM) and on a specific benchmark, and lack more general conclusions about the performance of null models. Finally, a benchmark test with P. falciparum confirmed these results. Conclusions Of the evaluated models the best suited for classification are the uniform model and the target model. However, the use of the uniform model presents a GC bias that can cause more false positives for candidate sequences with extreme compositional bias, a characteristic not described in previous studies. In these cases the target model is more dependable for biological validation due to its higher specificity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The recent introduction of free form NC machining in the ophthalmic field involved a full review of the former product development process both from the design and the manufacturing viewpoint. Aim of the present work is to investigate and to set up innovative methods and tools supporting the product development, particularly for lenses characterized by free form geometry, as e.g. progressive lenses. In the design stage, the research addressed geometric modeling of complex lens shapes and relevant analysis tools for the optical-geometrical characterization of the produced models. In the manufacturing stage, the main interest was focused on the set-up of the fabrication process, particularly on the NC machining process for which an integration CADCAM software was developed for the generation and the simulation of the machining cycle. The methodologies and tools made available by the present work are currently used in the development of new complex geometry product typologies as, e.g. progressive lenses.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis comes after a strong contribution on the realization of the CMS computing system, which can be seen as a relevant part of the experiment itself. A physics analysis completes the road from Monte Carlo production and analysis tools realization to the final physics study which is the actual goal of the experiment. The topic of physics work of this thesis is the study of tt events fully hadronic decay in the CMS experiment. A multi-jet trigger has been provided to fix a reasonable starting point, reducing the multi-jet sample to the nominal trigger rate. An offline selection has been provided to reduce the S/B ratio. The b-tag is applied to provide a further S/B improvement. The selection is applied to the background sample and to the samples generated at different top quark masses. The top quark mass candidate is reconstructed for all those samples using a kinematic fitter. The resulting distributions are used to build p.d.f.’s, interpolating them with a continuous arbitrary curve. These curves are used to perform the top mass measurement through a likelihood comparison

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The digital electronic market development is founded on the continuous reduction of the transistors size, to reduce area, power, cost and increase the computational performance of integrated circuits. This trend, known as technology scaling, is approaching the nanometer size. The lithographic process in the manufacturing stage is increasing its uncertainty with the scaling down of the transistors size, resulting in a larger parameter variation in future technology generations. Furthermore, the exponential relationship between the leakage current and the threshold voltage, is limiting the threshold and supply voltages scaling, increasing the power density and creating local thermal issues, such as hot spots, thermal runaway and thermal cycles. In addiction, the introduction of new materials and the smaller devices dimension are reducing transistors robustness, that combined with high temperature and frequently thermal cycles, are speeding up wear out processes. Those effects are no longer addressable only at the process level. Consequently the deep sub-micron devices will require solutions which will imply several design levels, as system and logic, and new approaches called Design For Manufacturability (DFM) and Design For Reliability. The purpose of the above approaches is to bring in the early design stages the awareness of the device reliability and manufacturability, in order to introduce logic and system able to cope with the yield and reliability loss. The ITRS roadmap suggests the following research steps to integrate the design for manufacturability and reliability in the standard CAD automated design flow: i) The implementation of new analysis algorithms able to predict the system thermal behavior with the impact to the power and speed performances. ii) High level wear out models able to predict the mean time to failure of the system (MTTF). iii) Statistical performance analysis able to predict the impact of the process variation, both random and systematic. The new analysis tools have to be developed beside new logic and system strategies to cope with the future challenges, as for instance: i) Thermal management strategy that increase the reliability and life time of the devices acting to some tunable parameter,such as supply voltage or body bias. ii) Error detection logic able to interact with compensation techniques as Adaptive Supply Voltage ASV, Adaptive Body Bias ABB and error recovering, in order to increase yield and reliability. iii) architectures that are fundamentally resistant to variability, including locally asynchronous designs, redundancy, and error correcting signal encodings (ECC). The literature already features works addressing the prediction of the MTTF, papers focusing on thermal management in the general purpose chip, and publications on statistical performance analysis. In my Phd research activity, I investigated the need for thermal management in future embedded low-power Network On Chip (NoC) devices.I developed a thermal analysis library, that has been integrated in a NoC cycle accurate simulator and in a FPGA based NoC simulator. The results have shown that an accurate layout distribution can avoid the onset of hot-spot in a NoC chip. Furthermore the application of thermal management can reduce temperature and number of thermal cycles, increasing the systemreliability. Therefore the thesis advocates the need to integrate a thermal analysis in the first design stages for embedded NoC design. Later on, I focused my research in the development of statistical process variation analysis tool that is able to address both random and systematic variations. The tool was used to analyze the impact of self-timed asynchronous logic stages in an embedded microprocessor. As results we confirmed the capability of self-timed logic to increase the manufacturability and reliability. Furthermore we used the tool to investigate the suitability of low-swing techniques in the NoC system communication under process variations. In this case We discovered the superior robustness to systematic process variation of low-swing links, which shows a good response to compensation technique as ASV and ABB. Hence low-swing is a good alternative to the standard CMOS communication for power, speed, reliability and manufacturability. In summary my work proves the advantage of integrating a statistical process variation analysis tool in the first stages of the design flow.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Adhesive bonding provides solutions to realize cost effective and low weight aircraft fuselage structures, in particular where the Damage Tolerance (DT) is the design criterion. Bonded structures that combine Metal Laminates (MLs) and eventually Selective Reinforcements can guarantee slow crack propagation, crack arrest and large damage capability. To optimize the design exploiting the benefit of bonded structures incorporating selective reinforcement requires reliable analysis tools. The effect of bonded doublers / selective reinforcements is very difficult to be predicted numerically or analytically due to the complexity of the underlying mechanisms and failures modes acting. Reliable predictions of crack growth and residual strength can only be based on sound empirical and phenomenological considerations strictly related to the specific structural concept. Large flat stiffened panels that combine MLs and selective reinforcements have been tested with the purpose of investigating solutions applicable to pressurized fuselages. The large test campaign (for a total of 35 stiffened panels) has quantitatively investigated the role of the different metallic skin concepts (monolithic vs. MLs) of the aluminum, titanium and glass-fiber reinforcements, of the stringers material and cross sections and of the geometry and location of doublers / selective reinforcements. Bonded doublers and selective reinforcements confirmed to be outstanding tools to improve the DT properties of structural elements with a minor weight increase. However the choice of proper materials for the skin and the stringers must be not underestimated since they play an important role as well. A fuselage structural concept has been developed to exploit the benefit of a metal laminate design concept in terms of high Fatigue and Damage Tolerance (F&DT) performances. The structure used laminated skin (0.8mm thick), bonded stringers, two different splicing solutions and selective reinforcements (glass prepreg embedded in the laminate) under the circumferential frames. To validate the design concept a curved panel was manufactured and tested under loading conditions representative of a single aisle fuselage: cyclic internal pressurization plus longitudinal loads. The geometry of the panel, design and loading conditions were tailored for the requirements of the upper front fuselage. The curved panel has been fatigue tested for 60 000 cycles before the introduction of artificial damages (cracks in longitudinal and circumferential directions). The crack growth of the artificial damages has been investigated for about 85 000 cycles. At the end a residual strength test has been performed with a “2 bay over broken frame” longitudinal crack. The reparability of this innovative concept has been taken into account during design and demonstrated with the use of an external riveted repair. The F&DT curved panel test has confirmed that a long fatigue life and high damage tolerance can be achieved with a hybrid metal laminate low weight configuration. The superior fatigue life from metal laminates and the high damage tolerance characteristics provided by integrated selective reinforcements are the key concepts that provided the excellent performances. The weight comparison between the innovative bonded concept and a conventional monolithic riveted design solution showed a significant potential weight saving but the weight advantages shall be traded off with the additional costs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis is concerned with the role played by software tools in the analysis and dissemination of linguistic corpora and their contribution to a more widespread adoption of corpora in different fields. Chapter 1 contains an overview of some of the most relevant corpus analysis tools available today, presenting their most interesting features and some of their drawbacks. Chapter 2 begins with an explanation of the reasons why none of the available tools appear to satisfy the requirements of the user community and then continues with technical overview of the current status of the new system developed as part of this work. This presentation is followed by highlights of features that make the system appealing to users and corpus builders (i.e. scholars willing to make their corpora available to the public). The chapter concludes with an indication of future directions for the projects and information on the current availability of the software. Chapter 3 describes the design of an experiment devised to evaluate the usability of the new system in comparison to another corpus tool. Usage of the tool was tested in the context of a documentation task performed on a real assignment during a translation class in a master's degree course. In chapter 4 the findings of the experiment are presented on two levels of analysis: firstly a discussion on how participants interacted with and evaluated the two corpus tools in terms of interface and interaction design, usability and perceived ease of use. Then an analysis follows of how users interacted with corpora to complete the task and what kind of queries they submitted. Finally, some general conclusions are drawn and areas for future work are outlined.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ziel der vorliegenden Arbeit war die vergleichende Sequenzierung und nachfolgende Analyse des syntänen chromosomalen Abschnitts auf dem kurzen Arm des humanen Chromosoms 11 in der Region 11p15.3 mit den Genen LMO1, TUB und dem orthologen Genomabschnitt der Maus auf Chromosom 7 F2. Die im Rahmen dieser Arbeit durchgeführte Kartierung dieser beiden chromosomalen Bereiche ermöglichte die Komplettierung einer genomischen Karte auf insgesamt über eine Megabase, die im Kooperationssequenzierprojekt der Universitäts-Kinderklinik und dem Institut für Molekulargenetik in Mainz erstellt wurde. Mit Hilfe von 28 PAC- und Cosmid-Klonen konnten in dieser Arbeit 383 kb an genomischer DNA des Menschen und mit sechs BAC- und PAC-Klonen 412 kb an genomischer DNA der Maus dargestellt werden. Dies ermöglichte erstmals die exakte Festlegung der Reihenfolge der in diesem chromosomalen Abschnitt enthaltenen Gene und die genaue Kartierung von acht STS-Markern des Menschen, bzw. vier STS-Sonden der Maus. Es zeigte sich dabei, dass die chromosomale Orientierung telomer-/centromerwärts des orthologen Bereichs in der Maus im Vergleich zum Menschen in invertierter Ausrichtung vorliegt. Die Sequenzierung von drei humanen Klonen ermöglichte die Bestimmung von 319.119 bp an zusammenhängender genomischer DNA. Dadurch konnte die genaue Lokalisation und Strukturaufklärung der Gene LMO1, ein putatives Tumorsuppressorgen, das mit der Entstehung von Leukämien assoziiert ist, und TUB, ein Transkriptionsmodulator, der in die Fettstoffwechselregulation involviert ist, vorgenommen werden. Für das murine Genom wurden 412.827 bp an neuer DNA-Sequenz durch Sequenzierung von ebenfalls drei Klonen generiert. Der im Vergleich zum Menschen ca. 100 kb größere Genombereich beinhaltete zudem die neuen Gene Stk33 und Eif3. Es handelte sich dabei um zwei Gene, die erst im Rahmen dieser Arbeit entdeckt und charakterisiert wurden. Die parallele Bearbeitung beider Genombereiche ermöglichte eine umfassende komparative Analyse nach kodierenden, funktionellen und strukturgebenden Sequenzabschnitten in beiden Spezies. Es konnten dabei für beide Organismen die Exon-Intron-Strukturen der Gene LMO1/Lmo1 und TUB/Tub geklärt. Zudem konnten vier neue Exons und zwei neue speziesspezifischer Spleißvarianten für TUB/Tub beschrieben werden. Die Identifizierung dieser neuen Spleißvarianten offenbart neue Möglichkeiten für alternative Regulation und Funktion, oder für eine veränderte Proteinstruktur, die weitere Erklärungsansätze für die Entstehung der mit diesen Genen assoziierten Erkrankungen zulässt. In der sequenzierten, größeren Genomsequenz der Maus konnte in den flankierenden, nicht mit der sequenzierten Humansequenz überlappenden Bereich das neue Gen Eif3 in seiner Exon-Intron-Struktur und die beiden letzten Exons 11 und 12 des Gens Stk33 kartiert und charakterisiert werden. Die umfangreiche Sequenzanalyse beider sequenzierter Genombereiche ergab für den Abschnitt des Menschen insgesamt 229 potentielle Exonsequenzen und für den Bereich der Maus 527 mögliche Exonbereiche. Davon konnten beim Menschen explizit 21 Exons und bei der Maus 31 Exons als exprimierte Bereiche identifiziert und experimentell mittels RT-PCR, bzw. durch cDNA-Sequenzierung verifiziert werden. Diese Abschnitte beschrieben nicht nur die Exonbereiche der oben genannten vier Gene, sondern konnten auch neuen nicht weiter definierten EST-Sequenzen zugeordnet werden. Mittels des Interspeziesvergleiches war darüber hinaus auch die Analyse der nichtkodierenden Intergen-Bereiche möglich. So konnten beispielsweise im ersten Intron des LMO1/Lmo1 sieben Sequenzbereiche mit Konservierungen von ca. 90% bestimmt werden. Auch die Charakterisierung von Promotor- und putativ regulatorischen Sequenzabschnitten konnte mit Hilfe unterschiedlicher bioinformatischer Analyse-Tools durchgeführt werden. Die konservierten Sequenzbereiche der DNA zeigen im Durchschnitt eine Homologie von mehr als 65% auf. Auch die Betrachtung der Genomorganisation zeigte Gemeinsamkeiten, die sich meist nur in ihrer graduellen Ausprägung unterschieden. So weist ein knapp 80 kb großer Bereich proximal zum humanen TUB-Gen einen deutlich erhöhten AT-Gehalt auf, der ebenso im murinen Genom nur in verkürzter Version und schwächer ausgeprägt in Erscheinung tritt. Die zusätzliche Vergleichsanalyse mit einer weiteren Spezies, den orthologen Genomabschnitten von Fugu, zeigte, dass es sich bei den untersuchten Genen LMO1 und TUB um sehr konservierte und evolutiv alte Gene handelt, deren genomisches Organisationsmuster sich auch bei den paralogen Genfamilienmitglieder innerhalb derselben Spezies wiederfindet. Insgesamt konnte durch die Kartierung, Sequenzierung und Analyse eine umfassende Datenbasis für die betrachtete Genomregion und die beschriebenen Gene generiert werden, die für zukünftige Untersuchungen und Fragestellungen wertvolle Informationen bereithält.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Seit Anbeginn der Menschheitsgeschichte beeinflussen die Menschen ihre Umwelt. Durch anthropogene Emissionen ändert sich die Zusammensetzung der Atmosphäre, was einen zunehmenden Einfluss unter anderem auf die Atmosphärenchemie, die Gesundheit von Mensch, Flora und Fauna und das Klima hat. Die steigende Anzahl riesiger, wachsender Metropolen geht einher mit einer räumlichen Konzentration der Emission von Luftschadstoffen, was vor allem einen Einfluss auf die Luftqualität der windabwärts gelegenen ruralen Regionen hat. In dieser Doktorarbeit wurde im Rahmen des MEGAPOLI-Projektes die Abluftfahne der Megastadt Paris unter Anwendung des mobilen Aerosolforschungslabors MoLa untersucht. Dieses ist mit modernen, zeitlich hochauflösenden Instrumenten zur Messung der chemischen Zusammensetzung und Größenverteilung der Aerosolpartikel sowie einiger Spurengase ausgestattet. Es wurden mobile Messstrategien entwickelt und angewendet, die besonders geeignet zur Charakterisierung urbaner Emissionen sind. Querschnittsmessfahrten durch die Abluftfahne und atmosphärische Hintergrundluftmassen erlaubten sowohl die Bestimmung der Struktur und Homogenität der Abluftfahne als auch die Berechnung des Beitrags der urbanen Emissionen zur Gesamtbelastung der Atmosphäre. Quasi-Lagrange’sche Radialmessfahrten dienten der Erkundung der räumlichen Erstreckung der Abluftfahne sowie auftretender Transformationsprozesse der advehierten Luftschadstoffe. In Kombination mit Modellierungen konnte die Struktur der Abluftfahne vertieft untersucht werden. Flexible stationäre Messungen ergänzten den Datensatz und ließen zudem Vergleichsmessungen mit anderen Messstationen zu. Die Daten einer ortsfesten Messstation wurden zusätzlich verwendet, um die Alterung des organischen Partikelanteils zu beschreiben. Die Analyse der mobilen Messdaten erforderte die Entwicklung einer neuen Methode zur Bereinigung des Datensatzes von lokalen Störeinflüssen. Des Weiteren wurden die Möglichkeiten, Grenzen und Fehler bei der Anwendung komplexer Analyseprogramme zur Berechnung des O/C-Verhältnisses der Partikel sowie der Klassifizierung der Aerosolorganik untersucht. Eine Validierung verschiedener Methoden zur Bestimmung der Luftmassenherkunft war für die Auswertung ebenfalls notwendig. Die detaillierte Untersuchung der Abluftfahne von Paris ergab, dass diese sich anhand der Erhöhung der Konzentrationen von Indikatoren für unprozessierte Luftverschmutzung im Vergleich zu Hintergrundwerten identifizieren lässt. Ihre eher homogene Struktur kann zumeist durch eine Gauß-Form im Querschnitt mit einem exponentiellen Abfall der unprozessierten Schadstoffkonzentrationen mit zunehmender Distanz zur Stadt beschrieben werden. Hierfür ist hauptsächlich die turbulente Vermischung mit Umgebungsluftmassen verantwortlich. Es konnte nachgewiesen werden, dass in der advehierten Abluftfahne eine deutliche Oxidation der Aerosolorganik im Sommer stattfindet; im Winter hingegen ließ sich dieser Prozess während der durchgeführten Messungen nicht beobachten. In beiden Jahreszeiten setzt sich die Abluftfahne hauptsächlich aus Ruß und organischen Partikelkomponenten im PM1-Größenbereich zusammen, wobei die Quellen Verkehr und Kochen sowie zusätzlich Heizen in der kalten Jahreszeit dominieren. Die PM1-Partikelmasse erhöhte sich durch die urbanen Emissionen im Vergleich zum Hintergrundwert im Sommer in der Abluftfahne im Mittel um 30% und im Winter um 10%. Besonders starke Erhöhungen ließen sich für Polyaromaten beobachten, wo im Sommer eine mittlere Zunahme von 194% und im Winter von 131% vorlag. Jahreszeitliche Unterschiede waren ebenso in der Größenverteilung der Partikel der Abluftfahne zu finden, wo im Winter im Gegensatz zum Sommer keine zusätzlichen nukleierten kleinen Partikel, sondern nur durch Kondensation und Koagulation angewachsene Partikel zwischen etwa 10nm und 200nm auftraten. Die Spurengaskonzentrationen unterschieden sich ebenfalls, da chemische Reaktionen temperatur- und mitunter strahlungsabhängig sind. Weitere Anwendungsmöglichkeiten des MoLa wurden bei einer Überführungsfahrt von Deutschland an die spanische Atlantikküste demonstriert, woraus eine Kartierung der Luftqualität entlang der Fahrtroute resultierte. Es zeigte sich, dass hauptsächlich urbane Ballungszentren von unprozessierten Luftschadstoffen betroffen sind, advehierte gealterte Substanzen jedoch jede Region beeinflussen können. Die Untersuchung der Luftqualität an Standorten mit unterschiedlicher Exposition bezüglich anthropogener Quellen erweiterte diese Aussage um einen Einblick in die Variation der Luftqualität, abhängig unter anderem von der Wetterlage und der Nähe zu Emissionsquellen. Damit konnte gezeigt werden, dass sich die entwickelten Messstrategien und Analysemethoden nicht nur zur Untersuchung der Abluftfahne einer Großstadt, sondern auch auf verschiedene andere wissenschaftliche und umweltmesstechnische Fragestellungen anwenden lassen.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In any terminological study, candidate term extraction is a very time-consuming task. Corpus analysis tools have automatized some processes allowing the detection of relevant data within the texts, facilitating term candidate selection as well. Nevertheless, these tools are (normally) not specific for terminology research; therefore, the units which are automatically extracted need manual evaluation. Over the last few years some software products have been specifically developed for automatic term extraction. They are based on corpus analysis, but use linguistic and statistical information to filter data more precisely. As a result, the time needed for manual evaluation is reduced. In this framework, we tried to understand if and how these new tools can really be an advantage. In order to develop our project, we simulated a terminology study: we chose a domain (i.e. legal framework for medicinal products for human use) and compiled a corpus from which we extracted terms and phraseologisms using AntConc, a corpus analysis tool. Afterwards, we compared our list with the lists extracted automatically from three different tools (TermoStat Web, TaaS e Sketch Engine) in order to evaluate their performance. In the first chapter we describe some principles relating to terminology and phraseology in language for special purposes and show the advantages offered by corpus linguistics. In the second chapter we illustrate some of the main concepts of the domain selected, as well as some of the main features of legal texts. In the third chapter we describe automatic term extraction and the main criteria to evaluate it; moreover, we introduce the term-extraction tools used for this project. In the fourth chapter we describe our research method and, in the fifth chapter, we show our results and draw some preliminary conclusions on the performance and usefulness of term-extraction tools.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Routine bridge inspections require labor intensive and highly subjective visual interpretation to determine bridge deck surface condition. Light Detection and Ranging (LiDAR) a relatively new class of survey instrument has become a popular and increasingly used technology for providing as-built and inventory data in civil applications. While an increasing number of private and governmental agencies possess terrestrial and mobile LiDAR systems, an understanding of the technology’s capabilities and potential applications continues to evolve. LiDAR is a line-of-sight instrument and as such, care must be taken when establishing scan locations and resolution to allow the capture of data at an adequate resolution for defining features that contribute to the analysis of bridge deck surface condition. Information such as the location, area, and volume of spalling on deck surfaces, undersides, and support columns can be derived from properly collected LiDAR point clouds. The LiDAR point clouds contain information that can provide quantitative surface condition information, resulting in more accurate structural health monitoring. LiDAR scans were collected at three study bridges, each of which displayed a varying degree of degradation. A variety of commercially available analysis tools and an independently developed algorithm written in ArcGIS Python (ArcPy) were used to locate and quantify surface defects such as location, volume, and area of spalls. The results were visual and numerically displayed in a user-friendly web-based decision support tool integrating prior bridge condition metrics for comparison. LiDAR data processing procedures along with strengths and limitations of point clouds for defining features useful for assessing bridge deck condition are discussed. Point cloud density and incidence angle are two attributes that must be managed carefully to ensure data collected are of high quality and useful for bridge condition evaluation. When collected properly to ensure effective evaluation of bridge surface condition, LiDAR data can be analyzed to provide a useful data set from which to derive bridge deck condition information.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Moose is a powerful reverse engineering platform, but its facilities and means to analyze software are separated from the tools developers typically use to develop and maintain their software systems: development environments such as Eclipse, VisualWorks, or Squeak. In practice, this requires developers to work with two distinct environments, one to actually develop the software, and another one (e.g., Moose) to analyze it. We worked on several different techniques, using both dynamic and static analyzes to provide software analysis capabilities to developers directly in the IDE. The immediate availability of analysis tools in an IDE significantly increases the likelihood that developers integrate software analysis in their daily work, as we discovered by conducting user studies with developers. Finally, we identified several important aspect of integrating software analysis in IDEs that need to be addressed in the future to increase the adoption of these techniques by developers.