935 resultados para Analysis Tools


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective: to identify patterns in the spatial and temporal distribution of cases of dengue fever occurring in the city of Cruzeiro, state of Sao Paulo (SP).Methods: an ecological and exploratory study was undertaken using spatial analysis tools and data from dengue cases obtained on the SinanNet. The analysis was carried out by area, using the IBGE census sector as a unit. The months of March to June 2006 and 2011 were assessed, revealing progress of the disease. TerraView 3.3.1 was used to calculate the Global Moran's I, month to month, and the Kernel estimator.Results: in the year 2006, 691 cases of dengue fever (rate of 864.2 cases/100,000 inhabitants) were georeferenced; and the Moran's I and p-values were significant in the months of April and May (TM = 0.28; p = 0.01; I-M = 0.20; p = 0.01) with higher densities in the central, north, northeast and south regions. In the year 2011, 654 cases of dengue fever (rate of 886.8 cases/100,000 inhabitants) were georeferenced; and the Moran's I and p-values were significant in the months of April and May (I, = 0.28; p = 0.01; I-M = 0.16; p = 0.05) with densities in the same regions as 2006. The Global Moran's I is a global measure of spatial autocorrelation, which indicates the degree of spatial association in the set of information from the product in relation to the average. The I varies between -1 and +1 and can be attributed to a level of significance (p-value). The positive value points to a positive or direct spatial autocorrelation.Conclusion: we were able to identify patterns in the spatial and temporal distribution of dengue cases occurring in the city of Cruzeiro, SP, and locate the census sectors where the outbreak began and how it evolved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Ciências Sociais - FCLAR

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Although equines have participated in the forming and development of several civilizations around the world since their domestication 6,000 years ago in comparison to other species that have zootechnical interest, few researches have been done related to animal breeding area, especially in Brazil. Some reasons for that are difficulties associated with the species as well as operational aspects. However, developments in genetics in the last decades contributed to a better understanding of the traits related to reproduction, heath, behavior and performance of domestic animals, including equines. Recent technologies as next generation sequencing methods and the high density chips of SNPs for genotyping allowed some advances in the researches already done. These researches used basically the candidate gene strategy, and identified genomic regions related to diseases and syndromes and, more recently, the performance in sport competition and specific abilities. Using these genomic analysis tools, some regions related to race performance have been identified and based on this information; genetic tests to select superior animals for racing performance have started to be available in the market.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The increasing expansion of agricultural activities, without considering the potential and limitations of soils is a potential source of environmental degradation. Thus, the present study assessed the variation of use and occupation in 49 years, between 1962 and 2011 scenarios of watershed of São Caetano - Botucatu (SP). geoprocessing techniques were used in this study. In a Geographic Information System (GIS) - IDRISI – it was integrated information from IBGE digital cards, scale 1:50,000, plus aerial photographs (1962) and satellite images LANDSAT - 5 (2011). In the study area, we can view the progress of the urban area, which in 1962 was not present in the watershed. In 2011, the urban area occupied 21.37% of the total area. Even with this breakthrough occurring in the period of 49 years, there was an increase in the area of natural vegetation, which once occupied only 12.33% of the area (1962), and in 2011 represents 25% of the total area of the watershed, showing an increase in awareness on the importance of preserving nature. Thus, we can conclude that the analysis tools based on GIS enabled us to analyze variations in space and time and to propose alternatives to the correct use and occupation of land.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Abstract Background A large number of probabilistic models used in sequence analysis assign non-zero probability values to most input sequences. To decide when a given probability is sufficient the most common way is bayesian binary classification, where the probability of the model characterizing the sequence family of interest is compared to that of an alternative probability model. We can use as alternative model a null model. This is the scoring technique used by sequence analysis tools such as HMMER, SAM and INFERNAL. The most prevalent null models are position-independent residue distributions that include: the uniform distribution, genomic distribution, family-specific distribution and the target sequence distribution. This paper presents a study to evaluate the impact of the choice of a null model in the final result of classifications. In particular, we are interested in minimizing the number of false predictions in a classification. This is a crucial issue to reduce costs of biological validation. Results For all the tests, the target null model presented the lowest number of false positives, when using random sequences as a test. The study was performed in DNA sequences using GC content as the measure of content bias, but the results should be valid also for protein sequences. To broaden the application of the results, the study was performed using randomly generated sequences. Previous studies were performed on aminoacid sequences, using only one probabilistic model (HMM) and on a specific benchmark, and lack more general conclusions about the performance of null models. Finally, a benchmark test with P. falciparum confirmed these results. Conclusions Of the evaluated models the best suited for classification are the uniform model and the target model. However, the use of the uniform model presents a GC bias that can cause more false positives for candidate sequences with extreme compositional bias, a characteristic not described in previous studies. In these cases the target model is more dependable for biological validation due to its higher specificity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The recent introduction of free form NC machining in the ophthalmic field involved a full review of the former product development process both from the design and the manufacturing viewpoint. Aim of the present work is to investigate and to set up innovative methods and tools supporting the product development, particularly for lenses characterized by free form geometry, as e.g. progressive lenses. In the design stage, the research addressed geometric modeling of complex lens shapes and relevant analysis tools for the optical-geometrical characterization of the produced models. In the manufacturing stage, the main interest was focused on the set-up of the fabrication process, particularly on the NC machining process for which an integration CADCAM software was developed for the generation and the simulation of the machining cycle. The methodologies and tools made available by the present work are currently used in the development of new complex geometry product typologies as, e.g. progressive lenses.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis comes after a strong contribution on the realization of the CMS computing system, which can be seen as a relevant part of the experiment itself. A physics analysis completes the road from Monte Carlo production and analysis tools realization to the final physics study which is the actual goal of the experiment. The topic of physics work of this thesis is the study of tt events fully hadronic decay in the CMS experiment. A multi-jet trigger has been provided to fix a reasonable starting point, reducing the multi-jet sample to the nominal trigger rate. An offline selection has been provided to reduce the S/B ratio. The b-tag is applied to provide a further S/B improvement. The selection is applied to the background sample and to the samples generated at different top quark masses. The top quark mass candidate is reconstructed for all those samples using a kinematic fitter. The resulting distributions are used to build p.d.f.’s, interpolating them with a continuous arbitrary curve. These curves are used to perform the top mass measurement through a likelihood comparison

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The digital electronic market development is founded on the continuous reduction of the transistors size, to reduce area, power, cost and increase the computational performance of integrated circuits. This trend, known as technology scaling, is approaching the nanometer size. The lithographic process in the manufacturing stage is increasing its uncertainty with the scaling down of the transistors size, resulting in a larger parameter variation in future technology generations. Furthermore, the exponential relationship between the leakage current and the threshold voltage, is limiting the threshold and supply voltages scaling, increasing the power density and creating local thermal issues, such as hot spots, thermal runaway and thermal cycles. In addiction, the introduction of new materials and the smaller devices dimension are reducing transistors robustness, that combined with high temperature and frequently thermal cycles, are speeding up wear out processes. Those effects are no longer addressable only at the process level. Consequently the deep sub-micron devices will require solutions which will imply several design levels, as system and logic, and new approaches called Design For Manufacturability (DFM) and Design For Reliability. The purpose of the above approaches is to bring in the early design stages the awareness of the device reliability and manufacturability, in order to introduce logic and system able to cope with the yield and reliability loss. The ITRS roadmap suggests the following research steps to integrate the design for manufacturability and reliability in the standard CAD automated design flow: i) The implementation of new analysis algorithms able to predict the system thermal behavior with the impact to the power and speed performances. ii) High level wear out models able to predict the mean time to failure of the system (MTTF). iii) Statistical performance analysis able to predict the impact of the process variation, both random and systematic. The new analysis tools have to be developed beside new logic and system strategies to cope with the future challenges, as for instance: i) Thermal management strategy that increase the reliability and life time of the devices acting to some tunable parameter,such as supply voltage or body bias. ii) Error detection logic able to interact with compensation techniques as Adaptive Supply Voltage ASV, Adaptive Body Bias ABB and error recovering, in order to increase yield and reliability. iii) architectures that are fundamentally resistant to variability, including locally asynchronous designs, redundancy, and error correcting signal encodings (ECC). The literature already features works addressing the prediction of the MTTF, papers focusing on thermal management in the general purpose chip, and publications on statistical performance analysis. In my Phd research activity, I investigated the need for thermal management in future embedded low-power Network On Chip (NoC) devices.I developed a thermal analysis library, that has been integrated in a NoC cycle accurate simulator and in a FPGA based NoC simulator. The results have shown that an accurate layout distribution can avoid the onset of hot-spot in a NoC chip. Furthermore the application of thermal management can reduce temperature and number of thermal cycles, increasing the systemreliability. Therefore the thesis advocates the need to integrate a thermal analysis in the first design stages for embedded NoC design. Later on, I focused my research in the development of statistical process variation analysis tool that is able to address both random and systematic variations. The tool was used to analyze the impact of self-timed asynchronous logic stages in an embedded microprocessor. As results we confirmed the capability of self-timed logic to increase the manufacturability and reliability. Furthermore we used the tool to investigate the suitability of low-swing techniques in the NoC system communication under process variations. In this case We discovered the superior robustness to systematic process variation of low-swing links, which shows a good response to compensation technique as ASV and ABB. Hence low-swing is a good alternative to the standard CMOS communication for power, speed, reliability and manufacturability. In summary my work proves the advantage of integrating a statistical process variation analysis tool in the first stages of the design flow.