977 resultados para Testing Framework


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The description of the short-range part of the nucleon-nucleon forces in terms of quark degrees of freedom is tested against experimental observables. We consider, for this purpose, a model where the short-range part of the forces is given by the quark cluster model and the long- and medium-range forces by well established meson exchanges. The investigation is performed using different quark cluster models coming from different sets of quark-quark interactions. The predictions of this model are compared not only with the phase shifts but also directly with the experimental observables. Agreement with the existing pp and np world set of data is poor. This suggests that the current description of the nucleon-nucleon interaction, at short distances, in the framework of the nonrelativistic quark models, is at present only qualitative.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a well-developed framework, the Black-Scholes theory, for the pricing of contracts based on the future prices of certain assets, called options. This theory assumes that the probability distribution of the returns of the underlying asset is a Gaussian distribution. However, it is observed in the market that this hypothesis is flawed, leading to the introduction of a fudge factor, the so-called volatility smile. Therefore, it would be interesting to explore extensions of the Black-Scholes theory to non-Gaussian distributions. In this paper, we provide an explicit formula for the price of an option when the distributions of the returns of the underlying asset is parametrized by an Edgeworth expansion, which allows for the introduction of higher independent moments of the probability distribution, namely skewness and kurtosis. We test our formula with options in the Brazilian and American markets, showing that the volatility smile can be reduced. We also check whether our approach leads to more efficient hedging strategies of these instruments. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review the basic hypotheses which motivate the statistical framework used to analyze the cosmic microwave background, and how that framework can be enlarged as we relax those hypotheses. In particular, we try to separate as much as possible the questions of gaussianity, homogeneity, and isotropy from each other. We focus both on isotropic estimators of nongaussianity as well as statistically anisotropic estimators of gaussianity, giving particular emphasis on their signatures and the enhanced cosmic variances that become increasingly important as our putative Universe becomes less symmetric. After reviewing the formalism behind some simple model-independent tests, we discuss how these tests can be applied to CMBdata when searching for large-scale anomalies. Copyright © 2010 L. Raul Abramo and Thiago S. Pereira.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: There is little information considering the framework association between cast clasps and attachments. The aim of this study was to evaluate the retention strength of frameworks match circumferential clasps and extra resilient attachment cast in three different alloys (cobalt-chromium, nickel-chromium titanium and commercially pure titanium), using two undercut (0.25 and 0.75 mm) and considering different period of time (0, 1/2, 1, 2, 3, 4 and 5 years). Methods: Using two metallic matrices, representing a partially edentulous mandibular right hemiarch with the first molar crown, canine root and without premolars, 60 frameworks were fabricated. Three groups (n = 20) of each metal were cast and each group was divided into two subgroups (n = 10), corresponding the molar undercut of 0.25 mm and 0.75 mm. The nylon male was positioned at the matrix and attached to the acrylic resin of the prosthetic base. The samples were subjected to an insertion and removal test under artificial saliva environment. Results: The data were analyzed and compared with ANOVAs and Tukey's test at 95% of probability. The groups cast in cobaltchromium and nickel-chromium-titanium had the highest mean retention strength (5.58 N and 6.36 N respectively) without significant difference between them, but statistically different from the group cast in commercially pure titanium, which had the lowest mean retention strength in all the periods (3.46 N). The association frameworks using nickel-chromium-titanium and cobalt-chromium could be used with 0.25 mm and 0.75 mm of undercut, but the titanium samples seems to decrease the retention strength, mainly in the 0.75 mm undercut. The circumferential clasps cast in commercially pure titanium used in 0.75 mm undercuts have a potential risk of fractures, especially after the 2nd year of use. Conclusion: This in vitro study showed that the framework association between cast clasp and an extra resilient attachment are suitable to the three metals evaluated, but strongly suggest extra care with commercially pure titanium in undercut of 0.75 mm. Clinical significance: Frameworks fabricated in Cp Ti tend to decrease in retentive strength over time and have a potential risk of fracture in less than 0.75 mm of undercut.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rapidly accumulating Holocene sediments in estuaries commonly are difficult to sample and date. In Chesapeake Bay, we obtained sediment cores as much as 20min length and used numerous radiocarbon ages measured by accelerator mass spectrometry methods to provide the first detailed chronologies of Holocene sediment accumulation in the bay. Carbon in these sediments is a complex mixture of materials from a variety of sources. Analyses of different components of the sediments show that total organic carbon ages are largely unreliable, because much of the carbon (including coal) has been transported to the bay from upstream sources and is older than sediments in which it was deposited. Mollusk shells (clams, oysters) and foraminifera appear to give reliable results, although reworking and burrowing are potential problems. Analyses of museum specimens collected alive before atmospheric nuclear testing suggest that the standard reservoir correction for marine samples is appropriate for middle to lower Chesapeake Bay. The biogenic carbonate radiocarbon ages are compatible with 210Pb and 137Cs data and pollen stratigraphy from the same sites. Post-settlement changes in sediment transport and accumulation is an important environmental issue in many estuaries, including the Chesapeake. Our data show that large variations in sediment mass accumulation rates occur among sites. At shallow water sites, local factors seem to control changes in accumulation rates with time. Our two relatively deep-water sites in the axial channel of the bay have different long-term average accumulation rates, but the history of sediment accumulation at these sites appears to reflect overall conditions in the bay. Mass accumulation rates at the two deep-water sites rapidly increased by about fourfold coincident with widespread land clearance for agriculture in the Chesapeake watershed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The evaluation of associations between genotypes and diseases in a case-control framework plays an important role in genetic epidemiology. This paper focuses on the evaluation of the homogeneity of both genotypic and allelic frequencies. The traditional test that is used to check allelic homogeneity is known to be valid only under Hardy-Weinberg equilibrium, a property that may not hold in practice. Results: We first describe the flaws of the traditional (chi-squared) tests for both allelic and genotypic homogeneity. Besides the known problem of the allelic procedure, we show that whenever these tests are used, an incoherence may arise: sometimes the genotypic homogeneity hypothesis is not rejected, but the allelic hypothesis is. As we argue, this is logically impossible. Some methods that were recently proposed implicitly rely on the idea that this does not happen. In an attempt to correct this incoherence, we describe an alternative frequentist approach that is appropriate even when Hardy-Weinberg equilibrium does not hold. It is then shown that the problem remains and is intrinsic of frequentist procedures. Finally, we introduce the Full Bayesian Significance Test to test both hypotheses and prove that the incoherence cannot happen with these new tests. To illustrate this, all five tests are applied to real and simulated datasets. Using the celebrated power analysis, we show that the Bayesian method is comparable to the frequentist one and has the advantage of being coherent. Conclusions: Contrary to more traditional approaches, the Full Bayesian Significance Test for association studies provides a simple, coherent and powerful tool for detecting associations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the framework of an international collaboration with South Africa CSIR, the structural design, manufacturing and testing of the new wing for the Modular UAS in composite materials has been performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Il Data Distribution Management (DDM) è un componente dello standard High Level Architecture. Il suo compito è quello di rilevare le sovrapposizioni tra update e subscription extent in modo efficiente. All'interno di questa tesi si discute la necessità di avere un framework e per quali motivi è stato implementato. Il testing di algoritmi per un confronto equo, librerie per facilitare la realizzazione di algoritmi, automatizzazione della fase di compilazione, sono motivi che sono stati fondamentali per iniziare la realizzazione framework. Il motivo portante è stato che esplorando articoli scientifici sul DDM e sui vari algoritmi si è notato che in ogni articolo si creavano dei dati appositi per fare dei test. L'obiettivo di questo framework è anche quello di riuscire a confrontare gli algoritmi con un insieme di dati coerente. Si è deciso di testare il framework sul Cloud per avere un confronto più affidabile tra esecuzioni di utenti diversi. Si sono presi in considerazione due dei servizi più utilizzati: Amazon AWS EC2 e Google App Engine. Sono stati mostrati i vantaggi e gli svantaggi dell'uno e dell'altro e il motivo per cui si è scelto di utilizzare Google App Engine. Si sono sviluppati quattro algoritmi: Brute Force, Binary Partition, Improved Sort, Interval Tree Matching. Sono stati svolti dei test sul tempo di esecuzione e sulla memoria di picco utilizzata. Dai risultati si evince che l'Interval Tree Matching e l'Improved Sort sono i più efficienti. Tutti i test sono stati svolti sulle versioni sequenziali degli algoritmi e che quindi ci può essere un riduzione nel tempo di esecuzione per l'algoritmo Interval Tree Matching.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Un'analisi del Framework VDE con lo scopo di implementare strumenti automatizzati per il testing per migliorarne e velocizzarne il processo di sviluppo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The last few years have seen the advent of high-throughput technologies to analyze various properties of the transcriptome and proteome of several organisms. The congruency of these different data sources, or lack thereof, can shed light on the mechanisms that govern cellular function. A central challenge for bioinformatics research is to develop a unified framework for combining the multiple sources of functional genomics information and testing associations between them, thus obtaining a robust and integrated view of the underlying biology. We present a graph theoretic approach to test the significance of the association between multiple disparate sources of functional genomics data by proposing two statistical tests, namely edge permutation and node label permutation tests. We demonstrate the use of the proposed tests by finding significant association between a Gene Ontology-derived "predictome" and data obtained from mRNA expression and phenotypic experiments for Saccharomyces cerevisiae. Moreover, we employ the graph theoretic framework to recast a surprising discrepancy presented in Giaever et al. (2002) between gene expression and knockout phenotype, using expression data from a different set of experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective for this thesis is to outline a Performance-Based Engineering (PBE) framework to address the multiple hazards of Earthquake (EQ) and subsequent Fire Following Earthquake (FFE). Currently, fire codes for the United States are largely empirical and prescriptive in nature. The reliance on prescriptive requirements makes quantifying sustained damage due to fire difficult. Additionally, the empirical standards have resulted from individual member or individual assembly furnace testing, which have been shown to differ greatly from full structural system behavior. The very nature of fire behavior (ignition, growth, suppression, and spread) is fundamentally difficult to quantify due to the inherent randomness present in each stage of fire development. The study of interactions between earthquake damage and fire behavior is also in its infancy with essentially no available empirical testing results. This thesis will present a literature review, a discussion, and critique of the state-of-the-art, and a summary of software currently being used to estimate loss due to EQ and FFE. A generalized PBE framework for EQ and subsequent FFE is presented along with a combined hazard probability to performance objective matrix and a table of variables necessary to fully implement the proposed framework. Future research requirements and summary are also provided with discussions of the difficulties inherent in adequately describing the multiple hazards of EQ and FFE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless Mesh Networks (WMN) have proven to be a key technology for increased network coverage of Internet infrastructures. The development process for new protocols and architectures in the area of WMN is typically split into evaluation by network simulation and testing of a prototype in a test-bed. Testing a prototype in a real test-bed is time-consuming and expensive. Irrepressible external interferences can occur which makes debugging difficult. Moreover, the test-bed usually supports only a limited number of test topologies. Finally, mobility tests are impractical. Therefore, we propose VirtualMesh as a new testing architecture which can be used before going to a real test-bed. It provides instruments to test the real communication software including the network stack inside a controlled environment. VirtualMesh is implemented by capturing real traffic through a virtual interface at the mesh nodes. The traffic is then redirected to the network simulator OMNeT++. In our experiments, VirtualMesh has proven to be scalable and introduces moderate delays. Therefore, it is suitable for predeployment testing of communication software for WMNs.