932 resultados para statistical equivalence
Resumo:
To enable a mathematically and physically sound execution of the fatigue test and a correct interpretation of its results, statistical evaluation methods are used to assist in the analysis of fatigue testing data. The main objective of this work is to develop step-by-stepinstructions for statistical analysis of the laboratory fatigue data. The scopeof this project is to provide practical cases about answering the several questions raised in the treatment of test data with application of the methods and formulae in the document IIW-XIII-2138-06 (Best Practice Guide on the Statistical Analysis of Fatigue Data). Generally, the questions in the data sheets involve some aspects: estimation of necessary sample size, verification of the statistical equivalence of the collated sets of data, and determination of characteristic curves in different cases. The series of comprehensive examples which are given in this thesis serve as a demonstration of the various statistical methods to develop a sound procedure to create reliable calculation rules for the fatigue analysis.
Resumo:
Non-Equilibrium Statistical Mechanics is a broad subject. Grossly speaking, it deals with systems which have not yet relaxed to an equilibrium state, or else with systems which are in a steady non-equilibrium state, or with more general situations. They are characterized by external forcing and internal fluxes, resulting in a net production of entropy which quantifies dissipation and the extent by which, by the Second Law of Thermodynamics, time-reversal invariance is broken. In this thesis we discuss some of the mathematical structures involved with generic discrete-state-space non-equilibrium systems, that we depict with networks in all analogous to electrical networks. We define suitable observables and derive their linear regime relationships, we discuss a duality between external and internal observables that reverses the role of the system and of the environment, we show that network observables serve as constraints for a derivation of the minimum entropy production principle. We dwell on deep combinatorial aspects regarding linear response determinants, which are related to spanning tree polynomials in graph theory, and we give a geometrical interpretation of observables in terms of Wilson loops of a connection and gauge degrees of freedom. We specialize the formalism to continuous-time Markov chains, we give a physical interpretation for observables in terms of locally detailed balanced rates, we prove many variants of the fluctuation theorem, and show that a well-known expression for the entropy production due to Schnakenberg descends from considerations of gauge invariance, where the gauge symmetry is related to the freedom in the choice of a prior probability distribution. As an additional topic of geometrical flavor related to continuous-time Markov chains, we discuss the Fisher-Rao geometry of nonequilibrium decay modes, showing that the Fisher matrix contains information about many aspects of non-equilibrium behavior, including non-equilibrium phase transitions and superposition of modes. We establish a sort of statistical equivalence principle and discuss the behavior of the Fisher matrix under time-reversal. To conclude, we propose that geometry and combinatorics might greatly increase our understanding of nonequilibrium phenomena.
Resumo:
This research aimed to test particleboard with leucena (Leucaena leucocephala) wood particles and polyurethane resin castor oil based. The response variables are: modulus of rupture (MOR), internal adhesion (AI), apparent density (dap) and wood moisture content (um). The experiments were developed based on the methodological procedures of the ABNT NBR 14810:2002 standard. The particleboards were manufactured by hot-pressing at 4MPa and 90°C, using timber particles with 5% of moisture content and 10% of monocomponent and bicomponent polyurethane resin. The higher moisture content was achieved when the monocomponent polyurethane resin was used. The bicomponent polyurethane resin provided a percent increase of 43.7% and 22.7% on the modulus of rupture and apparent density, respectively, when compared to the standard limit. The internal adhesion of the panels manufactured with monocomponent resin was 2.45 times higher than the standard limit. The confidence interval between means revealed that the internal adhesion and apparent density exhibited statistical equivalence. A good correlation between the internal adhesion and apparent density was found, for this reason it was possible to estimate the internal adhesion of the panels based on the apparent density data.
Resumo:
Analisa o efeito de três tratamentos físicos empregados em resíduos de serraria no comportamento à compressão de compósitos cimentomadeira.Foi utilizado resíduo de composição variada (dicotiledôneas) coletado em serrarias da Zona Metropolitana de Belém, e estudou-se o efeito dos seguintes tratamentos: secagem em estufa, banho térmico e a mineralização com sulfato de alumínio. O aporte inicial da pesquisa foi o levantamento do referencial teórico como suporte para o programa experimental. Em seguida, os materiais constituintes do compósito foram caracterizados segundo as Normas Brasileiras vigentes, e depois de homogeneizados, foram conduzidos ensaios no estado fresco. Os ensaios no estado endurecido foram realizados de forma tal que a característica mecânica observada para a avaliação do efeito dos referidos tratamentos foi a resistência à compressão. Os resultados de tensão na compressão indicaram que os resíduos utilizados são inibitórios à hidratação do cimento, bem como influenciam negativamente a resistência à compressão; tais efeitos estão relacionados com a absorção de água pelos resíduos de madeira, e posterior liberação na matriz. Os resíduos tratados com secagem em estufa proporcionaram as menores resistências à compressão observadas para os compósitos confeccionados; o tratamento de banho térmico e o tratamento de mineralização apresentaram melhor desempenho que o primeiro, contudo, mostraram-se estatisticamente equivalentes a partir de 3 dias de idade, fazendo com que a instância decisória em utilizar um ou outro recaia sobre variáveis distintas do desempenho à compressão.
Resumo:
The tests used to obtain the stiffness properties of wood are made with two loading cycles, as defined by the Brazilian standard ABNT NBR 7190 (Design of Timber Structures). However, the possibility of reducing the number of cycles allows decrease the operating time of the machine, resulting in reduced spending on electricity used during the tests. This research aimed to investigate, with the aid of the analysis of variance (ANOVA), the influence of the use of three load cycles to obtain the modulus of elasticity in compression parallel to grain (Ec0), in tensile parallel to the grain (Et0), in bending (Em) and in compression perpendicular to the grain (Ec90) of Angico Preto (Anadenanthera macrocarpa) wood specie. For the number of cycles and stiffness were manufactured 12 samples, totaling 144 specimens. The results of the ANOVA revealed statistical equivalence between the stiffness properties for both load cycle numbers evaluated, indicating that it is possible to carry out the tests with a single charge cycle, allowing savings in time and energy in the operation of the equipment.
Resumo:
Using the functional integral formalism for the statistical generating functional in the statistical (finite temperature) quantum field theory, we prove the equivalence of many-photon Greens functions in the Duffin-Kennner-Petiau and Klein-Gordon-Fock statistical quantum field theories. As an illustration, we calculate the one-loop polarization operators in both theories and demonstrate their coincidence.
Resumo:
We prove the equivalence of many-gluon Green's functions in the Duffin-Kemmer-Petieu and Klein-Gordon-Fock statistical quantum field theories. The proof is based on the functional integral formulation for the statistical generating functional in a finite-temperature quantum field theory. As an illustration, we calculate one-loop polarization operators in both theories and show that their expressions indeed coincide.
Resumo:
We consider the general response theory recently proposed by Ruelle for describing the impact of small perturbations to the non-equilibrium steady states resulting from Axiom A dynamical systems. We show that the causality of the response functions entails the possibility of writing a set of Kramers-Kronig (K-K) relations for the corresponding susceptibilities at all orders of nonlinearity. Nonetheless, only a special class of directly observable susceptibilities obey K-K relations. Specific results are provided for the case of arbitrary order harmonic response, which allows for a very comprehensive K-K analysis and the establishment of sum rules connecting the asymptotic behavior of the harmonic generation susceptibility to the short-time response of the perturbed system. These results set in a more general theoretical framework previous findings obtained for optical systems and simple mechanical models, and shed light on the very general impact of considering the principle of causality for testing self-consistency: the described dispersion relations constitute unavoidable benchmarks that any experimental and model generated dataset must obey. The theory exposed in the present paper is dual to the time-dependent theory of perturbations to equilibrium states and to non-equilibrium steady states, and has in principle similar range of applicability and limitations. In order to connect the equilibrium and the non equilibrium steady state case, we show how to rewrite the classical response theory by Kubo so that response functions formally identical to those proposed by Ruelle, apart from the measure involved in the phase space integration, are obtained. These results, taking into account the chaotic hypothesis by Gallavotti and Cohen, might be relevant in several fields, including climate research. In particular, whereas the fluctuation-dissipation theorem does not work for non-equilibrium systems, because of the non-equivalence between internal and external fluctuations, K-K relations might be robust tools for the definition of a self-consistent theory of climate change.
Resumo:
We show that the parametrized Wave-Packet Phase Space representation, which has been studied earlier by one of the authors, is equivalent to a Squeezed States Phase Space Representation of quantum mechanics. © 1988.
Resumo:
We explore the meaning of information about quantities of interest. Our approach is divided in two scenarios: the analysis of observations and the planning of an experiment. First, we review the Sufficiency, Conditionality and Likelihood principles and how they relate to trivial experiments. Next, we review Blackwell Sufficiency and show that sampling without replacement is Blackwell Sufficient for sampling with replacement. Finally, we unify the two scenarios presenting an extension of the relationship between Blackwell Equivalence and the Likelihood Principle.
Resumo:
Equivalence testing is growing in use in scientific research outside of its traditional role in the drug approval process. Largely due to its ease of use and recommendation from the United States Food and Drug Administration guidance, the most common statistical method for testing (bio)equivalence is the two one-sided tests procedure (TOST). Like classical point-null hypothesis testing, TOST is subject to multiplicity concerns as more comparisons are made. In this manuscript, a condition that bounds the family-wise error rate (FWER) using TOST is given. This condition then leads to a simple solution for controlling the FWER. Specifically, we demonstrate that if all pairwise comparisons of k independent groups are being evaluated for equivalence, then simply scaling the nominal Type I error rate down by (k - 1) is sufficient to maintain the family-wise error rate at the desired value or less. The resulting rule is much less conservative than the equally simple Bonferroni correction. An example of equivalence testing in a non drug-development setting is given.
Resumo:
The country-product-dummy (CPD) method, originally proposed in Summers (1973), has recently been revisited in its weighted formulation to handle a variety of data related situations (Rao and Timmer, 2000, 2003; Heravi et al., 2001; Rao, 2001; Aten and Menezes, 2002; Heston and Aten, 2002; Deaton et al., 2004). The CPD method is also increasingly being used in the context of hedonic modelling instead of its original purpose of filling holes in Summers (1973). However, the CPD method is seen, among practitioners, as a black box due to its regression formulation. The main objective of the paper is to establish equivalence of purchasing power parities and international prices derived from the application of the weighted-CPD method with those arising out of the Rao-system for multilateral comparisons. A major implication of this result is that the weighted-CPD method would then be a natural method of aggregation at all levels of aggregation within the context of international comparisons.
Resumo:
In physics, one attempts to infer the rules governing a system given only the results of imperfect measurements. Hence, microscopic theories may be effectively indistinguishable experimentally. We develop an operationally motivated procedure to identify the corresponding equivalence classes of states, and argue that the renormalization group (RG) arises from the inherent ambiguities associated with the classes: one encounters flow parameters as, e.g., a regulator, a scale, or a measure of precision, which specify representatives in a given equivalence class. This provides a unifying framework and reveals the role played by information in renormalization. We validate this idea by showing that it justifies the use of low-momenta n-point functions as statistically relevant observables around a Gaussian hypothesis. These results enable the calculation of distinguishability in quantum field theory. Our methods also provide a way to extend renormalization techniques to effective models which are not based on the usual quantum-field formalism, and elucidates the relationships between various type of RG.