965 resultados para joint hypothesis tests


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Long-term survival models have historically been considered for analyzing time-to-event data with long-term survivors fraction. However, situations in which a fraction (1 - p) of systems is subject to failure from independent competing causes of failure, while the remaining proportion p is cured or has not presented the event of interest during the time period of the study, have not been fully considered in the literature. In order to accommodate such situations, we present in this paper a new long-term survival model. Maximum likelihood estimation procedure is discussed as well as interval estimation and hypothesis tests. A real dataset illustrates the methodology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION: The accurate evaluation of error of measurement (EM) is extremely important as in growth studies as in clinical research, since there are usually quantitatively small changes. In any study it is important to evaluate the EM to validate the results and, consequently, the conclusions. Because of its extreme simplicity, the Dahlberg formula is largely used worldwide, mainly in cephalometric studies. OBJECTIVES: (I) To elucidate the formula proposed by Dahlberg in 1940, evaluating it by comparison with linear regression analysis; (II) To propose a simple methodology to analyze the results, which provides statistical elements to assist researchers in obtaining a consistent evaluation of the EM. METHODS: We applied linear regression analysis, hypothesis tests on its parameters and a formula involving the standard deviation of error of measurement and the measured values. RESULTS AND CONCLUSION: we introduced an error coefficient, which is a proportion related to the scale of observed values. This provides new parameters to facilitate the evaluation of the impact of random errors in the research final results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La protección de las aguas subterráneas es una prioridad de la política medioambiental de la UE. Por ello ha establecido un marco de prevención y control de la contaminación, que incluye provisiones para evaluar el estado químico de las aguas y reducir la presencia de contaminantes en ellas. Las herramientas fundamentales para el desarrollo de dichas políticas son la Directiva Marco del Agua y la Directiva Hija de Aguas Subterráneas. Según ellas, las aguas se consideran en buen estado químico si: • la concentración medida o prevista de nitratos no supera los 50 mg/l y la de ingredientes activos de plaguicidas, de sus metabolitos y de los productos de reacción no supera el 0,1 μg/l (0,5 μg/l para el total de los plaguicidas medidos) • la concentración de determinadas sustancias de riesgo es inferior al valor umbral fijado por los Estados miembros; se trata, como mínimo, del amonio, arsénico, cadmio, cloruro, plomo, mercurio, sulfatos, tricloroetileno y tetracloroetileno • la concentración de cualquier otro contaminante se ajusta a la definición de buen estado químico enunciada en el anexo V de la Directiva marco sobre la política de aguas • en caso de superarse el valor correspondiente a una norma de calidad o a un valor umbral, una investigación confirma, entre otros puntos, la falta de riesgo significativo para el medio ambiente. Analizar el comportamiento estadístico de los datos procedentes de la red de seguimiento y control puede resultar considerablemente complejo, debido al sesgo positivo que suelen presentar dichos datos y a su distribución asimétrica, debido a la existencia de valores anómalos y diferentes tipos de suelos y mezclas de contaminantes. Además, la distribución de determinados componentes en el agua subterránea puede presentar concentraciones por debajo del límite de detección o no ser estacionaria debida a la existencia de tendencias lineales o estacionales. En el primer caso es necesario realizar estimaciones de esos valores desconocidos, mediante procedimientos que varían en función del porcentaje de valores por debajo del límite de detección y el número de límites de detección aplicables. En el segundo caso es necesario eliminar las tendencias de forma previa a la realización de contrastes de hipótesis sobre los residuos. Con esta tesis se ha pretendido establecer las bases estadísticas para el análisis riguroso de los datos de las redes de calidad con objeto de realizar la evaluación del estado químico de las masas de agua subterránea para la determinación de tendencias al aumento en la concentración de contaminantes y para la detección de empeoramientos significativos, tanto en los casos que se ha fijado un estándar de calidad por el organismo medioambiental competente como en aquéllos que no ha sido así. Para diseñar una metodología que permita contemplar la variedad de casos existentes, se han analizado los datos de la Red Oficial de Seguimiento y Control del Estado Químico de las Aguas Subterráneas del Ministerio de Agricultura, Alimentación y Medio Ambiente (Magrama). A continuación, y dado que los Planes Hidrológicos de Cuenca son la herramienta básica de las Directivas, se ha seleccionado la Cuenca del Júcar, dada su designación como cuenca piloto en la estrategia de implementación común (CIS) de la Comisión Europea. El objetivo principal de los grupos de trabajo creados para ello se dirigió a implementar la Directiva Derivada de Agua Subterráneas y los elementos de la Directiva Marco del Agua relacionadas, en especial la toma de datos en los puntos de control y la preparación del primer Plan de Gestión de Cuencas Hidrográficas. Dada la extensión de la zona y con objeto de analizar una masa de agua subterránea (definida como la unidad de gestión en las Directivas), se ha seleccionado una zona piloto (Plana de Vinaroz Peñiscola) en la que se han aplicado los procedimientos desarrollados con objeto de determinar el estado químico de dicha masa. Los datos examinados no contienen en general valores de concentración de contaminantes asociados a fuentes puntuales, por lo que para la realización del estudio se han seleccionado valores de concentración de los datos más comunes, es decir, nitratos y cloruros. La estrategia diseñada combina el análisis de tendencias con la elaboración de intervalos de confianza cuando existe un estándar de calidad e intervalos de predicción cuando no existe o se ha superado dicho estándar. De forma análoga se ha procedido en el caso de los valores por debajo del límite de detección, tomando los valores disponibles en la zona piloto de la Plana de Sagunto y simulando diferentes grados de censura con objeto de comparar los resultados obtenidos con los intervalos producidos de los datos reales y verificar de esta forma la eficacia del método. El resultado final es una metodología general que integra los casos existentes y permite definir el estado químico de una masa de agua subterránea, verificar la existencia de impactos significativos en la calidad del agua subterránea y evaluar la efectividad de los planes de medidas adoptados en el marco del Plan Hidrológico de Cuenca. ABSTRACT Groundwater protection is a priority of the EU environmental policy. As a result, it has established a framework for prevention and control of pollution, which includes provisions for assessing the chemical status of waters and reducing the presence of contaminants in it. The measures include: • criteria for assessing the chemical status of groundwater bodies • criteria for identifying significant upward trends and sustained concentrations of contaminants and define starting points for reversal of such trends • preventing and limiting indirect discharges of pollutants as a result of percolation through soil or subsoil. The basic tools for the development of such policies are the Water Framework Directive and Groundwater Daughter Directive. According to them, the groundwater bodies are considered in good status if: • measured or predicted concentration of nitrate does not exceed 50 mg / l and the active ingredients of pesticides, their metabolites and reaction products do not exceed 0.1 mg / l (0.5 mg / l for total of pesticides measured) • the concentration of certain hazardous substances is below the threshold set by the Member States concerned, at least, of ammonium, arsenic, cadmium, chloride, lead, mercury, sulphates, trichloroethylene and tetrachlorethylene • the concentration of other contaminants fits the definition of good chemical status set out in Annex V of the Framework Directive on water policy • If the value corresponding to a quality standard or a threshold value is exceeded, an investigation confirms, among other things, the lack of significant risk to the environment. Analyzing the statistical behaviour of the data from the monitoring networks may be considerably complex due to the positive bias which often presents such information and its asymmetrical distribution, due to the existence of outliers and different soil types and mixtures of pollutants. Furthermore, the distribution of certain components in groundwater may have concentrations below the detection limit or may not be stationary due to the existence of linear or seasonal trends. In the first case it is necessary to estimate these unknown values, through procedures that vary according to the percentage of values below the limit of detection and the number of applicable limits of detection. In the second case removing trends is needed before conducting hypothesis tests on residuals. This PhD thesis has intended to establish the statistical basis for the rigorous analysis of data quality networks in order to conduct the evaluation of the chemical status of groundwater bodies for determining upward and sustained trends in pollutant concentrations and for the detection of significant deterioration in cases in which an environmental standard has been set by the relevant environmental agency and those that have not. Aiming to design a comprehensive methodology to include the whole range of cases, data from the Groundwater Official Monitoring and Control Network of the Ministry of Agriculture, Food and Environment (Magrama) have been analysed. Then, since River Basin Management Plans are the basic tool of the Directives, the Júcar river Basin has been selected. The main reason is its designation as a pilot basin in the common implementation strategy (CIS) of the European Commission. The main objective of the ad hoc working groups is to implement the Daughter Ground Water Directive and elements of the Water Framework Directive related to groundwater, especially the data collection at control stations and the preparation of the first River Basin Management Plan. Given the size of the area and in order to analyze a groundwater body (defined as the management unit in the Directives), Plana de Vinaroz Peñíscola has been selected as pilot area. Procedures developed to determine the chemical status of that body have been then applied. The data examined do not generally contain pollutant concentration values associated with point sources, so for the study concentration values of the most common data, i.e., nitrates and chlorides have been selected. The designed strategy combines trend analysis with the development of confidence intervals when there is a standard of quality and prediction intervals when there is not or the standard has been exceeded. Similarly we have proceeded in the case of values below the detection limit, taking the available values in Plana de Sagunto pilot area and simulating different degrees of censoring in order to compare the results obtained with the intervals achieved from the actual data and verify in this way the effectiveness of the method. The end result is a general methodology that integrates existing cases to define the chemical status of a groundwater body, verify the existence of significant impacts on groundwater quality and evaluate the effectiveness of the action plans adopted in the framework of the River Basin Management Plan.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Spatial structure of genetic variation within populations, an important interacting influence on evolutionary and ecological processes, can be analyzed in detail by using spatial autocorrelation statistics. This paper characterizes the statistical properties of spatial autocorrelation statistics in this context and develops estimators of gene dispersal based on data on standing patterns of genetic variation. Large numbers of Monte Carlo simulations and a wide variety of sampling strategies are utilized. The results show that spatial autocorrelation statistics are highly predictable and informative. Thus, strong hypothesis tests for neutral theory can be formulated. Most strikingly, robust estimators of gene dispersal can be obtained with practical sample sizes. Details about optimal sampling strategies are also described.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. ^ Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. ^ Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. ^ Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As ligações adesivas têm sido utilizadas em diversas áreas de aplicação. A utilização das juntas adesivas em aplicações industriais tem vindo a aumentar nos últimos anos, por causa das vantagens significativas que apresentam comparativamente com os métodos tradicionais de ligação tais como soldadura, ligações aparafusadas e rebitadas. A redução de peso, redução de concentrações de tensões e facilidade de fabrico são algumas das principais vantagens das ligações adesivas. Devido à crescente utilização das ligações adesivas, torna-se necessário a existência de ferramentas que permitam prever a resistência das juntas com elevada precisão. Assim, para a análise de juntas adesivas, está a ser cada vez mais utilizado o método de Elementos Finitos. Neste âmbito o Método de Elementos Finitos eXtendido (MEFX) perfila-se como um método capaz de prever o comportamento da junta, embora este ainda não esteja convenientemente estudado no que diz respeito à aplicação a juntas adesivas. Neste trabalho é apresentado um estudo experimental e numérico pelo MEFX de juntas de sobreposição dupla, nas quais são aplicados adesivos que variam desde frágeis e rígidos, como o caso do Araldite® AV138, até adesivos mais dúcteis, como o Araldite® 2015 e o Sikaforce® 7888. Foram considerados substratos de alumínio (AW6082-T651) em juntas com diferentes comprimentos de sobreposição, sendo sujeitos a esforços de tração de forma a avaliar o seu desempenho. Na análise numérica foi realizada uma análise da distribuição de tensões na camada adesiva, a previsão da resistência das juntas pelo MEFX segundo critérios de iniciação de dano baseados em tensões e deformações, e ainda um estudo sobre o critério energético de propagação de dano. A análise por MEFX revelou que este método é bastante preciso quando usados os critérios de iniciação de dano MAXS e QUADS, e parâmetro com valor de 1 no critério energético de propagação de dano. Apesar de ser um método pouco estudado na literatura comparativamente com outros, o MEFX apresentou resultados muito satisfatórios.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper examines the hysteresis hypothesis in the Brazilian industrialized exports using a time series analysis. This hypothesis finds an empirical representation into the nonlinear adjustments of the exported quantity to relative price changes. Thus, the threshold cointegration analysis proposed by Balke and Fomby [Balke, N.S. and Fomby, T.B. Threshold Cointegration. International Economic Review, 1997; 38; 627-645.] was used for estimating models with asymmetric adjustment of the error correction term. Amongst sixteen industrial sectors selected, there was evidence of nonlinearities in the residuals of long-run relationships of supply or demand for exports in nine of them. These nonlinearities represent asymmetric and/or discontinuous responses of exports to different representative measures of real exchange rates, in addition to other components of long-run demand or supply equations. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It is proved the algebraic equality between Jennrich's (1970) asymptotic$X^2$ test for equality of correlation matrices, and a Wald test statisticderived from Neudecker and Wesselman's (1990) expression of theasymptoticvariance matrix of the sample correlation matrix.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the trend in molecular epidemiology towards both genome-wide association studies and complex modelling, the need for large sample sizes to detect small effects and to allow for the estimation of many parameters within a model continues to increase. Unfortunately, most methods of association analysis have been restricted to either a family-based or a case-control design, resulting in the lack of synthesis of data from multiple studies. Transmission disequilibrium-type methods for detecting linkage disequilibrium from family data were developed as an effective way of preventing the detection of association due to population stratification. Because these methods condition on parental genotype, however, they have precluded the joint analysis of family and case-control data, although methods for case-control data may not protect against population stratification and do not allow for familial correlations. We present here an extension of a family-based association analysis method for continuous traits that will simultaneously test for, and if necessary control for, population stratification. We further extend this method to analyse binary traits (and therefore family and case-control data together) and accurately to estimate genetic effects in the population, even when using an ascertained family sample. Finally, we present the power of this binary extension for both family-only and joint family and case-control data, and demonstrate the accuracy of the association parameter and variance components in an ascertained family sample.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Tämän tutkielman tavoitteena on selvittää Venäjän, Slovakian, Tsekin, Romanian, Bulgarian, Unkarin ja Puolan osakemarkkinoiden heikkojen ehtojen tehokkuutta. Tämä tutkielma on kvantitatiivinen tutkimus ja päiväkohtaiset indeksin sulkemisarvot kerättiin Datastreamin tietokannasta. Data kerättiin pörssien ensimmäisestä kaupankäyntipäivästä aina vuoden 2006 elokuun loppuun saakka. Analysoinnin tehostamiseksi dataa tutkittiin koko aineistolla, sekä kahdella aliperiodilla. Osakemarkkinoiden tehokkuutta on testattu neljällä tilastollisella metodilla, mukaan lukien autokorrelaatiotesti ja epäparametrinen runs-testi. Tavoitteena on myös selvittääesiintyykö kyseisillä markkinoilla viikonpäiväanomalia. Viikonpäiväanomalian esiintymistä tutkitaan käyttämällä pienimmän neliösumman menetelmää (OLS). Viikonpäiväanomalia on löydettävissä kaikilta edellä mainituilta osakemarkkinoilta paitsi Tsekin markkinoilta. Merkittävää, positiivista tai negatiivista autokorrelaatiota, on löydettävissä kaikilta osakemarkkinoilta, myös Ljung-Box testi osoittaa kaikkien markkinoiden tehottomuutta täydellä periodilla. Osakemarkkinoiden satunnaiskulku hylätään runs-testin perusteella kaikilta muilta paitsi Slovakian osakemarkkinoilla, ainakin tarkastellessa koko aineistoa tai ensimmäistä aliperiodia. Aineisto ei myöskään ole normaalijakautunut minkään indeksin tai aikajakson kohdalla. Nämä havainnot osoittavat, että kyseessä olevat markkinat eivät ole heikkojen ehtojen mukaan tehokkaita

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This Paper Studies Tests of Joint Hypotheses in Time Series Regression with a Unit Root in Which Weakly Dependent and Heterogeneously Distributed Innovations Are Allowed. We Consider Two Types of Regression: One with a Constant and Lagged Dependent Variable, and the Other with a Trend Added. the Statistics Studied Are the Regression \"F-Test\" Originally Analysed by Dickey and Fuller (1981) in a Less General Framework. the Limiting Distributions Are Found Using Functinal Central Limit Theory. New Test Statistics Are Proposed Which Require Only Already Tabulated Critical Values But Which Are Valid in a Quite General Framework (Including Finite Order Arma Models Generated by Gaussian Errors). This Study Extends the Results on Single Coefficients Derived in Phillips (1986A) and Phillips and Perron (1986).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Most panel unit root tests are designed to test the joint null hypothesis of a unit root for each individual series in a panel. After a rejection, it will often be of interest to identify which series can be deemed to be stationary and which series can be deemed nonstationary. Researchers will sometimes carry out this classification on the basis of n individual (univariate) unit root tests based on some ad hoc significance level. In this paper, we demonstrate how to use the false discovery rate (FDR) in evaluating I(1)=I(0) classifications based on individual unit root tests when the size of the cross section (n) and time series (T) dimensions are large. We report results from a simulation experiment and illustrate the methods on two data sets.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study tested a dynamic field theory (DFT) of spatial working memory and an associated spatial precision hypothesis (SPH). Between 3 and 6 years of age, there is a qualitative shift in how children use reference axes to remember locations: 3-year-olds’ spatial recall responses are biased toward reference axes after short memory delays, whereas 6-year-olds’ responses are biased away from reference axes. According to the DFT and the SPH, quantitative improvements over development in the precision of excitatory and inhibitory working memory processes lead to this qualitative shift. Simulations of the DFT in Experiment 1 predict that improvements in precision should cause the spatial range of targets attracted toward a reference axis to narrow gradually over development, with repulsion emerging and gradually increasing until responses to most targets show biases away from the axis. Results from Experiment 2 with 3- to 5-year-olds support these predictions. Simulations of the DFT in Experiment 3 quantitatively fit the empirical results and offer insights into the neural processes underlying this developmental change.