977 resultados para CHANGE-POINT


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Regime shifts have been reported in many marine ecosystems, and are often expressed as an abrupt change occurring in multiple physical and biological components of the system. In the Gulf of Alaska, a regime shift in the late 1970s was observed, indicated by an abrupt increase in sea surface temperature and major shifts in the catch of many fish species. This late 1970s regime shift in the Gulf of Alaska was followed by another shift in the late 1980s, not as pervasive as the 1977 shift, but which nevertheless did not return to the prior state. A thorough understanding of the extent and mechanisms leading to such regime shifts is challenged by data paucity in time and space. We investigate the ability of a suite of ocean biogeochemistry models of varying complexity to simulate regime shifts in the Gulf of Alaska by examining the presence of abrupt changes in time series of physical variables (sea surface temperature and mixed layer depth), nutrients and biological variables (chlorophyll, primary productivity and plankton biomass) using change-point analysis. Our study demonstrates that ocean biogeochemical models are capable of simulating the late 1970s shift, indicating an abrupt increase in sea surface temperature forcing followed by an abrupt decrease in nutrients and biological productivity. This predicted shift is consistent among all the models, although some of them exhibit an abrupt transition (i.e. a significant shift from one year to the next), whereas others simulate a smoother transition. Some models further suggest that the late 1980s shift was constrained by changes in mixed layer depth. Our study demonstrates that ocean biogeochemical can successfully simulate regime shifts in the Gulf of Alaska region, thereby providing better understanding of how changes in physical conditions are propagated from lower to upper trophic levels through bottom-up controls.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper analyzes the measure of systemic importance ∆CoV aR proposed by Adrian and Brunnermeier (2009, 2010) within the context of a similar class of risk measures used in the risk management literature. In addition, we develop a series of testing procedures, based on ∆CoV aR, to identify and rank the systemically important institutions. We stress the importance of statistical testing in interpreting the measure of systemic importance. An empirical application illustrates the testing procedures, using equity data for three European banks.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A small group of phytoplankton species that produce toxic or allelopathic chemicals has a significant effect on plankton dynamics in marine ecosystems. The species of non-toxic phytoplankton, which are large in number, are affected by the toxin-allelopathy of those species. By analysis of the abundance data of marine phytoplankton collected from the North-West coast of the Bay of Bengal, an empirical relationship between the abundance of the potential toxin-producing species and the species diversity of the non-toxic phytoplankton is formulated. A change-point analysis demonstrates that the diversity of non-toxic phytoplankton increases with the increase of toxic species up to a certain level. However, for a massive increase of the toxin-producing species the diversity of phytoplankton at species level reduces gradually. Following the results, a deterministic relationship between the abundance of toxic phytoplankton and the diversity of non-toxic phytoplankton is developed. The abundance–diversity relationship develops a unimodal pathway through which the abundance of toxic species regulates the diversity of phytoplankton. These results contribute to the current understanding of the coexistence and biodiversity of phytoplankton, the top-down vs. bottom-up debate, and to that of abundance–diversity relationship in marine ecosystems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this work is to verify the stability of the relationship between real activity and interest rate spread. The test is based on Chen (1988) and Osorio and Galea (2006). The analysis is applied to Chile and the United States, from 1980 to 1999. In general, in both cases the relationship was statistically significant in early 80s, but a break point is found in both countries during that decades, suggesting that the relationship depends on the monetary rule follow by the Central Bank.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Driving phenomenon is a repetitive process, that permits sequential learning under identifying the proper change periods. Sequential filtering is widely used for tracking and prediction of state dynamics. However, it suffers at abrupt changes, which cause sudden incremental prediction error. We provide a sequential filtering approach using online Bayesian detection of change points to decrease prediction error generally, and specifically at abrupt changes. The approach learns from optimally detected segments for identifying driving behaviour. Change points detection is done by the Pruned Exact Linear Time algorithm. Computational cost of our approach is bounded by the cost of the implemented sequential filter. This computational performance is suitable to the online nature of motion simulator's delay reduction. The approach was tested on a simulated driving scenario using Vortex by CM Labs. The state dimensions are simulated 2D space coordinates, and velocity. Particle filter was used for online sequential filtering. Prediction results show that change-point detection improves the quality of state estimation compared to traditional sequential filters, and is more suitable for predicting behavioural activities.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Anaerobic threshold (AT) is usually estimated as a change point problem by visual analysis of the cardiorespiratory response to incremental dynamic exercise. In this study, two phase linear (TPL) models of the linear-linear and linear-quadratic type were used for the estimation of AT. The correlation coefficient between the classical and statistical approaches was 0.88, and 0.89 after outlier exclusion. The TPL models provide a simple method for estimating AT that can be easily implemented using a digital computer for the automatic pattern recognition of AT.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Changepoint regression models have originally been developed in connection with applications in quality control, where a change from the in-control to the out-of-control state has to be detected based on the avaliable random observations. Up to now various changepoint models have been suggested for differents applications like reliability, econometrics or medicine. In many practical situations the covariate cannot be measured precisely and an alternative model are the errors in variable regression models. In this paper we study the regression model with errors in variables with changepoint from a Bayesian approach. From the simulation study we found that the proposed procedure produces estimates suitable for the changepoint and all other model parameters.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In many clinical trials to evaluate treatment efficacy, it is believed that there may exist latent treatment effectiveness lag times after which medical procedure or chemical compound would be in full effect. In this article, semiparametric regression models are proposed and studied to estimate the treatment effect accounting for such latent lag times. The new models take advantage of the invariance property of the additive hazards model in marginalizing over random effects, so parameters in the models are easy to be estimated and interpreted, while the flexibility without specifying baseline hazard function is kept. Monte Carlo simulation studies demonstrate the appropriateness of the proposed semiparametric estimation procedure. Data collected in the actual randomized clinical trial, which evaluates the effectiveness of biodegradable carmustine polymers for treatment of recurrent brain tumors, are analyzed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

My dissertation focuses mainly on Bayesian adaptive designs for phase I and phase II clinical trials. It includes three specific topics: (1) proposing a novel two-dimensional dose-finding algorithm for biological agents, (2) developing Bayesian adaptive screening designs to provide more efficient and ethical clinical trials, and (3) incorporating missing late-onset responses to make an early stopping decision. Treating patients with novel biological agents is becoming a leading trend in oncology. Unlike cytotoxic agents, for which toxicity and efficacy monotonically increase with dose, biological agents may exhibit non-monotonic patterns in their dose-response relationships. Using a trial with two biological agents as an example, we propose a phase I/II trial design to identify the biologically optimal dose combination (BODC), which is defined as the dose combination of the two agents with the highest efficacy and tolerable toxicity. A change-point model is used to reflect the fact that the dose-toxicity surface of the combinational agents may plateau at higher dose levels, and a flexible logistic model is proposed to accommodate the possible non-monotonic pattern for the dose-efficacy relationship. During the trial, we continuously update the posterior estimates of toxicity and efficacy and assign patients to the most appropriate dose combination. We propose a novel dose-finding algorithm to encourage sufficient exploration of untried dose combinations in the two-dimensional space. Extensive simulation studies show that the proposed design has desirable operating characteristics in identifying the BODC under various patterns of dose-toxicity and dose-efficacy relationships. Trials of combination therapies for the treatment of cancer are playing an increasingly important role in the battle against this disease. To more efficiently handle the large number of combination therapies that must be tested, we propose a novel Bayesian phase II adaptive screening design to simultaneously select among possible treatment combinations involving multiple agents. Our design is based on formulating the selection procedure as a Bayesian hypothesis testing problem in which the superiority of each treatment combination is equated to a single hypothesis. During the trial conduct, we use the current values of the posterior probabilities of all hypotheses to adaptively allocate patients to treatment combinations. Simulation studies show that the proposed design substantially outperforms the conventional multi-arm balanced factorial trial design. The proposed design yields a significantly higher probability for selecting the best treatment while at the same time allocating substantially more patients to efficacious treatments. The proposed design is most appropriate for the trials combining multiple agents and screening out the efficacious combination to be further investigated. The proposed Bayesian adaptive phase II screening design substantially outperformed the conventional complete factorial design. Our design allocates more patients to better treatments while at the same time providing higher power to identify the best treatment at the end of the trial. Phase II trial studies usually are single-arm trials which are conducted to test the efficacy of experimental agents and decide whether agents are promising to be sent to phase III trials. Interim monitoring is employed to stop the trial early for futility to avoid assigning unacceptable number of patients to inferior treatments. We propose a Bayesian single-arm phase II design with continuous monitoring for estimating the response rate of the experimental drug. To address the issue of late-onset responses, we use a piece-wise exponential model to estimate the hazard function of time to response data and handle the missing responses using the multiple imputation approach. We evaluate the operating characteristics of the proposed method through extensive simulation studies. We show that the proposed method reduces the total length of the trial duration and yields desirable operating characteristics for different physician-specified lower bounds of response rate with different true response rates.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective. In 2009, the International Expert Committee recommended the use of HbA1c test for diagnosis of diabetes. Although it has been recommended for the diagnosis of diabetes, its precise test performance among Mexican Americans is uncertain. A strong “gold standard” would rely on repeated blood glucose measurement on different days, which is the recommended method for diagnosing diabetes in clinical practice. Our objective was to assess test performance of HbA1c in detecting diabetes and pre-diabetes against repeated fasting blood glucose measurement for the Mexican American population living in United States-Mexico border. Moreover, we wanted to find out a specific and precise threshold value of HbA1c for Diabetes Mellitus (DM) and pre-diabetes for this high-risk population which might assist in better diagnosis and better management of patient diabetes. ^ Research design and methods. We used CCHC dataset for our study. In 2004, the Cameron County Hispanic Cohort (CCHC), now numbering 2,574, was established drawn from randomly selected households on the basis of 2000 Census tract data. The CCHC study randomly selected a subset of people (aged 18-64 years) in CCHC cohort households to determine the influence of SES on diabetes and obesity. Among the participants in Cohort-2000, 67.15% are female; all are Hispanic. ^ Individuals were defined as having diabetes mellitus (Fasting plasma glucose [FPG] ≥ 126 mg/dL or pre-diabetes (100 ≤ FPG < 126 mg/dL). HbA1c test performance was evaluated using receiver operator characteristic (ROC) curves. Moreover, change-point models were used to determine HbA1c thresholds compatible with FPG thresholds for diabetes and pre-diabetes. ^ Results. When assessing Fasting Plasma Glucose (FPG) is used to detect diabetes, the sensitivity and specificity of HbA1c≥ 6.5% was 75% and 87% respectively (area under the curve 0.895). Additionally, when assessing FPG to detect pre-diabetes, the sensitivity and specificity of HbA1c≥ 6.0% (ADA recommended threshold) was 18% and 90% respectively. The sensitivity and specificity of HbA1c≥ 5.7% (International Expert Committee recommended threshold) for detecting pre-diabetes was 31% and 78% respectively. ROC analyses suggest HbA1c as a sound predictor of diabetes mellitus (area under the curve 0.895) but a poorer predictor for pre-diabetes (area under the curve 0.632). ^ Conclusions. Our data support the current recommendations for use of HbA1c in the diagnosis of diabetes for the Mexican American population as it has shown reasonable sensitivity, specificity and accuracy against repeated FPG measures. However, use of HbA1c may be premature for detecting pre-diabetes in this specific population because of the poor sensitivity with FPG. It might be the case that HbA1c is differentiating the cases more effectively who are at risk of developing diabetes. Following these pre-diabetic individuals for a longer-term for the detection of incident diabetes may lead to more confirmatory result.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ce mémoire a pour but de déterminer des nouvelles méthodes de détection de rupture et/ou de tendance. Après une brève introduction théorique sur les splines, plusieurs méthodes de détection de rupture existant déjà dans la littérature seront présentées. Puis, de nouvelles méthodes de détection de rupture qui utilisent les splines et la statistique bayésienne seront présentées. De plus, afin de bien comprendre d’où provient la méthode utilisant la statistique bayésienne, une introduction sur la théorie bayésienne sera présentée. À l’aide de simulations, nous effectuerons une comparaison de la puissance de toutes ces méthodes. Toujours en utilisant des simulations, une analyse plus en profondeur de la nouvelle méthode la plus efficace sera effectuée. Ensuite, celle-ci sera appliquée sur des données réelles. Une brève conclusion fera une récapitulation de ce mémoire.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ce mémoire a pour but de déterminer des nouvelles méthodes de détection de rupture et/ou de tendance. Après une brève introduction théorique sur les splines, plusieurs méthodes de détection de rupture existant déjà dans la littérature seront présentées. Puis, de nouvelles méthodes de détection de rupture qui utilisent les splines et la statistique bayésienne seront présentées. De plus, afin de bien comprendre d’où provient la méthode utilisant la statistique bayésienne, une introduction sur la théorie bayésienne sera présentée. À l’aide de simulations, nous effectuerons une comparaison de la puissance de toutes ces méthodes. Toujours en utilisant des simulations, une analyse plus en profondeur de la nouvelle méthode la plus efficace sera effectuée. Ensuite, celle-ci sera appliquée sur des données réelles. Une brève conclusion fera une récapitulation de ce mémoire.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A recent development of the Markov chain Monte Carlo (MCMC) technique is the emergence of MCMC samplers that allow transitions between different models. Such samplers make possible a range of computational tasks involving models, including model selection, model evaluation, model averaging and hypothesis testing. An example of this type of sampler is the reversible jump MCMC sampler, which is a generalization of the Metropolis-Hastings algorithm. Here, we present a new MCMC sampler of this type. The new sampler is a generalization of the Gibbs sampler, but somewhat surprisingly, it also turns out to encompass as particular cases all of the well-known MCMC samplers, including those of Metropolis, Barker, and Hastings. Moreover, the new sampler generalizes the reversible jump MCMC. It therefore appears to be a very general framework for MCMC sampling. This paper describes the new sampler and illustrates its use in three applications in Computational Biology, specifically determination of consensus sequences, phylogenetic inference and delineation of isochores via multiple change-point analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We hypothesized that fishes in short-hydroperiod wetlands display pulses in activity tied to seasonal flooding and drying, with relatively low activity during intervening periods. To evaluate this hypothesis, sampling devices that funnel fish into traps (drift fences) were used to investigate fish movement across the Everglades, U.S.A. Samples were collected at six sites in the Rocky Glades, a seasonally flooded karstic habitat located on the southeastern edge of the Everglades. Four species that display distinct recovery patterns following drought in long-hydroperiod wetlands were studied: eastern mosquitofish (Gambusia holbrooki) and flagfish (Jordanella floridae) (rapid recovery); and bluefin killifish (Lucania goodei) and least killifish (Heterandria formosa) (slow recovery). Consistent with our hypothesized conceptual model, fishes increased movement soon after flooding (immigration period) and just before drying (emigration period), but decreased activity in the intervening foraging period. We also found that eastern mosquitofish and flagfish arrived earlier and showed stronger responses to hydrological variation than either least killifish or bluefin killifish. We concluded that these fishes actively colonize and escape ephemeral wetlands in response to flooding and drying, and display species-specific differences related to flooding and drying that reflect differences in dispersal ability. These results have important implications for Everglades fish metacommunity dynamics.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Despite evidence from a number of Earth systems that abrupt temporal changes known as regime shifts are important, their nature, scale and mechanisms remain poorly documented and understood. Applying principal component analysis, change-point analysis and a sequential t-test analysis of regime shifts to 72 time series, we confirm that the 1980s regime shift represented a major change in the Earth's biophysical systems from the upper atmosphere to the depths of the ocean and from the Arctic to the Antarctic, and occurred at slightly different times around the world. Using historical climate model simulations from the Coupled Model Intercomparison Project Phase 5 (CMIP5) and statistical modelling of historical temperatures, we then demonstrate that this event was triggered by rapid global warming from anthropogenic plus natural forcing, the latter associated with the recovery from the El Chichón volcanic eruption. The shift in temperature that occurred at this time is hypothesized as the main forcing for a cascade of abrupt environmental changes. Within the context of the last century or more, the 1980s event was unique in terms of its global scope and scale; our observed consequences imply that if unavoidable natural events such as major volcanic eruptions interact with anthropogenic warming unforeseen multiplier effects may occur.