984 resultados para Change points


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Die Nichtlineare Zeitreihenanalyse konnte in den letzten Jahren im Rahmen von Laborexperimenten ihre prinzipelle Brauchbarkeit beweisen. Allerdings handelte es sich in der Regel um ausgewählte oder speziell konstruierte nichtlineare Systeme. Sieht man einmal von der Überwachung von Prozessen und Produkten ab, so sind Anwendungen auf konkrete, vorgegebene dynamische Probleme im industriellen Bereich kaum bekannt geworden. Ziel dieser Arbeit war es, an Hand von zwei Problemen aus der technischen Praxis zu untersuchen, ob die Anwendung des kanonischen Schemas der Nichtlinearen Zeitreihenanalyse auch dort zu brauchbaren Resultaten führt oder ob Modifikationen (Vereinfachungen oder Erweiterungen) notwendig werden. Am Beispiel der Herstellung von optischen Oberflächen durch Hochpräzisionsdrehbearbeitung konnte gezeigt werden, daß eine aktive Störungskompensation in Echtzeit mit einem speziell entwickelten nichtlinearen Vorhersagealgorithmus möglich ist. Standardverfahren der Nichtlinearen Zeitreihenanalyse beschreiten hier den allgemeinen, aber sehr aufwendigen Weg über eine möglichst vollständige Phasenraumrekonstruktion. Das neue Verfahren verzichtet auf viele der kanonischen Zwischenschritte. Dies führt zu einererheblichen Rechenzeitersparnis und zusätzlich zu einer wesentlich höheren Stabilität gegenüber additivem Meßrauschen. Mit den berechneten Vorhersagen der unerwünschten Maschinenschwingungen wurde eine Störungskompensation realisiert, die die Oberflächengüte des bearbeiteten Werkstücks um 20-30% verbesserte. Das zweite Beispiel betrifft die Klassifikation von Körperschallsignalen, die zur Überwachung von Zerspansprozessen gemessen werden. Diese Signale zeigen, wie auch viele andere Prozesse im Bereich der Produktion, ein hochgradig nichtstationäres Verhalten. Hier versagen die Standardverfahren der Nichtlinearen Datenanalyse, die FT- bzw. AAFT-Surrogate benutzen. Daher wurde eine neue Klasse von Surrogatdaten zum Testen der Nullhypothese nichtstationärer linearer stochastischer Prozesse entwickelt, die in der Lage ist, zwischen deterministischen nichtlinear chaotischen und stochastischen linearen nichtstationären Zeitreihen mit change points zu unterscheiden. Damit konnte gezeigt werden, daß die untersuchten Köperschallsignale sich statistisch signifikant einer nichtstationären stochastischen Folge von einfachen linearen Prozessen zuordnen lassen und eine Interpretation als nichtlineare chaotische Zeitreihe nicht erforderlich ist.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Changepoint analysis is a well established area of statistical research, but in the context of spatio-temporal point processes it is as yet relatively unexplored. Some substantial differences with regard to standard changepoint analysis have to be taken into account: firstly, at every time point the datum is an irregular pattern of points; secondly, in real situations issues of spatial dependence between points and temporal dependence within time segments raise. Our motivating example consists of data concerning the monitoring and recovery of radioactive particles from Sandside beach, North of Scotland; there have been two major changes in the equipment used to detect the particles, representing known potential changepoints in the number of retrieved particles. In addition, offshore particle retrieval campaigns are believed may reduce the particle intensity onshore with an unknown temporal lag; in this latter case, the problem concerns multiple unknown changepoints. We therefore propose a Bayesian approach for detecting multiple changepoints in the intensity function of a spatio-temporal point process, allowing for spatial and temporal dependence within segments. We use Log-Gaussian Cox Processes, a very flexible class of models suitable for environmental applications that can be implemented using integrated nested Laplace approximation (INLA), a computationally efficient alternative to Monte Carlo Markov Chain methods for approximating the posterior distribution of the parameters. Once the posterior curve is obtained, we propose a few methods for detecting significant change points. We present a simulation study, which consists in generating spatio-temporal point pattern series under several scenarios; the performance of the methods is assessed in terms of type I and II errors, detected changepoint locations and accuracy of the segment intensity estimates. We finally apply the above methods to the motivating dataset and find good and sensible results about the presence and quality of changes in the process.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Motivation: Array CGH technologies enable the simultaneous measurement of DNA copy number for thousands of sites on a genome. We developed the circular binary segmentation (CBS) algorithm to divide the genome into regions of equal copy number (Olshen {\it et~al}, 2004). The algorithm tests for change-points using a maximal $t$-statistic with a permutation reference distribution to obtain the corresponding $p$-value. The number of computations required for the maximal test statistic is $O(N^2),$ where $N$ is the number of markers. This makes the full permutation approach computationally prohibitive for the newer arrays that contain tens of thousands markers and highlights the need for a faster. algorithm. Results: We present a hybrid approach to obtain the $p$-value of the test statistic in linear time. We also introduce a rule for stopping early when there is strong evidence for the presence of a change. We show through simulations that the hybrid approach provides a substantial gain in speed with only a negligible loss in accuracy and that the stopping rule further increases speed. We also present the analysis of array CGH data from a breast cancer cell line to show the impact of the new approaches on the analysis of real data. Availability: An R (R Development Core Team, 2006) version of the CBS algorithm has been implemented in the ``DNAcopy'' package of the Bioconductor project (Gentleman {\it et~al}, 2004). The proposed hybrid method for the $p$-value is available in version 1.2.1 or higher and the stopping rule for declaring a change early is available in version 1.5.1 or higher.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Standard procedures for forecasting flood risk (Bulletin 17B) assume annual maximum flood (AMF) series are stationary, meaning the distribution of flood flows is not significantly affected by climatic trends/cycles, or anthropogenic activities within the watershed. Historical flood events are therefore considered representative of future flood occurrences, and the risk associated with a given flood magnitude is modeled as constant over time. However, in light of increasing evidence to the contrary, this assumption should be reconsidered, especially as the existence of nonstationarity in AMF series can have significant impacts on planning and management of water resources and relevant infrastructure. Research presented in this thesis quantifies the degree of nonstationarity evident in AMF series for unimpaired watersheds throughout the contiguous U.S., identifies meteorological, climatic, and anthropogenic causes of this nonstationarity, and proposes an extension of the Bulletin 17B methodology which yields forecasts of flood risk that reflect climatic influences on flood magnitude. To appropriately forecast flood risk, it is necessary to consider the driving causes of nonstationarity in AMF series. Herein, large-scale climate patterns—including El Niño-Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO), North Atlantic Oscillation (NAO), and Atlantic Multidecadal Oscillation (AMO)—are identified as influencing factors on flood magnitude at numerous stations across the U.S. Strong relationships between flood magnitude and associated precipitation series were also observed for the majority of sites analyzed in the Upper Midwest and Northeastern regions of the U.S. Although relationships between flood magnitude and associated temperature series are not apparent, results do indicate that temperature is highly correlated with the timing of flood peaks. Despite consideration of watersheds classified as unimpaired, analyses also suggest that identified change-points in AMF series are due to dam construction, and other types of regulation and diversion. Although not explored herein, trends in AMF series are also likely to be partially explained by changes in land use and land cover over time. Results obtained herein suggest that improved forecasts of flood risk may be obtained using a simple modification of the Bulletin 17B framework, wherein the mean and standard deviation of the log-transformed flows are modeled as functions of climate indices associated with oceanic-atmospheric patterns (e.g. AMO, ENSO, NAO, and PDO) with lead times between 3 and 9 months. Herein, one-year ahead forecasts of the mean and standard deviation, and subsequently flood risk, are obtained by applying site specific multivariate regression models, which reflect the phase and intensity of a given climate pattern, as well as possible impacts of coupling of the climate cycles. These forecasts of flood risk are compared with forecasts derived using the existing Bulletin 17B model; large differences in the one-year ahead forecasts are observed in some locations. The increased knowledge of the inherent structure of AMF series and an improved understanding of physical and/or climatic causes of nonstationarity gained from this research should serve as insight for the formulation of a physical-casual based statistical model, incorporating both climatic variations and human impacts, for flood risk over longer planning horizons (e.g., 10-, 50, 100-years) necessary for water resources design, planning, and management.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The electroencephalogram (EEG) is a physiological time series that measures electrical activity at different locations in the brain, and plays an important role in epilepsy research. Exploring the variance and/or volatility may yield insights for seizure prediction, seizure detection and seizure propagation/dynamics.^ Maximal Overlap Discrete Wavelet Transforms (MODWTs) and ARMA-GARCH models were used to determine variance and volatility characteristics of 66 channels for different states of an epileptic EEG – sleep, awake, sleep-to-awake and seizure. The wavelet variances, changes in wavelet variances and volatility half-lives for the four states were compared for possible differences between seizure and non-seizure channels.^ The half-lives of two of the three seizure channels were found to be shorter than all of the non-seizure channels, based on 95% CIs for the pre-seizure and awake signals. No discernible patterns were found the wavelet variances of the change points for the different signals. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Eukaryotic genomes display segmental patterns of variation in various properties, including GC content and degree of evolutionary conservation. DNA segmentation algorithms are aimed at identifying statistically significant boundaries between such segments. Such algorithms may provide a means of discovering new classes of functional elements in eukaryotic genomes. This paper presents a model and an algorithm for Bayesian DNA segmentation and considers the feasibility of using it to segment whole eukaryotic genomes. The algorithm is tested on a range of simulated and real DNA sequences, and the following conclusions are drawn. Firstly, the algorithm correctly identifies non-segmented sequence, and can thus be used to reject the null hypothesis of uniformity in the property of interest. Secondly, estimates of the number and locations of change-points produced by the algorithm are robust to variations in algorithm parameters and initial starting conditions and correspond to real features in the data. Thirdly, the algorithm is successfully used to segment human chromosome 1 according to GC content, thus demonstrating the feasibility of Bayesian segmentation of eukaryotic genomes. The software described in this paper is available from the author's website (www.uq.edu.au/similar to uqjkeith/) or upon request to the author.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this thesis is to shed more light in the FX market microstructure by examining the determinants of bid-ask spread for three currencies pairs, the US dollar/Japanese yen, the British pound/US dollar and the Euro/US dollar in different time zones. I examine the commonality in liquidity with the elaboration of FX market microstructure variables in financial centres across the world (New York, London, Tokyo) based on the quotes of three exchange rate currency pairs over a ten-year period. I use GARCH (1,1) specifications, ICSS algorithm, and vector autoregression analysis to examine the effect of trading activity, exchange rate volatility and inventory holding costs on both quoted and relative spreads. ICSS algorithm results show that intraday spread series are much less volatile compared to the intraday exchange rate series as the number of change points obtained from ICSS algorithm is considerably lower. GARCH (1,1) estimation results of daily and intraday bid-ask spreads, show that the explanatory variables work better when I use higher frequency data (intraday results) however, their explanatory power is significantly lower compared to the results based on the daily sample. This suggests that although daily spreads and intraday spreads have some common determinants there are other factors that determine the behaviour of spreads at high frequencies. VAR results show that there are some differences in the behaviour of the variables at high frequencies compared to the results from the daily sample. A shock in the number of quote revisions has more effect on the spread when short term trading intervals are considered (intra-day) compared to its own shocks. When longer trading intervals are considered (daily) then the shocks in the spread have more effect on the future spread. In other words, trading activity is more informative about the future spread when intra-day trading is considered while past spread is more informative about the future spread when daily trading is considered

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The research studies the transformation from a single-sided offering to a multi-sided platform. The study aims to define platforms and their benefits, creating a theoretical framework by applying change management models with the platform theory, and by finding critical change points of the transformation. The empirical research was done by utilizing action research. The researcher worked as project manager in the case company, and studied the transformation project by working actively and leading the project team. The result of the project was a study of how the company would be able to manage the transformation. The results clearly showed that the company didn’t have the capabilities to finish the transformation. As a conclusion, the study showed that the critical change points that led to the project failure were, that the project was managed with insufficient change managerial efforts, which later resulted as lack of commitment to re-allocating the resources to complete the transformation. Many of the critical change points were results of combined change managerial and platform-related issues.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Detecting change points in epidemic models has been studied by many scholars. Yao (1993) summarized five existing test statistics in the literature. Out of those test statistics, it was observed that the likelihood ratio statistic showed its standout power. However, all of the existing test statistics are based on an assumption that population variance is known, which is an unrealistic assumption in practice. To avoid assuming known population variance, a new test statistic for detecting epidemic models is studied in this thesis. The new test statistic is a parameter-free test statistic which is more powerful compared to the existing test statistics. Different sample sizes and lengths of epidemic durations are used for the power comparison purpose. Monte Carlo simulation is used to find the critical values of the new test statistic and to perform the power comparison. Based on the Monte Carlo simulation result, it can be concluded that the sample size and the length of the duration have some effect on the power of the tests. It can also be observed that the new test statistic studied in this thesis has higher power than the existing test statistics do in all of cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigated the effect of simulated microwave disinfection (SMD) on the linear dimensional changes, hardness and impact strength of acrylic resins under different polymerization cycles. Metal dies with referential points were embedded in flasks with dental stone. Samples of Classico and Vipi acrylic resins were made following the manufacturers' recommendations. The assessed polymerization cycles were: A-- water bath at 74ºC for 9 h; B-- water bath at 74ºC for 8 h and temperature increased to 100ºC for 1 h; C-- water bath at 74ºC for 2 h and temperature increased to 100ºC for 1 h;; and D-- water bath at 120ºC and pressure of 60 pounds. Linear dimensional distances in length and width were measured after SMD and water storage at 37ºC for 7 and 30 days using an optical microscope. SMD was carried out with the samples immersed in 150 mL of water in an oven (650 W for 3 min). A load of 25 gf for 10 sec was used in the hardness test. Charpy impact test was performed with 40 kpcm. Data were submitted to ANOVA and Tukey's test (5%). The Classico resin was dimensionally steady in length in the A and D cycles for all periods, while the Vipi resin was steady in the A, B and C cycles for all periods. The Classico resin was dimensionally steady in width in the C and D cycles for all periods, and the Vipi resin was steady in all cycles and periods. The hardness values for Classico resin were steady in all cycles and periods, while the Vipi resin was steady only in the C cycle for all periods. Impact strength values for Classico resin were steady in the A, C and D cycles for all periods, while Vipi resin was steady in all cycles and periods. SMD promoted different effects on the linear dimensional changes, hardness and impact strength of acrylic resins submitted to different polymerization cycles when after SMD and water storage were considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigated the effect of simulated microwave disinfection (SMD) on the linear dimensional changes, hardness and impact strength of acrylic resins under different polymerization cycles. Metal dies with referential points were embedded in flasks with dental stone. Samples of Classico and Vipi acrylic resins were made following the manufacturers' recommendations. The assessed polymerization cycles were: A) water bath at 74 ºC for 9 h; B) water bath at 74 ºC for 8 h and temperature increased to 100 ºC for 1 h; C) water bath at 74 ºC for 2 h and temperature increased to 100 ºC for 1 h; and D) water bath at 120 ºC and pressure of 60 pounds. Linear dimensional distances in length and width were measured after SMD and water storage at 37 ºC for 7 and 30 days using an optical microscope. SMD was carried out with the samples immersed in 150 mL of water in an oven (650 W for 3 min). A load of 25 gf for 10 s was used in the hardness test. Charpy impact test was performed with 40 kpcm. Data were submitted to ANOVA and Tukey's test (5%). The Classico resin was dimensionally steady in length in the A and D cycles for all periods, while the Vipi resin was steady in the A, B and C cycles for all periods. The Classico resin was dimensionally steady in width in the C and D cycles for all periods, and the Vipi resin was steady in all cycles and periods. The hardness values for Classico resin were steady in all cycles and periods, while the Vipi resin was steady only in the C cycle for all periods. Impact strength values for Classico resin were steady in the A, C and D cycles for all periods, while Vipi resin was steady in all cycles and periods. SMD promoted different effects on the linear dimensional changes, hardness and impact strength of acrylic resins submitted to different polymerization cycles when after SMD and water storage were considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The HACCP system is being increasingly used to ensure food safety. This study investigated the validation of the control measures technique in order to establish performance indicators of this HACCP system in the manufacturing process of Lasagna Bolognese (meat lasagna). Samples were collected along the manufacturing process as a whole, before and after the CCPs. The following microorganism s indicator (MIs) was assessed: total mesophile and faecal coliform counts. The same MIs were analyzed in the final product, as well as, the microbiological standards required by the current legislation. A significant reduction in the total mesophile count was observed after cooking (p < 0.001). After storage, there was a numerical, however non-significant change in the MI count. Faecal coliform counts were also significantly reduced (p < 0.001) after cooking. We were able to demonstrate that the HACCP system allowed us to meet the standards set by both, the company and the Brazilian regulations, proved by the reduction in the established indicators

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the syntax of indirect objects (IO) in Brazilian Portuguese (BP). Adopting a comparative perspective we propose that BP differs from European Portuguese (EP) in the grammatical encoding of IO. In EP ditransitive contexts, IO is found in two configurations - one projected by a (low) applicative head and another one involving a lexical/true preposition. We propose that the former property is contingent upon the presence of dative Case marking: namely, the morpheme `a` that introduces IO (a-DP), whose corresponding clitic pronoun is `lhe/lhes`. In contrast, important changes in the pronominal system, coupled with the increase in the use of the preposition `para` are taken as evidence for the loss of the low applicative construction in BP. Thus only the configuration with the lexical/true preposition is found in (Standard) BP. We argue that the innovative properties of IO in BP are due to the loss of the (3rd person) dative clitic and the preposition `a` as dative Case markers. Under this view, we further account for the realization of IO as a DP/weak pronoun, found in dialects of the central region of Brazil, which points to a similarity with the English Double Object Construction. Finally we show that the connection between the morphological expression of the dative Case and the expression of parameters supports a view of syntactic change according to which parametric variation is determined in the lexicon, in terms of the formal features of functional heads.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper shows the numerous problems of conventional economic analysis in the evaluation of climate change mitigation policies. The article points out the many limitations, omissions, and the arbitrariness that have characterized most evaluation models applied up until now. These shortcomings, in an almost overwhelming way, have biased the result towards the recommendation of a lower aggressiveness of emission mitigation policies. Consequently, this paper questions whether these results provide an appropriate answer to the problem. Finally, various points that an analysis coherent with sustainable development should take into account are presented.