833 resultados para Random time change


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 60J60, 62M99.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research is investigating the claim that Change Data Capture (CDC) technologies capture data changes in real-time. Based on theory, our hypothesis states that real-time CDC is not achievable with traditional approaches (log scanning, triggers and timestamps). Traditional approaches to CDC require a resource to be polled, which prevents true real-time CDC. We propose an approach to CDC that encapsulates the data source with a set of web services. These web services will propagate the changes to the targets and eliminate the need for polling. Additionally we propose a framework for CDC technologies that allow changes to flow from source to target. This paper discusses current CDC technologies and presents the theory about why they are unable to deliver changes in real-time. Following, we discuss our web service approach to CDC and accompanying framework, explaining how they can produce real-time CDC. The paper concludes with a discussion on the research required to investigate the real-time capabilities of CDC technologies. © 2010 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 60J80, 62P05.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the psychosocial construction of the conceptions of time and social change, and their relation with the notion of progress set up by modernity as well as their contradictory aspects, considering their historicity as psychosocial and social phenomena. The social construction of the notion of time is discussed; it responds to social organization forms, thus turning the social changes into historical and temporal landmarks. The association between social changes and social movements is argued and the concept of progress is deconstructed. Finally, the importance of the idea of movement in the construction of time, change and progress concepts is pointed out; it is also explained how modernity imbued such notions with direction and goals, in the direction of an arrow, according to the prevailing interests, while modeling them in its own image and likeness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the psychosocial construction of the conceptions of time and social change, and their relation with the notion of progress set up by modernity as well as their contradictory aspects, considering their historicity as psychosocial and social phenomena. The social construction of the notion of time is discussed; it responds to social organization forms, thus turning the social changes into historical and temporal landmarks. The association between social changes and social movements is argued and the concept of progress is deconstructed. Finally, the importance of the idea of movement in the construction of time, change and progress concepts is pointed out; it is also explained how modernity imbued such notions with direction and goals, in the direction of an arrow, according to the prevailing interests, while modeling them in its own image and likeness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the psychosocial construction of the conceptions of time and social change, and their relation with the notion of progress set up by modernity as well as their contradictory aspects, considering their historicity as psychosocial and social phenomena. The social construction of the notion of time is discussed; it responds to social organization forms, thus turning the social changes into historical and temporal landmarks. The association between social changes and social movements is argued and the concept of progress is deconstructed. Finally, the importance of the idea of movement in the construction of time, change and progress concepts is pointed out; it is also explained how modernity imbued such notions with direction and goals, in the direction of an arrow, according to the prevailing interests, while modeling them in its own image and likeness.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We introduce and study a class of infinite-horizon nonzero-sum non-cooperative stochastic games with infinitely many interacting agents using ideas of statistical mechanics. First we show, in the general case of asymmetric interactions, the existence of a strategy that allows any player to eliminate losses after a finite random time. In the special case of symmetric interactions, we also prove that, as time goes to infinity, the game converges to a Nash equilibrium. Moreover, assuming that all agents adopt the same strategy, using arguments related to those leading to perfect simulation algorithms, spatial mixing and ergodicity are proved. In turn, ergodicity allows us to prove “fixation”, i.e. that players will adopt a constant strategy after a finite time. The resulting dynamics is related to zerotemperature Glauber dynamics on random graphs of possibly infinite volume.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Radar refractivity retrievals can capture near-surface humidity changes, but noisy phase changes of the ground clutter returns limit the accuracy for both klystron- and magnetron-based systems. Observations with a C-band (5.6 cm) magnetron weather radar indicate that the correction for phase changes introduced by local oscillator frequency changes leads to refractivity errors no larger than 0.25 N units: equivalent to a relative humidity change of only 0.25% at 20°C. Requested stable local oscillator (STALO) frequency changes were accurate to 0.002 ppm based on laboratory measurements. More serious are the random phase change errors introduced when targets are not at the range-gate center and there are changes in the transmitter frequency (ΔfTx) or the refractivity (ΔN). Observations at C band with a 2-μs pulse show an additional 66° of phase change noise for a ΔfTx of 190 kHz (34 ppm); this allows the effect due to ΔN to be predicted. Even at S band with klystron transmitters, significant phase change noise should occur when a large ΔN develops relative to the reference period [e.g., ~55° when ΔN = 60 for the Next Generation Weather Radar (NEXRAD) radars]. At shorter wavelengths (e.g., C and X band) and with magnetron transmitters in particular, refractivity retrievals relative to an earlier reference period are even more difficult, and operational retrievals may be restricted to changes over shorter (e.g., hourly) periods of time. Target location errors can be reduced by using a shorter pulse or identified by a new technique making alternate measurements at two closely spaced frequencies, which could even be achieved with a dual–pulse repetition frequency (PRF) operation of a magnetron transmitter.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we study the pricing problem of derivatives written in terms of a two dimensional time{changed L¶evy processes. Then, we examine an existing relation between prices of put and call options, of both the European and the American type. This relation is called put{call duality. It includes as a particular case, the relation known as put{call symmetry. Necessary and su±cient conditions for put{call symmetry to hold are shown, in terms of the triplet of local charac- teristic of the Time{changed L¶evy process. In this way we extend the results obtained in Fajardo and Mordecki (2004) to the case of time{changed Lévy processes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Throughout this article, it is assumed that the no-central chi-square chart with two stage samplings (TSS Chisquare chart) is employed to monitor a process where the observations from the quality characteristic of interest X are independent and identically normally distributed with mean μ and variance σ2. The process is considered to start with the mean and the variance on target (μ = μ0; σ2 = σ0 2), but at some random time in the future an assignable cause shifts the mean from μ0 to μ1 = μ0 ± δσ0, δ >0 and/or increases the variance from σ0 2 to σ1 2 = γ2σ0 2, γ > 1. Before the assignable cause occurrence, the process is considered to be in a state of statistical control (defined by the in-control state). Similar to the Shewhart charts, samples of size n 0+ 1 are taken from the process at regular time intervals. The samplings are performed in two stages. At the first stage, the first item of the i-th sample is inspected. If its X value, say Xil, is close to the target value (|Xil-μ0|< w0σ 0, w0>0), then the sampling is interrupted. Otherwise, at the second stage, the remaining n0 items are inspected and the following statistic is computed. Wt = Σj=2n 0+1(Xij - μ0 + ξiσ 0)2 i = 1,2 Let d be a positive constant then ξ, =d if Xil > 0 ; otherwise ξi =-d. A signal is given at sample i if |Xil-μ0| > w0σ 0 and W1 > knia:tl, where kChi is the factor used in determining the upper control limit for the non-central chi-square chart. If devices such as go and no-go gauges can be considered, then measurements are not required except when the sampling goes to the second stage. Let P be the probability of deciding that the process is in control and P 1, i=1,2, be the probability of deciding that the process is in control at stage / of the sampling procedure. Thus P = P1 + P 2 - P1P2, P1 = Pr[μ0 - w0σ0 ≤ X ≤ μ0+ w 0σ0] P2=Pr[W ≤ kChi σ0 2], (3) During the in-control period, W / σ0 2 is distributed as a non-central chi-square distribution with n0 degrees of freedom and a non-centrality parameter λ0 = n0d2, i.e. W / σ0 2 - xn0 22 (λ0) During the out-of-control period, W / σ1 2 is distributed as a non-central chi-square distribution with n0 degrees of freedom and a non-centrality parameter λ1 = n0(δ + ξ)2 / γ2 The effectiveness of a control chart in detecting a process change can be measured by the average run length (ARL), which is the speed with which a control chart detects process shifts. The ARL for the proposed chart is easily determined because in this case, the number of samples before a signal is a geometrically distributed random variable with parameter 1-P, that is, ARL = I /(1-P). It is shown that the performance of the proposed chart is better than the joint X̄ and R charts, Furthermore, if the TSS Chi-square chart is used for monitoring diameters, volumes, weights, etc., then appropriate devices, such as go-no-go gauges can be used to decide if the sampling should go to the second stage or not. When the process is stable, and the joint X̄ and R charts are in use, the monitoring becomes monotonous because rarely an X̄ or R value fall outside the control limits. The natural consequence is the user to pay less and less attention to the steps required to obtain the X̄ and R value. In some cases, this lack of attention can result in serious mistakes. The TSS Chi-square chart has the advantage that most of the samplings are interrupted, consequently, most of the time the user will be working with attributes. Our experience shows that the inspection of one item by attribute is much less monotonous than measuring four or five items at each sampling.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Polynomial Chaos Expansion (PCE) is widely recognized as a flexible tool to represent different types of random variables/processes. However, applications to real, experimental data are still limited. In this article, PCE is used to represent the random time-evolution of metal corrosion growth in marine environments. The PCE coefficients are determined in order to represent data of 45 corrosion coupons tested by Jeffrey and Melchers (2001) at Taylors Beach, Australia. Accuracy of the representation and possibilities for model extrapolation are considered in the study. Results show that reasonably accurate smooth representations of the corrosion process can be obtained. The representation is not better because a smooth model is used to represent non-smooth corrosion data. Random corrosion leads to time-variant reliability problems, due to resistance degradation over time. Time variant reliability problems are not trivial to solve, especially under random process loading. Two example problems are solved herein, showing how the developed PCE representations can be employed in reliability analysis of structures subject to marine corrosion. Monte Carlo Simulation is used to solve the resulting time-variant reliability problems. However, an accurate and more computationally efficient solution is also presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work provides a forward step in the study and comprehension of the relationships between stochastic processes and a certain class of integral-partial differential equation, which can be used in order to model anomalous diffusion and transport in statistical physics. In the first part, we brought the reader through the fundamental notions of probability and stochastic processes, stochastic integration and stochastic differential equations as well. In particular, within the study of H-sssi processes, we focused on fractional Brownian motion (fBm) and its discrete-time increment process, the fractional Gaussian noise (fGn), which provide examples of non-Markovian Gaussian processes. The fGn, together with stationary FARIMA processes, is widely used in the modeling and estimation of long-memory, or long-range dependence (LRD). Time series manifesting long-range dependence, are often observed in nature especially in physics, meteorology, climatology, but also in hydrology, geophysics, economy and many others. We deepely studied LRD, giving many real data examples, providing statistical analysis and introducing parametric methods of estimation. Then, we introduced the theory of fractional integrals and derivatives, which indeed turns out to be very appropriate for studying and modeling systems with long-memory properties. After having introduced the basics concepts, we provided many examples and applications. For instance, we investigated the relaxation equation with distributed order time-fractional derivatives, which describes models characterized by a strong memory component and can be used to model relaxation in complex systems, which deviates from the classical exponential Debye pattern. Then, we focused in the study of generalizations of the standard diffusion equation, by passing through the preliminary study of the fractional forward drift equation. Such generalizations have been obtained by using fractional integrals and derivatives of distributed orders. In order to find a connection between the anomalous diffusion described by these equations and the long-range dependence, we introduced and studied the generalized grey Brownian motion (ggBm), which is actually a parametric class of H-sssi processes, which have indeed marginal probability density function evolving in time according to a partial integro-differential equation of fractional type. The ggBm is of course Non-Markovian. All around the work, we have remarked many times that, starting from a master equation of a probability density function f(x,t), it is always possible to define an equivalence class of stochastic processes with the same marginal density function f(x,t). All these processes provide suitable stochastic models for the starting equation. Studying the ggBm, we just focused on a subclass made up of processes with stationary increments. The ggBm has been defined canonically in the so called grey noise space. However, we have been able to provide a characterization notwithstanding the underline probability space. We also pointed out that that the generalized grey Brownian motion is a direct generalization of a Gaussian process and in particular it generalizes Brownain motion and fractional Brownain motion as well. Finally, we introduced and analyzed a more general class of diffusion type equations related to certain non-Markovian stochastic processes. We started from the forward drift equation, which have been made non-local in time by the introduction of a suitable chosen memory kernel K(t). The resulting non-Markovian equation has been interpreted in a natural way as the evolution equation of the marginal density function of a random time process l(t). We then consider the subordinated process Y(t)=X(l(t)) where X(t) is a Markovian diffusion. The corresponding time-evolution of the marginal density function of Y(t) is governed by a non-Markovian Fokker-Planck equation which involves the same memory kernel K(t). We developed several applications and derived the exact solutions. Moreover, we considered different stochastic models for the given equations, providing path simulations.