887 resultados para Non-Wear Time


Relevância:

90.00% 90.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 60J60, 62M99.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Limited literature regarding parameter estimation of dynamic systems has been identified as the central-most reason for not having parametric bounds in chaotic time series. However, literature suggests that a chaotic system displays a sensitive dependence on initial conditions, and our study reveals that the behavior of chaotic system: is also sensitive to changes in parameter values. Therefore, parameter estimation technique could make it possible to establish parametric bounds on a nonlinear dynamic system underlying a given time series, which in turn can improve predictability. By extracting the relationship between parametric bounds and predictability, we implemented chaos-based models for improving prediction in time series. ^ This study describes work done to establish bounds on a set of unknown parameters. Our research results reveal that by establishing parametric bounds, it is possible to improve the predictability of any time series, although the dynamics or the mathematical model of that series is not known apriori. In our attempt to improve the predictability of various time series, we have established the bounds for a set of unknown parameters. These are: (i) the embedding dimension to unfold a set of observation in the phase space, (ii) the time delay to use for a series, (iii) the number of neighborhood points to use for avoiding detection of false neighborhood and, (iv) the local polynomial to build numerical interpolation functions from one region to another. Using these bounds, we are able to get better predictability in chaotic time series than previously reported. In addition, the developments of this dissertation can establish a theoretical framework to investigate predictability in time series from the system-dynamics point of view. ^ In closing, our procedure significantly reduces the computer resource usage, as the search method is refined and efficient. Finally, the uniqueness of our method lies in its ability to extract chaotic dynamics inherent in non-linear time series by observing its values. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this research is to develop an optimal kernel which would be used in a real-time engineering and communications system. Since the application is a real-time system, relevant real-time issues are studied in conjunction with kernel related issues. The emphasis of the research is the development of a kernel which would not only adhere to the criteria of a real-time environment, namely determinism and performance, but also provide the flexibility and portability associated with non-real-time environments. The essence of the research is to study how the features found in non-real-time systems could be applied to the real-time system in order to generate an optimal kernel which would provide flexibility and architecture independence while maintaining the performance needed by most of the engineering applications. Traditionally, development of real-time kernels has been done using assembly language. By utilizing the powerful constructs of the C language, a real-time kernel was developed which addressed the goals of flexibility and portability while still meeting the real-time criteria. The implementation of the kernel is carried out using the powerful 68010/20/30/40 microprocessor based systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Statistical approaches to study extreme events require, by definition, long time series of data. In many scientific disciplines, these series are often subject to variations at different temporal scales that affect the frequency and intensity of their extremes. Therefore, the assumption of stationarity is violated and alternative methods to conventional stationary extreme value analysis (EVA) must be adopted. Using the example of environmental variables subject to climate change, in this study we introduce the transformed-stationary (TS) methodology for non-stationary EVA. This approach consists of (i) transforming a non-stationary time series into a stationary one, to which the stationary EVA theory can be applied, and (ii) reverse transforming the result into a non-stationary extreme value distribution. As a transformation, we propose and discuss a simple time-varying normalization of the signal and show that it enables a comprehensive formulation of non-stationary generalized extreme value (GEV) and generalized Pareto distribution (GPD) models with a constant shape parameter. A validation of the methodology is carried out on time series of significant wave height, residual water level, and river discharge, which show varying degrees of long-term and seasonal variability. The results from the proposed approach are comparable with the results from (a) a stationary EVA on quasi-stationary slices of non-stationary series and (b) the established method for non-stationary EVA. However, the proposed technique comes with advantages in both cases. For example, in contrast to (a), the proposed technique uses the whole time horizon of the series for the estimation of the extremes, allowing for a more accurate estimation of large return levels. Furthermore, with respect to (b), it decouples the detection of non-stationary patterns from the fitting of the extreme value distribution. As a result, the steps of the analysis are simplified and intermediate diagnostics are possible. In particular, the transformation can be carried out by means of simple statistical techniques such as low-pass filters based on the running mean and the standard deviation, and the fitting procedure is a stationary one with a few degrees of freedom and is easy to implement and control. An open-source MAT-LAB toolbox has been developed to cover this methodology, which is available at https://github.com/menta78/tsEva/(Mentaschi et al., 2016).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

United States federal agencies assess flood risk using Bulletin 17B procedures which assume annual maximum flood series are stationary. This represents a significant limitation of current flood frequency models as the flood distribution is thereby assumed to be unaffected by trends or periodicity of atmospheric/climatic variables and/or anthropogenic activities. The validity of this assumption is at the core of this thesis, which aims to improve understanding of the forms and potential causes of non-stationarity in flood series for moderately impaired watersheds in the Upper Midwest and Northeastern US. Prior studies investigated non-stationarity in flood series for unimpaired watersheds; however, as the majority of streams are located in areas of increasing human activity, relative and coupled impacts of natural and anthropogenic factors need to be considered such that non-stationary flood frequency models can be developed for flood risk forecasting over relevant planning horizons for large scale water resources planning and management.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An algorithm for explicit integration of structural dynamics problems with multiple time steps is proposed that averages accelerations to obtain subcycle states at a nodal interface between regions integrated with different time steps. With integer time step ratios, the resulting subcycle updates at the interface sum to give the same effect as a central difference update over a major cycle. The algorithm is shown to have good accuracy, and stability properties in linear elastic analysis similar to those of constant velocity subcycling algorithms. The implementation of a generalised form of the algorithm with non-integer time step ratios is presented. (C) 1997 by John Wiley & Sons, Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Preliminary version

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The IEEE 802.15.4 standard provides appealing features to simultaneously support real-time and non realtime traffic, but it is only capable of supporting real-time communications from at most seven devices. Additionally, it cannot guarantee delay bounds lower than the superframe duration. Motivated by this problem, in this paper we propose an Explicit Guaranteed time slot Sharing and Allocation scheme (EGSA) for beacon-enabled IEEE 802.15.4 networks. This scheme is capable of providing tighter delay bounds for real-time communications by splitting the Contention Free access Period (CFP) into smaller mini time slots and by means of a new guaranteed bandwidth allocation scheme for a set of devices with periodic messages. At the same the novel bandwidth allocation scheme can maximize the duration of the CFP for non real-time communications. Performance analysis results show that the EGSA scheme works efficiently and outperforms competitor schemes both in terms of guaranteed delay and bandwidth utilization.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We discuss the development of a simple globally prioritized multi-channel medium access control (MAC) protocol for wireless networks. This protocol provides “hard” pre-run-time real-time guarantees to sporadic message streams, exploits a very large fraction of the capacity of all channels for “hard” real-time traffic and also makes it possible to fully utilize the channels with non real-time traffic when hard real-time messages do not request to be transmitted. The potential of such protocols for real-time applications is discussed and a schedulability analysis is also presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Consumer-electronics systems are becoming increasingly complex as the number of integrated applications is growing. Some of these applications have real-time requirements, while other non-real-time applications only require good average performance. For cost-efficient design, contemporary platforms feature an increasing number of cores that share resources, such as memories and interconnects. However, resource sharing causes contention that must be resolved by a resource arbiter, such as Time-Division Multiplexing. A key challenge is to configure this arbiter to satisfy the bandwidth and latency requirements of the real-time applications, while maximizing the slack capacity to improve performance of their non-real-time counterparts. As this configuration problem is NP-hard, a sophisticated automated configuration method is required to avoid negatively impacting design time. The main contributions of this article are: 1) An optimal approach that takes an existing integer linear programming (ILP) model addressing the problem and wraps it in a branch-and-price framework to improve scalability. 2) A faster heuristic algorithm that typically provides near-optimal solutions. 3) An experimental evaluation that quantitatively compares the branch-and-price approach to the previously formulated ILP model and the proposed heuristic. 4) A case study of an HD video and graphics processing system that demonstrates the practical applicability of the approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

During drilling operation, cuttings are produced downhole and must be removed to avoid issues which can lead to Non Productive Time (NPT). Most of stuck pipe and then Bottom-Hole Assembly (BHA) lost events are hole cleaned related. There are many parameters which help determine hole cleaning conditions, but a proper selection of the key parameters will facilitate monitoring hole cleaning conditions and interventions. The aim of Hole Cleaning Monitoring is to keep track of borehole conditions including hole cleaning efficiency and wellbore stability issues during drilling operations. Adequate hole cleaning is the one of the main concerns in the underbalanced drilling operations especially for directional and horizontal wells. This dissertation addresses some hole cleaning fundamentals which will act as the basis for recommendation practice during drilling operations. Understand how parameters such as Flowrate, Rotation per Minute (RPM), Rate of Penetration (ROP) and Mud Weight are useful to improve the hole cleaning performance and how Equivalent Circulate Density (ECD), Torque & Drag (T&D) and Cuttings Volumes coming from downhole help to indicate how clean and stable the well is. For case study, hole cleaning performance or cuttings volume removal monitoring, will be based on real-time measurements of the cuttings volume removal from downhole at certain time, taking into account Flowrate, RPM, ROP and Drilling fluid or Mud properties, and then will be plotted and compared to the volume being drilled expected. ECD monitoring will dictate hole stability conditions and T&D and Cuttings Volume coming from downhole monitoring will dictate how clean the well is. T&D Modeling Software provide theoretical calculated T&D trends which will be plotted and compared to the real-time measurements. It will use the measured hookloads to perform a back-calculation of friction factors along the wellbore.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En el presente trabajo, tratamos diferentes perspectivas sobre la poética, estrategias compositivas y repercusión perceptiva del tiempo en la música de Gérard Grisey. En el primer capítulo, abordamos la concepción del tiempo como unidad y proporcionalidad duracional y su relación con otros parámetros musicales. A continuación, presentamos tres enfoques sobre el tiempo que emergen de la poética de Grisey y del análisis de sus obras: la ruptura con la proporcionalidad duracional y la relación entre tiempo y sonido, el concepto de cambio de escala temporal y la analogía entre tiempo y cosmos. En el segundo capítulo, proponemos tres categorías temporales basadas principalmente en el concepto de previsibilidad: tiempo no lineal, tiempo lineal y tiempo procesual. En el tercer y último capítulo, exponemos los fundamentos de la Teoría de la Información, su relación con el discurso de Grisey y su método de aplicación.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

STUDY QUESTION: What are the long term trends in the total (live births, fetal deaths, and terminations of pregnancy for fetal anomaly) and live birth prevalence of neural tube defects (NTD) in Europe, where many countries have issued recommendations for folic acid supplementation but a policy for mandatory folic acid fortification of food does not exist? METHODS: This was a population based, observational study using data on 11 353 cases of NTD not associated with chromosomal anomalies, including 4162 cases of anencephaly and 5776 cases of spina bifida from 28 EUROCAT (European Surveillance of Congenital Anomalies) registries covering approximately 12.5 million births in 19 countries between 1991 and 2011. The main outcome measures were total and live birth prevalence of NTD, as well as anencephaly and spina bifida, with time trends analysed using random effects Poisson regression models to account for heterogeneities across registries and splines to model non-linear time trends. SUMMARY ANSWER AND LIMITATIONS: Overall, the pooled total prevalence of NTD during the study period was 9.1 per 10 000 births. Prevalence of NTD fluctuated slightly but without an obvious downward trend, with the final estimate of the pooled total prevalence of NTD in 2011 similar to that in 1991. Estimates from Poisson models that took registry heterogeneities into account showed an annual increase of 4% (prevalence ratio 1.04, 95% confidence interval 1.01 to 1.07) in 1995-99 and a decrease of 3% per year in 1999-2003 (0.97, 0.95 to 0.99), with stable rates thereafter. The trend patterns for anencephaly and spina bifida were similar, but neither anomaly decreased substantially over time. The live birth prevalence of NTD generally decreased, especially for anencephaly. Registration problems or other data artefacts cannot be excluded as a partial explanation of the observed trends (or lack thereof) in the prevalence of NTD. WHAT THIS STUDY ADDS: In the absence of mandatory fortification, the prevalence of NTD has not decreased in Europe despite longstanding recommendations aimed at promoting peri-conceptional folic acid supplementation and existence of voluntary folic acid fortification. FUNDING, COMPETING INTERESTS, DATA SHARING: The study was funded by the European Public Health Commission, EUROCAT Joint Action 2011-2013. HD and ML received support from the European Commission DG Sanco during the conduct of this study. No additional data available.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

New emerging technologies in the recent decade have brought new options to cross platform computer graphics development. This master thesis took a look for cross platform 3D graphics development possibilities. All platform dependent and non real time solutions were excluded. WebGL and two different OpenGL based solutions were assessed via demo application by using most recent development tools. In the results pros and cons of the each solutions were noted.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Complex networks can arise naturally and spontaneously from all things that act as a part of a larger system. From the patterns of socialization between people to the way biological systems organize themselves, complex networks are ubiquitous, but are currently poorly understood. A number of algorithms, designed by humans, have been proposed to describe the organizational behaviour of real-world networks. Consequently, breakthroughs in genetics, medicine, epidemiology, neuroscience, telecommunications and the social sciences have recently resulted. The algorithms, called graph models, represent significant human effort. Deriving accurate graph models is non-trivial, time-intensive, challenging and may only yield useful results for very specific phenomena. An automated approach can greatly reduce the human effort required and if effective, provide a valuable tool for understanding the large decentralized systems of interrelated things around us. To the best of the author's knowledge this thesis proposes the first method for the automatic inference of graph models for complex networks with varied properties, with and without community structure. Furthermore, to the best of the author's knowledge it is the first application of genetic programming for the automatic inference of graph models. The system and methodology was tested against benchmark data, and was shown to be capable of reproducing close approximations to well-known algorithms designed by humans. Furthermore, when used to infer a model for real biological data the resulting model was more representative than models currently used in the literature.