21 resultados para Event-based timing

em University of Queensland eSpace - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past years, the paradigm of component-based software engineering has been established in the construction of complex mission-critical systems. Due to this trend, there is a practical need for techniques that evaluate critical properties (such as safety, reliability, availability or performance) of these systems. In this paper, we review several high-level techniques for the evaluation of safety properties for component-based systems and we propose a new evaluation model (State Event Fault Trees) that extends safety analysis towards a lower abstraction level. This model possesses a state-event semantics and strong encapsulation, which is especially useful for the evaluation of component-based software systems. Finally, we compare the techniques and give suggestions for their combined usage

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: The description and evaluation of the performance of a new real-time seizure detection algorithm in the newborn infant. Methods: The algorithm includes parallel fragmentation of EEG signal into waves; wave-feature extraction and averaging; elementary, preliminary and final detection. The algorithm detects EEG waves with heightened regularity, using wave intervals, amplitudes and shapes. The performance of the algorithm was assessed with the use of event-based and liberal and conservative time-based approaches and compared with the performance of Gotman's and Liu's algorithms. Results: The algorithm was assessed on multi-channel EEG records of 55 neonates including 17 with seizures. The algorithm showed sensitivities ranging 83-95% with positive predictive values (PPV) 48-77%. There were 2.0 false positive detections per hour. In comparison, Gotman's algorithm (with 30 s gap-closing procedure) displayed sensitivities of 45-88% and PPV 29-56%; with 7.4 false positives per hour and Liu's algorithm displayed sensitivities of 96-99%, and PPV 10-25%; with 15.7 false positives per hour. Conclusions: The wave-sequence analysis based algorithm displayed higher sensitivity, higher PPV and a substantially lower level of false positives than two previously published algorithms. Significance: The proposed algorithm provides a basis for major improvements in neonatal seizure detection and monitoring. Published by Elsevier Ireland Ltd. on behalf of International Federation of Clinical Neurophysiology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Experiments with simulators allow psychologists to better understand the causes of human errors and build models of cognitive processes to be used in human reliability assessment (HRA). This paper investigates an approach to task failure analysis based on patterns of behaviour, by contrast to more traditional event-based approaches. It considers, as a case study, a formal model of an air traffic control (ATC) system which incorporates controller behaviour. The cognitive model is formalised in the CSP process algebra. Patterns of behaviour are expressed as temporal logic properties. Then a model-checking technique is used to verify whether the decomposition of the operator's behaviour into patterns is sound and complete with respect to the cognitive model. The decomposition is shown to be incomplete and a new behavioural pattern is identified, which appears to have been overlooked in the analysis of the data provided by the experiments with the simulator. This illustrates how formal analysis of operator models can yield fresh insights into how failures may arise in interactive systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Hospital performance reports based on administrative data should distinguish differences in quality of care between hospitals from case mix related variation and random error effects. A study was undertaken to determine which of 12 diagnosis-outcome indicators measured across all hospitals in one state had significant risk adjusted systematic ( or special cause) variation (SV) suggesting differences in quality of care. For those that did, we determined whether SV persists within hospital peer groups, whether indicator results correlate at the individual hospital level, and how many adverse outcomes would be avoided if all hospitals achieved indicator values equal to the best performing 20% of hospitals. Methods: All patients admitted during a 12 month period to 180 acute care hospitals in Queensland, Australia with heart failure (n = 5745), acute myocardial infarction ( AMI) ( n = 3427), or stroke ( n = 2955) were entered into the study. Outcomes comprised in-hospital deaths, long hospital stays, and 30 day readmissions. Regression models produced standardised, risk adjusted diagnosis specific outcome event ratios for each hospital. Systematic and random variation in ratio distributions for each indicator were then apportioned using hierarchical statistical models. Results: Only five of 12 (42%) diagnosis-outcome indicators showed significant SV across all hospitals ( long stays and same diagnosis readmissions for heart failure; in-hospital deaths and same diagnosis readmissions for AMI; and in-hospital deaths for stroke). Significant SV was only seen for two indicators within hospital peer groups ( same diagnosis readmissions for heart failure in tertiary hospitals and inhospital mortality for AMI in community hospitals). Only two pairs of indicators showed significant correlation. If all hospitals emulated the best performers, at least 20% of AMI and stroke deaths, heart failure long stays, and heart failure and AMI readmissions could be avoided. Conclusions: Diagnosis-outcome indicators based on administrative data require validation as markers of significant risk adjusted SV. Validated indicators allow quantification of realisable outcome benefits if all hospitals achieved best performer levels. The overall level of quality of care within single institutions cannot be inferred from the results of one or a few indicators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The efficiency of inhibitory control processes has been proposed as a mechanism constraining working-memory capacity. In order to investigate genetic influences on processes that may reflect interference control, event-related potential (ER-P) activity recorded at frontal sites, during distracting and nondistracting conditions of a working-memory task, in a sample of 509 twin pairs was examined. The ERP component of interest was the slow wave (SW). Considerable overlap in source of genetic influence was found, with a common genetic factor accounting for 37 - 45% of SW variance irrespective of condition. However, 3 - 8 % of SW variance in the distracting condition was influenced by an independent genetic source. These results suggest that neural responses to irrelevant and distracting information, that may disrupt working-memory performance, differ in a fundamental way from perceptual and memory-based processing in a working-memory task. Furthermore, the results are consistent with the view that cognition is a complex genetic trait influenced by numerous genes of small influence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Firms have embraced electronic commerce as a means of doing business, either because they see it as a way to improve efficiency, grow market share, expand into new markets, or because they view it as essential for survival. Recent research in the United States provides some evidence that the market does value investments in electronic commerce. Following research that suggests that, in certain circumstances, the market values noninnovative investments as well as innovative investments in new products, we partition electronic commerce investment project announcements into innovative and noninnovative to determine whether there are excess returns associated with these types of announcements. Apart from our overall results being consistent with the United States findings that the market values investments in electronic commerce projects, we also find that noninnovative investments are perceived as more valuable to the firm than innovative investments. On average, the market expects innovative investments to earn a return commensurate with their risk. We conclude that innovative electronic commerce projects are most likely seen by the capital market as easily replicable, and consequently have little, if any, competitive advantage period. On the other hand, we conclude from the noninnovative investment results that these types of investments are seen as being compatible with a firm's assets-in-place, in particular, its information technology capabilities, a view consistent with the resource-based view of the firm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Few studies have examined the potential benefits of specialist nurse-led programs of care involving home and clinic-based follow-up to optimise the post-discharge management of chronic heart failure (CHF). Objective: To determine the effectiveness of a hybrid program of clinic plus home-based intervention (C+HBI) in reducing recurrent hospitalisation in CHF patients. Methods: CHF patients with evidence of left ventricular systolic dysfunction admitted to two hospitals in Northern England were assigned to a C+HBI lasting 6 months post-discharge (n=58) or to usual, post-discharge care (UC: n=48) via a cluster randomization protocol. The co-primary endpoints were death or unplanned readmission (event-free survival) and rate of recurrent, all-cause readmission within 6 months of hospital discharge. Results: During study follow-up, more UC patients had an unplanned readmission for any cause (44% vs. 22%: P=0.0191 OR 1.95 95% CI 1.10-3.48) whilst 7 (15%) versus 5 (9%) UC and C+HBI patients, respectively, died (P=NS). Overall, 15 (26%) C+HBI versus 21 (44%) UC patients experienced a primary endpoint. C+HBI was associated with a non-significant, 45% reduction in the risk of death or readmission when adjusting for potential confounders (RR 0.55, 95% CI 0.28-1.08: P=0.08). Overall, C+HBI patients accumulated significantly fewer unplanned readmissions (15 vs. 45: P

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An appreciation of the physical mechanisms which cause observed seismicity complexity is fundamental to the understanding of the temporal behaviour of faults and single slip events. Numerical simulation of fault slip can provide insights into fault processes by allowing exploration of parameter spaces which influence microscopic and macroscopic physics of processes which may lead towards an answer to those questions. Particle-based models such as the Lattice Solid Model have been used previously for the simulation of stick-slip dynamics of faults, although mainly in two dimensions. Recent increases in the power of computers and the ability to use the power of parallel computer systems have made it possible to extend particle-based fault simulations to three dimensions. In this paper a particle-based numerical model of a rough planar fault embedded between two elastic blocks in three dimensions is presented. A very simple friction law without any rate dependency and no spatial heterogeneity in the intrinsic coefficient of friction is used in the model. To simulate earthquake dynamics the model is sheared in a direction parallel to the fault plane with a constant velocity at the driving edges. Spontaneous slip occurs on the fault when the shear stress is large enough to overcome the frictional forces on the fault. Slip events with a wide range of event sizes are observed. Investigation of the temporal evolution and spatial distribution of slip during each event shows a high degree of variability between the events. In some of the larger events highly complex slip patterns are observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The estimation of P(S-n > u) by simulation, where S, is the sum of independent. identically distributed random varibles Y-1,..., Y-n, is of importance in many applications. We propose two simulation estimators based upon the identity P(S-n > u) = nP(S, > u, M-n = Y-n), where M-n = max(Y-1,..., Y-n). One estimator uses importance sampling (for Y-n only), and the other uses conditional Monte Carlo conditioning upon Y1,..., Yn-1. Properties of the relative error of the estimators are derived and a numerical study given in terms of the M/G/1 queue in which n is replaced by an independent geometric random variable N. The conclusion is that the new estimators compare extremely favorably with previous ones. In particular, the conditional Monte Carlo estimator is the first heavy-tailed example of an estimator with bounded relative error. Further improvements are obtained in the random-N case, by incorporating control variates and stratification techniques into the new estimation procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Data on the long-term benefits of nonspecific disease management programs are limited. We performed a long-term follow-up of a previously published randomized trial. Methods: We compared all-cause mortality and recurrent hospitalization during median follow-up of 7.5 years in a heterogeneous cohort of patients with chronic illness initially exposed to a multidisciplinary, homebased intervention (HBI) (n = 260) or to usual postdischarge care (n = 268). Results: During follow-up, HBI had no impact on all-cause mortality (relative risk, 1.04; 95% confidence interval, 0.80-1.35) or event-free survival from death or unplanned hospitalization (relative risk, 1.03; 95% confidence interval, 0.86-1.24). Initial analysis suggested that HBI had only a marginal impact in reducing unplanned hospitalization, with 677 readmissions vs 824 for the usual care group (mean +/- SD rate, 0.72 +/- 0.96 vs 0.84 +/- 1.20 readmissions/patient per year; P = .08). When accounting for increased hospital activity in HBI patients with chronic obstructive pulmonary disease during follow-up for 2 years, post hoc analyses showed that HBI reduced readmissions by 14% within 2 years in patients without this condition (mean +/- SD rate, 0.54 +/- 0.72 vs 0.63 +/- 0.88 readmission/patient per year; P =. 04) and by 21% in all surviving patients within 3 to 8 years (mean +/- SD rate, 0.64 +/- 1.26 vs 0.81 +/- 1.61 readmissions/ patient per year; P =. 03). Overall, recurrent hospital costs were significantly lower ( 14%) in the HBI group (mean +/- SD, $ 823 +/- $ 1642 vs $ 960 +/- $ 1376 per patient per year; P =. 045). Conclusion: This unique study suggests that a nonspecific HBI provides long-term cost benefits in a range of chronic illnesses, except for chronic obstructive pulmonary disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current Physiologically based pharmacokinetic (PBPK) models are inductive. We present an additional, different approach that is based on the synthetic rather than the inductive approach to modeling and simulation. It relies on object-oriented programming A model of the referent system in its experimental context is synthesized by assembling objects that represent components such as molecules, cells, aspects of tissue architecture, catheters, etc. The single pass perfused rat liver has been well described in evaluating hepatic drug pharmacokinetics (PK) and is the system on which we focus. In silico experiments begin with administration of objects representing actual compounds. Data are collected in a manner analogous to that in the referent PK experiments. The synthetic modeling method allows for recognition and representation of discrete event and discrete time processes, as well as heterogeneity in organization, function, and spatial effects. An application is developed for sucrose and antipyrine, administered separately and together PBPK modeling has made extensive progress in characterizing abstracted PK properties but this has also been its limitation. Now, other important questions and possible extensions emerge. How are these PK properties and the observed behaviors generated? The inherent heuristic limitations of traditional models have hindered getting meaningful, detailed answers to such questions. Synthetic models of the type described here are specifically intended to help answer such questions. Analogous to wet-lab experimental models, they retain their applicability even when broken apart into sub-components. Having and applying this new class of models along with traditional PK modeling methods is expected to increase the productivity of pharmaceutical research at all levels that make use of modeling and simulation.