956 resultados para Event data recorders.


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective To evaluate methods for monitoring monthly aggregated hospital adverse event data that display clustering, non-linear trends and possible autocorrelation. Design Retrospective audit. Setting The Northern Hospital, Melbourne, Australia. Participants 171,059 patients admitted between January 2001 and December 2006. Measurements The analysis is illustrated with 72 months of patient fall injury data using a modified Shewhart U control chart, and charts derived from a quasi-Poisson generalised linear model (GLM) and a generalised additive mixed model (GAMM) that included an approximate upper control limit. Results The data were overdispersed and displayed a downward trend and possible autocorrelation. The downward trend was followed by a predictable period after December 2003. The GLM-estimated incidence rate ratio was 0.98 (95% CI 0.98 to 0.99) per month. The GAMM-fitted count fell from 12.67 (95% CI 10.05 to 15.97) in January 2001 to 5.23 (95% CI 3.82 to 7.15) in December 2006 (p<0.001). The corresponding values for the GLM were 11.9 and 3.94. Residual plots suggested that the GLM underestimated the rate at the beginning and end of the series and overestimated it in the middle. The data suggested a more rapid rate fall before 2004 and a steady state thereafter, a pattern reflected in the GAMM chart. The approximate upper two-sigma equivalent control limit in the GLM and GAMM charts identified 2 months that showed possible special-cause variation. Conclusion Charts based on GAMM analysis are a suitable alternative to Shewhart U control charts with these data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes a recommendation system that supports process participants in taking risk-informed decisions, with the goal of reducing risks that may arise during process execution. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a business process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we suggest to the participant the action to perform which minimizes the predicted process risk. Risks are predicted by traversing decision trees generated from the logs of past process executions, which consider process data, involved resources, task durations and other information elements like task frequencies. When applied in the context of multiple process instances running concurrently, a second technique is employed that uses integer linear programming to compute the optimal assignment of resources to tasks to be performed, in order to deal with the interplay between risks relative to different instances. The recommendation system has been implemented as a set of components on top of the YAWL BPM system and its effectiveness has been evaluated using a real-life scenario, in collaboration with risk analysts of a large insurance company. The results, based on a simulation of the real-life scenario and its comparison with the event data provided by the company, show that the process instances executed concurrently complete with significantly fewer faults and with lower fault severities, when the recommendations provided by our recommendation system are taken into account.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many organizations realize that increasing amounts of data (“Big Data”) need to be dealt with intelligently in order to compete with other organizations in terms of efficiency, speed and services. The goal is not to collect as much data as possible, but to turn event data into valuable insights that can be used to improve business processes. However, data-oriented analysis approaches fail to relate event data to process models. At the same time, large organizations are generating piles of process models that are disconnected from the real processes and information systems. In this chapter we propose to manage large collections of process models and event data in an integrated manner. Observed and modeled behavior need to be continuously compared and aligned. This results in a “liquid” business process model collection, i.e. a collection of process models that is in sync with the actual organizational behavior. The collection should self-adapt to evolving organizational behavior and incorporate relevant execution data (e.g. process performance and resource utilization) extracted from the logs, thereby allowing insightful reports to be produced from factual organizational data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cowcod (Sebastes levis) is a large (100-cm-FL), long-lived (maximum observed age 55 yr) demersal rockfish taken in multispecies commercial and recreational fisheries off southern and central California. It lives at 20–500 m depth: adults (>44 cm TL) inhabit rocky areas at 90–300 m and juveniles inhabit fine sand and clay at 40–100 m. Both sexes have similar growth and maturity. Both sexes recruit to the fishery before reaching full maturity. Based on age and growth data, the natural mortality rate is about M =0.055/yr, but the estimate is uncertain. Biomass, recruitment, and mortality during 1951–98 were estimated in a delay-difference model with catch data and abundance indices. The same model gave less precise estimates for 1916–50 based on catch data and assumptions about virgin biomass and recruitment such as used in stock reduction analysis. Abundance indices, based on rare event data, included a habitat-area–weighted index of recreational catch per unit of fishing effort (CPUE index values were 0.003–0.07 fish per angler hour), a standardized index of proportion of positive tows in CalCOFI ichthyoplankton survey data (binomial errors, 0–13% positive tows/yr), and proportion of positive tows for juveniles in bottom trawl surveys (binomial errors, 0–30% positive tows/yr). Cowcod are overfished in the southern California Bight; biomass during the 1998 season was about 7% of the virgin level and recent catches have been near 20 metric tons (t)/yr. Projections based on recent recruitment levels indicate that biomass will decline at catch levels > 5 t/yr. Trend data indicate that recruitment will be poor in the near future. Recreational fishing effort in deep water has increased and has become more effective for catching cowcod. Areas with relatively high catch rates for cowcod are fewer and are farther offshore. Cowcod die after capture and cannot be released alive. Two areas recently closed to bottom fishing will help rebuild the cowcod stock.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

RFID technology can be used to its fullest potential only with software to supplement the hardware with powerful capabilities for data capture, filtering, counting and storage. The EPCglobal Network architecture encourages minimizing the amount of business logic embedded in the tags, readers and middleware. This creates the need for a Business Logic Layer above the event filtering layer that enhances basic observation events with business context - i.e. in addition to the (what, when, where) information about an observation, it adds context information about why the object was there. The purpose of this project is to develop an implementation of the Business Logic Layer. This application accepts observation event data (e.g. from the Application Level Events (ALE) standard interface), enriches them with business context and provides these enriched events to a repository of business-level events (e.g. via the EPC Information Services (EPCIS) capture interface). The strength of the application lies in the automatic addition of business context. It is quick and easy to adapt any business process to the framework suggested and equally easy to reconfigure it if the business process is changed. A sample application has been developed for a business scenario in the retail sector.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many of the challenges faced in health care delivery can be informed through building models. In particular, Discrete Conditional Survival (DCS) models, recently under development, can provide policymakers with a flexible tool to assess time-to-event data. The DCS model is capable of modelling the survival curve based on various underlying distribution types and is capable of clustering or grouping observations (based on other covariate information) external to the distribution fits. The flexibility of the model comes through the choice of data mining techniques that are available in ascertaining the different subsets and also in the choice of distribution types available in modelling these informed subsets. This paper presents an illustrated example of the Discrete Conditional Survival model being deployed to represent ambulance response-times by a fully parameterised model. This model is contrasted against use of a parametric accelerated failure-time model, illustrating the strength and usefulness of Discrete Conditional Survival models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Long-term hormone therapy has been the standard of care for advanced prostate cancer since the 1940s. STAMPEDE is a randomised controlled trial using a multiarm, multistage platform design. It recruits men with high-risk, locally advanced, metastatic or recurrent prostate cancer who are starting first-line long-term hormone therapy. We report primary survival results for three research comparisons testing the addition of zoledronic acid, docetaxel, or their combination to standard of care versus standard of care alone.

METHODS: Standard of care was hormone therapy for at least 2 years; radiotherapy was encouraged for men with N0M0 disease to November, 2011, then mandated; radiotherapy was optional for men with node-positive non-metastatic (N+M0) disease. Stratified randomisation (via minimisation) allocated men 2:1:1:1 to standard of care only (SOC-only; control), standard of care plus zoledronic acid (SOC + ZA), standard of care plus docetaxel (SOC + Doc), or standard of care with both zoledronic acid and docetaxel (SOC + ZA + Doc). Zoledronic acid (4 mg) was given for six 3-weekly cycles, then 4-weekly until 2 years, and docetaxel (75 mg/m(2)) for six 3-weekly cycles with prednisolone 10 mg daily. There was no blinding to treatment allocation. The primary outcome measure was overall survival. Pairwise comparisons of research versus control had 90% power at 2·5% one-sided α for hazard ratio (HR) 0·75, requiring roughly 400 control arm deaths. Statistical analyses were undertaken with standard log-rank-type methods for time-to-event data, with hazard ratios (HRs) and 95% CIs derived from adjusted Cox models. This trial is registered at ClinicalTrials.gov (NCT00268476) and ControlledTrials.com (ISRCTN78818544).

FINDINGS: 2962 men were randomly assigned to four groups between Oct 5, 2005, and March 31, 2013. Median age was 65 years (IQR 60-71). 1817 (61%) men had M+ disease, 448 (15%) had N+/X M0, and 697 (24%) had N0M0. 165 (6%) men were previously treated with local therapy, and median prostate-specific antigen was 65 ng/mL (IQR 23-184). Median follow-up was 43 months (IQR 30-60). There were 415 deaths in the control group (347 [84%] prostate cancer). Median overall survival was 71 months (IQR 32 to not reached) for SOC-only, not reached (32 to not reached) for SOC + ZA (HR 0·94, 95% CI 0·79-1·11; p=0·450), 81 months (41 to not reached) for SOC + Doc (0·78, 0·66-0·93; p=0·006), and 76 months (39 to not reached) for SOC + ZA + Doc (0·82, 0·69-0·97; p=0·022). There was no evidence of heterogeneity in treatment effect (for any of the treatments) across prespecified subsets. Grade 3-5 adverse events were reported for 399 (32%) patients receiving SOC, 197 (32%) receiving SOC + ZA, 288 (52%) receiving SOC + Doc, and 269 (52%) receiving SOC + ZA + Doc.

INTERPRETATION: Zoledronic acid showed no evidence of survival improvement and should not be part of standard of care for this population. Docetaxel chemotherapy, given at the time of long-term hormone therapy initiation, showed evidence of improved survival accompanied by an increase in adverse events. Docetaxel treatment should become part of standard of care for adequately fit men commencing long-term hormone therapy.

FUNDING: Cancer Research UK, Medical Research Council, Novartis, Sanofi-Aventis, Pfizer, Janssen, Astellas, NIHR Clinical Research Network, Swiss Group for Clinical Cancer Research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2012

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is proposed to study the suspended sediment transport characteristics of river basins of Kerala and to model suspended sediment discharge mechanism for typical micro-watersheds. The Pamba river basin is selected as a representative hydrologic regime for detailed studies of suspended sediment characteristics and its seasonal variation. The applicability of various erosion models would be tested by comparing with the observed event data (by continuous monitoring of rainfall, discharge, and suspended sediment concentration for lower order streams). Empirical, conceptual and physically distributed models were used for making the comparison of performance of the models. Large variations in the discharge and sediment quantities were noticed during a particular year between the river basins investigated and for an individual river basin during the years for which the data was available. In general, the sediment yield pattern follows the seasonal distribution of rainfall, discharge and physiography of the land. This confirms with similar studies made for other Indian rivers. It was observed from this study, that the quantity of sediment transported downstream shows a decreasing trend over the years corresponding to increase in discharge. For sound and sustainable management of coastal zones, it is important to understand the balance between erosion and retention and to quantify the exact amount of the sediments reaching this eco-system. This, of course, necessitates a good length of time series data and more focused research on the behaviour of each river system, both present and past. In this realm of river inputs to ocean system, each of the 41 rivers of Kerala may have dominant yet diversified roles to influence the coastal ecosystem as reflected from this study on the major fraction of transport, namely the suspended sediments

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Survival models deals with the modeling of time to event data. However in some situations part of the population may be no longer subject to the event. Models that take this fact into account are called cure rate models. There are few studies about hypothesis tests in cure rate models. Recently a new test statistic, the gradient statistic, has been proposed. It shares the same asymptotic properties with the classic large sample tests, the likelihood ratio, score and Wald tests. Some simulation studies have been carried out to explore the behavior of the gradient statistic in fi nite samples and compare it with the classic statistics in diff erent models. The main objective of this work is to study and compare the performance of gradient test and likelihood ratio test in cure rate models. We first describe the models and present the main asymptotic properties of the tests. We perform a simulation study based on the promotion time model with Weibull distribution to assess the performance of the tests in finite samples. An application is presented to illustrate the studied concepts

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many recent survival studies propose modeling data with a cure fraction, i.e., data in which part of the population is not susceptible to the event of interest. This event may occur more than once for the same individual (recurrent event). We then have a scenario of recurrent event data in the presence of a cure fraction, which may appear in various areas such as oncology, finance, industries, among others. This paper proposes a multiple time scale survival model to analyze recurrent events using a cure fraction. The objective is analyzing the efficiency of certain interventions so that the studied event will not happen again in terms of covariates and censoring. All estimates were obtained using a sampling-based approach, which allows information to be input beforehand with lower computational effort. Simulations were done based on a clinical scenario in order to observe some frequentist properties of the estimation procedure in the presence of small and moderate sample sizes. An application of a well-known set of real mammary tumor data is provided.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Long-term survival models have historically been considered for analyzing time-to-event data with long-term survivors fraction. However, situations in which a fraction (1 - p) of systems is subject to failure from independent competing causes of failure, while the remaining proportion p is cured or has not presented the event of interest during the time period of the study, have not been fully considered in the literature. In order to accommodate such situations, we present in this paper a new long-term survival model. Maximum likelihood estimation procedure is discussed as well as interval estimation and hypothesis tests. A real dataset illustrates the methodology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Use of microarray technology often leads to high-dimensional and low- sample size data settings. Over the past several years, a variety of novel approaches have been proposed for variable selection in this context. However, only a small number of these have been adapted for time-to-event data where censoring is present. Among standard variable selection methods shown both to have good predictive accuracy and to be computationally efficient is the elastic net penalization approach. In this paper, adaptation of the elastic net approach is presented for variable selection both under the Cox proportional hazards model and under an accelerated failure time (AFT) model. Assessment of the two methods is conducted through simulation studies and through analysis of microarray data obtained from a set of patients with diffuse large B-cell lymphoma where time to survival is of interest. The approaches are shown to match or exceed the predictive performance of a Cox-based and an AFT-based variable selection method. The methods are moreover shown to be much more computationally efficient than their respective Cox- and AFT- based counterparts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Transient left ventricular apical ballooning syndrome (TLVABS) is an acute cardiac syndrome mimicking ST-segment elevation myocardial infarction characterized by transient wall-motion abnormalities involving apical and mid-portions of the left ventricle in the absence of significant obstructive coronary disease. METHODS: Searching the MEDLINE database 28 case series met the eligibility criteria and were summarized in a narrative synthesis of the demographic characteristics, clinical features and pathophysiological mechanisms. RESULTS: TLVABS is observed in 0.7-2.5% of patients with suspected ACS, affects women in 90.7% (95% CI: 88.2-93.2%) with a mean age ranging from 62 to 76 years and most commonly presents with chest pain (83.4%, 95% CI: 80.0-86.7%) and dyspnea (20.4%, 95% CI: 16.3-24.5%) following an emotionally or physically stressful event. ECG on admission shows ST-segment elevations in 71.1% (95% CI: 67.2-75.1%) and is accompanied by usually mild elevations of Troponins in 85.0% (95% CI: 80.8-89.1%). Despite dramatic clinical presentation and substantial risk of heart failure, cardiogenic shock and arrhythmias, LVEF improved from 20-49.9% to 59-76% within a mean time of 7-37 days with an in-hospital mortality rate of 1.7% (95% CI: 0.5-2.8%), complete recovery in 95.9% (95% CI: 93.8-98.1%) and rare recurrence. The underlying etiology is thought to be based on an exaggerated sympathetic stimulation. CONCLUSION: TLVABS is a considerable differential diagnosis in ACS, especially in postmenopausal women with a preceding stressful event. Data on longterm follow-up is pending and further studies will be necessary to clarify the etiology and reach consensus in acute and longterm management of TLVABS.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE To compare the effects of antiplatelets and anticoagulants on stroke and death in patients with acute cervical artery dissection. DESIGN Systematic review with Bayesian meta-analysis. DATA SOURCES The reviewers searched MEDLINE and EMBASE from inception to November 2012, checked reference lists, and contacted authors. STUDY SELECTION Studies were eligible if they were randomised, quasi-randomised or observational comparisons of antiplatelets and anticoagulants in patients with cervical artery dissection. DATA EXTRACTION Data were extracted by one reviewer and checked by another. Bayesian techniques were used to appropriately account for studies with scarce event data and imbalances in the size of comparison groups. DATA SYNTHESIS Thirty-seven studies (1991 patients) were included. We found no randomised trial. The primary analysis revealed a large treatment effect in favour of antiplatelets for preventing the primary composite outcome of ischaemic stroke, intracranial haemorrhage or death within the first 3 months after treatment initiation (relative risk 0.32, 95% credibility interval 0.12 to 0.63), while the degree of between-study heterogeneity was moderate (τ(2) = 0.18). In an analysis restricted to studies of higher methodological quality, the possible advantage of antiplatelets over anticoagulants was less obvious than in the main analysis (relative risk 0.73, 95% credibility interval 0.17 to 2.30). CONCLUSION In view of these results and the safety advantages, easier usage and lower cost of antiplatelets, we conclude that antiplatelets should be given precedence over anticoagulants as a first line treatment in patients with cervical artery dissection unless results of an adequately powered randomised trial suggest the opposite.