994 resultados para Event data recorders.


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Survival models are being widely applied to the engineering field to model time-to-event data once censored data is here a common issue. Using parametric models or not, for the case of heterogeneous data, they may not always represent a good fit. The present study relays on critical pumps survival data where traditional parametric regression might be improved in order to obtain better approaches. Considering censored data and using an empiric method to split the data into two subgroups to give the possibility to fit separated models to our censored data, we’ve mixture two distinct distributions according a mixture-models approach. We have concluded that it is a good method to fit data that does not fit to a usual parametric distribution and achieve reliable parameters. A constant cumulative hazard rate policy was used as well to check optimum inspection times using the obtained model from the mixture-model, which could be a plus when comparing with the actual maintenance policies to check whether changes should be introduced or not.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Numerous expert elicitation methods have been suggested for generalised linear models (GLMs). This paper compares three relatively new approaches to eliciting expert knowledge in a form suitable for Bayesian logistic regression. These methods were trialled on two experts in order to model the habitat suitability of the threatened Australian brush-tailed rock-wallaby (Petrogale penicillata). The first elicitation approach is a geographically assisted indirect predictive method with a geographic information system (GIS) interface. The second approach is a predictive indirect method which uses an interactive graphical tool. The third method uses a questionnaire to elicit expert knowledge directly about the impact of a habitat variable on the response. Two variables (slope and aspect) are used to examine prior and posterior distributions of the three methods. The results indicate that there are some similarities and dissimilarities between the expert informed priors of the two experts formulated from the different approaches. The choice of elicitation method depends on the statistical knowledge of the expert, their mapping skills, time constraints, accessibility to experts and funding available. This trial reveals that expert knowledge can be important when modelling rare event data, such as threatened species, because experts can provide additional information that may not be represented in the dataset. However care must be taken with the way in which this information is elicited and formulated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The ability to estimate the asset reliability and the probability of failure is critical to reducing maintenance costs, operation downtime, and safety hazards. Predicting the survival time and the probability of failure in future time is an indispensable requirement in prognostics and asset health management. In traditional reliability models, the lifetime of an asset is estimated using failure event data, alone; however, statistically sufficient failure event data are often difficult to attain in real-life situations due to poor data management, effective preventive maintenance, and the small population of identical assets in use. Condition indicators and operating environment indicators are two types of covariate data that are normally obtained in addition to failure event and suspended data. These data contain significant information about the state and health of an asset. Condition indicators reflect the level of degradation of assets while operating environment indicators accelerate or decelerate the lifetime of assets. When these data are available, an alternative approach to the traditional reliability analysis is the modelling of condition indicators and operating environment indicators and their failure-generating mechanisms using a covariate-based hazard model. The literature review indicates that a number of covariate-based hazard models have been developed. All of these existing covariate-based hazard models were developed based on the principle theory of the Proportional Hazard Model (PHM). However, most of these models have not attracted much attention in the field of machinery prognostics. Moreover, due to the prominence of PHM, attempts at developing alternative models, to some extent, have been stifled, although a number of alternative models to PHM have been suggested. The existing covariate-based hazard models neglect to fully utilise three types of asset health information (including failure event data (i.e. observed and/or suspended), condition data, and operating environment data) into a model to have more effective hazard and reliability predictions. In addition, current research shows that condition indicators and operating environment indicators have different characteristics and they are non-homogeneous covariate data. Condition indicators act as response variables (or dependent variables) whereas operating environment indicators act as explanatory variables (or independent variables). However, these non-homogenous covariate data were modelled in the same way for hazard prediction in the existing covariate-based hazard models. The related and yet more imperative question is how both of these indicators should be effectively modelled and integrated into the covariate-based hazard model. This work presents a new approach for addressing the aforementioned challenges. The new covariate-based hazard model, which termed as Explicit Hazard Model (EHM), explicitly and effectively incorporates all three available asset health information into the modelling of hazard and reliability predictions and also drives the relationship between actual asset health and condition measurements as well as operating environment measurements. The theoretical development of the model and its parameter estimation method are demonstrated in this work. EHM assumes that the baseline hazard is a function of the both time and condition indicators. Condition indicators provide information about the health condition of an asset; therefore they update and reform the baseline hazard of EHM according to the health state of asset at given time t. Some examples of condition indicators are the vibration of rotating machinery, the level of metal particles in engine oil analysis, and wear in a component, to name but a few. Operating environment indicators in this model are failure accelerators and/or decelerators that are included in the covariate function of EHM and may increase or decrease the value of the hazard from the baseline hazard. These indicators caused by the environment in which an asset operates, and that have not been explicitly identified by the condition indicators (e.g. Loads, environmental stresses, and other dynamically changing environment factors). While the effects of operating environment indicators could be nought in EHM; condition indicators could emerge because these indicators are observed and measured as long as an asset is operational and survived. EHM has several advantages over the existing covariate-based hazard models. One is this model utilises three different sources of asset health data (i.e. population characteristics, condition indicators, and operating environment indicators) to effectively predict hazard and reliability. Another is that EHM explicitly investigates the relationship between condition and operating environment indicators associated with the hazard of an asset. Furthermore, the proportionality assumption, which most of the covariate-based hazard models suffer from it, does not exist in EHM. According to the sample size of failure/suspension times, EHM is extended into two forms: semi-parametric and non-parametric. The semi-parametric EHM assumes a specified lifetime distribution (i.e. Weibull distribution) in the form of the baseline hazard. However, for more industry applications, due to sparse failure event data of assets, the analysis of such data often involves complex distributional shapes about which little is known. Therefore, to avoid the restrictive assumption of the semi-parametric EHM about assuming a specified lifetime distribution for failure event histories, the non-parametric EHM, which is a distribution free model, has been developed. The development of EHM into two forms is another merit of the model. A case study was conducted using laboratory experiment data to validate the practicality of the both semi-parametric and non-parametric EHMs. The performance of the newly-developed models is appraised using the comparison amongst the estimated results of these models and the other existing covariate-based hazard models. The comparison results demonstrated that both the semi-parametric and non-parametric EHMs outperform the existing covariate-based hazard models. Future research directions regarding to the new parameter estimation method in the case of time-dependent effects of covariates and missing data, application of EHM in both repairable and non-repairable systems using field data, and a decision support model in which linked to the estimated reliability results, are also identified.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective To evaluate methods for monitoring monthly aggregated hospital adverse event data that display clustering, non-linear trends and possible autocorrelation. Design Retrospective audit. Setting The Northern Hospital, Melbourne, Australia. Participants 171,059 patients admitted between January 2001 and December 2006. Measurements The analysis is illustrated with 72 months of patient fall injury data using a modified Shewhart U control chart, and charts derived from a quasi-Poisson generalised linear model (GLM) and a generalised additive mixed model (GAMM) that included an approximate upper control limit. Results The data were overdispersed and displayed a downward trend and possible autocorrelation. The downward trend was followed by a predictable period after December 2003. The GLM-estimated incidence rate ratio was 0.98 (95% CI 0.98 to 0.99) per month. The GAMM-fitted count fell from 12.67 (95% CI 10.05 to 15.97) in January 2001 to 5.23 (95% CI 3.82 to 7.15) in December 2006 (p<0.001). The corresponding values for the GLM were 11.9 and 3.94. Residual plots suggested that the GLM underestimated the rate at the beginning and end of the series and overestimated it in the middle. The data suggested a more rapid rate fall before 2004 and a steady state thereafter, a pattern reflected in the GAMM chart. The approximate upper two-sigma equivalent control limit in the GLM and GAMM charts identified 2 months that showed possible special-cause variation. Conclusion Charts based on GAMM analysis are a suitable alternative to Shewhart U control charts with these data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes a recommendation system that supports process participants in taking risk-informed decisions, with the goal of reducing risks that may arise during process execution. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a business process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we suggest to the participant the action to perform which minimizes the predicted process risk. Risks are predicted by traversing decision trees generated from the logs of past process executions, which consider process data, involved resources, task durations and other information elements like task frequencies. When applied in the context of multiple process instances running concurrently, a second technique is employed that uses integer linear programming to compute the optimal assignment of resources to tasks to be performed, in order to deal with the interplay between risks relative to different instances. The recommendation system has been implemented as a set of components on top of the YAWL BPM system and its effectiveness has been evaluated using a real-life scenario, in collaboration with risk analysts of a large insurance company. The results, based on a simulation of the real-life scenario and its comparison with the event data provided by the company, show that the process instances executed concurrently complete with significantly fewer faults and with lower fault severities, when the recommendations provided by our recommendation system are taken into account.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many organizations realize that increasing amounts of data (“Big Data”) need to be dealt with intelligently in order to compete with other organizations in terms of efficiency, speed and services. The goal is not to collect as much data as possible, but to turn event data into valuable insights that can be used to improve business processes. However, data-oriented analysis approaches fail to relate event data to process models. At the same time, large organizations are generating piles of process models that are disconnected from the real processes and information systems. In this chapter we propose to manage large collections of process models and event data in an integrated manner. Observed and modeled behavior need to be continuously compared and aligned. This results in a “liquid” business process model collection, i.e. a collection of process models that is in sync with the actual organizational behavior. The collection should self-adapt to evolving organizational behavior and incorporate relevant execution data (e.g. process performance and resource utilization) extracted from the logs, thereby allowing insightful reports to be produced from factual organizational data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cowcod (Sebastes levis) is a large (100-cm-FL), long-lived (maximum observed age 55 yr) demersal rockfish taken in multispecies commercial and recreational fisheries off southern and central California. It lives at 20–500 m depth: adults (>44 cm TL) inhabit rocky areas at 90–300 m and juveniles inhabit fine sand and clay at 40–100 m. Both sexes have similar growth and maturity. Both sexes recruit to the fishery before reaching full maturity. Based on age and growth data, the natural mortality rate is about M =0.055/yr, but the estimate is uncertain. Biomass, recruitment, and mortality during 1951–98 were estimated in a delay-difference model with catch data and abundance indices. The same model gave less precise estimates for 1916–50 based on catch data and assumptions about virgin biomass and recruitment such as used in stock reduction analysis. Abundance indices, based on rare event data, included a habitat-area–weighted index of recreational catch per unit of fishing effort (CPUE index values were 0.003–0.07 fish per angler hour), a standardized index of proportion of positive tows in CalCOFI ichthyoplankton survey data (binomial errors, 0–13% positive tows/yr), and proportion of positive tows for juveniles in bottom trawl surveys (binomial errors, 0–30% positive tows/yr). Cowcod are overfished in the southern California Bight; biomass during the 1998 season was about 7% of the virgin level and recent catches have been near 20 metric tons (t)/yr. Projections based on recent recruitment levels indicate that biomass will decline at catch levels > 5 t/yr. Trend data indicate that recruitment will be poor in the near future. Recreational fishing effort in deep water has increased and has become more effective for catching cowcod. Areas with relatively high catch rates for cowcod are fewer and are farther offshore. Cowcod die after capture and cannot be released alive. Two areas recently closed to bottom fishing will help rebuild the cowcod stock.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

RFID technology can be used to its fullest potential only with software to supplement the hardware with powerful capabilities for data capture, filtering, counting and storage. The EPCglobal Network architecture encourages minimizing the amount of business logic embedded in the tags, readers and middleware. This creates the need for a Business Logic Layer above the event filtering layer that enhances basic observation events with business context - i.e. in addition to the (what, when, where) information about an observation, it adds context information about why the object was there. The purpose of this project is to develop an implementation of the Business Logic Layer. This application accepts observation event data (e.g. from the Application Level Events (ALE) standard interface), enriches them with business context and provides these enriched events to a repository of business-level events (e.g. via the EPC Information Services (EPCIS) capture interface). The strength of the application lies in the automatic addition of business context. It is quick and easy to adapt any business process to the framework suggested and equally easy to reconfigure it if the business process is changed. A sample application has been developed for a business scenario in the retail sector.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many of the challenges faced in health care delivery can be informed through building models. In particular, Discrete Conditional Survival (DCS) models, recently under development, can provide policymakers with a flexible tool to assess time-to-event data. The DCS model is capable of modelling the survival curve based on various underlying distribution types and is capable of clustering or grouping observations (based on other covariate information) external to the distribution fits. The flexibility of the model comes through the choice of data mining techniques that are available in ascertaining the different subsets and also in the choice of distribution types available in modelling these informed subsets. This paper presents an illustrated example of the Discrete Conditional Survival model being deployed to represent ambulance response-times by a fully parameterised model. This model is contrasted against use of a parametric accelerated failure-time model, illustrating the strength and usefulness of Discrete Conditional Survival models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Long-term hormone therapy has been the standard of care for advanced prostate cancer since the 1940s. STAMPEDE is a randomised controlled trial using a multiarm, multistage platform design. It recruits men with high-risk, locally advanced, metastatic or recurrent prostate cancer who are starting first-line long-term hormone therapy. We report primary survival results for three research comparisons testing the addition of zoledronic acid, docetaxel, or their combination to standard of care versus standard of care alone.

METHODS: Standard of care was hormone therapy for at least 2 years; radiotherapy was encouraged for men with N0M0 disease to November, 2011, then mandated; radiotherapy was optional for men with node-positive non-metastatic (N+M0) disease. Stratified randomisation (via minimisation) allocated men 2:1:1:1 to standard of care only (SOC-only; control), standard of care plus zoledronic acid (SOC + ZA), standard of care plus docetaxel (SOC + Doc), or standard of care with both zoledronic acid and docetaxel (SOC + ZA + Doc). Zoledronic acid (4 mg) was given for six 3-weekly cycles, then 4-weekly until 2 years, and docetaxel (75 mg/m(2)) for six 3-weekly cycles with prednisolone 10 mg daily. There was no blinding to treatment allocation. The primary outcome measure was overall survival. Pairwise comparisons of research versus control had 90% power at 2·5% one-sided α for hazard ratio (HR) 0·75, requiring roughly 400 control arm deaths. Statistical analyses were undertaken with standard log-rank-type methods for time-to-event data, with hazard ratios (HRs) and 95% CIs derived from adjusted Cox models. This trial is registered at ClinicalTrials.gov (NCT00268476) and ControlledTrials.com (ISRCTN78818544).

FINDINGS: 2962 men were randomly assigned to four groups between Oct 5, 2005, and March 31, 2013. Median age was 65 years (IQR 60-71). 1817 (61%) men had M+ disease, 448 (15%) had N+/X M0, and 697 (24%) had N0M0. 165 (6%) men were previously treated with local therapy, and median prostate-specific antigen was 65 ng/mL (IQR 23-184). Median follow-up was 43 months (IQR 30-60). There were 415 deaths in the control group (347 [84%] prostate cancer). Median overall survival was 71 months (IQR 32 to not reached) for SOC-only, not reached (32 to not reached) for SOC + ZA (HR 0·94, 95% CI 0·79-1·11; p=0·450), 81 months (41 to not reached) for SOC + Doc (0·78, 0·66-0·93; p=0·006), and 76 months (39 to not reached) for SOC + ZA + Doc (0·82, 0·69-0·97; p=0·022). There was no evidence of heterogeneity in treatment effect (for any of the treatments) across prespecified subsets. Grade 3-5 adverse events were reported for 399 (32%) patients receiving SOC, 197 (32%) receiving SOC + ZA, 288 (52%) receiving SOC + Doc, and 269 (52%) receiving SOC + ZA + Doc.

INTERPRETATION: Zoledronic acid showed no evidence of survival improvement and should not be part of standard of care for this population. Docetaxel chemotherapy, given at the time of long-term hormone therapy initiation, showed evidence of improved survival accompanied by an increase in adverse events. Docetaxel treatment should become part of standard of care for adequately fit men commencing long-term hormone therapy.

FUNDING: Cancer Research UK, Medical Research Council, Novartis, Sanofi-Aventis, Pfizer, Janssen, Astellas, NIHR Clinical Research Network, Swiss Group for Clinical Cancer Research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2012

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is proposed to study the suspended sediment transport characteristics of river basins of Kerala and to model suspended sediment discharge mechanism for typical micro-watersheds. The Pamba river basin is selected as a representative hydrologic regime for detailed studies of suspended sediment characteristics and its seasonal variation. The applicability of various erosion models would be tested by comparing with the observed event data (by continuous monitoring of rainfall, discharge, and suspended sediment concentration for lower order streams). Empirical, conceptual and physically distributed models were used for making the comparison of performance of the models. Large variations in the discharge and sediment quantities were noticed during a particular year between the river basins investigated and for an individual river basin during the years for which the data was available. In general, the sediment yield pattern follows the seasonal distribution of rainfall, discharge and physiography of the land. This confirms with similar studies made for other Indian rivers. It was observed from this study, that the quantity of sediment transported downstream shows a decreasing trend over the years corresponding to increase in discharge. For sound and sustainable management of coastal zones, it is important to understand the balance between erosion and retention and to quantify the exact amount of the sediments reaching this eco-system. This, of course, necessitates a good length of time series data and more focused research on the behaviour of each river system, both present and past. In this realm of river inputs to ocean system, each of the 41 rivers of Kerala may have dominant yet diversified roles to influence the coastal ecosystem as reflected from this study on the major fraction of transport, namely the suspended sediments

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Optimistic bias is a commonly observed but poorly explained phenomenon. Our aim was to determine whether optimistic bias varied according to the nature of the event. Two event characteristics were explored: control and delay. A sample of 100 participants aged 18–30 years was randomly selected from the local residential telephone directory. Respondents were interviewed over the telephone. The highly structured interview schedule assessed respondents' perceptions of their own risk, and the risk of an average person of their age and sex for experiencing four negative life events: developing skin cancer, being involved in a serious car accident as the driver, being involved in a serious car accident as a passenger and having to wear a hearing aid. It also assessed respondents' perceptions of control and delay for each event. Data analysis using a repeated-measures MANOVA showed that optimistic bias occurred for all four events. Optimistic bias was significantly greater for the two events high in control (skin cancer and accident as the driver) than for those low in control (accident as a passenger and hearing aid). Delay was not related to the magnitude of optimistic bias. These findings have implications for health promotion campaigns and self-protective behaviors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background Analysis of recurrent event data is frequently needed in clinical and epidemiological studies. An important issue in such analysis is how to account for the dependence of the events in an individual and any unobserved heterogeneity of the event propensity across individuals.Methods We applied a number of conditional frailty and nonfrailty models in an analysis involving recurrent myocardial infarction events in the Long-Term Intervention with Pravastatin in Ischaemic Disease study. A multiple variable risk prediction model was developed for both males and females. Results A Weibull model with a gamma frailty term fitted the data better than other frailty models for each gender. Among nonfrailty models the stratified survival model fitted the data best for each gender. The relative risk estimated by the elapsed time model was close to that estimated by the gap time model. We found that a cholesterol-lowering drug, pravastatin (the intervention being tested in the trial) had significant protective effect against the occurrence of myocardial infarction in men (HR¼0.71, 95% CI0.60–0.83). However, the treatment effect was not significant in women due to smaller sample size (HR¼0.75, 95% CI 0.51–1.10). There were no significant interactions between the treatment effect and each recurrent MI event (p¼0.24 for men and p¼0.55 for women). The risk of developing an MI event for a male who had an MI event during follow-up was about 3.4 (95% CI 2.6–4.4) times the risk compared with those who did not have an MI event. The corresponding relative risk for a female was about 7.8 (95% CI 4.4–13.6). Limitations The number of female patients was relatively small compared with their male counterparts, which may result in low statistical power to find real differences in the effect of treatment and other potential risk factors.Conclusions The conditional frailty model suggested that after accounting for all the risk factors in the model, there was still unmeasured heterogeneity of the risk for myocardial infarction, indicating the effect of subject-specific risk factors. These risk prediction models can be used to classify cardiovascular disease patients into different risk categories and may be useful for the most effective targeting of preventive therapies for cardiovascular disease.