820 resultados para Risk assessment Mathematical models
Resumo:
Lean strategies have been developed to eliminate or reduce manufacturing waste and thus improve operational efficiency in manufacturing processes. However, implementing lean strategies requires a large amount of resources and, in practice, manufacturers encounter difficulties in selecting appropriate lean strategies within their resource constraints. There is currently no systematic methodology available for selecting appropriate lean strategies within a manufacturer's resource constraints. In the lean transformation process, it is also critical to measure the current and desired leanness levels in order to clearly evaluate lean implementation efforts. Despite the fact that many lean strategies are utilized to reduce or eliminate manufacturing waste, little effort has been directed towards properly assessing the leanness of manufacturing organizations. In practice, a single or specific group of metrics (either qualitative or quantitative) will only partially measure the overall leanness. Existing leanness assessment methodologies do not offer a comprehensive evaluation method, integrating both quantitative and qualitative lean measures into a single quantitative value for measuring the overall leanness of an organization. This research aims to develop mathematical models and a systematic methodology for selecting appropriate lean strategies and evaluating the leanness levels in manufacturing organizations. Mathematical models were formulated and a methodology was developed for selecting appropriate lean strategies within manufacturers' limited amount of available resources to reduce their identified wastes. A leanness assessment model was developed by using the fuzzy concept to assess the leanness level and to recommend an optimum leanness value for a manufacturing organization. In the proposed leanness assessment model, both quantitative and qualitative input factors have been taken into account. Based on program developed in MATLAB and C#, a decision support tool (DST) was developed for decision makers to select lean strategies and evaluate the leanness value based on the proposed models and methodology hence sustain the lean implementation efforts. A case study was conducted to demonstrate the effectiveness of these proposed models and methodology. Case study results suggested that out of 10 wastes identified, the case organization (ABC Limited) is able to improve a maximum of six wastes from the selected workstation within their resource limitations. The selected wastes are: unnecessary motion, setup time, unnecessary transportation, inappropriate processing, work in process and raw material inventory and suggested lean strategies are: 5S, Just-In-Time, Kanban System, the Visual Management System (VMS), Cellular Manufacturing, Standard Work Process using method-time measurement (MTM), and Single Minute Exchange of Die (SMED). From the suggested lean strategies, the impact of 5S was demonstrated by measuring the leanness level of two different situations in ABC. After that, MTM was suggested as a standard work process for further improvement of the current leanness value. The initial status of the organization showed a leanness value of 0.12. By applying 5S, the leanness level significantly improved to reach 0.19 and the simulation of MTM as a standard work method shows the leanness value could be improved to 0.31. The optimum leanness value of ABC was calculated to be 0.64. These leanness values provided a quantitative indication of the impacts of improvement initiatives in terms of the overall leanness level to the case organization. Sensitivity analsysis and a t-test were also performed to validate the model proposed. This research advances the current knowledge base by developing mathematical models and methodologies to overcome lean strategy selection and leanness assessment problems. By selecting appropriate lean strategies, a manufacturer can better prioritize implementation efforts and resources to maximize the benefits of implementing lean strategies in their organization. The leanness index is used to evaluate an organization's current (before lean implementation) leanness state against the state after lean implementation and to establish benchmarking (the optimum leanness state). Hence, this research provides a continuous improvement tool for a lean manufacturing organization.
Resumo:
Background Cancer-related malnutrition is associated with increased morbidity, poorer tolerance of treatment, decreased quality of life, increased hospital admissions, and increased health care costs (Isenring et al., 2013). This study’s aim was to determine whether a novel, automated screening system was a useful tool for nutrition screening when compared against a full nutrition assessment using the Patient-Generated Subjective Global Assessment (PG-SGA) tool. Methods A single site, observational, cross-sectional study was conducted in an outpatient oncology day care unit within a Queensland tertiary facility, with three hundred outpatients (51.7% male, mean age 58.6 ± 13.3 years). Eligibility criteria: ≥18 years, receiving anticancer treatment, able to provide written consent. Patients completed the Malnutrition Screening Tool (MST). Nutritional status was assessed using the PG-SGA. Data for the automated screening system was extracted from the pharmacy software program Charm. This included body mass index (BMI) and weight records dating back up to six months. Results The prevalence of malnutrition was 17%. Any weight loss over three to six weeks prior to the most recent weight record as identified by the automated screening system relative to malnutrition resulted in 56.52% sensitivity, 35.43% specificity, 13.68% positive predictive value, 81.82% negative predictive value. MST score 2 or greater was a stronger predictor of nutritional risk relative to PG-SGA classified malnutrition (70.59% sensitivity, 69.48% specificity, 32.14% positive predictive value, 92.02% negative predictive value). Conclusions Both the automated screening system and the MST fell short of the accepted professional standard for sensitivity (80%) or specificity (60%) when compared to the PG-SGA. However, although the MST remains a better predictor of malnutrition in this setting, uptake of this tool in the Oncology Day Care Unit remains challenging.
Resumo:
Background: There is currently no early predictive marker of survival for patients receiving chemotherapy for malignant pleural mesothelioma (MPM). Tumour response may be predictive for overall survival (OS), though this has not been explored. We have thus undertaken a combined-analysis of OS, from a 42 day landmark, of 526 patients receiving systemic therapy for MPM. We also validate published progression-free survival rates (PFSRs) and a progression-free survival (PFS) prognostic-index model. Methods: Analyses included nine MPM clinical trials incorporating six European Organisation for Research and Treatment of Cancer (EORTC) studies. Analysis of OS from landmark (from day 42 post-treatment) was considered regarding tumour response. PFSR analysis data included six non-EORTC MPM clinical trials. Prognostic index validation was performed on one non-EORTC data-set, with available survival data. Results: Median OS, from landmark, of patients with partial response (PR) was 12·8 months, stable disease (SD), 9·4 months and progressive disease (PD), 3·4 months. Both PR and SD were associated with longer OS from landmark compared with disease progression (both p < 0·0001). PFSRs for platinum-based combination therapies were consistent with published significant clinical activity ranges. Effective separation between PFS and OS curves provided a validation of the EORTC prognostic model, based on histology, stage and performance status. Conclusion: Response to chemotherapy is associated with significantly longer OS from landmark in patients with MPM. © 2012 Elsevier Ltd. All rights reserved.
Resumo:
Background: Findings from the phase 3 First-Line ErbituX in lung cancer (FLEX) study showed that the addition of cetuximab to first-line chemotherapy significantly improved overall survival compared with chemotherapy alone (hazard ratio [HR] 0·871, 95% CI 0·762-0·996; p=0·044) in patients with advanced non-small-cell lung cancer (NSCLC). To define patients benefiting most from cetuximab, we studied the association of tumour EGFR expression level with clinical outcome in FLEX study patients. Methods: We used prospectively collected tumour EGFR expression data to generate an immunohistochemistry score for FLEX study patients on a continuous scale of 0-300. We used response data to select an outcome-based discriminatory threshold immunohistochemistry score for EGFR expression of 200. Treatment outcome was analysed in patients with low (immunohistochemistry score <200) and high (≥200) tumour EGFR expression. The primary endpoint in the FLEX study was overall survival. We analysed patients from the FLEX intention-to-treat (ITT) population. The FLEX study is registered with ClinicalTrials.gov, number NCT00148798. Findings: Tumour EGFR immunohistochemistry data were available for 1121 of 1125 (99·6%) patients from the FLEX study ITT population. High EGFR expression was scored for 345 (31%) evaluable patients and low for 776 (69%) patients. For patients in the high EGFR expression group, overall survival was longer in the chemotherapy plus cetuximab group than in the chemotherapy alone group (median 12·0 months [95% CI 10·2-15·2] vs 9·6 months [7·6-10·6]; HR 0·73, 0·58-0·93; p=0·011), with no meaningful increase in side-effects. We recorded no corresponding survival benefit for patients in the low EGFR expression group (median 9·8 months [8·9-12·2] vs 10·3 months [9·2-11·5]; HR 0·99, 0·84-1·16; p=0·88). A treatment interaction test assessing the difference in the HRs for overall survival between the EGFR expression groups suggested a predictive value for EGFR expression (p=0·044). Interpretation: High EGFR expression is a tumour biomarker that can predict survival benefit from the addition of cetuximab to first-line chemotherapy in patients with advanced NSCLC. Assessment of EGFR expression could offer a personalised treatment approach in this setting. Funding: Merck KGaA. © 2012 Elsevier Ltd.
Resumo:
Objective: Modern series from high-volume esophageal centers report an approximate 40% 5-year survival in patients treated with curative intent and postoperative mortality rates of less than 4%. An objective analysis of factors that underpin current benchmarks within high-volume centers has not been performed. Methods: Three time periods were studied, 1990 to 1998 (period 1), 1999 to 2003 (period 2), and 2004 to 2008 (period 3), in which 471, 254, and 342 patients, respectively, with esophageal cancer were treated with curative intent. All data were prospectively recorded, and staging, pathology, treatment, operative, and oncologic outcomes were compared. Results: Five-year disease-specific survival was 28%, 35%, and 44%, and in-hospital postoperative mortality was 6.7%, 4.4%, and 1.7% for periods 1 to 3, respectively (P < .001). Period 3, compared with periods 1 and 2, respectively, was associated with significantly (P < .001) more early tumors (17% vs 4% and 6%), higher nodal yields (median 22 vs 11 and 18), and a higher R0 rate in surgically treated patients (81% vs 73% and 75%). The use of multimodal therapy increased (P < .05) across time periods. By multivariate analysis, age, T stage, N stage, vascular invasion, R status, and time period were significantly (P < .0001) associated with outcome. Conclusions: Improved survival with localized esophageal cancer in the modern era may reflect an increase of early tumors and optimized staging. Important surgical and pathologic standards, including a higher R0 resection rate and nodal yields, and lower postoperative mortality, were also observed. Copyright © 2012 by The American Association for Thoracic Surgery.
Resumo:
Background: Appropriate disposition of emergency department (ED) patients with chest pain is dependent on clinical evaluation of risk. A number of chest pain risk stratification tools have been proposed. The aim of this study was to compare the predictive performance for major adverse cardiac events (MACE) using risk assessment tools from the National Heart Foundation of Australia (HFA), the Goldman risk score and the Thrombolysis in Myocardial Infarction risk score (TIMI RS). Methods: This prospective observational study evaluated ED patients aged ≥30 years with non-traumatic chest pain for which no definitive non-ischemic cause was found. Data collected included demographic and clinical information, investigation findings and occurrence of MACE by 30 days. The outcome of interest was the comparative predictive performance of the risk tools for MACE at 30 days, as analyzed by receiver operator curves (ROC). Results: Two hundred eighty-one patients were studied; the rate of MACE was 14.1%. Area under the curve (AUC) of the HFA, TIMI RS and Goldman tools for the endpoint of MACE was 0.54, 0.71 and 0.67, respectively, with the difference between the tools in predictive ability for MACE being highly significant [chi2 (3) = 67.21, N = 276, p < 0.0001]. Conclusion: The TIMI RS and Goldman tools performed better than the HFA in this undifferentiated ED chest pain population, but selection of cutoffs balancing sensitivity and specificity was problematic. There is an urgent need for validated risk stratification tools specific for the ED chest pain population.
Resumo:
The contact lens industry has evolved and now provides many choices, including continuous wear, overnight orthokeratology, frequent-replacement lenses, daily-disposable lenses, and many alternatives in systems of care and maintenance. Epidemiologic studies to date have shown that how a lens is worn, particularly if worn overnight, can increase the risk of microbial keratitis. However, the risk of silicone hydrogel contact lenses worn on a continuous-wear basis has been evaluated only recently. This article summarizes the recent research data on extended-wear silicone hydrogel lenses and discusses the challenges of early evaluations of silicone hydrogel lens safety. Finally, the relevance of this information is discussed to practitioners and contact lens wearers making choices about the risks and benefits of different products and how they are used.
Resumo:
Healthy governance systems are key to delivering sound environmental management outcomes from global to local scales. There are, however, surprisingly few risk assessment methods that can pinpoint those domains and sub-domains within governance systems that are most likely to influence good environmental outcomes at any particular scale, or those if absent or dysfunctional, most likely to prevent effective environmental management. This paper proposes a new risk assessment method for analysing governance systems. This method is then tested through its preliminary application to a significant real-world context: governance as it relates to the health of Australia's Great Barrier Reef (GBR). The GBR exists at a supra-regional scale along most of the north eastern coast of Australia. Brodie et al (2012 Mar. Pollut. Bull. 65 81-100) have recently reviewed the state and trend of the health of the GBR, finding that overall trends remain of significant concern. At the same time, official international concern over the governance of the reef has recently been signalled globally by the International Union for the Conservation of Nature (IUCN). These environmental and political contexts make the GBR an ideal candidate for use in testing and reviewing the application of improved tools for governance risk assessment. © 2013 IOP Publishing Ltd.
Resumo:
The trust and credibility gap between institutional regulators and the public is based on fundamental social and cultural differences related to power and authority. It is also associated with the 'distance' of a bureaucracies from those whom they serve. The nature of public concern about risk may be investigated by considering specific cognitive decision making 'rules' such as 'familiarity' of a hazard or 'voluntariness' of exposure. A more complete appreciation of the 'how' and 'why' of public response to danger from industrial hazards can be gained by appreciating these 'rules' within the broader context of mis-communication between 'elite' regulators and a highly diverse public. If the results of risk assessments are expressed in technical terms alone, it is unlikely that any real communication will occur. Further, if issues related to the 'remote' nature of much institutional decision making are not addressed, closure of the 'gap' may be difficult to bring about.
Resumo:
The technical feasibility of roll motion control devices has been amply demonstrated for over 100 years. Performance, however, can still fall short of expectations because of difficulties associated with control system designs, which have proven to be far from trivial due to fundamental performance limitations and large variations of the spectral characteristics of wave-induced roll motion. This tutorial paper presents an account of the development of various ship roll motion control systems together with the challenges associated with their design. It discusses the assessment of performance and the applicability of different mathematical models, and it surveys the control methods that have been implemented and validated with full scale experiments. The paper also presents an outlook on what are believed to be potential areas of research within this topic.
Resumo:
Post-earthquake fire (PEF) is considered one of the most high risk and complicated problems affecting buildings in urban areas and can cause even more damage than the earthquake itself. However, most standards and codes ignore the implications of PEF and so buildings are not normally designed with PEF in mind. What is needed is for PEF factors to be routinely scrutinized and codified as part of the design process. A systematic application is presented as a means of mitigating the risk of PEF in urban buildings. This covers both existing buildings, in terms of retrofit solutions, and those yet to be designed, where a PEF factor is proposed. To ensure the mitigation strategy meets the defined criteria, a minimum time is defined – the safety guaranteed time target – where the safety of the inhabitants in a building is guaranteed.
Resumo:
Multivariate predictive models are widely used tools for assessment of aquatic ecosystem health and models have been successfully developed for the prediction and assessment of aquatic macroinvertebrates, diatoms, local stream habitat features and fish. We evaluated the ability of a modelling method based on the River InVertebrate Prediction and Classification System (RIVPACS) to accurately predict freshwater fish assemblage composition and assess aquatic ecosystem health in rivers and streams of south-eastern Queensland, Australia. The predictive model was developed, validated and tested in a region of comparatively high environmental variability due to the unpredictable nature of rainfall and river discharge. The model was concluded to provide sufficiently accurate and precise predictions of species composition and was sensitive enough to distinguish test sites impacted by several common types of human disturbance (particularly impacts associated with catchment land use and associated local riparian, in-stream habitat and water quality degradation). The total number of fish species available for prediction was low in comparison to similar applications of multivariate predictive models based on other indicator groups, yet the accuracy and precision of our model was comparable to outcomes from such studies. In addition, our model developed for sites sampled on one occasion and in one season only (winter), was able to accurately predict fish assemblage composition at sites sampled during other seasons and years, provided that they were not subject to unusually extreme environmental conditions (e.g. extended periods of low flow that restricted fish movement or resulted in habitat desiccation and local fish extinctions).
Resumo:
We consider the problem of combining opinions from different experts in an explicitly model-based way to construct a valid subjective prior in a Bayesian statistical approach. We propose a generic approach by considering a hierarchical model accounting for various sources of variation as well as accounting for potential dependence between experts. We apply this approach to two problems. The first problem deals with a food risk assessment problem involving modelling dose-response for Listeria monocytogenes contamination of mice. Two hierarchical levels of variation are considered (between and within experts) with a complex mathematical situation due to the use of an indirect probit regression. The second concerns the time taken by PhD students to submit their thesis in a particular school. It illustrates a complex situation where three hierarchical levels of variation are modelled but with a simpler underlying probability distribution (log-Normal).
Resumo:
Early full-term pregnancy is one of the most effective natural protections against breast cancer. To investigate this effect, we have characterized the global gene expression and epigenetic profiles of multiple cell types from normal breast tissue of nulliparous and parous women and carriers of BRCA1 or BRCA2 mutations. We found significant differences in CD44+ progenitor cells, where the levels of many stem cell-related genes and pathways, including the cell-cycle regulator p27, are lower in parous women without BRCA1/BRCA2 mutations. We also noted a significant reduction in the frequency of CD44+p27+ cells in parous women and showed, using explant cultures, that parity-related signaling pathways play a role in regulating the number of p27+ cells and their proliferation. Our results suggest that pathways controlling p27+ mammary epithelial cells and the numbers of these cells relate to breast cancer risk and can be explored for cancer risk assessment and prevention.