965 resultados para Service Failure
Resumo:
Background: The aim of this study was to determine the effects of carvedilol on the costs related to the treatment of severe chronic heart failure (CHF). Methods: Costs for the treatment for heart failure within the National Health Service (NHS) in the United Kingdom (UK) were applied to resource utilisation data prospectively collected in all patients randomized into the Carvedilol Prospective Randomized Cumulative Survival (COPERNICUS) Study. Unit-specific, per them (hospital bed day) costs were used to calculate expenditures due to hospitalizations. We also included costs of carvedilol treatment, general practitioner surgery/office visits, hospital out-patient clinic visits and nursing home care based on estimates derived from validated patterns of clinical practice in the UK. Results: The estimated cost of carvedilol therapy and related ambulatory care for the 1156 patients assigned to active treatment was 530,771 pound (44.89 pound per patient/month of follow-up). However, patients assigned to carvedilol were hospitalised less often and accumulated fewer and less expensive days of admission. Consequently, the total estimated cost of hospital care was 3.49 pound million in the carvedilol group compared with 4.24 pound million for the 1133 patients in the placebo arm. The cost of post-discharge care was also less in the carvedilol than in the placebo group (479,200 pound vs. 548,300) pound. Overall, the cost per patient treated in the carvedilol group was 3948 pound compared to 4279 pound in the placebo group. This equated to a cost of 385.98 pound vs. 434.18 pound, respectively, per patient/month of follow-up: an 11.1% reduction in health care costs in favour of carvedilol. Conclusions: These findings suggest that not only can carvedilol treatment increase survival and reduce hospital admissions in patients with severe CHF but that it can also cut costs in the process.
Why are consumers with heart failure still receiving non-steroidal anti-inflammatory drugs (NSAIDS)?
Resumo:
Increased competition, geographically expanded marketplaces, technology replication and an ever discerning consumer base, are reasons why companies need to regularly reappraise their competencies in terms of activities and functions they perform themselves. Where viable alternatives exist, companies should consider outsourcing of non-core activities and functions. Within SCM (Supply Chain Management) it could be preferable if a “one stop shop” existed for companies seeking to outsource functions identified as non-core. “Traditionally” structured LSP’s who have concentrated their service offer around providing warehousing and transport activities are potentially at a crossroads – clients and potential clients requiring “new” services which could increase LSP’s revenues if provided, whilst failure to provide could perhaps result in clients seeking outsourced services elsewhere.
Resumo:
Risk management in healthcare represents a group of various complex actions, implemented to improve the quality of healthcare services and guarantee the patients safety. Risks cannot be eliminated, but it can be controlled with different risk assessment methods derived from industrial applications and among these the Failure Mode Effect and Criticality Analysis (FMECA) is a largely used methodology. The main purpose of this work is the analysis of failure modes of the Home Care (HC) service provided by local healthcare unit of Naples (ASL NA1) to focus attention on human and non human factors according to the organization framework selected by WHO. © Springer International Publishing Switzerland 2014.
Resumo:
This dissertation provides a theory of the effects and determinants of an economy's level of social services. The dissertation focuses on how the provision of social services will affect the effort decisions of workers, which will ultimately determine the economy's level of output. A worker decides on how much effort to contribute in relation to the level of social services he/she receives. The higher the level of social services received, the lower the cost—disutility—from providing effort will be. The government provides public infrastructure and social services (i.e. health services) in accordance with the economy's endowment of effort. In doing so, the government takes the aggregate effort endowment as given. Since, with higher individual work effort the higher the economy's total level of effort, failure by workers to coordinate effort levels will result in possible instances of low effort, low social services and low output; and, other instances of high effort, high social services and high output. Therefore, this dissertation predicts that in the context of social services, coordination failures in effort levels can lead to development traps. ^
Resumo:
Distributed applications are exposed as reusable components that are dynamically discovered and integrated to create new applications. These new applications, in the form of aggregate services, are vulnerable to failure due to the autonomous and distributed nature of their integrated components. This vulnerability creates the need for adaptability in aggregate services. The need for adaptation is accentuated for complex long-running applications as is found in scientific Grid computing, where distributed computing nodes may participate to solve computation and data-intensive problems. Such applications integrate services for coordinated problem solving in areas such as Bioinformatics. For such applications, when a constituent service fails, the application fails, even though there are other nodes that can substitute for the failed service. This concern is not addressed in the specification of high-level composition languages such as that of the Business Process Execution Language (BPEL). We propose an approach to transparently autonomizing existing BPEL processes in order to make them modifiable at runtime and more resilient to the failures in their execution environment. By transparent introduction of adaptive behavior, adaptation preserves the original business logic of the aggregate service and does not tangle the code for adaptive behavior with that of the aggregate service. The major contributions of this dissertation are: first, we assessed the effectiveness of BPEL language support in developing adaptive mechanisms. As a result, we identified the strengths and limitations of BPEL and came up with strategies to address those limitations. Second, we developed a technique to enhance existing BPEL processes transparently in order to support dynamic adaptation. We proposed a framework which uses transparent shaping and generative programming to make BPEL processes adaptive. Third, we developed a technique to dynamically discover and bind to substitute services. Our technique was evaluated and the result showed that dynamic utilization of components improves the flexibility of adaptive BPEL processes. Fourth, we developed an extensible policy-based technique to specify how to handle exceptional behavior. We developed a generic component that introduces adaptive behavior for multiple BPEL processes. Fifth, we identify ways to apply our work to facilitate adaptability in composite Grid services.
Resumo:
In their dialogue entitled - The Food Service Industry Environment: Market Volatility Analysis - by Alex F. De Noble, Assistant Professor of Management, San Diego State University and Michael D. Olsen, Associate Professor and Director, Division of Hotel, Restaurant & Institutional Management at Virginia Polytechnic Institute and State University, De Noble and Olson preface the discussion by saying: “Hospitality executives, as a whole, do not believe they exist in a volatile environment and spend little time or effort in assessing how current and future activity in the environment will affect their success or failure. The authors highlight potential differences that may exist between executives' perceptions and objective indicators of environmental volatility within the hospitality industry and suggest that executives change these perceptions by incorporating the assumption of a much more dynamic environment into their future strategic planning efforts. Objective, empirical evidence of the dynamic nature of the hospitality environment is presented and compared to several studies pertaining to environmental perceptions of the industry.” That weighty thesis statement presumes that hospitality executives/managers do not fully comprehend the environment in which they operate. The authors provide a contrast, which conventional wisdom would seem to support and satisfy. “Broadly speaking, the operating environment of an organization is represented by its task domain,” say the authors. “This task domain consists of such elements as a firm's customers, suppliers, competitors, and regulatory groups.” These are dynamic actors and the underpinnings of change, say the authors by way of citation. “The most difficult aspect for management in this regard tends to be the development of a proper definition of the environment of their particular firm. Being able to precisely define who the customers, competitors, suppliers, and regulatory groups are within the environment of the firm is no easy task, yet is imperative if proper planning is to occur,” De Noble and Olson further contribute to support their thesis statement. The article is bloated, and that’s not necessarily a bad thing, with tables both survey and empirically driven, to illustrate market volatility. One such table is the Bates and Eldredge outline; Table-6 in the article. “This comprehensive outline…should prove to be useful to most executives in expanding their perception of the environment of their firm,” say De Noble and Olson. “It is, however, only a suggested outline,” they advise. “…risk should be incorporated into every investment decision, especially in a volatile environment,” say the authors. De Noble and Olson close with an intriguing formula to gauge volatility in an environment.
Resumo:
In recent years, hotels in Cyprus have encountered difficult economic times due to increasing customer demands and strong internal industry development competition. The hospitality industry’s main concern globally is to serve its customer S needs and desires, most of which are addressed through personal services. Hence, the hotel businesses that are able to provide quality services to its ever-demanding customers in a warm and efficient manner are those businesses which will be more likely to obtain a long term competitive advantage over their rivals. Ironically, the quality of services frequently cannot fully appreciated until something goes wrong, and then, the poor quality of services can have long lasting lingering effects on the customer base and, hence, often is translated into a loss of business. Nevertheless, since the issue of delivery of hospitality services always involves people, this issue must center around the management of the human resource factor, and in particular, on the way which interacts with itself and with guests, as service encounters. In the eyes of guests, hospitality businesses will be viewed successful or failure, depending on [he cumulative impact of the service encounters they have experienced on a personal level. Finally, since hotels are offering intangible and perishable personal service encounters, managing these services must be a paramount concern of any hotel business. As a preliminary exercise, visualize when you have last visited a hotel, or a restaurant, and then, ask yourself these questions: What did you feel about the quality of the experience? Was it a memorable one, which you would recommend it to others, or there were certain things, which could have made the difference? Thus, the way personalized services are provided can make the deference in attracting arid retaining long-term customers
Resumo:
IMPORTANCE: Prevention strategies for heart failure are needed.
OBJECTIVE: To determine the efficacy of a screening program using brain-type natriuretic peptide (BNP) and collaborative care in an at-risk population in reducing newly diagnosed heart failure and prevalence of significant left ventricular (LV) systolic and/or diastolic dysfunction.
DESIGN, SETTING, AND PARTICIPANTS: The St Vincent's Screening to Prevent Heart Failure Study, a parallel-group randomized trial involving 1374 participants with cardiovascular risk factors (mean age, 64.8 [SD, 10.2] years) recruited from 39 primary care practices in Ireland between January 2005 and December 2009 and followed up until December 2011 (mean follow-up, 4.2 [SD, 1.2] years).
INTERVENTION: Patients were randomly assigned to receive usual primary care (control condition; n=677) or screening with BNP testing (n=697). Intervention-group participants with BNP levels of 50 pg/mL or higher underwent echocardiography and collaborative care between their primary care physician and specialist cardiovascular service.
MAIN OUTCOMES AND MEASURES: The primary end point was prevalence of asymptomatic LV dysfunction with or without newly diagnosed heart failure. Secondary end points included emergency hospitalization for arrhythmia, transient ischemic attack, stroke, myocardial infarction, peripheral or pulmonary thrombosis/embolus, or heart failure.
RESULTS: A total of 263 patients (41.6%) in the intervention group had at least 1 BNP reading of 50 pg/mL or higher. The intervention group underwent more cardiovascular investigations (control, 496 per 1000 patient-years vs intervention, 850 per 1000 patient-years; incidence rate ratio, 1.71; 95% CI, 1.61-1.83; P<.001) and received more renin-angiotensin-aldosterone system-based therapy at follow-up (control, 49.6%; intervention, 56.5%; P=.01). The primary end point of LV dysfunction with or without heart failure was met in 59 (8.7%) of 677 in the control group and 37 (5.3%) of 697 in the intervention group (odds ratio [OR], 0.55; 95% CI, 0.37-0.82; P = .003). Asymptomatic LV dysfunction was found in 45 (6.6%) of 677 control-group patients and 30 (4.3%) of 697 intervention-group patients (OR, 0.57; 95% CI, 0.37-0.88; P = .01). Heart failure occurred in 14 (2.1%) of 677 control-group patients and 7 (1.0%) of 697 intervention-group patients (OR, 0.48; 95% CI, 0.20-1.20; P = .12). The incidence rates of emergency hospitalization for major cardiovascular events were 40.4 per 1000 patient-years in the control group vs 22.3 per 1000 patient-years in the intervention group (incidence rate ratio, 0.60; 95% CI, 0.45-0.81; P = .002).
CONCLUSION AND RELEVANCE: Among patients at risk of heart failure, BNP-based screening and collaborative care reduced the combined rates of LV systolic dysfunction, diastolic dysfunction, and heart failure.
TRIAL REGISTRATION: clinicaltrials.gov Identifier: NCT00921960.
Resumo:
INTRODUCTION: EGFR screening requires good quality tissue, sensitivity and turn-around time (TAT). We report our experience of routine screening, describing sample type, TAT, specimen quality (cellularity and DNA yield), histopathological description, mutation result and clinical outcome. METHODS: Non-small cell lung cancer (NSCLC) sections were screened for EGFR mutations (M+) in exons 18-21. Clinical, pathological and screening outcome data were collected for year 1 of testing. Screening outcome alone was collected for year 2. RESULTS: In year 1, 152 samples were tested, most (72%) were diagnostic. TAT was 4.9 days (95%confidence interval (CI)=4.5-5.5). EGFR-M+ prevalence was 11% and higher (20%) among never-smoking women with adenocarcinomas (ADCs), but 30% of mutations occurred in current/ex-smoking men. EGFR-M+ tumours were non-mucinous ADCs and 100% thyroid transcription factor (TTF1+). No mutations were detected in poorly differentiated NSCLC-not otherwise specified (NOS). There was a trend for improved overall survival (OS) among EGFR-M+ versus EGFR-M- patients (median OS=78 versus 17 months). In year 1, test failure rate was 19%, and associated with scant cellularity and low DNA concentrations. However 75% of samples with poor cellularity but representative of tumour were informative and mutation prevalence was 9%. In year 2, 755 samples were tested; mutation prevalence was 13% and test failure only 5.4%. Although samples with low DNA concentration (2.2 ng/μL), the mutation rate was 9.2%. CONCLUSION: Routine epidermal growth factor receptor (EGFR) screening using diagnostic samples is fast and feasible even on samples with poor cellularity and DNA content. Mutations tend to occur in better-differentiated non-mucinous TTF1+ ADCs. Whether these histological criteria may be useful to select patients for EGFR testing merits further investigation.
Resumo:
Thesis (Master's)--University of Washington, 2016-08
Resumo:
Failure analysis has been, throughout the years, a fundamental tool used in the aerospace sector, supporting assessments performed by sustainment and design engineers mainly related to failure modes and material suitability. The predicted service life of aircrafts often exceeds 40 years, and the design assured life rarely accounts for all in service loads and in service environmental menaces that aging aircrafts must deal with throughout their service lives. From the most conservative safe-life conceptual design approaches to the most recent on-condition based design approaches, assessing the condition and predicting the failure modes of components and materials are essential for the development of adequate preventive and corrective maintenance actions as well as for the accomplishment and optimization of scheduled maintenance programs of aircrafts. Moreover, as the operational conditions of aircrafts may vary significantly from operator to operator (especially in military aircraft), it is necessary to access if the defined maintenance programs are adequate to guarantee the continuous reliability and safe usage of the aircrafts, preventing catastrophic failures which bear significant maintenance and repair costs, and that may lead to the loss of human lives. Thus being, failure analysis and material investigations performed as part of aircraft accidents and incidents investigations arise as powerful tools of the utmost importance for safety assurance and cost reduction within the aeronautical and aerospace sectors. The Portuguese Air Force (PRTAF) has operated different aircrafts throughout its long existence, and in some cases, has operated a particular type of aircraft for more than 30 years, gathering a great amount of expertise in: assessing failure modes of the aircrafts materials; conducting aircrafts accidents and incidents investigations (sometimes with the participation of the aircraft manufacturers and/or other operators); and in the development of design and repair solutions for in-service related problems. This paper addresses several studies to support the thesis that failure analysis plays a key role in flight safety improvement within the PRTAF. It presents a short summary of developed
Resumo:
The article examines a range of components for the customer service from the point of view of marketing.It start with the explanation of several features that are required for a company to crystallize teamwork that finally, after all, will be provided by the success or failure of that company.These features are named: engagement, cooperation, companionship, communication, motivation and leadership.Subsequently, this article presents a section which explores human relationships and conflict management within organizations, with emphasis on attitudes, skills and personality types that present human beings as part of its essence.Finally, this text includes a section that highlights concepts related to customer service and sales techniques that exist today.
Resumo:
Particular strengths of the MRC Needs for Care Assessment Schedule have been used to investigate the treatment status of patients with persistent psychiatric disability in ways that other needs assessment tools are unable to. One hundred and seventy-nine such patients from three settings; a private sector psychiatric hospital, two public sector day hospitals situated in the same town, and a high security hospital, were found to have a high level of need. Although there were differences between settings, overall these needs were well met in all three. The high level of persistent disability found amongst these patients could not be attributed to failure on the part of those treating them to use the best available methods, or to failures to comply or engage with treatment on the patient's part. In some two thirds of instances persistent disability was best explained by the fact that even the most suitable available treatments have to be considered only partially effective.
Resumo:
Concrete substructures are often subjected to environmental deterioration, such as sulfate and acid attack, which leads to severe damage and causes structure degradation or even failure. In order to improve the durability of concrete, the High Performance Concrete (HPC) has become widely used by partially replacing cement with pozzolanic materials. However, HPC degradation mechanisms in sulfate and acidic environments are not completely understood. It is therefore important to evaluate the performance of the HPC in such conditions and predict concrete service life by establishing degradation models. This study began with a review of available environmental data in the State of Florida. A total of seven bridges have been inspected. Concrete cores were taken from these bridge piles and were subjected for microstructural analysis using Scanning Electron Microscope (SEM). Ettringite is found to be the products of sulfate attack in sulfate and acidic condition. In order to quantitatively analyze concrete deterioration level, an image processing program is designed using Matlab to obtain quantitative data. Crack percentage (Acrack/Asurface) is used to evaluate concrete deterioration. Thereafter, correlation analysis was performed to find the correlation between five related variables and concrete deterioration. Environmental sulfate concentration and bridge age were found to be positively correlated, while environmental pH level was found to be negatively correlated. Besides environmental conditions, concrete property factor was also included in the equation. It was derived from laboratory testing data. Experimental tests were carried out implementing accelerated expansion test under controlled environment. Specimens of eight different mix designs were prepared. The effect of pozzolanic replacement rate was taken into consideration in the empirical equation. And the empirical equation was validated with existing bridges. Results show that the proposed equations compared well with field test results with a maximum deviation of ± 20%. Two examples showing how to use the proposed equations are provided to guide the practical implementation. In conclusion, the proposed approach of relating microcracks to deterioration is a better method than existing diffusion and sorption models since sulfate attack cause cracking in concrete. Imaging technique provided in this study can also be used to quantitatively analyze concrete samples.