859 resultados para Power quality improvement
Resumo:
OBJECTIVE: We aimed to evaluate whether the inclusion of videothoracoscopy in a pleural empyema treatment algorithm would change the clinical outcome of such patients. METHODS: This study performed quality-improvement research. We conducted a retrospective review of patients who underwent pleural decortication for pleural empyema at our institution from 2002 to 2008. With the old algorithm (January 2002 to September 2005), open decortication was the procedure of choice, and videothoracoscopy was only performed in certain sporadic mid-stage cases. With the new algorithm (October 2005 to December 2008), videothoracoscopy became the first-line treatment option, whereas open decortication was only performed in patients with a thick pleural peel (>2 cm) observed by chest scan. The patients were divided into an old algorithm (n = 93) and new algorithm (n = 113) group and compared. The main outcome variables assessed included treatment failure (pleural space reintervention or death up to 60 days after medical discharge) and the occurrence of complications. RESULTS: Videothoracoscopy and open decortication were performed in 13 and 80 patients from the old algorithm group and in 81 and 32 patients from the new algorithm group, respectively (p < 0.01). The patients in the new algorithm group were older (41 +/- 1 vs. 46.3 +/- 16.7 years, p=0.014) and had higher Charlson Comorbidity Index scores [0(0-3) vs. 2(0-4), p = 0.032]. The occurrence of treatment failure was similar in both groups (19.35% vs. 24.77%, p= 0.35), although the complication rate was lower in the new algorithm group (48.3% vs. 33.6%, p = 0.04). CONCLUSIONS: The wider use of videothoracoscopy in pleural empyema treatment was associated with fewer complications and unaltered rates of mortality and reoperation even though more severely ill patients were subjected to videothoracoscopic surgery.
Resumo:
The installation of induction distributed generators should be preceded by a careful study in order to determine if the point of common coupling is suitable for transmission of the generated power, keeping acceptable power quality and system stability. In this sense, this paper presents a simple analytical formulation that allows a fast and comprehensive evaluation of the maximum power delivered by the induction generator, without losing voltage stability. Moreover, this formulation can be used to identify voltage stability issues that limit the generator output power. All the formulation is developed by using the equivalent circuit of squirrel-cage induction machine. Simulation results are used to validate the method, which enables the approach to be used as a guide to reduce the simulation efforts necessary to assess the maximum output power and voltage stability of induction generators. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Synchronous distributed generators are prone to operate islanded after contingencies, which is usually not allowed due to safety and power-quality issues. Thus, there are several anti-islanding techniques; however, most of them present technical limitations so that they are likely to fail in certain situations. Therefore, it is important to quantify and determine whether the scheme under study is adequate or not. In this context, this paper proposes an index to evaluate the effectiveness of anti-islanding frequency-based relays commonly used to protect synchronous distributed generators. The method is based on the calculation of a numerical index that indicates the time period that the system is unprotected against islanding considering the global period of analysis. Although this index can precisely be calculated based on several electromagnetic transient simulations, a practical method is also proposed to calculate it directly from simple analytical formulas or lookup tables. The results have shown that the proposed approach can assist distribution engineers to assess and set anti-islanding protection schemes.
Resumo:
Ventilator-associated pneumonia (VAP) remains one of the major causes of infection in the intensive care unit (ICU) and is associated with the length of hospital stay, duration of mechanical ventilation, and use of broad-spectrum antibiotics. We compared the frequency of VAP 10 months prior to (pre-intervention group) and 13 months after (post-intervention group) initiation of the use of a heat and moisture exchanger (HME) filter. This is a study with prospective before-and-after design performed in the ICU in a tertiary university hospital. Three hundred and fourteen patients were admitted to the ICU under mechanical ventilation, 168 of whom were included in group HH (heated humidifier) and 146 in group HME. The frequency of VAP per 1000 ventilator-days was similar for both the HH and HME groups (18.7 vs 17.4, respectively; P = 0.97). Duration of mechanical ventilation (11 vs 12 days, respectively; P = 0.48) and length of ICU stay (11 vs 12 days, respectively; P = 0.39) did not differ between the HH and HME groups. The chance of developing VAP was higher in patients with a longer ICU stay and longer duration of mechanical ventilation. This finding was similar when adjusted for the use of HME. The use of HME in intensive care did not reduce the incidence of VAP, the duration of mechanical ventilation, or the length of stay in the ICU in the study population.
Resumo:
Topologies of motor drive systems are studied, aiming the reduction of common-mode (CM) currents. Initially, the aspects concerning the CM currents circulation are analysed. The reason of common-mode voltages generation, the circulating paths for the resulting CM currents and their effects are discussed. Then, a non-conventional drive system configuration is proposed in order to reduce the CM currents and their effects. This configuration comprehends a non-conventional inverter module wired to a motor with an unusual connection. The cables arrangement differs from the standard solution, too. The proposed topology is compared with other ones, like the active circuit for common-mode voltages compensation. The contribution of the configuration to the reduction of CM voltages and currents and their related interferences are evaluated, based on numerical simulations. Some results are presented and discussed regarding the suitability of the proposed configuration as a potential solution to reduce the CM currents effects, when the state of art and implementation cost of drives are taken into account.
Resumo:
Máster Universitario en Sistemas Inteligentes y Aplicaciones Numéricas en Ingeniería (SIANI)
Resumo:
It is well known that the best grape quality can occur only through the achievement of optimal source/sink ratio. Vine balance is in fact a key parameter in controlling berry sugar, acidity and secondary metabolites content (Howell, 2001; Vanden Heuvel et al., 2004). Despite yield reduction and quality improvement are not always strictly related, cluster thinning is considered a technique which could lead to improvement in grape sugar and anthocyanin composition (Dokoozlian and Hirschfelt, 1995; Guidoni et al., 2002). Among several microclimatic variables which may impact grape composition, the effect of cluster light exposure and temperature, which probably act in synergistic and complex way, has been widely explored showing positive even sometimes contradictory results (Spayd et al., 2001; Tarara et al., 2008). Pre-bloom and véraison defoliation are very efficient techniques in inducing cluster microclimatic modification. Furthermore pre-bloom defoliation inducing a lower berry set percentage On these basis the aim of the first experiment of the thesis was to verify in cv Sangiovese the effects on ripening and berry composition of management techniques which may increase source/sink ratio and /or promote light incidence on berries throughout grape ripening. An integrated agronomic, biochemical and microarray approach, aims to understand which mechanisms are involved in berry composition and may be conditioned in the berries during ripening in vines submitted to three treatments. In particular the treatments compared were: a) cluster thinning (increasing in source/sink ratio) b) leaf removal at véraison (increasing cluster light exposure) c) pre-bloom defoliation (increasing source sink ratio and cluster light exposure). Vine response to leaf removal at véraison was further evaluated in the second experiment on three different varieties (Cabernet Sauvignon, Nero d’Avola, Raboso Piave) chosen for their different genetic traits in terms of anthocyanin amount and composition. The integrated agronomic, biochemical and microarray approach, employed in order to understand those mechanisms involved in berry composition of Sangiovese vines submitted to management techniques which may increase source/sink ratio and induce microclimatic changes, bring to interesting results. This research confirmed the main role of source/sink ratio in conditioning sugars metabolism and revealed also that carbohydrates availability is a crucial issue in triggering anthocyanin biosynthesis. More complex is the situation of pre-bloom defoliation, where source/sink and cluster light increase effects are associated to determine final berry composition. It results that the application of pre-bloom defoliation may be risky, as too much dependent on seasonal conditions (rain and temperature) and physiological vine response (leaf area recovery, photosynthetic compensation, laterals regrowth). Early induced stress conditions could bring cluster at véraison in disadvantage to trigger optimal berry ripening processes compared to untreated vines. This conditions could be maintained until harvest, if no previously described physiological recovery occurs. Certainly, light exposure increase linked to defoliation treatments, showed a positive and solid effect on flavonol biosynthesis, as in our conditions temperature was not so different among treatments. Except the last aspects, that could be confirmed also for véraison defoliation, microclimatic changes by themselves seemed not able to induce any modification in berry composition. Further studies are necessary to understand if the peculiar anthocyanic and flavonols composition detected in véraison defoliation could play important role in both color intensity and stability of wines.
Resumo:
Flicker is a power quality phenomenon that applies to cycle instability of light intensity resulting from supply voltage fluctuation, which, in turn can be caused by disturbances introduced during power generation, transmission or distribution. The standard EN 61000-4-15 which has been recently adopted also by the IEEE as IEEE Standard 1453 relies on the analysis of the supply voltage which is processed according to a suitable model of the lamp – human eye – brain chain. As for the lamp, an incandescent 60 W, 230 V, 50 Hz source is assumed. As far as the human eye – brain model is concerned, it is represented by the so-called flicker curve. Such a curve was determined several years ago by statistically analyzing the results of tests where people were subjected to flicker with different combinations of magnitude and frequency. The limitations of this standard approach to flicker evaluation are essentially two. First, the provided index of annoyance Pst can be related to an actual tiredness of the human visual system only if such an incandescent lamp is used. Moreover, the implemented response to flicker is “subjective” given that it relies on the people answers about their feelings. In the last 15 years, many scientific contributions have tackled these issues by investigating the possibility to develop a novel model of the eye-brain response to flicker and overcome the strict dependence of the standard on the kind of the light source. In this light of fact, this thesis is aimed at presenting an important contribution for a new Flickermeter. An improved visual system model using a physiological parameter that is the mean value of the pupil diameter, has been presented, thus allowing to get a more “objective” representation of the response to flicker. The system used to both generate flicker and measure the pupil diameter has been illustrated along with all the results of several experiments performed on the volunteers. The intent has been to demonstrate that the measurement of that geometrical parameter can give reliable information about the feeling of the human visual system to light flicker.
Resumo:
Population growth in urban areas is a world-wide phenomenon. According to a recent United Nations report, over half of the world now lives in cities. Numerous health and environmental issues arise from this unprecedented urbanization. Recent studies have demonstrated the effectiveness of urban green spaces and the role they play in improving both the aesthetics and the quality of life of its residents. In particular, urban green spaces provide ecosystem services such as: urban air quality improvement by removing pollutants that can cause serious health problems, carbon storage, carbon sequestration and climate regulation through shading and evapotranspiration. Furthermore, epidemiological studies with controlled age, sex, marital and socio-economic status, have provided evidence of a positive relationship between green space and the life expectancy of senior citizens. However, there is little information on the role of public green spaces in mid-sized cities in northern Italy. To address this need, a study was conducted to assess the ecosystem services of urban green spaces in the city of Bolzano, South Tyrol, Italy. In particular, we quantified the cooling effect of urban trees and the hourly amount of pollution removed by the urban forest. The information was gathered using field data collected through local hourly air pollution readings, tree inventory and simulation models. During the study we quantified pollution removal for ozone, nitrogen dioxide, carbon monoxide and particulate matter (<10 microns). We estimated the above ground carbon stored and annually sequestered by the urban forest. Results have been compared to transportation CO2 emissions to determine the CO2 offset potential of urban streetscapes. Furthermore, we assessed commonly used methods for estimating carbon stored and sequestered by urban trees in the city of Bolzano. We also quantified ecosystem disservices such as hourly urban forest volatile organic compound emissions.
Resumo:
In der Herstellung fester Darreichungsformen umfasst die Granulierung einen komplexen Teilprozess mit hoher Relevanz für die Qualität des pharmazeutischen Produktes. Die Wirbelschichtgranulierung ist ein spezielles Granulierverfahren, welches die Teilprozesse Mischen, Agglomerieren und Trocknen in einem Gerät vereint. Durch die Kombination mehrerer Prozessstufen unterliegt gerade dieses Verfahren besonderen Anforderungen an ein umfassendes Prozessverständnis. Durch die konsequente Verfolgung des PAT- Ansatzes, welcher im Jahre 2004 durch die amerikanische Zulassungsbehörde (FDA) als Guideline veröffentlicht wurde, wurde der Grundstein für eine kontinuierliche Prozessverbesserung durch erhöhtes Prozessverständnis, für Qualitätserhöhung und Kostenreduktion gegeben. Die vorliegende Arbeit befasste sich mit der Optimierung der Wirbelschicht-Granulationsprozesse von zwei prozesssensiblen Arzneistoffformulierungen, unter Verwendung von PAT. rnFür die Enalapril- Formulierung, einer niedrig dosierten und hochaktiven Arzneistoffrezeptur, wurde herausgefunden, dass durch eine feinere Zerstäubung der Granulierflüssigkeit deutlich größere Granulatkörnchen erhalten werden. Eine Erhöhung der MassRatio verringert die Tröpfchengröße, dies führt zu größeren Granulaten. Sollen Enalapril- Granulate mit einem gewünschten D50-Kornverteilung zwischen 100 und 140 um hergestellt werden, dann muss die MassRatio auf hohem Niveau eingestellt werden. Sollen Enalapril- Granulate mit einem D50- Wert zwischen 80 und 120µm erhalten werden, so muss die MassRatio auf niedrigem Niveau eingestellt sein. Anhand der durchgeführten Untersuchungen konnte gezeigt werden, dass die MassRatio ein wichtiger Parameter ist und zur Steuerung der Partikelgröße der Enalapril- Granulate eingesetzt werden kann; unter der Voraussetzung dass alle anderen Prozessparameter konstant gehalten werden.rnDie Betrachtung der Schnittmengenplots gibt die Möglichkeit geeignete Einstellungen der Prozessparameter bzw. Einflussgrößen zu bestimmen, welche dann zu den gewünschten Granulat- und Tabletteneigenschaften führen. Anhand der Lage und der Größe der Schnittmenge können die Grenzen der Prozessparameter zur Herstellung der Enalapril- Granulate bestimmt werden. Werden die Grenzen bzw. der „Design Space“ der Prozessparameter eingehalten, kann eine hochwertige Produktqualität garantiert werden. rnUm qualitativ hochwertige Enalapril Tabletten mit der gewählten Formulierung herzustellen, sollte die Enalapril- Granulation mit folgenden Prozessparametern durchgeführt werden: niedrige Sprührate, hoher MassRatio, einer Zulufttemperatur von mindestens > 50 °C und einer effektiven Zuluftmenge < 180 Nm³/h. Wird hingegen eine Sprührate von 45 g/min und eine mittlere MassRatio von 4.54 eingestellt, so muss die effektive Zuluftmenge mindestens 200 Nm³/h und die Zulufttemperatur mindestens 60 °C betragen, um eine vorhersagbar hohe Tablettenqualität zu erhalten. Qualität wird in das Arzneimittel bereits während der Herstellung implementiert, indem die Prozessparameter bei der Enalapril- Granulierung innerhalb des „Design Space“ gehalten werden.rnFür die Metformin- Formulierung, einer hoch dosierten aber wenig aktiven Arzneistoffrezeptur wurde herausgefunden, dass sich der Wachstumsmechanismus des Feinanteils der Metformin- Granulate von dem Wachstumsmechanismus der D50- und D90- Kornverteilung unterscheidet. Der Wachstumsmechanismus der Granulate ist abhängig von der Partikelbenetzung durch die versprühten Flüssigkeitströpfchen und vom Größenverhältnis von Partikel zu Sprühtröpfchen. Der Einfluss der MassRatio ist für die D10- Kornverteilung der Granulate vernachlässigbar klein. rnMit Hilfe der Störgrößen- Untersuchungen konnte eine Regeleffizienz der Prozessparameter für eine niedrig dosierte (Enalapril)- und eine hoch dosierte (Metformin) Arzneistoffformulierung erarbeitet werden, wodurch eine weitgehende Automatisierung zur Verringerung von Fehlerquellen durch Nachregelung der Störgrößen ermöglicht wird. Es ergibt sich für die gesamte Prozesskette ein in sich geschlossener PAT- Ansatz. Die Prozessparameter Sprührate und Zuluftmenge erwiesen sich als am besten geeignet. Die Nachregelung mit dem Parameter Zulufttemperatur erwies sich als träge. rnFerner wurden in der Arbeit Herstellverfahren für Granulate und Tabletten für zwei prozesssensible Wirkstoffe entwickelt. Die Robustheit der Herstellverfahren gegenüber Störgrößen konnte demonstriert werden, wodurch die Voraussetzungen für eine Echtzeitfreigabe gemäß dem PAT- Gedanken geschaffen sind. Die Kontrolle der Qualität des Produkts findet nicht am Ende der Produktions- Prozesskette statt, sondern die Kontrolle wird bereits während des Prozesses durchgeführt und basiert auf einem besseren Verständnis des Produktes und des Prozesses. Außerdem wurde durch die konsequente Verfolgung des PAT- Ansatzes die Möglichkeit zur kontinuierlichen Prozessverbesserung, zur Qualitätserhöhung und Kostenreduktion gegeben und damit das ganzheitliche Ziel des PAT- Gedankens erreicht und verwirklicht.rn
Resumo:
a. Introduzione: il capitolo introduttivo tratta il background del lavoro in generale. In esso vengono illustrati inoltre gli scopi e gli obiettivi e la struttura dell’elaborato. b. Framework teorici: Strategicità fornitori e SM Il primo passo nella progettazione di questo elaborato di tesi è stato quello di sviluppare una ricerca nella letteratura della SCM. Secondo Strauss e Corbin (2008) “la domanda iniziale di uno studio qualitativo è spesso ampia ed aperta”. Un modo per sviluppare domande di ricerca è quello si esaminare la letteratura. Viene trattata la strategicità della funzione approvvigionamenti nel Supply Chain Management ed il notevole impatto dei fornitori sulle prestazioni aziendali. c. Supplier Base Reduction: Modello Concettuale. Viene proposto un modello concettuale che si base sulla strategicità della SBR intesa come una strategia di gestione del parco fornitori grazie ai suoi metodi di standardization, elimination e tiering. Tale modello evidenzia l’importanza di eseguirla in contemporanea ad altri metodi, strumenti e strategie molto rilevanti, quali: purchasing strategy, spend analysis per classe merceologica di acquisto, purchasing performance meausurement tramite appositi KPI (i più rilevanti sono lead time e qualità), valutazione e segmentazione del parco fornitori. In tal modo sarà immediato individuare i fornitori critici da eliminare e quelli più performanti con cui stabilire dei rapporti di partnership e di fornitura integrata. d. Case Study: Bonfiglioli Riduttori Dopo un excursus sulla struttura aziendale di Bonfiglioli Riduttori, le sue Business Unit, le diverse filiali ed i suoi principali prodotti, viene riportata una breve analisi dell’intera supply chain. Successivamente viene trattata la necessità di aumentare le performance aziendali (date le stringenti richieste di mercato in termini di qualità e puntualità nelle consegne) e di non perdere la competitività acquisita precedentemente. Inoltre si enfatizza l’importanza della funzione approvvigionamenti nel raggiungimento degli obiettivi aziendali. e. Applicazione del modello concettuale al caso Dal modello concettuale si hanno gli input per definire il piano esecutivo del caso di studio in esame. Verranno trattati: analisi di Pareto per categoria merceologica, monitoraggio KPI per fornitore e categoria merceologica (con relativa griglia di misurazione delle performance globale), segmentazione fornitori per categoria tramite Commodity Pyramids, attuazione di azioni correttive generiche (quali le tecniche SBR e la partnership con i fornitori più performanti) e puntuali (quality improvement, ridefinizione dei piani di consegna, condivisione della programmazione, applicazione di bonus o penalità). f. Risultati ottenuti ed attesi Dopo aver riportato i risultati di alcuni esempi di macrocategorie merceologiche d’acquisto, si analizzano i risultati globali della razionalizzazione del parco fornitori e dell’implementazione di rapporti di partnership con annessi i relativi benefici per le performance dell’organizzazione. Si propone inoltre una revisione dei meccanismi di selezione dei fornitori tramite l’ideazione di un nuovo modello di vendor rating che rispetti i target prefissati.
Resumo:
PURPOSE OF REVIEW: Intensive care medicine consumes a high share of healthcare costs, and there is growing pressure to use the scarce resources efficiently. Accordingly, organizational issues and quality management have become an important focus of interest in recent years. Here, we will review current concepts of how outcome data can be used to identify areas requiring action. RECENT FINDINGS: Using recently established models of outcome assessment, wide variability between individual ICUs is found, both with respect to outcome and resource use. Such variability implies that there are large differences in patient care processes not only within the ICU but also in pre-ICU and post-ICU care. Indeed, measures to improve the patient process in the ICU (including care of the critically ill, patient safety, and management of the ICU) have been presented in a number of recently published papers. SUMMARY: Outcome assessment models provide an important framework for benchmarking. They may help the individual ICU to spot appropriate fields of action, plan and initiate quality improvement projects, and monitor the consequences of such activity.
Resumo:
Rising fuel prices and environmental concerns are threatening the stability of current electrical grid systems. These factors are pushing the automobile industry towards more effcient, hybrid vehicles. Current trends show petroleum is being edged out in favor of electricity as the main vehicular motive force. The proposed methods create an optimized charging control schedule for all participating Plug-in Hybrid Electric Vehicles in a distribution grid. The optimization will minimize daily operating costs, reduce system losses, and improve power quality. This requires participation from Vehicle-to-Grid capable vehicles, load forecasting, and Locational Marginal Pricing market predictions. Vehicles equipped with bidirectional chargers further improve the optimization results by lowering peak demand and improving power quality.
Resumo:
PURPOSE To develop internationally harmonised standards for programmes of training in intensive care medicine (ICM). METHODS Standards were developed by using consensus techniques. A nine-member nominal group of European intensive care experts developed a preliminary set of standards. These were revised and refined through a modified Delphi process involving 28 European national coordinators representing national training organisations using a combination of moderated discussion meetings, email, and a Web-based tool for determining the level of agreement with each proposed standard, and whether the standard could be achieved in the respondent's country. RESULTS The nominal group developed an initial set of 52 possible standards which underwent four iterations to achieve maximal consensus. All national coordinators approved a final set of 29 standards in four domains: training centres, training programmes, selection of trainees, and trainers' profiles. Only three standards were considered immediately achievable by all countries, demonstrating a willingness to aspire to quality rather than merely setting a minimum level. Nine proposed standards which did not achieve full consensus were identified as potential candidates for future review. CONCLUSIONS This preliminary set of clearly defined and agreed standards provides a transparent framework for assuring the quality of training programmes, and a foundation for international harmonisation and quality improvement of training in ICM.
Resumo:
OBJECTIVES To evaluate the impact of preoperative sepsis on risk of postoperative arterial and venous thromboses. DESIGN Prospective cohort study using the National Surgical Quality Improvement Program database of the American College of Surgeons (ACS-NSQIP). SETTING Inpatient and outpatient procedures in 374 hospitals of all types across the United States, 2005-12. PARTICIPANTS 2,305,380 adults who underwent surgical procedures. MAIN OUTCOME MEASURES Arterial thrombosis (myocardial infarction or stroke) and venous thrombosis (deep venous thrombosis or pulmonary embolism) in the 30 days after surgery. RESULTS Among all surgical procedures, patients with preoperative systemic inflammatory response syndrome or any sepsis had three times the odds of having an arterial or venous postoperative thrombosis (odds ratio 3.1, 95% confidence interval 3.0 to 3.1). The adjusted odds ratios were 2.7 (2.5 to 2.8) for arterial thrombosis and 3.3 (3.2 to 3.4) for venous thrombosis. The adjusted odds ratios for thrombosis were 2.5 (2.4 to 2.6) in patients with systemic inflammatory response syndrome, 3.3 (3.1 to 3.4) in patients with sepsis, and 5.7 (5.4 to 6.1) in patients with severe sepsis, compared with patients without any systemic inflammation. In patients with preoperative sepsis, both emergency and elective surgical procedures had a twofold increased odds of thrombosis. CONCLUSIONS Preoperative sepsis represents an important independent risk factor for both arterial and venous thromboses. The risk of thrombosis increases with the severity of the inflammatory response and is higher in both emergent and elective surgical procedures. Suspicion of thrombosis should be higher in patients with sepsis who undergo surgery.