917 resultados para FMEA (Failure Mode Effects Analysis)
Resumo:
Natural systems face pressures exerted by natural physical-chemical forcings and a myriad of co-occurring human stressors that may interact to cause larger than expected effects, thereby presenting a challenge to ecosystem management. This thesis aimed to develop new information that can contribute to reduce the existing knowledge gaps hampering the holistic management of multiple stressors. I undertook a review of the state-of-the-art methods to detect, quantify and predict stressor interactions, identifying techniques that could be applied in this thesis research. Then, I conducted a systematic review of saltmarsh multiple stressor studies in conjunction with a multiple stressor mapping exercise for the study system in order to infer potential important synergistic stressor interactions. This analysis identified key stressors that are affecting the study system, but also pointed to data gaps in terms of driver and pressure data and raised issues for potentially overlooked stressors. Using field mesocosms, I explored how a local stressor (nutrient availability) affects the responses of saltmarsh vegetation to a global stressor (increased inundation) in different soil types. Results indicate that saltmarsh vegetation would be more drastically affected by increased inundation in low than in medium organic matter soils, and especially in estuaries already under high nutrient availability. In another field experiment, I examined the challenges of managing co-occurring and potentially interacting local stressors on saltmarsh vegetation: recreational trampling and smothering by deposition of excess macroalgal wrack due to high nutrient loads. Trampling and wrack prevention had interacting effects, causing non-linear responses of the vegetation to simulated management of these stressors, such that vegetation recovered only in those treatments simulating the combined prevention of both stressors. During this research I detected, using molecular genetic methods, a widespread presence of S. anglica (and to a lesser extent S. townsendii), two previously unrecorded non-native Spartinas in the study areas.
Resumo:
The BLEVE, acronym for Boiling Liquid Expanding Vapour Explosion, is one of the most dangerous accidents that can occur in pressure vessels. It can be defined as an explosion resulting from the failure of a vessel containing a pressure liquefied gas stored at a temperature significantly above its boiling point at atmospheric pressure. This phenomenon frequently appears when a vessel is engulfed by a fire: the heat causes the internal pressure to raise and the mechanical proprieties of the wall to decrease, with the consequent rupture of the tank and the instantaneous release of its whole content. After the breakage, the vapour outflows and expands and the liquid phase starts boiling due to the pressure drop. The formation and propagation of a distructive schock wave may occur, together with the ejection of fragments, the generation of a fireball if the stored fluid is flammable and immediately ignited or the atmospheric dispersion of a toxic cloud if the fluid contained inside the vessel is toxic. Despite the presence of many studies on the BLEVE mechanism, the exact causes and conditions of its occurrence are still elusive. In order to better understand this phenomenon, in the present study first of all the concept and definition of BLEVE are investigated. A historical analysis of the major events that have occurred over the past 60 years is described. A research of the principal causes of this event, including the analysis of the substances most frequently involved, is presented too. Afterwards a description of the main effects of BLEVEs is reported, focusing especially on the overpressure. Though the major aim of the present thesis is to contribute, with a comparative analysis, to the validation of the main models present in the literature for the calculation and prediction of the overpressure caused by BLEVEs. In line with this purpose, after a short overview of the available approaches, their ability to reproduce the trend of the overpressure is investigated. The overpressure calculated with the different models is compared with values deriving from events happened in the past and ad-hoc experiments, focusing the attention especially on medium and large scale phenomena. The ability of the models to consider different filling levels of the reservoir and different substances is analyzed too. The results of these calculations are extensively discussed. Finally some conclusive remarks are reported.
Resumo:
New treatment options for Niemann-Pick Type C (NPC) have recently become available. To assess the efficiency and efficacy of these new treatment markers for disease status and progression are needed. Both the diagnosis and the monitoring of disease progression are challenging and mostly rely on clinical impression and functional testing of horizontal eye movements. Diffusion tensor imaging (DTI) provides information about the microintegrity especially of white matter. We show here in a case report how DTI and measures derived from this imaging method can serve as adjunct quantitative markers for disease management in Niemann-Pick Type C. Two approaches are taken--first, we compare the fractional anisotropy (FA) in the white matter globally between a 29-year-old NPC patient and 18 healthy age-matched controls and show the remarkable difference in FA relatively early in the course of the disease. Second, a voxelwise comparison of FA values reveals where white matter integrity is compromised locally and demonstrate an individualized analysis of FA changes before and after 1year of treatment with Miglustat. This method might be useful in future treatment trials for NPC to assess treatment effects.
Resumo:
Objective To assess the outcome of patients who experienced treatment failure with antiretrovirals in sub-Saharan Africa. Methods Analysis of 11 antiretroviral therapy (ART) programmes in sub-Saharan Africa. World Health Organization (WHO) criteria were used to define treatment failure. All ART-naive patients aged ≥16 who started with a non-nucleoside reverse transcriptase inhibitor (NNRTI)-based regimen and had at least 6 months of follow-up were eligible. For each patient who switched to a second-line regimen, 10 matched patients who remained on a non-failing first-line regimen were selected. Time was measured from the time of switching, from the corresponding time in matched patients, or from the time of treatment failure in patients who remained on a failing regimen. Mortality was analysed using Kaplan–Meier curves and random-effects Cox models. Results Of 16 591 adult patients starting ART, 382 patients (2.3%) switched to a second-line regimen. Another 323 patients (1.9%) did not switch despite developing immunological or virological failure. Cumulative mortality at 1 year was 4.2% (95% CI 2.2–7.8%) in patients who switched to a second-line regimen and 11.7% (7.3%–18.5%) in patients who remained on a failing first-line regimen, compared to 2.2% (1.6–3.0%) in patients on a non-failing first-line regimen (P < 0.0001). Differences in mortality were not explained by nadir CD4 cell count, age or differential loss to follow up. Conclusions Many patients who meet criteria for treatment failure do not switch to a second-line regimen and die. There is an urgent need to clarify the reasons why in sub-Saharan Africa many patients remain on failing first-line ART.
Resumo:
OBJECTIVE: To determine the effect of glucosamine, chondroitin, or the two in combination on joint pain and on radiological progression of disease in osteoarthritis of the hip or knee. Design Network meta-analysis. Direct comparisons within trials were combined with indirect evidence from other trials by using a Bayesian model that allowed the synthesis of multiple time points. MAIN OUTCOME MEASURE: Pain intensity. Secondary outcome was change in minimal width of joint space. The minimal clinically important difference between preparations and placebo was prespecified at -0.9 cm on a 10 cm visual analogue scale. DATA SOURCES: Electronic databases and conference proceedings from inception to June 2009, expert contact, relevant websites. Eligibility criteria for selecting studies Large scale randomised controlled trials in more than 200 patients with osteoarthritis of the knee or hip that compared glucosamine, chondroitin, or their combination with placebo or head to head. Results 10 trials in 3803 patients were included. On a 10 cm visual analogue scale the overall difference in pain intensity compared with placebo was -0.4 cm (95% credible interval -0.7 to -0.1 cm) for glucosamine, -0.3 cm (-0.7 to 0.0 cm) for chondroitin, and -0.5 cm (-0.9 to 0.0 cm) for the combination. For none of the estimates did the 95% credible intervals cross the boundary of the minimal clinically important difference. Industry independent trials showed smaller effects than commercially funded trials (P=0.02 for interaction). The differences in changes in minimal width of joint space were all minute, with 95% credible intervals overlapping zero. Conclusions Compared with placebo, glucosamine, chondroitin, and their combination do not reduce joint pain or have an impact on narrowing of joint space. Health authorities and health insurers should not cover the costs of these preparations, and new prescriptions to patients who have not received treatment should be discouraged.
Resumo:
The Default Mode Network (DMN) is a higher order functional neural network that displays activation during passive rest and deactivation during many types of cognitive tasks. Accordingly, the DMN is viewed to represent the neural correlate of internally-generated self-referential cognition. This hypothesis implies that the DMN requires the involvement of cognitive processes, like declarative memory. The present study thus examines the spatial and functional convergence of the DMN and the semantic memory system. Using an active block-design functional Magnetic Resonance Imaging (fMRI) paradigm and Independent Component Analysis (ICA), we trace the DMN and fMRI signal changes evoked by semantic, phonological and perceptual decision tasks upon visually-presented words. Our findings show less deactivation during semantic compared to the two non-semantic tasks for the entire DMN unit and within left-hemispheric DMN regions, i.e., the dorsal medial prefrontal cortex, the anterior cingulate cortex, the retrosplenial cortex, the angular gyrus, the middle temporal gyrus and the anterior temporal region, as well as the right cerebellum. These results demonstrate that well-known semantic regions are spatially and functionally involved in the DMN. The present study further supports the hypothesis of the DMN as an internal mentation system that involves declarative memory functions.
Resumo:
Unexplained differences between classes of antihypertensive drugs in their effectiveness in preventing stroke might be due to class effects on intraindividual variability in blood pressure. We did a systematic review to assess any such effects in randomised controlled trials.
Resumo:
A minimal marginal bone loss around implants during early healing has been considered acceptable. However, the preservation of the marginal bone is related to soft tissue stability and esthetics. Implant designs and surfaces were evaluated to determine their impact on the behavior of the crestal bone. The purpose of this study is to evaluate histologic marginal bone level changes around early loaded, chemically modified, sandblasted acid-etched-surfaced implants with a machined collar (MC) or no MC (NMC).
Resumo:
This is the first part of a study investigating a model-based transient calibration process for diesel engines. The motivation is to populate hundreds of parameters (which can be calibrated) in a methodical and optimum manner by using model-based optimization in conjunction with the manual process so that, relative to the manual process used by itself, a significant improvement in transient emissions and fuel consumption and a sizable reduction in calibration time and test cell requirements is achieved. Empirical transient modelling and optimization has been addressed in the second part of this work, while the required data for model training and generalization are the focus of the current work. Transient and steady-state data from a turbocharged multicylinder diesel engine have been examined from a model training perspective. A single-cylinder engine with external air-handling has been used to expand the steady-state data to encompass transient parameter space. Based on comparative model performance and differences in the non-parametric space, primarily driven by a high engine difference between exhaust and intake manifold pressures (ΔP) during transients, it has been recommended that transient emission models should be trained with transient training data. It has been shown that electronic control module (ECM) estimates of transient charge flow and the exhaust gas recirculation (EGR) fraction cannot be accurate at the high engine ΔP frequently encountered during transient operation, and that such estimates do not account for cylinder-to-cylinder variation. The effects of high engine ΔP must therefore be incorporated empirically by using transient data generated from a spectrum of transient calibrations. Specific recommendations on how to choose such calibrations, how many data to acquire, and how to specify transient segments for data acquisition have been made. Methods to process transient data to account for transport delays and sensor lags have been developed. The processed data have then been visualized using statistical means to understand transient emission formation. Two modes of transient opacity formation have been observed and described. The first mode is driven by high engine ΔP and low fresh air flowrates, while the second mode is driven by high engine ΔP and high EGR flowrates. The EGR fraction is inaccurately estimated at both modes, while EGR distribution has been shown to be present but unaccounted for by the ECM. The two modes and associated phenomena are essential to understanding why transient emission models are calibration dependent and furthermore how to choose training data that will result in good model generalization.
Resumo:
We investigated the effects of different encoding tasks and of manipulations of two supposedly surface parameters of music on implicit and explicit memory for tunes. In two experiments, participants were first asked to either categorize instrument or judge familiarity of 40 unfamiliar short tunes. Subsequently, participants were asked to give explicit and implicit memory ratings for a list of 80 tunes, which included 40 previously heard. Half of the 40 previously heard tunes differed in timbre (Experiment 1) or tempo (Experiment 2) in comparison with the first exposure. A third experiment compared similarity ratings of the tunes that varied in timbre or tempo. Analysis of variance (ANOVA) results suggest first that the encoding task made no difference for either memory mode. Secondly, timbre and tempo change both impaired explicit memory, whereas tempo change additionally made implicit tune recognition worse. Results are discussed in the context of implicit memory for nonsemantic materials and the possible differences in timbre and tempo in musical representations.
Resumo:
The two modes most widely used in Western music today convey opposite moods—a distinction that nonmusicians and even young children are able to make. However, the current studies provide evidence that, despite a strong link between mode and affect, mode perception is problematic. Nonmusicians found mode discrimination to be harder than discrimination of other melodic features, and they were not able to accurately classify major and minor melodies with these labels. Although nonmusicians were able to classify major and minor melodies using affective labels, they performed at chance in mode discrimination. Training, in the form of short lessons given to nonmusicians and the natural musical experience of musicians, improved performance, but not to ceiling levels. Tunes with high note density were classified as major, and tunes with low note density as minor, even though these features were actually unrelated in the experimental material. Although these findings provide support for the importance of mode in the perception of emotion, they clearly indicate that these mode perceptions are inaccurate, even in trained individuals, without the assistance of affective labeling.
Resumo:
We explored the ability of older (60-80 years old) and younger (18-23 years old) musicians and nonmusicians to judge the similarity of transposed melodies varying on rhythm, mode, and/or contour (Experiment 1) and to discriminate among melodies differing only in rhythm, mode, or contour (Experiment 2). Similarity ratings did not vary greatly among groups, with tunes differing only by mode being rated as most similar. In the same/different discrimination task, musicians performed better than nonmusicians, but we found no age differences. We also found that discrimination of major from minor tunes was difficult for everyone, even for musicians. Mode is apparently a subtle dimension in music, despite its deliberate use in composition and despite people's ability to label minor as "sad" and major as "happy."
Resumo:
The aim of this study was to investigate treatment failure (TF) in hospitalised community-acquired pneumonia (CAP) patients with regard to initial antibiotic treatment and economic impact. CAP patients were included in two open, prospective multicentre studies assessing the direct costs for in-patient treatment. Patients received treatment either with moxifloxacin (MFX) or a nonstandardised antibiotic therapy. Any change in antibiotic therapy after >72 h of treatment to a broadened antibiotic spectrum was considered as TF. Overall, 1,236 patients (mean ± SD age 69.6 ± 16.8 yrs, 691 (55.9%) male) were included. TF occurred in 197 (15.9%) subjects and led to longer hospital stay (15.4 ± 7.3 days versus 9.8 ± 4.2 days; p < 0.001) and increased median treatment costs (€2,206 versus €1,284; p<0.001). 596 (48.2%) patients received MFX and witnessed less TF (10.9% versus 20.6%; p < 0.001). After controlling for confounders in multivariate analysis, adjusted risk of TF was clearly reduced in MFX as compared with β-lactam monotherapy (adjusted OR for MFX 0.43, 95% CI 0.27-0.68) and was more comparable with a β-lactam plus macrolide combination (BLM) (OR 0.68, 95% CI 0.38-1.21). In hospitalised CAP, TF is frequent and leads to prolonged hospital stay and increased treatment costs. Initial treatment with MFX or BLM is a possible strategy to prevent TF, and may thus reduce treatment costs.
Resumo:
In autologous cell therapy, e.g. in melanocyte transplantation for vitiligo, a minimally invasive mode of transepidermal delivery of the isolated cells is of crucial importance to reduce potential side effects such as infections and scarring as well as to minimize the duration of sick leave.