928 resultados para pressure analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study focuses on a specific engine, i.e., a dual-spool, separate-flow turbofan engine with an Interstage Turbine Burner (ITB). This conventional turbofan engine has been modified to include a secondary isobaric burner, i.e., ITB, in a transition duct between the high-pressure turbine and the low-pressure turbine. The preliminary design phase for this modified engine starts with the aerothermodynamics cycle analysis is consisting of parametric (i.e., on-design) and performance (i.e., off-design) cycle analyses. In parametric analysis, the modified engine performance parameters are evaluated and compared with baseline engine in terms of design limitation (maximum turbine inlet temperature), flight conditions (such as flight Mach condition, ambient temperature and pressure), and design choices (such as compressor pressure ratio, fan pressure ratio, fan bypass ratio etc.). A turbine cooling model is also included to account for the effect of cooling air on engine performance. The results from the on-design analysis confirmed the advantage of using ITB, i.e., higher specific thrust with small increases in thrust specific fuel consumption, less cooling air, and less NOx production, provided that the main burner exit temperature and ITB exit temperature are properly specified. It is also important to identify the critical ITB temperature, beyond which the ITB is turned off and has no advantage at all. With the encouraging results from parametric cycle analysis, a detailed performance cycle analysis of the identical engine is also conducted for steady-stateengine performance prediction. The results from off-design cycle analysis show that the ITB engine at full throttle setting has enhanced performance over baseline engine. Furthermore, ITB engine operating at partial throttle settings will exhibit higher thrust at lower specific fuel consumption and improved thermal efficiency over the baseline engine. A mission analysis is also presented to predict the fuel consumptions in certain mission phases. Excel macrocode, Visual Basic for Application, and Excel neuron cells are combined to facilitate Excel software to perform these cycle analyses. These user-friendly programs compute and plot the data sequentially without forcing users to open other types of post-processing programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Typical internal combustion engines lose about 75% of the fuel energy through the engine coolant, exhaust and surface radiation. Most of the heat generated comes from converting the chemical energy in the fuel to mechanical energy and in turn thermal energy is produced. In general, the thermal energy is unutilized and thus wasted. This report describes the analysis of a novel waste heat recovery (WHR) system that operates on a Rankine cycle. This novel WHR system consists of a second piston within the existing piston to reduce losses associated with compression and exhaust strokes in a four-cycle engine. The wasted thermal energy recovered from the coolant and exhaust systems generate a high temperature and high pressure working fluid which is used to power the modified piston assembly. Cycle simulation shows that a large, stationary natural gas spark ignition engine produces enough waste heat to operate the novel WHR system. With the use of this system, the stationary gas compression ignition engine running at 900 RPM and full load had a net increase of 177.03 kW (240.7 HP). This increase in power improved the brake fuel conversion efficiency by 4.53%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An extrusion die is used to continuously produce parts with a constant cross section; such as sheets, pipes, tire components and more complex shapes such as window seals. The die is fed by a screw extruder when polymers are used. The extruder melts, mixes and pressures the material by the rotation of either a single or double screw. The polymer can then be continuously forced through the die producing a long part in the shape of the die outlet. The extruded section is then cut to the desired length. Generally, the primary target of a well designed die is to produce a uniform outlet velocity without excessively raising the pressure required to extrude the polymer through the die. Other properties such as temperature uniformity and residence time are also important but are not directly considered in this work. Designing dies for optimal outlet velocity variation using simple analytical equations are feasible for basic die geometries or simple channels. Due to the complexity of die geometry and of polymer material properties design of complex dies by analytical methods is difficult. For complex dies iterative methods must be used to optimize dies. An automated iterative method is desired for die optimization. To automate the design and optimization of an extrusion die two issues must be dealt with. The first is how to generate a new mesh for each iteration. In this work, this is approached by modifying a Parasolid file that describes a CAD part. This file is then used in a commercial meshing software. Skewing the initial mesh to produce a new geometry was also employed as a second option. The second issue is an optimization problem with the presence of noise stemming from variations in the mesh and cumulative truncation errors. In this work a simplex method and a modified trust region method were employed for automated optimization of die geometries. For the trust region a discreet derivative and a BFGS Hessian approximation were used. To deal with the noise in the function the trust region method was modified to automatically adjust the discreet derivative step size and the trust region based on changes in noise and function contour. Generally uniformity of velocity at exit of the extrusion die can be improved by increasing resistance across the die but this is limited by the pressure capabilities of the extruder. In optimization, a penalty factor that increases exponentially from the pressure limit is applied. This penalty can be applied in two different ways; the first only to the designs which exceed the pressure limit, the second to both designs above and below the pressure limit. Both of these methods were tested and compared in this work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Pacaya volcanic complex is part of the Central American volcanic arc, which is associated with the subduction of the Cocos tectonic plate under the Caribbean plate. Located 30 km south of Guatemala City, Pacaya is situated on the southern rim of the Amatitlan Caldera. It is the largest post-caldera volcano, and has been one of Central America’s most active volcanoes over the last 500 years. Between 400 and 2000 years B.P, the Pacaya volcano had experienced a huge collapse, which resulted in the formation of horseshoe-shaped scarp that is still visible. In the recent years, several smaller collapses have been associated with the activity of the volcano (in 1961 and 2010) affecting its northwestern flanks, which are likely to be induced by the local and regional stress changes. The similar orientation of dry and volcanic fissures and the distribution of new vents would likely explain the reactivation of the pre-existing stress configuration responsible for the old-collapse. This paper presents the first stability analysis of the Pacaya volcanic flank. The inputs for the geological and geotechnical models were defined based on the stratigraphical, lithological, structural data, and material properties obtained from field survey and lab tests. According to the mechanical characteristics, three lithotechnical units were defined: Lava, Lava-Breccia and Breccia-Lava. The Hoek and Brown’s failure criterion was applied for each lithotechnical unit and the rock mass friction angle, apparent cohesion, and strength and deformation characteristics were computed in a specified stress range. Further, the stability of the volcano was evaluated by two-dimensional analysis performed by Limit Equilibrium (LEM, ROCSCIENCE) and Finite Element Method (FEM, PHASE 2 7.0). The stability analysis mainly focused on the modern Pacaya volcano built inside the collapse amphitheatre of “Old Pacaya”. The volcanic instability was assessed based on the variability of safety factor using deterministic, sensitivity, and probabilistic analysis considering the gravitational instability and the effects of external forces such as magma pressure and seismicity as potential triggering mechanisms of lateral collapse. The preliminary results from the analysis provide two insights: first, the least stable sector is on the south-western flank of the volcano; second, the lowest safety factor value suggests that the edifice is stable under gravity alone, and the external triggering mechanism can represent a likely destabilizing factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Strain rate significantly affects the strength of a material. The Split-Hopkinson Pressure Bar (SHPB) was initially used to study the effects of high strain rate (~103 1/s) testing of metals. Later modifications to the original technique allowed for the study of brittle materials such as ceramics, concrete, and rock. While material properties of wood for static and creep strain rates are readily available, data on the dynamic properties of wood are sparse. Previous work using the SHPB technique with wood has been limited in scope to variability of only a few conditions and tests of the applicability of the SHPB theory on wood have not been performed. Tests were conducted using a large diameter (3.0 inch (75 mm)) SHPB. The strain rate and total strain applied to a specimen are dependent on the striker bar length and velocity at impact. Pulse shapers are used to further modify the strain rate and change the shape of the strain pulse. A series of tests were used to determine test conditions necessary to produce a strain rate, total strain, and pulse shape appropriate for testing wood specimens. Hard maple, consisting of sugar maple (Acer saccharum) and black maple (Acer nigrum), and eastern white pine (Pinus strobus) specimens were used to represent a dense hardwood and a low-density soft wood. Specimens were machined to diameters of 2.5 and 3.0 inches and an assortment of lengths were tested to determine the appropriate specimen dimensions. Longitudinal specimens of 1.5 inch length and radial and tangential specimens of 0.5 inch length were found to be most applicable to SHPB testing. Stress/strain curves were generated from the SHPB data and validated with 6061-T6 aluminum and wood specimens. Stress was indirectly corroborated with gaged aluminum specimens. Specimen strain was assessed with strain gages, digital image analysis, and measurement of residual strain to confirm the strain calculated from SHPB data. The SHPB was found to be a useful tool in accurately assessing the material properties of wood under high strain rates (70 to 340 1/s) and short load durations (70 to 150 μs to compressive failure).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fundamental combustion model for spark-ignition engine is studied in this report. The model is implemented in SIMULINK to simulate engine outputs (mass fraction burn and in-cylinder pressure) under various engine operation conditions. The combustion model includes a turbulent propagation and eddy burning processes based on literature [1]. The turbulence propagation and eddy burning processes are simulated by zero-dimensional method and the flame is assumed as sphere. To predict pressure, temperature and other in-cylinder variables, a two-zone thermodynamic model is used. The predicted results of this model match well with the engine test data under various engine speeds, loads, spark ignition timings and air fuel mass ratios. The developed model is used to study cyclic variation and combustion stability at lean (or diluted) combustion conditions. Several variation sources are introduced into the combustion model to simulate engine performance observed in experimental data. The relations between combustion stability and the introduced variation amount are analyzed at various lean combustion levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Transient elevation of arterial blood pressure (BP) is frequent in acute ischemic stroke and may help to increase perfusion of tissue jeopardized by ischemia. If this is true, recanalization may eliminate the need for this BP elevation. METHODS: We analyzed BP in 149 patients with acute ischemic stroke on admission to the hospital and 1 and 12 hours after intraarterial thrombolysis. BP values of patients with adequate recanalization were compared with BP values of patients with inadequate recanalization. Recanalization was determined on cerebral arteriography after thrombolysis using thrombolysis in myocardial infarction grades. RESULTS: Systolic, mean, and diastolic arterial BP decreased significantly from admission to 12 hours after thrombolysis in all patients (P<0.001). Before thrombolysis, patients with adequate and inadequate recanalization showed equal systolic (147.4 and 148.0 mm Hg), mean (102.1 and 104.1 mm Hg), and diastolic (79.5 and 82.1 mm Hg) BP values. Twelve hours after thrombolysis, patients with adequate recanalization had lower values than those with inadequate recanalization (systolic BP, 130 versus 139.9 mm Hg; mean BP, 86.8 versus 92.2 mm Hg; and diastolic, BP 65.2 versus 68.3 mm Hg). Two-way repeated ANOVA analysis showed a significant group x time interaction for systolic BP, indicating a larger systolic BP decrease when recanalization succeeded (P=0.019). CONCLUSIONS: The course of elevated systolic but not diastolic BP after acute ischemic stroke was found to be inversely associated with the degree of vessel recanalization. When recanalization failed, systolic BP remained elevated longer than when it succeeded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Difference in pulse pressure (dPP) reliably predicts fluid responsiveness in patients. We have developed a respiratory variation (RV) monitoring device (RV monitor), which continuously records both airway pressure and arterial blood pressure (ABP). We compared the RV monitor measurements with manual dPP measurements. METHODS: ABP and airway pressure (PAW) from 24 patients were recorded. Data were fed to the RV monitor to calculate dPP and systolic pressure variation in two different ways: (a) considering both ABP and PAW (RV algorithm) and (b) ABP only (RV(slim) algorithm). Additionally, ABP and PAW were recorded intraoperatively in 10-min intervals for later calculation of dPP by manual assessment. Interobserver variability was determined. Manual dPP assessments were used for comparison with automated measurements. To estimate the importance of the PAW signal, RV(slim) measurements were compared with RV measurements. RESULTS: For the 24 patients, 174 measurements (6-10 per patient) were recorded. Six observers assessed dPP manually in the first 8 patients (10-min interval, 53 measurements); no interobserver variability occurred using a computer-assisted method. Bland-Altman analysis showed acceptable bias and limits of agreement of the 2 automated methods compared with the manual method (RV: -0.33% +/- 8.72% and RV(slim): -1.74% +/- 7.97%). The difference between RV measurements and RV(slim) measurements is small (bias -1.05%, limits of agreement 5.67%). CONCLUSIONS: Measurements of the automated device are comparable with measurements obtained by human observers, who use a computer-assisted method. The importance of the PAW signal is questionable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: In search of an optimal compression therapy for venous leg ulcers, a systematic review and meta-analysis was performed of randomized controlled trials (RCT) comparing compression systems based on stockings (MCS) with divers bandages. METHODS: RCT were retrieved from six sources and reviewed independently. The primary endpoint, completion of healing within a defined time frame, and the secondary endpoints, time to healing, and pain were entered into a meta-analysis using the tools of the Cochrane Collaboration. Additional subjective endpoints were summarized. RESULTS: Eight RCT (published 1985-2008) fulfilled the predefined criteria. Data presentation was adequate and showed moderate heterogeneity. The studies included 692 patients (21-178/study, mean age 61 years, 56% women). Analyzed were 688 ulcerated legs, present for 1 week to 9 years, sizing 1 to 210 cm(2). The observation period ranged from 12 to 78 weeks. Patient and ulcer characteristics were evenly distributed in three studies, favored the stocking groups in four, and the bandage group in one. Data on the pressure exerted by stockings and bandages were reported in seven and two studies, amounting to 31-56 and 27-49 mm Hg, respectively. The proportion of ulcers healed was greater with stockings than with bandages (62.7% vs 46.6%; P < .00001). The average time to healing (seven studies, 535 patients) was 3 weeks shorter with stockings (P = .0002). In no study performed bandages better than MCS. Pain was assessed in three studies (219 patients) revealing an important advantage of stockings (P < .0001). Other subjective parameters and issues of nursing revealed an advantage of MCS as well. CONCLUSIONS: Leg compression with stockings is clearly better than compression with bandages, has a positive impact on pain, and is easier to use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the association between arterial blood pressure (ABP) during the first 24 h and mortality in sepsis. DESIGN: Retrospective cohort study. SETTING: Multidisciplinary intensive care unit (ICU). PATIENTS AND PARTICIPANTS: A total of 274 septic patients. INTERVENTIONS: None. MEASUREMENTS AND RESULTS: Hemodynamic, and laboratory parameters were extracted from a PDMS database. The hourly time integral of ABP drops below clinically relevant systolic arterial pressure (SAP), mean arterial pressure (MAP), and mean perfusion pressure (MPP = MAP - central venous pressure) levels was calculated for the first 24 h after ICU admission and compared with 28-day-mortality. Binary and linear regression models (adjusted for SAPS II as a measure of disease severity), and a receiver operating characteristic (ROC) analysis were applied. The areas under the ROC curve were largest for the hourly time integrals of ABP drops below MAP 60 mmHg (0.779 vs. 0.764 for ABP drops below MAP 55 mmHg; P < or = 0.01) and MPP 45 mmHg. No association between the hourly time integrals of ABP drops below certain SAP levels and mortality was detected. One or more episodes of MAP < 60 mmHg increased the risk of death by 2.96 (CI 95%, 1.06-10.36, P = 0.04). The area under the ROC curve to predict the need for renal replacement therapy was highest for the hourly time integral of ABP drops below MAP 75 mmHg. CONCLUSIONS: A MAP level > or = 60 mmHg may be as safe as higher MAP levels during the first 24 h of ICU therapy in septic patients. A higher MAP may be required to maintain kidney function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Twentieth Century Reanalysis (20CR) is an atmospheric dataset consisting of 56 ensemble members, which covers the entire globe and reaches back to 1871. To assess the suitability of this dataset for studying past extremes, we analysed a prominent extreme event, namely the Galveston Hurricane, which made landfall in September 1900 in Texas, USA. The ensemble mean of 20CR shows a track of the pressure minimum with a small standard deviation among the 56 ensemble members in the area of the Gulf of Mexico. However, there are systematic differences between the assimilated “Best Track” from the International Best Track Archive for Climate Stewardship (IBTrACS) and the ensemble mean track in 20CR. East of the Strait of Florida, the tracks derived from 20CR are located systematically northeast of the assimilated track while in the Gulf of Mexico, the 20CR tracks are systematically shifted to the southwest compared to the IBTrACS position. The hurricane can also be observed in the wind field, which shows a cyclonic rotation and a relatively calm zone in the centre of the hurricane. The 20CR data reproduce the pressure gradient and cyclonic wind field. Regarding the amplitude of the wind speeds, the ensemble mean values from 20CR are significantly lower than the wind speeds known from measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The meteorological circumstances that led to the Blizzard of March 1888 that hit New York are analysed in Version 2 of the “Twentieth Century Reanalysis” (20CR). The potential of this data set for studying historical extreme events has not yet been fully explored. A detailed analysis of 20CR data alongside other data sources (including historical instrumental data and weather maps) for historical extremes such as the March 1888 blizzard may give insights into the limitations of 20CR. We find that 20CR reproduces the circulation pattern as well as the temperature development very well. Regarding the absolute values of variables such as snow fall or minimum and maximum surface pressure, there is anunderestimation of the observed extremes, which may be due to the low spatial resolution of 20CR and the fact that only the ensemble mean is considered. Despite this drawback, the dataset allows us to gain new information due to its complete spatial and temporal coverage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION Low systolic blood pressure (SBP) is an important secondary insult following traumatic brain injury (TBI), but its exact relationship with outcome is not well characterised. Although a SBP of <90mmHg represents the threshold for hypotension in consensus TBI treatment guidelines, recent studies suggest redefining hypotension at higher levels. This study therefore aimed to fully characterise the association between admission SBP and mortality to further inform resuscitation endpoints. METHODS We conducted a multicentre cohort study using data from the largest European trauma registry. Consecutive adult patients with AIS head scores >2 admitted directly to specialist neuroscience centres between 2005 and July 2012 were studied. Multilevel logistic regression models were developed to examine the association between admission SBP and 30 day inpatient mortality. Models were adjusted for confounders including age, severity of injury, and to account for differential quality of hospital care. RESULTS 5057 patients were included in complete case analyses. Admission SBP demonstrated a smooth u-shaped association with outcome in a bivariate analysis, with increasing mortality at both lower and higher values, and no evidence of any threshold effect. Adjusting for confounding slightly attenuated the association between mortality and SBP at levels <120mmHg, and abolished the relationship for higher SBP values. Case-mix adjusted odds of death were 1.5 times greater at <120mmHg, doubled at <100mmHg, tripled at <90mmHg, and six times greater at SBP<70mmHg, p<0.01. CONCLUSIONS These findings indicate that TBI studies should model SBP as a continuous variable and may suggest that current TBI treatment guidelines, using a cut-off for hypotension at SBP<90mmHg, should be reconsidered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Identifying and comparing different steady states is an important task for clinical decision making. Data from unequal sources, comprising diverse patient status information, have to be interpreted. In order to compare results an expressive representation is the key. In this contribution we suggest a criterion to calculate a context-sensitive value based on variance analysis and discuss its advantages and limitations referring to a clinical data example obtained during anesthesia. Different drug plasma target levels of the anesthetic propofol were preset to reach and maintain clinically desirable steady state conditions with target controlled infusion (TCI). At the same time systolic blood pressure was monitored, depth of anesthesia was recorded using the bispectral index (BIS) and propofol plasma concentrations were determined in venous blood samples. The presented analysis of variance (ANOVA) is used to quantify how accurately steady states can be monitored and compared using the three methods of measurement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study tests whether cognitive failures mediate effects of work-related time pressure and time control on commuting accidents and near-accidents. Participants were 83 employees (56% female) who each commuted between their regular place of residence and place of work using vehicles. The Workplace Cognitive Failure Scale (WCFS) asked for the frequency of failure in memory function, failure in attention regulation, and failure in action execution. Time pressure and time control at work were assessed by the Instrument for Stress Oriented Task Analysis (ISTA). Commuting accidents in the last 12 months were reported by 10% of participants, and half of the sample reported commuting near-accidents in the last 4 weeks. Cognitive failure significantly mediated the influence of time pressure at work on near-accidents even when age, gender, neuroticism, conscientiousness, commuting duration, commuting distance, and time pressure during commuting were controlled for. Time control was negatively related to cognitive failure and neuroticism, but no association with commuting accidents or near-accidents was found. Time pressure at work is likely to increase cognitive load. Time pressure might, therefore, increase cognitive failures during work and also during commuting. Hence, time pressure at work can decrease commuting safety. The result suggests a reduction of time pressure at work should improve commuting safety.