155 resultados para Drip loss


Relevância:

20.00% 20.00%

Publicador:

Resumo:

THE UPDATED second edition of this text begins with an overview of theories underpinning loss and grief, followed by a comprehensive outline of the author’s ‘range of response to loss’ (RRL) model, which emerged from her own research and experience. The RRL model provides a framework to explore variability in how people respond to grief, and case studies are used to demonstrate its application in practice. This is followed by an outline of the author’s ‘adult attitude to grief’ scale, which can be used to map a person’s grief and generate a grief profile.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evidence-based thermal care recommendations designed to minimize heat loss immediately at birth are readily available however, hypothermia still persists as a global challenge especially when caring for the most immature and smallest preterm infants. In this narrative overview we aim to provide the reader with a succinct summary of the causes and consequences of hypothermia, the extent of the problem (rates of hypothermia), principles of good thermal care, delivery room preventative measures, the research evidence underpinning existing interventions, current issues in practice, and the way forward. Due to the plethora of research literature available in this subject area, our article will focus primarily on evidence derived from systematic reviews and randomized or quasi-randomized controlled trials assessing the effectiveness of interventions to prevent hypothermia in the most vulnerable (preterm/low birth weight) infants where the intervention or combination of interventions is applied immediately at birth. © 2014.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Web-based programs are a potential medium for supporting weight loss because of their accessibility and wide reach. Research is warranted to determine the shorter- and longer-term effects of these programs in relation to weight loss and other health outcomes.

OBJECTIVE: The aim was to evaluate the effects of a Web-based component of a weight loss service (Imperative Health) in an overweight/obese population at risk of cardiovascular disease (CVD) using a randomized controlled design and a true control group.

METHODS: A total of 65 overweight/obese adults at high risk of CVD were randomly allocated to 1 of 2 groups. Group 1 (n=32) was provided with the Web-based program, which supported positive dietary and physical activity changes and assisted in managing weight. Group 2 continued with their usual self-care (n=33). Assessments were conducted face-to-face. The primary outcome was between-group change in weight at 3 months. Secondary outcomes included between-group change in anthropometric measurements, blood pressure, lipid measurements, physical activity, and energy intake at 3, 6, and 12 months. Interviews were conducted to explore participants' views of the Web-based program.

RESULTS: Retention rates for the intervention and control groups at 3 months were 78% (25/32) vs 97% (32/33), at 6 months were 66% (21/32) vs 94% (31/33), and at 12 months were 53% (17/32) vs 88% (29/33). Intention-to-treat analysis, using baseline observation carried forward imputation method, revealed that the intervention group lost more weight relative to the control group at 3 months (mean -3.41, 95% CI -4.70 to -2.13 kg vs mean -0.52, 95% CI -1.55 to 0.52 kg, P<.001), at 6 months (mean -3.47, 95% CI -4.95 to -1.98 kg vs mean -0.81, 95% CI -2.23 to 0.61 kg, P=.02), but not at 12 months (mean -2.38, 95% CI -3.48 to -0.97 kg vs mean -1.80, 95% CI -3.15 to -0.44 kg, P=.77). More intervention group participants lost ≥5% of their baseline body weight at 3 months (34%, 11/32 vs 3%, 1/33, P<.001) and 6 months (41%, 13/32 vs 18%, 6/33, P=.047), but not at 12 months (22%, 7/32 vs 21%, 7/33, P=.95) versus control group. The intervention group showed improvements in total cholesterol, triglycerides, and adopted more positive dietary and physical activity behaviors for up to 3 months verus control; however, these improvements were not sustained.

CONCLUSIONS: Although the intervention group had high attrition levels, this study provides evidence that this Web-based program can be used to initiate clinically relevant weight loss and lower CVD risk up to 3-6 months based on the proportion of intervention group participants losing ≥5% of their body weight versus control group. It also highlights a need for augmenting Web-based programs with further interventions, such as in-person support to enhance engagement and maintain these changes.


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, our previous work on Principal Component Analysis (PCA) based fault detection method is extended to the dynamic monitoring and detection of loss-of-main in power systems using wide-area synchrophasor measurements. In the previous work, a static PCA model was built and verified to be capable of detecting and extracting system faulty events; however the false alarm rate is high. To address this problem, this paper uses a well-known ‘time lag shift’ method to include dynamic behavior of the PCA model based on the synchronized measurements from Phasor Measurement Units (PMU), which is named as the Dynamic Principal Component Analysis (DPCA). Compared with the static PCA approach as well as the traditional passive mechanisms of loss-of-main detection, the proposed DPCA procedure describes how the synchrophasors are linearly
auto- and cross-correlated, based on conducting the singular value decomposition on the augmented time lagged synchrophasor matrix. Similar to the static PCA method, two statistics, namely T2 and Q with confidence limits are calculated to form intuitive charts for engineers or operators to monitor the loss-of-main situation in real time. The effectiveness of the proposed methodology is evaluated on the loss-of-main monitoring of a real system, where the historic data are recorded from PMUs installed in several locations in the UK/Ireland power system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To determine the effect of microbial metabolites on the release of root exudates from perennial ryegrass, seedlings were pulse labelled with [14C]-CO2 in the presence of a range of soil micro-organisms. Microbial inoculants were spatially separated from roots by Millipore membranes so that root infection did not occur. Using this technique, only microbial metabolites affected root exudation. The effect of microbial metabolites on carbon assimilation and distribution and root exudation was determined for 15 microbial species. Assimilation of a pulse label varied by over 3.5 fold, dependent on inoculant. Distribution of the label between roots and shoots also varied with inoculant, but the carbon pool that was most sensitive to inoculation was root exudation. In the absence of a microbial inoculant only 1% of assimilated label was exuded. Inoculation of the microcosms always caused an increase in exudation but the percentage exuded varied greatly, within the range of 3-34%. © 1995 Kluwer Academic Publishers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Single-Zone modelling is used to assess three 1D impeller loss model collections. An automotive turbocharger centrifugal compressor is used for evaluation. The individual 1D losses are presented relative to each other at three tip speeds to provide a visual description of each author’s perception of the relative importance of each loss. The losses are compared with their resulting prediction of pressure ratio and efficiency, which is further compared with test data; upon comparison, a combination of the 1D loss collections is identified as providing the best performance prediction. 3D CFD simulations have also been carried out for the same geometry using a single passage model. A method of extracting 1D losses from CFD is described and utilised to draw further comparisons with the 1D losses. A 1D scroll volute model has been added to the single passage CFD results; good agreement with the test data is achieved. Short-comings in the existing 1D loss models are identified as a result of the comparisons with 3D CFD losses. Further comparisons are drawn between the predicted 1D data, 3D CFD simulation results, and the test data using a nondimensional method to highlight where the current errors exist in the 1D prediction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract. Single-zone modelling is used to assess different collections of impeller 1D loss models. Three collections of loss models have been identified in literature, and the background to each of these collections is discussed. Each collection is evaluated using three modern automotive turbocharger style centrifugal compressors; comparisons of performance for each of the collections are made. An empirical data set taken from standard hot gas stand tests for each turbocharger is used as a baseline for comparison. Compressor range is predicted in this study; impeller diffusion ratio is shown to be a useful method of predicting compressor surge in 1D, and choke is predicted using basic compressible flow theory. The compressor designer can use this as a guide to identify the most compatible collection of losses for turbocharger compressor design applications. The analysis indicates the most appropriate collection for the design of automotive turbocharger centrifugal compressors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High Voltage Direct Current (HVDC) electric power transmission is a promising technology for integrating offshore wind farms and interconnecting power grids in different regions. In order to maintain the DC voltage, droop control has been widely used. Transmission line loss constitutes an import part of the total power loss in a multi-terminal HVDC scheme. In this paper, the relation between droop controller design and transmission loss has been investigated. Different MTDC layout configurations are compared to examine the effect of droop controller design on the transmission loss.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Public concern over biodiversity loss is often rationalized as a threat to ecosystem functioning, but biodiversity-ecosystem functioning (BEF) relations are hard to empirically quantify at large scales. We use a realistic marine food-web model, resolving species over five trophic levels, to study how total fish production changes with species richness. This complex model predicts that BEF relations, on average, follow simple Michaelis-Menten curves when species are randomly deleted. These are shaped mainly by release of fish from predation, rather than the release from competition expected from simpler communities. Ordering species deletions by decreasing body mass or trophic level, representing 'fishing down the food web', accentuates prey-release effects and results in unimodal relationships. In contrast, simultaneous unselective harvesting diminishes these effects and produces an almost linear BEF relation, with maximum multispecies fisheries yield at approximate to 40% of initial species richness. These findings have important implications for the valuation of marine biodiversity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PTF11iqb was initially classified as a TypeIIn event caught very early after explosion. It showed narrow Wolf-Rayet (WR) spectral features on day 2, but the narrow emission weakened quickly and the spectrum morphed to resemble those of Types II-L and II-P. At late times, Halpha emission exhibited a complex, multipeaked profile reminiscent of SN1998S. In terms of spectroscopic evolution, we find that PTF11iqb was a near twin of SN~1998S, although with weaker interaction with circumstellar material (CSM) at early times, and stronger CSM interaction at late times. We interpret the spectral changes as caused by early interaction with asymmetric CSM that is quickly (by day 20) enveloped by the expanding SN ejecta photosphere, but then revealed again after the end of the plateau when the photosphere recedes. The light curve can be matched with a simple model for weak CSM interaction added to the light curve of a normal SN~II-P. This plateau requires that the progenitor had an extended H envelope like a red supergiant, consistent with the slow progenitor wind speed indicated by narrow emission. The cool supergiant progenitor is significant because PTF11iqb showed WR features in its early spectrum --- meaning that the presence of such WR features in an early SN spectrum does not necessarily indicate a WR-like progenitor. [abridged] Overall, PTF11iqb bridges SNe~IIn with weaker pre-SN mass loss seen in SNe II-L and II-P, implying a continuum between these types.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the reinsurance market, the risks natural catastrophes pose to portfolios of properties must be quantified, so that they can be priced, and insurance offered. The analysis of such risks at a portfolio level requires a simulation of up to 800 000 trials with an average of 1000 catastrophic events per trial. This is sufficient to capture risk for a global multi-peril reinsurance portfolio covering a range of perils including earthquake, hurricane, tornado, hail, severe thunderstorm, wind storm, storm surge and riverine flooding, and wildfire. Such simulations are both computation and data intensive, making the application of high-performance computing techniques desirable.

In this paper, we explore the design and implementation of portfolio risk analysis on both multi-core and many-core computing platforms. Given a portfolio of property catastrophe insurance treaties, key risk measures, such as probable maximum loss, are computed by taking both primary and secondary uncertainties into account. Primary uncertainty is associated with whether or not an event occurs in a simulated year, while secondary uncertainty captures the uncertainty in the level of loss due to the use of simplified physical models and limitations in the available data. A combination of fast lookup structures, multi-threading and careful hand tuning of numerical operations is required to achieve good performance. Experimental results are reported for multi-core processors and systems using NVIDIA graphics processing unit and Intel Phi many-core accelerators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES:

To compare methods to estimate the incidence of visual field progression used by 3 large randomized trials of glaucoma treatment by applying these methods to a common data set of annually obtained visual field measurements of patients with glaucoma followed up for an average of 6 years.

METHODS:

The methods used by the Advanced Glaucoma Intervention Study (AGIS), the Collaborative Initial Glaucoma Treatment Study (CIGTS), and the Early Manifest Glaucoma Treatment study (EMGT) were applied to 67 eyes of 56 patients with glaucoma enrolled in a 10-year natural history study of glaucoma using Program 30-2 of the Humphrey Field Analyzer (Humphrey Instruments, San Leandro, Calif). The incidence of apparent visual field progression was estimated for each method. Extent of agreement between the methods was calculated, and time to apparent progression was compared.

RESULTS:

The proportion of patients progressing was 11%, 22%, and 23% with AGIS, CIGTS, and EMGT methods, respectively. Clinical assessment identified 23% of patients who progressed, but only half of these were also identified by CIGTS or EMGT methods. The CIGTS and the EMGT had comparable incidence rates, but only half of those identified by 1 method were also identified by the other.

CONCLUSIONS:

The EMGT and CIGTS methods produced rates of apparent progression that were twice those of the AGIS method. Although EMGT, CIGTS, and clinical assessment rates were comparable, they did not identify the same patients as having had field progression.