4 resultados para External parameters

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The ability of the one-dimensional lake model FLake to represent the mixolimnion temperatures for tropical conditions was tested for three locations in East Africa: Lake Kivu and Lake Tanganyika's northern and southern basins. Meteorological observations from surrounding automatic weather stations were corrected and used to drive FLake, whereas a comprehensive set of water temperature profiles served to evaluate the model at each site. Careful forcing data correction and model configuration made it possible to reproduce the observed mixed layer seasonality at Lake Kivu and Lake Tanganyika (northern and southern basins), with correct representation of both the mixed layer depth and water temperatures. At Lake Kivu, mixolimnion temperatures predicted by FLake were found to be sensitive both to minimal variations in the external parameters and to small changes in the meteorological driving data, in particular wind velocity. In each case, small modifications may lead to a regime switch, from the correctly represented seasonal mixed layer deepening to either completely mixed or permanently stratified conditions from similar to 10 m downwards. In contrast, model temperatures were found to be robust close to the surface, with acceptable predictions of near-surface water temperatures even when the seasonal mixing regime is not reproduced. FLake can thus be a suitable tool to parameterise tropical lake water surface temperatures within atmospheric prediction models. Finally, FLake was used to attribute the seasonal mixing cycle at Lake Kivu to variations in the near-surface meteorological conditions. It was found that the annual mixing down to 60m during the main dry season is primarily due to enhanced lake evaporation and secondarily to the decreased incoming long wave radiation, both causing a significant heat loss from the lake surface and associated mixolimnion cooling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Bolt-kit systems are increasingly used as an alternative to conventional external cerebrospinal fluid (CSF) drainage systems. Since 2009 we regularly utilize bolt-kit external ventricular drainage (EVD) systems with silver-bearing catheters inserted manually with a hand drill and skull screws for emergency ventriculostomy. For non-emergency situations, we use conventional ventriculostomy with subcutaneous tunneled silver-bearing catheters, performed in the operating room with a pneumatic drill. This retrospective analysis compared the two techniques in terms of infection rates. METHODS 152 patients (aged 17-85 years, mean=55.4 years) were included in the final analysis; 95 received bolt-kit silver-bearing catheters and 57 received conventionally implanted silver-bearing catheters. The primary endpoint combined infection parameters: occurrence of positive CSF culture, colonization of catheter tips, or elevated CSF white blood cell counts (>4/μl). Secondary outcome parameters were presence of microorganisms in CSF or on catheter tips. Incidence of increased CSF cell counts and number of patients with catheter malposition were also compared. RESULTS The primary outcome, defined as analysis of combined infection parameters (occurrence of either positive CSF culture, colonization of the catheter tips or raised CSF white blood cell counts >4/μl)was not significantly different between the groups (58.9% bolt-kit group vs. 63.2% conventionally implanted group, p=0.61, chi-square-test). The bolt-kit group was non-inferior and not superior to the conventional group (relative risk reduction of 6.7%; 90% confidence interval: -19.9% to 25.6%). Secondary outcomes showed no statistically significant difference in the incidence of microorganisms in CSF (2.1% bolt-kit vs. 5.3% conventionally implanted; p=0.30; chi-square-test). CONCLUSIONS This analysis indicates that silver-bearing EVD catheters implanted with a bolt-kit system outside the operating room do not significantly elevate the risk of CSF infection as compared to conventional implant methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Predicting long-term survival after admission to hospital is helpful for clinical, administrative and research purposes. The Hospital-patient One-year Mortality Risk (HOMR) model was derived and internally validated to predict the risk of death within 1 year after admission. We conducted an external validation of the model in a large multicentre study. METHODS We used administrative data for all nonpsychiatric admissions of adult patients to hospitals in the provinces of Ontario (2003-2010) and Alberta (2011-2012), and to the Brigham and Women's Hospital in Boston (2010-2012) to calculate each patient's HOMR score at admission. The HOMR score is based on a set of parameters that captures patient demographics, health burden and severity of acute illness. We determined patient status (alive or dead) 1 year after admission using population-based registries. RESULTS The 3 validation cohorts (n = 2,862,996 in Ontario, 210 595 in Alberta and 66,683 in Boston) were distinct from each other and from the derivation cohort. The overall risk of death within 1 year after admission was 8.7% (95% confidence interval [CI] 8.7% to 8.8%). The HOMR score was strongly and significantly associated with risk of death in all populations and was highly discriminative, with a C statistic ranging from 0.89 (95% CI 0.87 to 0.91) to 0.92 (95% CI 0.91 to 0.92). Observed and expected outcome risks were similar (median absolute difference in percent dying in 1 yr 0.3%, interquartile range 0.05%-2.5%). INTERPRETATION The HOMR score, calculated using routinely collected administrative data, accurately predicted the risk of death among adult patients within 1 year after admission to hospital for nonpsychiatric indications. Similar performance was seen when the score was used in geographically and temporally diverse populations. The HOMR model can be used for risk adjustment in analyses of health administrative data to predict long-term survival among hospital patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The prevalence of keel bone damage as well as external egg parameters of 2 pure lines divergently selected for high (H) and low (L) bone strength were investigated in 2 aviary systems under commercial conditions. A standard LSL hybrid was used as a reference group. Birds were kept mixed per genetic line (77 hens of the H and L line and 201 or 206 hens of the LSL line, respectively, per pen) in 8 pens of 2 aviary systems differing in design. Keel bone status and body mass of 20 focal hens per line and pen were assessed at 17, 18, 23, 30, 36, 43, 52, and 63 wk of age. External egg parameters (i.e., egg mass, eggshell breaking strength, thickness, and mass) were measured using 10 eggs per line at both 38 and 57 wk of age. Body parameters (i.e. tarsus and third primary wing feather length to calculate index of wing loading) were recorded at 38 wk of age and mortality per genetic line throughout the laying cycle. Bone mineral density (BMD) of 15 keel bones per genetic line was measured after slaughter to confirm assignment of the experimental lines. We found a greater BMD in the H compared with the L and LSL lines. Fewer keel bone fractures and deviations, a poorer external egg quality, as well as a lower index of wing loading were found in the H compared with the L line. Mortality was lower and production parameters (e.g., laying performance) were higher in the LSL line compared with the 2 experimental lines. Aviary design affected prevalence of keel bone damage, body mass, and mortality. We conclude that selection of specific bone traits associated with bone strength as well as the related differences in body morphology (i.e., lower index of wing loading) have potential to reduce keel bone damage in commercial settings. Also, the housing environment (i.e., aviary design) may have additive effects.