69 resultados para On-site observations


Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Vasopressor-induced hypertension is routinely indicated for prevention and treatment of cerebral vasospasm (CVS) after subarachnoid haemorrhage (SAH). Mechanisms underlying patients' clinical improvement during vasopressor-induced hypertension remain incompletely understood. The aim of this study was to evaluate angiographic effects of normovolaemic Norepinephrine (NE)-induced hypertension therapy on the rabbit basilar artery (BA) after SAH. METHODS: Cerebral vasospasm was induced using the one-haemorrhage rabbit model; sham-operated animals served as controls. Five days later the animals underwent follow-up angiography prior to and during NE-induced hypertension. Changes in diameter of the BA were digitally calculated in mean microm +/- SEM (standard error of mean). FINDINGS: Significant CVS of 14.2% was documented in the BA of the SAH animals on day 5 compared to the baseline angiogram on day 0 (n = 12, p < 0.01), whereas the BA of the control animals remained statistically unchanged (n = 12, p > 0.05). During systemic administration of NE, mean arterial pressure increased from 70.0 +/- 1.9 mmHg to 136.0 +/- 2.1 mmHg in the SAH group (n = 12, p < 0.001) and from 72.0 +/- 3.1 to 137.8 +/- 1.3 in the control group (n = 12, p < 0.001). On day 5 after SAH, a significant dilatation of the BA in response to norepinephrine could be demonstrated in both groups. The diameter of the BA in the SAH group increased from 640.5 +/- 17.5 microm to 722.5 +/- 23.7 microm (n = 12, p < 0.05; ). In the control group the diameter increased from 716.8 +/- 15.5 microm to 779.9 +/- 24.1 microm (n = 12, p < 0.05). CONCLUSION: This study demonstrated that NE-induced hypertension causes angiographic dilatation of the BA in the SAH rabbit model. Based on these observations, it can be hypothesised that clinical improvement during vasopressor-induced hypertension therapy after SAH might be explained with cerebral vasodilatation mechanisms that lead to improvement of cerebral blood flow.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The role of platelets in hemostasis is to produce a plug to arrest bleeding. During thrombocytopenia, spontaneous bleeding is seen in some patients but not in others; the reason for this is unknown. Here, we subjected thrombocytopenic mice to models of dermatitis, stroke, and lung inflammation. The mice showed massive hemorrhage that was limited to the area of inflammation and was not observed in uninflamed thrombocytopenic mice. Endotoxin-induced lung inflammation during thrombocytopenia triggered substantial intra-alveolar hemorrhage leading to profound anemia and respiratory distress. By imaging the cutaneous Arthus reaction through a skin window, we observed in real time the loss of vascular integrity and the kinetics of skin hemorrhage in thrombocytopenic mice. Bleeding-observed mostly from venules-occurred as early as 20 minutes after challenge, pointing to a continuous need for platelets to maintain vascular integrity in inflamed microcirculation. Inflammatory hemorrhage was not seen in genetically engineered mice lacking major platelet adhesion receptors or their activators (alphaIIbbeta3, glycoprotein Ibalpha [GPIbalpha], GPVI, and calcium and diacylglycerol-regulated guanine nucleotide exchange factor I [CalDAG-GEFI]), thus indicating that firm platelet adhesion was not necessary for their supporting role. While platelets were previously shown to promote endothelial activation and recruitment of inflammatory cells, they also appear indispensable to maintain vascular integrity in inflamed tissue. Based on our observations, we propose that inflammation may cause life-threatening hemorrhage during thrombocytopenia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Within the scope of a comprehensive assessment of the degree of soil erosion in Switzerland, common methods have been used in the past including test plot measurements, artificial rainfall simulation, and erosion modelling. In addition, mapping guidelines for all visible erosion features have been developed since the 1970s and are being successfully applied in many research and soil conservation projects. Erosion damage has been continuously mapped over a period of 9 years in a test region in the central Bernese plateau. In 2005, two additional study areas were added. The present paper assesses the data gathered and provides a comparison of the three study areas within a period of one year (from October 2005 to October 2006), focusing on the on-site impacts of soil erosion. During this period, about 11 erosive rainfall events occurred. Average soil loss rates mapped at each study site amounted to 0.7 t ha-1, 1.2 t ha-1 and 2.3 t ha-1, respectively. About one fourth of the total arable land showed visible erosion damage. Maximum soil losses of about 70 t ha-1 occurred on individual farm plots. Average soil erosion patterns are widely used to underline the severity of an erosion problem (e.g. impacts on water bodies). But since severe rainfall events, wheel tracks, headlands, and other “singularities” often cause high erosion rates, analysis of extreme erosion patterns such as maximum values led to a more differentiated understanding and appropriate conclusions for planning and design of soil protection measures. The study contains an assessment of soil erosion in Switzerland, emphasizing questions about extent, frequency and severity. At the same time, the effects of different types of land management are investigated in the field, aiming at the development of meaningful impact indicators of (un-)sustainable agriculture/soil erosion risk as well as the validation of erosion models. The results illustrate that conservation agriculture including no-till, strip tillage and in-mulch seeding plays an essential role in reducing soil loss as compared to conventional tillage.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Andean piedmont of eastern Bolivia is situated at the southern margin of Amazonia characterized by an overall humid climate regime with a marked contrast between the rainy and dry seasons. The nearby Subandean foothills deliver abundant sandy sediments to the piedmont, leading to a complex array of sediments and paleosol horizons. Within this setting, the presented study analyzes four profiles of paleosol-sediment-sequences along incised ephemeral streams near Santa Cruz de la Sierra with a focus on past pedogenic variability in the context of the regional late Quaternary geomorphic and environmental evolution. Based on field observations, micromorphological analysis, geochemical and clay mineralogical data five classes of paleosol horizons could be distinguished. The individual paleosol horizons as well as the sediments, in which they developed, were interpreted regarding their paleoenvironmental significance, taking into consideration the various controls on soil formation with particular focus on changes of local environmental conditions through time. Thus, three different pathways of soil formation were established. On the late Quaternary timescale, the results suggest a strong relation between paleoenvironmental conditions (climate, vegetation etc.), soil environment (soil water flow, micro-environment) and the type of paleosol horizons developed in the study area. The formation of “red beds” (Bw horizons) implies very dry soil environments under dominantly dry conditions, which seem to have prevailed in the study area some time before ∼ 18 cal ka BP. Moderately dry but markedly seasonal environmental conditions with a long dry season and strong seasonal contrasts in soil water flow could explain the formation of moderately developed Bwt horizons around ∼ 18 cal ka BP and much of the mid-Holocene. The formation of Bt horizons and/or clay lamellae in relation to intense neoformation of clay and dominant clay illuviation by soil water points to wet conditions similar to today, which have probably prevailed in the study area before ∼ 8 cal ka BP and since ∼ 5 cal ka BP.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Bovine spongiform encephalopathy (BSE) rapid tests and routine BSE-testing laboratories underlie strict regulations for approval. Due to the lack of BSE-positive control samples, however, full assay validation at the level of individual test runs and continuous monitoring of test performance on-site is difficult. Most rapid tests use synthetic prion protein peptides, but it is not known to which extend they reflect the assay performance on field samples, and whether they are sufficient to indicate on-site assay quality problems. To address this question we compared the test scores of the provided kit peptide controls to those of standardized weak BSE-positive tissue samples in individual test runs as well as continuously over time by quality control charts in two widely used BSE rapid tests. Our results reveal only a weak correlation between the weak positive tissue control and the peptide control scores. We identified kit-lot related shifts in the assay performances that were not reflected by the peptide control scores. Vice versa, not all shifts indicated by the peptide control scores indeed reflected a shift in the assay performance. In conclusion these data highlight that the use of the kit peptide controls for continuous quality control purposes may result in unjustified rejection or acceptance of test runs. However, standardized weak positive tissue controls in combination with Shewhart-CUSUM control charts appear to be reliable in continuously monitoring assay performance on-site to identify undesired deviations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Lichens are a key component of forest biodiversity. However, a comprehensive study analyzing lichen species richness in relation to several management types, extending over different regions and forest stages and including information on site conditions is missing for temperate European forests. In three German regions (Schwäbische Alb, Hainich-Dün, Schorfheide-Chorin), the so-called Biodiversity Exploratories, we studied lichen species richness in 631 forest plots of 400 m2 comprising different management types (unmanaged, selection cutting, deciduous and coniferous age-class forests resulting from clear cutting or shelterwood logging), various stand ages, and site conditions, typical for large parts of temperate Europe. We analyzed how lichen species richness responds to management and habitat variables (standing biomass, cover of deadwood, cover of rocks). We found strong regional differences with highest lichen species richness in the Schwäbische Alb, probably driven by regional differences in former air pollution, and in precipitation and habitat variables. Overall, unmanaged forests harbored 22% more threatened lichen species than managed age-class forests. In general, total, corticolous, and threatened lichen species richness did not differ among management types of deciduous forests. However, in the Schwäbische-Alb region, deciduous forests had 61% more lichen species than coniferous forests and they had 279% more threatened and 76% more corticolous lichen species. Old deciduous age classes were richer in corticolous lichen species than young ones, while old coniferous age-classes were poorer than young ones. Overall, our findings highlight the importance of stand continuity for conservation. To increase total and threatened lichen species richness we suggest (1) conserving unmanaged forests, (2) promoting silvicultural methods assuring stand continuity, (3) conserving old trees in managed forests, (4) promoting stands of native deciduous tree species instead of coniferous plantations, and (5) increasing the amount of deadwood in forests.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is a wealth of smaller-scale studies on the effects of forest management on plant diversity. However, studies comparing plant species diversity in forests with different management types and intensity, extending over different regions and forest stages, and including detailed information on site conditions are missing. We studied vascular plants on 1500 20 m × 20 m forest plots in three regions of Germany (Schwäbische Alb, Hainich-Dün, Schorfheide-Chorin). In all regions, our study plots comprised different management types (unmanaged, selection cutting, deciduous and coniferous age-class forests, which resulted from clear cutting or shelterwood logging), various stand ages, site conditions, and levels of management-related disturbances. We analyzed how overall richness and richness of different plant functional groups (trees, shrubs, herbs, herbaceous species typically growing in forests and herbaceous light-demanding species) responded to the different management types. On average, plant species richness was 13% higher in age-class than in unmanaged forests, and did not differ between deciduous age-class and selection forests. In age-class forests of the Schwäbische Alb and Hainich-Dün, coniferous stands had higher species richness than deciduous stands. Among age-class forests, older stands with large quantities of standing biomass were slightly poorer in shrub and light-demanding herb species than younger stands. Among deciduous forests, the richness of herbaceous forest species was generally lower in unmanaged than in managed forests, and it was even 20% lower in unmanaged than in selection forests in Hainich-Dün. Overall, these findings show that disturbances by management generally increase plant species richness. This suggests that total plant species richness is not suited as an indicator for the conservation status of forests, but rather indicates disturbances.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES In dental research multiple site observations within patients or taken at various time intervals are commonplace. These clustered observations are not independent; statistical analysis should be amended accordingly. This study aimed to assess whether adjustment for clustering effects during statistical analysis was undertaken in five specialty dental journals. METHODS Thirty recent consecutive issues of Orthodontics (OJ), Periodontology (PJ), Endodontology (EJ), Maxillofacial (MJ) and Paediatric Dentristry (PDJ) journals were hand searched. Articles requiring adjustment accounting for clustering effects were identified and statistical techniques used were scrutinized. RESULTS Of 559 studies considered to have inherent clustering effects, adjustment for this was made in the statistical analysis in 223 (39.1%). Studies published in the Periodontology specialty accounted for clustering effects in the statistical analysis more often than articles published in other journals (OJ vs. PJ: OR=0.21, 95% CI: 0.12, 0.37, p<0.001; MJ vs. PJ: OR=0.02, 95% CI: 0.00, 0.07, p<0.001; PDJ vs. PJ: OR=0.14, 95% CI: 0.07, 0.28, p<0.001; EJ vs. PJ: OR=0.11, 95% CI: 0.06, 0.22, p<0.001). A positive correlation was found between increasing prevalence of clustering effects in individual specialty journals and correct statistical handling of clustering (r=0.89). CONCLUSIONS The majority of studies in 5 dental specialty journals (60.9%) examined failed to account for clustering effects in statistical analysis where indicated, raising the possibility of inappropriate decreases in p-values and the risk of inappropriate inferences.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

AIM To examine the association of alcohol-related mortality and other causes of death with neighbourhood density of alcohol-selling outlets for on-site consumption. DESIGN, SETTING AND PARTICIPANTS Longitudinal study of the adult Swiss population (n = 4 376 873) based on census records linked to mortality data from 2001 to 2008. MEASUREMENTS Sex-specific hazard ratios (HR) for death and 95% confidence intervals (95%CI) were calculated using Cox models adjusting for age, educational level, occupational attainment, marital status and other potential confounders. The density of alcohol-selling outlets within 1000 m of the residence was calculated using geocodes of outlets and residences. FINDINGS Compared with >17 outlets within 1000 m the HR for alcohol-related mortality in men was 0.95 (95%CI: 0.89-1.02) for 8-17 outlets, 0.84 (95%CI: 0.77-0.90) for 3-7 outlets, 0.76 (95%CI: 0.68-0.83) for 1-2 outlets and 0.60 (95%CI: 0.51-0.72) for 0 outlets. The gradient in women was somewhat steeper, with a HR comparing 0 with >17 outlets of 0.39 (95%CI: 0.26-0.60). Mortality from mental and behavioural causes and lung cancer were also associated with density of alcohol-selling outlets: HRs comparing 0 outlets with >17 outlets were 0.64 (95%CI: 0.52-0.79) and 0.79 (95%CI: 0.72-0.88), respectively, in men and 0.46 (95%CI: 0.27-0.78) and 0.63 (95%CI: 0.52-0.77), respectively, in women. There were weak associations in the same direction with all-cause mortality in men but not in women. CONCLUSIONS In Switzerland, alcohol-related mortality is associated with the density of outlets around the place of residence. Community-level interventions to reduce alcohol outlet density may usefully complement existing interventions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aims The effects of a system based on minimally trained first responders (FR) dispatched simultaneously with the emergency medical services (EMS) of the local hospital in a mixed urban and rural area in Northwestern Switzerland were examined. Methods and results In this prospective study 500 voluntary fire fighters received a 4-h training in basic-life-support using automated-external-defibrillation (AED). FR and EMS were simultaneously dispatched in a two-tier rescue system. During the years 2001–2008, response times, resuscitation interventions and outcomes were monitored. 1334 emergencies were included. The FR reached the patients (mean age 60.4 ± 19 years; 65% male) within 6 ± 3 min after emergency calls compared to 12 ± 5 min by the EMS (p < 0.0001). Seventy-six percent of the 297 OHCAs occurred at home. Only 3 emergencies with resuscitation attempts occurred at the main railway station equipped with an on-site AED. FR were on the scene before arrival of the EMS in 1166 (87.4%) cases. Of these, the FR used AED in 611 patients for monitoring or defibrillation. CPR was initiated by the FR in 164 (68.9% of 238 resuscitated patients). 124 patients were defibrillated, of whom 93 (75.0%) were defibrillated first by the FR. Eighteen patients (of whom 13 were defibrillated by the FR) were discharged from hospital in good neurological condition. Conclusions Minimally trained fire fighters integrated in an EMS as FR contributed substantially to an increase of the survival rate of OHCAs in a mixed urban and rural area.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Contagious Bovine Pleuropneumonia (CBPP) is the most important chronic pulmonary disease of cattle on the African continent causing severe economic losses. The disease, caused by infection with Mycoplasma mycoides subsp. mycoides is transmitted by animal contact and develops slowly into a chronic form preventing an early clinical diagnosis. Because available vaccines confer a low protection rate and short-lived immunity, the rapid diagnosis of infected animals combined with traditional curbing measures is seen as the best way to control the disease. While traditional labour-intensive bacteriological methods for the detection of M. mycoides subsp. mycoides have been replaced by molecular genetic techniques in the last two decades, these latter approaches require well-equipped laboratories and specialized personnel for the diagnosis. This is a handicap in areas where CBPP is endemic and early diagnosis is essential. RESULTS We present a rapid, sensitive and specific diagnostic tool for M. mycoides subsp. mycoides detection based on isothermal loop-mediated amplification (LAMP) that is applicable to field conditions. The primer set developed is highly specific and sensitive enough to diagnose clinical cases without prior cultivation of the organism. The LAMP assay detects M. mycoides subsp. mycoides DNA directly from crude samples of pulmonary/pleural fluids and serum/plasma within an hour using a simple dilution protocol. A photometric detection of LAMP products allows the real-time visualisation of the amplification curve and the application of a melting curve/re-association analysis presents a means of quality assurance based on the predetermined strand-inherent temperature profile supporting the diagnosis. CONCLUSION The CBPP LAMP developed in a robust kit format can be run on a battery-driven mobile device to rapidly detect M. mycoides subsp. mycoides infections from clinical or post mortem samples. The stringent innate quality control allows a conclusive on-site diagnosis of CBPP such as during farm or slaughter house inspections.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

PRINCIPLES To evaluate the validity and feasibility of a novel photography-based home assessment (PhoHA) protocol, as a possible substitute for on-site home assessment (OsHA). METHODS A total of 20 patients aged ≥65 years who were hospitalised in a rehabilitation centre for musculoskeletal disorders affecting mobility participated in this prospective validation study. For PhoHA, occupational therapists rated photographs and measurements of patients' homes provided by patients' confidants. For OsHA, occupational therapists conducted a conventional home visit. RESULTS Information obtained by PhoHA was 79.1% complete (1,120 environmental factors identified by PhoHA vs 1416 by OsHA). Of the 1,120 factors, 749 had dichotomous (potential hazards) and 371 continuous scores (measurements with tape measure). Validity of PhoHA to potential hazards was good (sensitivity 78.9%, specificity 84.9%), except for two subdomains (pathways, slippery surfaces). Pearson's correlation coefficient for the validity of measurements was 0.87 (95% confidence interval [CI 0.80-0.92, p <0.001). Agreement between methods was 0.52 (95%CI 0.34-0.67, p <0.001, Cohen's kappa coefficient) for dichotomous and 0.86 (95%CI 0.79-0.91, p <0.001, intraclass correlation coefficient) for continuous scores. Costs of PhoHA were 53.0% lower than those of OsHA (p <0.001). CONCLUSIONS PhoHA has good concurrent validity for environmental assessment if instructions for confidants are improved. PhoHA is potentially a cost-effective method for environmental assessment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Wetland and Wetland CH4 Intercomparison of Models Project (WETCHIMP) was created to evaluate our present ability to simulate large-scale wetland characteristics and corresponding methane (CH4) emissions. A multi-model comparison is essential to evaluate the key uncertainties in the mechanisms and parameters leading to methane emissions. Ten modelling groups joined WETCHIMP to run eight global and two regional models with a common experimental protocol using the same climate and atmospheric carbon dioxide (CO2) forcing datasets. We reported the main conclusions from the intercomparison effort in a companion paper (Melton et al., 2013). Here we provide technical details for the six experiments, which included an equilibrium, a transient, and an optimized run plus three sensitivity experiments (temperature, precipitation, and atmospheric CO2 concentration). The diversity of approaches used by the models is summarized through a series of conceptual figures, and is used to evaluate the wide range of wetland extent and CH4 fluxes predicted by the models in the equilibrium run. We discuss relationships among the various approaches and patterns in consistencies of these model predictions. Within this group of models, there are three broad classes of methods used to estimate wetland extent: prescribed based on wetland distribution maps, prognostic relationships between hydrological states based on satellite observations, and explicit hydrological mass balances. A larger variety of approaches was used to estimate the net CH4 fluxes from wetland systems. Even though modelling of wetland extent and CH4 emissions has progressed significantly over recent decades, large uncertainties still exist when estimating CH4 emissions: there is little consensus on model structure or complexity due to knowledge gaps, different aims of the models, and the range of temporal and spatial resolutions of the models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Penninic nappes in the Swiss Alps formed during continental collision between the Adriatic and European plates in Cenozoic times. Although intensely studied, the finite geometry of the basement-bearing Penninic nappes in western Switzerland has remained a matter of debate for decades (e.g., “Siviez-Mischabel dilemma”) and the paleogeographic origin of various nappes has been disputed. Here, we present new structural data for the central part of the Penninic Bernard nappe complex, which contains pre-Permian basement and Permo-Mesozoic metasedimentary units. Our lithological and structural observations indicate that the discrepancy between the different structural models proposed for the Bernard nappe complex can be explained by a lateral discontinuity. In the west, the presence of a Permian graben caused complex isoclinal folding, whereas in the east, the absence of such a graben resulted mainly in imbricate thrusting. The overall geometry of the Bernard nappe complex is the result of three main deformation phases: (1) detachment of Mesozoic cover sediments along Triassic evaporites (Evolène phase) during the early stages of collision, (2) Eocene top-to-the-N(NW) nappe stacking (Anniviers phase), and (3) subsequent backfolding and backshearing (Mischabel phase). The southward localized backshearing is key to understand the structural position and paleogeographic origin of units, such as the Frilihorn and Cimes Blanches “nappes” and the Antrona ophiolites. Based on these observations, we present a new tectonic model for the entire Penninic region of western Switzerland and discuss this model in terms of continental collision zone processes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Macrophages are an important line of defence against invading pathogens. Human macrophages derived by different methods were tested for their suitability as models to investigate Listeria monocytogenes (Lm) infection and compared to macrophage-like THP-1 cells. Human primary monocytes were isolated by either positive or negative immunomagnetic selection and differentiated in the presence of granulocyte macrophage colony-stimulating factor (GM-CSF) or macrophage colony-stimulating factor (M-CSF) into pro- or anti-inflammatory macrophages, respectively. Regardless of the isolation method, GM-CSF-derived macrophages (GM-Mφ) stained positive for CD206 and M-CSF-derived macrophages (M-Mφ) for CD163. THP-1 cells did not express CD206 or CD163 following incubation with PMA, M- or GM-CSF alone or in combination. Upon infection with Lm, all primary macrophages showed good survival at high multiplicities of infection whereas viability of THP-1 was severely reduced even at lower bacterial numbers. M-Mφ generally showed high phagocytosis of Lm. Strikingly, phagocytosis of Lm by GM-Mφ was markedly influenced by the method used for isolation of monocytes. GM-Mφ derived from negatively isolated monocytes showed low phagocytosis of Lm whereas GM-Mφ generated from positively selected monocytes displayed high phagocytosis of Lm. Moreover, incubation with CD14 antibody was sufficient to enhance phagocytosis of Lm by GM-Mφ generated from negatively isolated monocytes. By contrast, non-specific phagocytosis of latex beads by GM-Mφ was not influenced by treatment with CD14 antibody. Furthermore, phagocytosis of Lactococcus lactis, Escherichia coli, human cytomegalovirus and the protozoan parasite Leishmania major by GM-Mφ was not enhanced upon treatment with CD14 antibody indicating that this effect is specific for Lm. Based on these observations, we propose macrophages derived by ex vivo differentiation of negatively selected human primary monocytes as the most suitable model to study Lm infection of macrophages.