983 resultados para CATCHMENT-AREA
Resumo:
Up to 88% of cavernous malformations (CMs) of the central nervous system can become symptomatic and cause long-term disability. The aim of this study was to document the characteristics of CMs in the catchment area of our institution.
Resumo:
We examined the seasonal variability of spontaneous cervical artery dissection (sCAD) by analysing prospectively collected data from 352 patients with 380 sCAD (361 symptomatic sCAD; 305 carotid and 75 vertebral artery dissections) admitted to two university hospitals with a catchment area of 2,200,000 inhabitants between 1985 and 2004. Presenting symptoms and signs of the 380 sCAD were ischaemic stroke in 241 (63%), transient ischaemic attack in 40 (11%), retinal ischemia in seven (2%), and non-ischaemic in 73 (19%) cases; 19 (5%) were asymptomatic sCAD. A seasonal pattern, with higher frequency of sCAD in winter (31.3%; 95% confidence interval (CI): 26.5 to 36.4; p=0.021) compared to spring (25.5%; 95% CI: 21.1 to 30.3), summer (23.5%; 95% CI: 19.3 to 28.3), and autumn (19.7%; 95% CI: 15.7 to 24.1) was observed. Although the cause of seasonality in sCAD is unclear, the winter peaks of infection, hypertension, and aortic dissection suggest common underlying mechanisms.
Resumo:
Latent class regression models are useful tools for assessing associations between covariates and latent variables. However, evaluation of key model assumptions cannot be performed using methods from standard regression models due to the unobserved nature of latent outcome variables. This paper presents graphical diagnostic tools to evaluate whether or not latent class regression models adhere to standard assumptions of the model: conditional independence and non-differential measurement. An integral part of these methods is the use of a Markov Chain Monte Carlo estimation procedure. Unlike standard maximum likelihood implementations for latent class regression model estimation, the MCMC approach allows us to calculate posterior distributions and point estimates of any functions of parameters. It is this convenience that allows us to provide the diagnostic methods that we introduce. As a motivating example we present an analysis focusing on the association between depression and socioeconomic status, using data from the Epidemiologic Catchment Area study. We consider a latent class regression analysis investigating the association between depression and socioeconomic status measures, where the latent variable depression is regressed on education and income indicators, in addition to age, gender, and marital status variables. While the fitted latent class regression model yields interesting results, the model parameters are found to be invalid due to the violation of model assumptions. The violation of these assumptions is clearly identified by the presented diagnostic plots. These methods can be applied to standard latent class and latent class regression models, and the general principle can be extended to evaluate model assumptions in other types of models.
Resumo:
This thesis presents a paleoclimatic/paleoenvironmental study conducted on clastic cave sediments of the Moravian Karst, Czech Republic. The study is based on environmental magnetic techniques, yet a wide range of other scientific methods was used to obtain a clearer picture of the Quaternary climate. My thesis also presents an overview of the significance of cave deposits for paleoclimatic reconstructions, explains basic environmental magnetic techniques and offers background information on the study area – a famous karst region in Central Europe with a rich history. In Kulna Cave magnetic susceptibility variations and in particular variations in pedogenic susceptibility yield a detailed record of the palaeoenvironmental conditions during the Last Glacial Stage. The Kulna long-term climatic trends agree with the deep-sea SPECMAP record, while the short-term oscillations correlate with rapid changes in the North Atlantic sea surface temperatures. Kulna Cave sediments reflect the intensity of pedogenesis controlled by short-term warmer events and precipitation over the mid-continent and provide a link between continental European climate and sea surface temperatures in the North Atlantic during the Last Glacial Stage. Given the number of independent climate proxies determined from the entrance facies of the cave and their high resolution, Kulna is an extremely important site for studying Late Pleistocene climate. In the interior of Spiralka Cave, a five meter high section of fine grained sediments deposited during floods yields information on the climatic and environmental conditions of the last millenium. In the upper 1.5 meters of this profile, mineral magnetic and other non-magnetic data indicate that susceptibility variations are controlled by the concentration of magnetite and its magnetic grain size. Comparison of our susceptibility record to the instrumental record of winter temperature anomalies shows a remarkable correlation. This correlation is explained by coupling of the flooding events, cultivation of land and pedogenetic processes in the cave catchment area. A combination of mineral magnetic and geochemical proxies yields a detail picture of the rapidly evolving climate of the near past and tracks both natural and human induced environmental changes taking place in the broader region.
Resumo:
Changes in land cover alter the water balance components of a catchment, due to strong interactions between soils, vegetation and the atmosphere. Therefore, hydrological climate impact studies should also integrate scenarios of associated land cover change. To reflect two severe climate-induced changes in land cover, we applied scenarios of glacier retreat and forest cover increase that were derived from the temperature signals of the climate scenarios used in this study. The climate scenarios were derived from ten regional climate models from the ENSEMBLES project. Their respective temperature and precipitation changes between the scenario period (2074–2095) and the control period (1984–2005) were used to run a hydrological model. The relative importance of each of the three types of scenarios (climate, glacier, forest) was assessed through an analysis of variance (ANOVA). Altogether, 15 mountainous catchments in Switzerland were analysed, exhibiting different degrees of glaciation during the control period (0–51%) and different degrees of forest cover increase under scenarios of change (12–55% of the catchment area). The results show that even an extreme change in forest cover is negligible with respect to changes in runoff, but it is crucial as soon as changes in evaporation or soil moisture are concerned. For the latter two variables, the relative impact of forest change is proportional to the magnitude of its change. For changes that concern 35% of the catchment area or more, the effect of forest change on summer evapotranspiration is equally or even more important than the climate signal. For catchments with a glaciation of 10% or more in the control period, the glacier retreat significantly determines summer and annual runoff. The most important source of uncertainty in this study, though, is the climate scenario and it is highly recommended to apply an ensemble of climate scenarios in the impact studies. The results presented here are valid for the climatic region they were tested for, i.e., a humid, mid-latitude mountainous environment. They might be different for regions where the evaporation is a major component of the water balance, for example. Nevertheless, a hydrological climate-impact study that assesses the additional impacts of forest and glacier change is new so far and provides insight into the question whether or not it is necessary to account for land cover changes as part of climate change impacts on hydrological systems.
Resumo:
Introduction: Lesotho was among the first countries to adopt decentralization of care from hospitals to nurse-led health centres (HCs) to scale up the provision of antiretroviral therapy (ART). We compared outcomes between patients who started ART at HCs and hospitals in two rural catchment areas in Lesotho. Methods: The two catchment areas comprise two hospitals and 12 HCs. Patients ≥16 years starting ART at a hospital or HC between 2008 and 2011 were included. Loss to follow-up (LTFU) was defined as not returning to the facility for ≥180 days after the last visit, no follow-up (no FUP) as not returning after starting ART, and retention in care as alive and on ART at the facility. The data were analysed using logistic regression, competing risk regression and Kaplan-Meier methods. Multivariable analyses were adjusted for sex, age, CD4 cell count, World Health Organization stage, catchment area and type of ART. All analyses were stratified by gender. Results: Of 3747 patients, 2042 (54.5%) started ART at HCs. Both women and men at hospitals had more advanced clinical and immunological stages of disease than those at HCs. Over 5445 patient-years, 420 died and 475 were LTFU. Kaplan-Meier estimates for three-year retention were 68.7 and 69.7% at HCs and hospitals, respectively, among women (p=0.81) and 68.8% at HCs versus 54.7% at hospitals among men (p<0.001). These findings persisted in adjusted analyses, with similar retention at HCs and hospitals among women (odds ratio (OR): 0.89, 95% confidence interval (CI): 0.73-1.09) and higher retention at HCs among men (OR: 1.53, 95% CI: 1.20-1.96). The latter result was mainly driven by a lower proportion of patients LTFU at HCs (OR: 0.68, 95% CI: 0.51-0.93). Conclusions: In rural Lesotho, overall retention in care did not differ significantly between nurse-led HCs and hospitals. However, men seemed to benefit most from starting ART at HCs, as they were more likely to remain in care in these facilities compared to hospitals.
Resumo:
High-resolution seismic profiles and sediment cores from Lake Ledro combined with soil and riverbed samples from the lake's catchment area are used to assess the recurrence of natural hazards (earthquakes and flood events) in the southern Italian Alps during the Holocene. Two well-developed deltas and a flat central basin are identified on seismic profiles in Lake Ledro. Lake sediments have been finely laminated in the basin since 9000 cal. yr BP and frequently interrupted by two types of sedimentary events (SEs): light-coloured massive layers and dark-coloured graded beds. Optical analysis (quantitative organic petrography) of the organic matter present in soil, riverbed and lacustrine samples together with lake sediment bulk density and grain-size analysis illustrate that light-coloured layers consist of a mixture of lacustrine sediments and mainly contain algal particles similar to the ones observed in background sediments. Light-coloured layers thicker than 1.5 cm in the main basin of Lake Ledro are synchronous to numerous coeval mass-wasting deposits remoulding the slopes of the basin. They are interpreted as subaquatic mass-movements triggered by historical and pre-historical regional earthquakes dated to AD2005, AD1891, AD1045 and 1260, 2545, 2595, 3350, 3815, 4740, 7190, 9185 and 11 495 cal. yr BP. Darkcoloured SEs develop high-amplitude reflections in front of the deltas and in the deep central basin. These beds are mainly made of terrestrial organic matter (soils and lignocellulosic debris) and are interpreted as resulting from intense hyperpycnal flood event. Mapping and quantifying the amount of soil material accumulated in the Holocene hyperpycnal flood deposits of the sequence allow estimating that the equivalent soil thickness eroded over the catchment area reached up to 5mm during the largest Holocene flood events. Such significant soil erosion is interpreted as resulting from the combination of heavy rainfall and snowmelt. The recurrence of flash flood events during the Holocene was, however, not high enough to affect pedogenesis processes and highlight several wet regional periods during the Holocene. The Holocene period is divided into four phases of environmental evolution. Over the first half of the Holocene, a progressive stabilization of the soils present through the catchment of Lake Ledro was associated with a progressive reforestation of the area and only interrupted during the wet 8.2 event when the soil destabilization was particularly important. Lower soil erosion was recorded during the mid-Holocene climatic optimum (8000-4200 cal. yr BP) and associated with higher algal production. Between 4200 and 3100 cal. yr BP, both wetter climate and human activities within the drainage basin drastically increased soil erosion rates. Finally, from 3100 cal. yr BP to the present-day, data suggest increasing and changing human land use.
Resumo:
The endemic New Zealand longfin eel Anguilla dieffenbachi (hereafter, longfin eel), is overfished, and in southern South Island, New Zealand, rivers have recently become predominated by males. This study examined length and age at sexual differentiation in male eels in the Aparima River catchment (area, 1,375 km(2); mean flow, 20 m(3.)s(-1)) and the sex ratio and distribution of eels throughout the catchment. Longfin eels differentiated into males mostly at lengths from 300 to 460 mm and ages from 10 to 25+ years. Females were rare: Of 738 eels examined for sexual differentiation, 466 were males and 5 were females, and a few others, not examined, were large enough to be female. These counts suggest a male : female ratio among differentiated longfin eels of 68:1. Of 31 differentiated shortfin eels A. australis, less common in the Aparima River, 26 were females. Male longfin eels were distributed throughout the main stern and tributaries; undifferentiated eels were more prevalent in lower and middle reaches and in the main stem than in upper reaches and tributaries. In other studies, male longfin eels predominated commercial catches in the Aparima and four other southernmost rivers, by 2.4:1 to 13.6:1 males to females. The Aparima River had the most skewed sex ratio. Longfin eel catches from the Aparima River will become more male predominated because few sublegal-size females were present. The length-frequency distributions of eels in the present samples and in the commercial catches were truncated just above minimum legal size (about 460 mm), showing that few females escape the fishery. Historically, females predominated these rivers. The recent change in sex ratio is attributable partly to selective harvest of females, and partly to changes in the structure of the population from fishing, such that differentiation into males has been favored. Longevity, delayed sexual maturity, semel-parity, and endemism with restricted range make the longfin eel particularly vulnerable to overfishing.
Resumo:
OBJECTIVE The ACCESS treatment model offers assertive community treatment embedded in an integrated care program to patients with psychoses. Compared to standard care and within a controlled study, it proved to be more effective in terms of service disengagement and illness outcomes in patients with schizophrenia spectrum disorders over 12 months. ACCESS was implemented into clinical routine and its effectiveness assessed over 24 months in severe schizophrenia spectrum disorders and bipolar I disorder with psychotic features (DSM-IV) in a cohort study. METHOD All 115 patients treated in ACCESS (from May 2007 to October 2009) were included in the ACCESS II study. The primary outcome was rate of service disengagement. Secondary outcomes were change of psychopathology, severity of illness, psychosocial functioning, quality of life, satisfaction with care, medication nonadherence, length of hospital stay, and rates of involuntary hospitalization. RESULTS Only 4 patients (3.4%) disengaged with the service. Another 11 (9.6%) left because they moved outside the catchment area. Patients received a mean of 1.6 outpatient contacts per week. Involuntary admissions decreased from 34.8% in the 2 previous years to 7.8% during ACCESS (P < .001). Mixed models repeated-measures analyses revealed significant improvements among all patients in psychopathology (effect size d = 0.64, P < .001), illness severity (d = 0.84, P = .03), functioning level (d = 0.65, P < .001), quality of life (d = 0.50, P < .001), and client satisfaction (d = 0.11, P < .001). At 24 months, 78.3% were fully adherent to medication, compared to 25.2% at baseline (P = .002). CONCLUSIONS ACCESS was successfully implemented in clinical routine and maintained excellent rates of service engagement and other outcomes in patients with schizophrenia spectrum disorders or bipolar I disorder with psychotic features over 24 months. TRIAL REGISTRATION ClinicalTrials.gov identifier: NCT01888627.
Resumo:
BACKGROUND Transient ischemic attacks (TIA) are stroke warning signs and emergency situations, and, if immediately investigated, doctors can intervene to prevent strokes. Nevertheless, many patients delay going to the doctor, and doctors might delay urgently needed investigations and preventative treatments. We set out to determine how much general practitioners (GPs) and hospital physicians (HPs) knew about stroke risk after TIA, and to measure their referral rates. METHODS We used a structured questionnaire to ask GPs and HPs in the catchment area of the University Hospital of Bern to estimate a patient's risk of stroke after TIA. We also assessed their referral behavior. We then statistically analysed their reasons for deciding not to immediately refer patients. RESULTS Of the 1545 physicians, 40% (614) returned the survey. Of these, 75% (457) overestimated stroke risk within 24 hours, and 40% (245) overestimated risk within 3 months after TIA. Only 9% (53) underestimated stroke risk within 24 hours and 26% (158) underestimated risk within 3 months; 78% (473) of physicians overestimated the amount that carotid endarterectomy reduces stroke risk; 93% (543) would rigorously investigate the cause of a TIA, but only 38% (229) would refer TIA patients for urgent investigations "very often". Physicians most commonly gave these reasons for not making emergency referrals: patient's advanced age; patient's preference; patient was multimorbid; and, patient needed long-term care. CONCLUSIONS Although physicians overestimate stroke risk after TIA, their rate of emergency referral is modest, mainly because they tend not to refer multimorbid and elderly patients at the appropriate rate. Since old and frail patients benefit from urgent investigations and treatment after TIA as much as younger patients, future educational campaigns should focus on the importance of emergency evaluations for all TIA patients.
Resumo:
A deeper understanding of past vegetation dynamics is required to better assess future vegetation responses to global warming in the Alps. Lake sediments from Lac de Bretaye, a small subalpine lake in the Northern Swiss Alps (1780 m a.s.l.), were analysed to reconstruct past vegetation dynamics for the entire Holocene, using pollen, macrofossil and charcoal analyses as main proxies. The results show that timberline reached the lake’s catchment area at around 10,300 cal. BP, supporting the hypothesis of a delayed postglacial afforestation in the Northern Alps. At the same time, thermophilous trees such as Ulmus, Tilia and Acer established in the lowlands and expanded to the altitude of the lake, forming distinctive boreo-nemoral forests with Betula, Pinus cembra and Larix decidua. From about 5000 to 3500 cal. BP, thermophilous trees declined because of increasing human land use, mainly driven by the mass expansion of Picea abies and severe anthropogenic fire activity. From the Bronze Age onwards (c. 4200–2800 cal. BP), grazing indicators and high values for charcoal concentration and influx attest an intensifying human impact, fostering the expansion of Alnus viridis and Picea abies. Hence, biodiversity in alpine meadows increased, whereas forest diversity declined, as can be seen in other regional records. We argue that the anticipated climate change and decreasing human impact in the Alps today will not only lead to an upward movement of timberline with consequent loss of area for grasslands, but also to a disruption of Picea abies forests, which may allow the re-expansion of thermophilous tree species.
Resumo:
A descriptive study of demographic and psychosocial factors believed to be associated with employment was carried out through face-to-face interviews with 417 chronically mentally-ill patients. Subjects had been hospitalized a minimum of two times for psychiatric treatment, had been discharged from at least one of these hospitalizations in the two years prior to the study, and were currently residing within a specific community mental health center catchment area in Texas. The study group ranged in age from 16 to 68 years and over one-half had chart diagnoses of schizophrenia.^ A structured interview was developed which addressed current employment status, length of current employment, job title of current or last job, and detailed work history for the prior five years. Four measures of social support were included in the interview. Each subject was asked to identify one recent work and one recent non-work situation which had been stressful or very demanding. A coping questionnaire was verbally administered to measure the ways in which subjects had coped with these specific stressful situations.^ Analysis of results revealed that 27 percent of the sample was gainfully employed at time of interview. Differences between the employed and unemployed groups were analyzed by t-test an chi square. The employed demonstrated significantly more weeks of employment in the prior five years than the unemployed. The current jobs of the employed required a significantly higher relationship to "things" or inanimate objects than the last jobs of the unemployed. Subjects diagnosed as schizophrenic were significantly less likely to be employed than subjects with other diagnoses.^ Employed subjects scored significantly higher on three of four measures of social support than unemployed subjects, including reported frequency of social group attendance and/or meetings with mental health professionals. Problem-focused coping was used significantly more by the employed than by the unemployed to deal with stressful situations in the work, but not the non-work, context. ^
Resumo:
Global and local climatic forcing, e.g. concentration of atmospheric CO2 or insolation, influence the distribution of C3 and C4 plants in southwest Africa. C4 plants dominate in more arid and warmer areas and are favoured by lower pCO2 levels. Several studies have assessed past and present continental vegetation by the analysis of terrestrial n-alkanes in near-coastal deep sea sediments using single samples or a small number of samples from a given climatic stage. The objectives of this study were to evaluate vegetation changes in southwest Africa with regard to climatic changes during the Late Pleistocene and the Holocene and to elucidate the potential of single sample simplifications. We analysed two sediment cores at high resolution, altogether ca. 240 samples, from the Southeast Atlantic Ocean (20°S and 12°S) covering the time spans of 18 to 1 ka and 56 to 2 ka, respectively. Our results for 20°S showed marginally decreasing C4 plant domination (of ca. 5%) during deglaciation based on average chain length (ACL27-33 values) and carbon isotopic composition of the C31 and C33 n-alkanes. Values for single samples from 18 ka and the Holocene overlap and, thus, are not significantly representative of the climatic stages they derive from. In contrast, at 12°S the n-alkane parameters show a clear difference of plant type for the Late Pleistocene (C4 plant domination, 66% C4 on average) and the Holocene (C3 plant domination, 40% C4 on average). During deglaciation vegetation change highly correlates with the increase in pCO2 (r² = 0.91). Short-term climatic events such as Heinrich Stadials or Antarctic warming periods are not reflected by vegetation changes in the catchment area. Instead, smaller vegetation fluctuations during the Late Pleistocene occur in accordance with local variations of insolation.
Resumo:
En el presente trabajo se propone y desarrolla una herramienta de "Gestión del riesgo de contaminación del recurso hídrico", inspirada en métodos comúnmente utilizados en las evaluaciones de impacto ambiental tales como la Matriz de importancia y la Evaluación de riesgo. Dicha herramienta se aplica en el oasis del río Tunuyán Inferior, cuya cuenca se localiza en el sector E de la Cordillera de Los Andes, provincia de Mendoza, Argentina. El método propuesto consiste en la determinación, en cada Unidad de Manejo (UM)3 de: 1. la vulnerabilidad del territorio; 2. la peligrosidad del efluente; 3. las clases de riesgo; 4. el índice prioridad de manejo del riesgo, variables que luego se traducen cartográficamente. Las bases de datos generadas pueden ser analizadas desde distintos enfoques y, a su vez, actualizadas a medida que se van profundizando los conocimientos acerca de los atributos que hacen a la peligrosidad del vertido (ej.: tipo de efluente, tiempo, caudal y lugar de descarga) y a la vulnerabilidad de la UM (ej.: tipo de acuífero, profundidad de nivel freático, permeabilidad del terreno, calidad del suelo, etc.). Esta herramienta de gestión genera un diagnóstico dinámico de la situación, ya que puede ser perfeccionado a través de la investigación de las variables que intervienen en el proceso de contaminación del agua por efluentes. Además, es una herramienta práctica porque jerarquiza las prioridades de gestión, de acuerdo con un orden de aplicación gradual de medidas de manejo del riesgo de contaminación. Teniendo en cuenta la tendencia mundial de reducción de glaciares por efecto del calentamiento global y su impacto negativo en los caudales de los ríos, es indispensable y urgente establecer prioridades de gestión para preservar la calidad del recurso hídrico.