942 resultados para Seasonal and interannual monitoring


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Adriatic sturgeon, Acipenser naccarii (Bonaparte, 1836), is a highly threatened species due to human activities, particularly overfishing and habitat destruction. Its peculiar ecology and biology (restricted areal and anadromy) makes this species particularly vulnerable. In March 2010 the IUCN has identified the Adriatic sturgeon as a critically endangered species according to the Red List of Threatened Species. Due to its rapid decline, starting from the 80s, at present there is no evidence of natural reproduction in wild environment, which makes the Adriatic sturgeon dependenton captive breeding programs that need to be improved in order to be effective for the survival of the species. For this purpose this study aims to characterize artificial restocking population of Adriatic sturgeon, with both genetic and physiological analysis in order to establish an efficient restocking program for future reproductions. The research is structured on two levels: First genetically, by analyzing 9 microsatellite loci. This gives information relatively about parent allocation and kinship between individuals that were sampled for this study. Hence to predict which reproduction events are the most optimal in terms of incrementing genetic diversity, by the estimation of multilocus pairwise band sharing coefficients. Second step, physiological analysis: testosterone (T) concentration levels in each individual were measured for sexing, without sacrificing the lives of the animals with the use of an invasive examination of the gonads. The combination of interdisciplinary analysis is important to obtain an overall picture in order to indicate the main broodstock participating in reproduction events and future optimal potential participants, in order to ensure a valid management for restocking program and their monitoring.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

New treatment options for Niemann-Pick Type C (NPC) have recently become available. To assess the efficiency and efficacy of these new treatment markers for disease status and progression are needed. Both the diagnosis and the monitoring of disease progression are challenging and mostly rely on clinical impression and functional testing of horizontal eye movements. Diffusion tensor imaging (DTI) provides information about the microintegrity especially of white matter. We show here in a case report how DTI and measures derived from this imaging method can serve as adjunct quantitative markers for disease management in Niemann-Pick Type C. Two approaches are taken--first, we compare the fractional anisotropy (FA) in the white matter globally between a 29-year-old NPC patient and 18 healthy age-matched controls and show the remarkable difference in FA relatively early in the course of the disease. Second, a voxelwise comparison of FA values reveals where white matter integrity is compromised locally and demonstrate an individualized analysis of FA changes before and after 1year of treatment with Miglustat. This method might be useful in future treatment trials for NPC to assess treatment effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Extracranial application of diffusion-weighted magnetic resonance imaging (MRI) has gained increasing importance in recent years. As a result of technical advances, this new non-invasive functional technique has also been applied in head and neck radiology for several clinical indications. In cancer imaging, diffusion-weighted MRI can be performed for tumour detection and characterization, monitoring of treatment response as well as the differentiation of recurrence and post-therapeutic changes after radiotherapy. Even for lymph node staging promising results have been reported recently. This review article provides overview of potential applications of diffusion-weighted MRI in head and neck with the main focus on its applications in oncology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To check the effectiveness of campaigns preventing drug abuse or indicating local effects of efforts against drug trafficking, it is beneficial to know consumed amounts of substances in a high spatial and temporal resolution. The analysis of drugs of abuse in wastewater (WW) has the potential to provide this information. In this study, the reliability of WW drug consumption estimates is assessed and a novel method presented to calculate the total uncertainty in observed WW cocaine (COC) and benzoylecgonine (BE) loads. Specifically, uncertainties resulting from discharge measurements, chemical analysis and the applied sampling scheme were addressed and three approaches presented. These consist of (i) a generic model-based procedure to investigate the influence of the sampling scheme on the uncertainty of observed or expected drug loads, (ii) a comparative analysis of two analytical methods (high performance liquid chromatography-tandem mass spectrometry and gas chromatography-mass spectrometry), including an extended cross-validation by influent profiling over several days, and (iii) monitoring COC and BE concentrations in WW of the largest Swiss sewage treatment plants. In addition, the COC and BE loads observed in the sewage treatment plant of the city of Berne were used to back-calculate the COC consumption. The estimated mean daily consumed amount was 107 ± 21 g of pure COC, corresponding to 321 g of street-grade COC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: With expanding pediatric antiretroviral therapy (ART) access, children will begin to experience treatment failure and require second-line therapy. We evaluated the probability and determinants of virologic failure and switching in children in South Africa. Methods: Pooled analysis of routine individual data from children who initiated ART in 7 South African treatment programs with 6-monthly viral load and CD4 monitoring produced Kaplan-Meier estimates of probability of virologic failure (2 consecutive unsuppressed viral loads with the second being >1000 copies/mL, after ≥24 weeks of therapy) and switch to second-line. Cox-proportional hazards models stratified by program were used to determine predictors of these outcomes. Results: The 3-year probability of virologic failure among 5485 children was 19.3% (95% confidence interval: 17.6 to 21.1). Use of nevirapine or ritonavir alone in the initial regimen (compared with efavirenz) and exposure to prevention of mother to child transmission regimens were independently associated with failure [adjusted hazard ratios (95% confidence interval): 1.77 (1.11 to 2.83), 2.39 (1.57 to 3.64) and 1.40 (1.02 to 1.92), respectively]. Among 252 children with ≥1 year follow-up after failure, 38% were switched to second-line. Median (interquartile range) months between failure and switch was 5.7 (2.9-11.0). Conclusions: Triple ART based on nevirapine or ritonavir as a single protease inhibitor seems to be associated with a higher risk of virologic failure. A low proportion of virologically failing children were switched.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diagnosis, staging, and treatment monitoring are still suboptimal for most genitourinary tumours. Diffusion-weighted magnetic resonance imaging (DW-MRI) has already shown promise as a noninvasive imaging modality in the early detection of microstructural and functional changes in several pathologies of various organs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Development of new personal mobile and wireless devices for healthcare has become essential due to our aging population characterized by constant rise in chronic diseases that consequently require a complex treatment and close monitoring. Personal telehealth devices allow patients to adequately receive their appropriate treatment, followup with their doctors, and report any emergency without the need of the presence of any caregivers with them thus increasing their quality of life in a cost-effective fashion. This paper includes a brief overview of personal telehealth systems, a survey of 100 consecutive ED patients aged >65 years, and introduces "Limmex" a new GSM based technology packaged in a wristwatch. Limmex can by a push of a button initiate multiple emergency call and establish mobile communication between the patient and a preselected person, institution, or a search and rescue service. To the best of our knowledge, Limmex is the first of its kind worldwide.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background During the Soviet era, malaria was close to eradication in Tajikistan. Since the early 1990s, the disease has been on the rise and has become endemic in large areas of southern and western Tajikistan. The standard national treatment for Plasmodium vivax is based on primaquine. This entails the risk of severe haemolysis for patients with glucose-6-phosphate dehydrogenase (G6PD) deficiency. Seasonal and geographical distribution patterns as well as G6PD deficiency frequency were analysed with a view to improve understanding of the current malaria situation in Tajikistan. Methods Spatial and seasonal distribution was analysed, applying a risk model that included key environmental factors such as temperature and the availability of mosquito breeding sites. The frequency of G6PD deficiency was studied at the health service level, including a cross-sectional sample of 382 adult men. Results Analysis revealed high rates of malaria transmission in most districts of the southern province of Khatlon, as well as in some zones in the northern province of Sughd. Three categories of risk areas were identified: (i) zones at relatively high malaria risk with high current incidence rates, where malaria control and prevention measures should be taken at all stages of the transmission cycle; (ii) zones at relatively high malaria risk with low current incidence rates, where malaria prevention measures are recommended; and (iii) zones at intermediate or low malaria risk with low current incidence rates where no particular measures appear necessary. The average prevalence of G6PD deficiency was 2.1% with apparent differences between ethnic groups and geographical regions. Conclusion The study clearly indicates that malaria is a serious health issue in specific regions of Tajikistan. Transmission is mainly determined by temperature. Consequently, locations at lower altitude are more malaria-prone. G6PD deficiency frequency is too moderate to require fundamental changes in standard national treatment of cases of P. vivax.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECT: Glycerol is considered to be a marker of cell membrane degradation and thus cellular lysis. Recently, it has become feasible to measure via microdialysis cerebral extracellular fluid (ECF) glycerol concentrations at the patient's bedside. Therefore the aim of this study was to investigate the ECF concentration and time course of glycerol after severe traumatic brain injury (TBI) and its relationship to patient outcome and other monitoring parameters. METHODS: As soon as possible after injury for up to 4 days, 76 severely head-injured patients were monitored using a microdialysis probe (cerebral glycerol) and a Neurotrend sensor (brain tissue PO2) in uninjured brain tissue confirmed by computerized tomography scanning. The mean brain tissue glycerol concentration in all monitored patients decreased significantly from 206 +/- 31 micromol/L on Day 1 to 9 +/- 3 micromol/L on Day 4 after injury (p < 0.0001). Note, however, that there was no significant difference in the time course between patients with a favorable outcome (Glasgow Outcome Scale [GOS] Scores 4 and 5) and those with an unfavorable outcome (GOS Scores 1-3). Significantly increased glycerol concentrations were observed when brain tissue PO2 was less than 10 mm Hg or when cerebral perfusion pressure was less than 70 mm Hg. CONCLUSIONS: Based on results in the present study one can infer that microdialysate glycerol is a marker of severe tissue damage, as seen immediately after brain injury or during profound tissue hypoxia. Given that brain tissue glycerol levels do not yet add new clinically significant information, however, routine monitoring of this parameter following traumatic brain injury needs further validation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The most important early pathomechanism in traumatic brain injury (TBI) is alteration of the resting membrane potential. This may be mediated via voltage, or agonist-dependent ion channels (e.g. glutamate-dependent channels). This may result in a consequent increase in metabolism with increased oxygen consumption, in order to try to restore ionic balance via the ATP-dependent pumps. We hypothesize that glutamate is an important agonist in this process and may induce an increase in lactate, potassium and brain tissue CO2, and hence a decrease in brain pH. Further we propose that an increase in lactate is thus not an indicator of anaerobic metabolic conditions as has been thought for many years. We therefore analyzed a total of 85 patients with TBI, Glasgow Coma Scale (GCS) < 8 using microdialysis, brain tissue oxygen, CO2 and pH monitoring. Cerebral blood flow studies (CBF) were performed to test the relationship between regional cerebral blood flow (rCBF) and the metabolic determinants. Glutamate was significantly correlated with lactate (p < 0.0001), potassium (p < 0.0001), brain tissue pH (p = 0.0005), and brain tissue CO2 (p = 0.006). rCBF was inversely correlated with glutamate, lactate and potassium. 44% of high lactate values were observed in brain with tissue oxygen values, above the threshold level for cell damage. These results support the hypothesis of a glutamate driven increase in metabolism, with secondary traumatic depolarization and possibly hyperglycolysis. Further, we demonstrate evidence for lactate production in aerobic conditions in humans after TBI. Finally, when reduced regional cerebral blood flow (rCBF) is observed, high dialysate glutamate, lactate and potassium values are usually seen, suggesting ischemia worsens these TBI-induced changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate seasonal to interannual streamflow forecasts based on climate information are critical for optimal management and operation of water resources systems. Considering most water supply systems are multipurpose, operating these systems to meet increasing demand under the growing stresses of climate variability and climate change, population and economic growth, and environmental concerns could be very challenging. This study was to investigate improvement in water resources systems management through the use of seasonal climate forecasts. Hydrological persistence (streamflow and precipitation) and large-scale recurrent oceanic-atmospheric patterns such as the El Niño/Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO), North Atlantic Oscillation (NAO), the Atlantic Multidecadal Oscillation (AMO), the Pacific North American (PNA), and customized sea surface temperature (SST) indices were investigated for their potential to improve streamflow forecast accuracy and increase forecast lead-time in a river basin in central Texas. First, an ordinal polytomous logistic regression approach is proposed as a means of incorporating multiple predictor variables into a probabilistic forecast model. Forecast performance is assessed through a cross-validation procedure, using distributions-oriented metrics, and implications for decision making are discussed. Results indicate that, of the predictors evaluated, only hydrologic persistence and Pacific Ocean sea surface temperature patterns associated with ENSO and PDO provide forecasts which are statistically better than climatology. Secondly, a class of data mining techniques, known as tree-structured models, is investigated to address the nonlinear dynamics of climate teleconnections and screen promising probabilistic streamflow forecast models for river-reservoir systems. Results show that the tree-structured models can effectively capture the nonlinear features hidden in the data. Skill scores of probabilistic forecasts generated by both classification trees and logistic regression trees indicate that seasonal inflows throughout the system can be predicted with sufficient accuracy to improve water management, especially in the winter and spring seasons in central Texas. Lastly, a simplified two-stage stochastic economic-optimization model was proposed to investigate improvement in water use efficiency and the potential value of using seasonal forecasts, under the assumption of optimal decision making under uncertainty. Model results demonstrate that incorporating the probabilistic inflow forecasts into the optimization model can provide a significant improvement in seasonal water contract benefits over climatology, with lower average deficits (increased reliability) for a given average contract amount, or improved mean contract benefits for a given level of reliability compared to climatology. The results also illustrate the trade-off between the expected contract amount and reliability, i.e., larger contracts can be signed at greater risk.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Peru is a developing country with abundant fresh water resources, yet the lack of infrastructure leaves much of the population without access to safe water for domestic uses. The author of this report was a Peace Corps Volunteer in the sector of water & sanitation in the district of Independencia, Ica, Peru. Independencia is located in the arid coastal region of the country, receiving on average 15 mm of rain annually. The water source for this district comes from the Pisco River, originating in the Andean highlands and outflowing into the Pacific Ocean near the town of Pisco, Peru. The objectives of this report are to assess the water supply and sanitation practices, model the existing water distribution system, and make recommendations for future expansion of the distribution system in the district of Independencia, Peru. The assessment of water supply will be based on the results from community surveys done in the district of Independencia, water quality testing done by a detachment of the U.S. Navy, as well as on the results of a hydraulic model built in EPANET 2.0 to represent the distribution system. Sanitation practice assessments will be based on the surveys as well as observations from the author while living in Peru. Recommendations for system expansions will be made based on results from the EPANET model and the municipality’s technical report for the existing distribution system. Household water use and sanitation surveys were conducted with 84 families in the district revealing that upwards of 85% store their domestic water in regularly washed containers with lids. Over 80% of those surveyed are drinking water that is treated, mostly boiled. Of those surveyed, over 95% reported washing their hands and over 60% mentioned at least one critical time for hand washing when asked for specific instances. From the surveys, it was also discovered that over 80% of houses are properly disposing of excrement, in either latrines or septic tanks. There were 43 families interviewed with children five years of age or under, and just over 18% reported the child had a case of diarrhea within the last month at the time of the interview. Finally, from the surveys it was calculated that the average water use per person per day is about 22 liters. Water quality testing carried out by a detachment of the U.S. Navy revealed that the water intended for consumption in the houses surveyed was not suitable for consumption, with a median E. coli most probable number of 47/100 ml for the 61 houses sampled. The median total coliforms was 3,000 colony forming units per 100 ml. EPANET was used to simulate the water delivery system and evaluate its performance. EPANET is designed for continuous water delivery systems, assuming all pipes are always flowing full. To account for the intermittent nature of the system, multiple EPANET network models were created to simulate how water is routed to the different parts of the system throughout the day. The models were created from interviews with the water technicians and a map of the system created using handheld GPS units. The purpose is to analyze the performance of the water system that services approximately 13,276 people in the district of Independencia, Peru, as well as provide recommendations for future growth and improvement of the service level. Performance evaluation of the existing system is based on meeting 25 liters per person per day while maintaining positive pressure at all nodes in the network. The future performance is based on meeting a minimum pressure of 20 psi in the main line, as proposed by Chase (2000). The EPANET model results yield an average nodal pressure for all communities of 71 psi, with a range from 1.3 – 160 psi. Thus, if the current water delivery schedule obtained from the local municipality is followed, all communities should have sufficient pressure to deliver 25 l/p/d, with the exception of Los Rosales, which can only supply 3.25 l/p/d. However, if the line to Los Rosales were increased from one to four inches, the system could supply this community with 25 l/p/d. The district of Independencia could greatly benefit from increasing the service level to 24-hour water delivery and a minimum of 50 l/p/d, so that communities without reliable access due to insufficient pressure would become equal beneficiaries of this invaluable resource. To evaluate the feasibility of this, EPANET was used to model the system with a range of population growth rates, system lifetimes, and demands. In order to meet a minimum pressure of 20 psi in the main line, the 6-inch diameter main line must be increased and approximately two miles of trench must be excavated up to 30 feet deep. The sections of the main line that must be excavated are mile 0-1 and 1.5-2.5, and the first 3.4 miles of the main line must be increased from 6 to 16 inches, contracting to 10 inches for the remaining 5.8 miles. Doing this would allow 24-hour water delivery and provide 50 l/p/d for a range of population growth rates and system lifetimes. It is expected that improving the water delivery service would reduce the morbidity and mortality from diarrheal diseases by decreasing the recontamination of the water due to transport and household storage, as well as by maintaining continuous pressure in the system to prevent infiltration of contaminated groundwater. However, this expansion must be carefully planned so as not to affect aquatic ecosystems or other districts utilizing water from the Pisco River. It is recommended that stream gaging of the Pisco River and precipitation monitoring of the surrounding watershed is initiated in order to begin a hydrological study that would be integrated into the district’s water resource planning. It is also recommended that the district begin routine water quality testing, with the results available to the public.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The provision of highly active antiretroviral therapy (HAART) in resource-limited settings follows a public health approach, which is characterised by a limited number of regimens and the standardisation of clinical and laboratory monitoring. In industrialized countries doctors prescribe from the full range of available antiretroviral drugs, supported by resistance testing and frequent laboratory monitoring. We compared virologic response, changes to first-line regimens, and mortality in HIV-infected patients starting HAART in South Africa and Switzerland. METHODS AND FINDINGS: We analysed data from the Swiss HIV Cohort Study and two HAART programmes in townships of Cape Town, South Africa. We included treatment-naïve patients aged 16 y or older who had started treatment with at least three drugs since 2001, and excluded intravenous drug users. Data from a total of 2,348 patients from South Africa and 1,016 patients from the Swiss HIV Cohort Study were analysed. Median baseline CD4+ T cell counts were 80 cells/mul in South Africa and 204 cells/mul in Switzerland. In South Africa, patients started with one of four first-line regimens, which was subsequently changed in 514 patients (22%). In Switzerland, 36 first-line regimens were used initially, and these were changed in 539 patients (53%). In most patients HIV-1 RNA was suppressed to 500 copies/ml or less within one year: 96% (95% confidence interval [CI] 95%-97%) in South Africa and 96% (94%-97%) in Switzerland, and 26% (22%-29%) and 27% (24%-31%), respectively, developed viral rebound within two years. Mortality was higher in South Africa than in Switzerland during the first months of HAART: adjusted hazard ratios were 5.90 (95% CI 1.81-19.2) during months 1-3 and 1.77 (0.90-3.50) during months 4-24. CONCLUSIONS: Compared to the highly individualised approach in Switzerland, programmatic HAART in South Africa resulted in similar virologic outcomes, with relatively few changes to initial regimens. Further innovation and resources are required in South Africa to both achieve more timely access to HAART and improve the prognosis of patients who start HAART with advanced disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The retention of patients in antiretroviral therapy (ART) programmes is an important issue in resource-limited settings. Loss to follow up can be substantial, but it is unclear what the outcomes are in patients who are lost to programmes. METHODS AND FINDINGS: We searched the PubMed, EMBASE, Latin American and Caribbean Health Sciences Literature (LILACS), Indian Medlars Centre (IndMed) and African Index Medicus (AIM) databases and the abstracts of three conferences for studies that traced patients lost to follow up to ascertain their vital status. Main outcomes were the proportion of patients traced, the proportion found to be alive and the proportion that had died. Where available, we also examined the reasons why some patients could not be traced, why patients found to be alive did not return to the clinic, and the causes of death. We combined mortality data from several studies using random-effects meta-analysis. Seventeen studies were eligible. All were from sub-Saharan Africa, except one study from India, and none were conducted in children. A total of 6420 patients (range 44 to 1343 patients) were included. Patients were traced using telephone calls, home visits and through social networks. Overall the vital status of 4021 patients could be ascertained (63%, range across studies: 45% to 86%); 1602 patients had died. The combined mortality was 40% (95% confidence interval 33%-48%), with substantial heterogeneity between studies (P<0.0001). Mortality in African programmes ranged from 12% to 87% of patients lost to follow-up. Mortality was inversely associated with the rate of loss to follow up in the programme: it declined from around 60% to 20% as the percentage of patients lost to the programme increased from 5% to 50%. Among patients not found, telephone numbers and addresses were frequently incorrect or missing. Common reasons for not returning to the clinic were transfer to another programme, financial problems and improving or deteriorating health. Causes of death were available for 47 deaths: 29 (62%) died of an AIDS defining illness. CONCLUSIONS: In ART programmes in resource-limited settings a substantial minority of adults lost to follow up cannot be traced, and among those traced 20% to 60% had died. Our findings have implications both for patient care and the monitoring and evaluation of programmes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Elevated lactate and interleukin-6 (IL-6) levels were shown to correlate with mortality and multiple organ dysfunction in severely traumatized patients. The purpose of this study was to test whether an association exists between 24-hour lactate clearance, IL-6 and procalcitonin (PCT) levels, and the development of infectious complications in trauma patients. METHODS: A total of 1757 consecutive trauma patients with an Injury Severity Score (ISS) > 16 admitted over a 10-year period were retrospectively analyzed over a 21-day period. Exclusion criteria included death within 72 h of admission (24.5%), late admission > 12 h after injury (16%), and age < 16 years (0.5%). Data are stated as the median (range). RESULTS: Altogether, 1032 trauma patients (76.2% male) with an average age of 38 years, a median ISS of 29 (16-75), and an Acute Physiology, Age, and Chronic Health Evaluation (APACHE) II score of 14 (0-40) were evaluated. The in-hospital mortality (>3 days) was 10%. Patients with insufficient 24-hour lactate clearance had a high rate of overall mortality and infections. Elevated early serum procalcitonin on days 1 to 5 after trauma was strongly associated with the subsequent development of sepsis (p < 0.01) but not with nonseptic infections. The kinetics of IL-6 were similar to those of PCT but did differentiate between infected and noninfected patients after day 5. CONCLUSIONS: This study demonstrates that elevated early procalcitonin and IL-6 levels and inadequate 24-hour lactate clearance help identify trauma patients who develop septic and nonseptic infectious complications. Definition of specific cutoff values and early monitoring of these parameters may help direct early surgical and antibiotic therapy and reduce infectious mortality.