871 resultados para scalable to larger studies


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cape Verde is considered part of Sahelian Africa, where drought and desertification are common occurrences. The main activity of the rural population is rain-fed agriculture, which over time has been increasingly challenged by high temporal and spatial rainfall variability, lack of inputs, limited land area, fragmentation of land, steep slopes, pests, lack of mechanization and loss of top soil by water erosion. Human activities, largely through poor farming practices and deforestation (Gomez, 1989) have accelerated natural erosion processes, shifting the balance between soil erosion and soil formation (Norton, 1987). According to previous studies, vegetation cover is one of the most important factors in controlling soil loss (Cyr et al., 1995; Hupy, 2004; Zhang et al., 2004; Zhou et al., 2006). For this reason, reforestation is a touchstone of the Cape Verdean policy to combat desertification. After Independence in 1975, the Cape Verde government had pressing and closely entangled environmental and socio-economic issues to address, as long-term desertification had resulted in a lack of soil cover, severe soil erosion and a scarcity of water resources and fuel wood. Across the archipelago, desertification was resulting from a variety of processes including poor farming practices, soil erosion by water and wind, soil and water salinity in coastal areas due to over pumping and seawater intrusion, drought and unplanned urbanization (DGA-MAAP, 2004). All these issues directly affected socio-economic vulnerability in rural areas, where about 70% of people depended directly or indirectly on agriculture in 1975. By becoming part of the Inter- State Committee for the Fight against Drought in the Sahel in 1975, the government of Cape Verde gained structured support to address these issues more efficiently. Presentday policies and strategies were defined on the basis of rational use of resources and human efforts and were incorporated into three subsequent national plans: the National Action Plan for Development (NDP) (1982–1986), the NDP (1986–1990) and the NDP (1991–1995) (Carvalho

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Neolithic was marked by a transition from small and relatively egalitarian groups to much larger groups with increased stratification. But, the dynamics of this remain poorly understood. It is hard to see how despotism can arise without coercion, yet coercion could not easily have occurred in an egalitarian setting. Using a quantitative model of evolution in a patch-structured population, we demonstrate that the interaction between demographic and ecological factors can overcome this conundrum. We model the coevolution of individual preferences for hierarchy alongside the degree of despotism of leaders, and the dispersal preferences of followers. We show that voluntary leadership without coercion can evolve in small groups, when leaders help to solve coordination problems related to resource production. An example is coordinating construction of an irrigation system. Our model predicts that the transition to larger despotic groups will then occur when: (i) surplus resources lead to demographic expansion of groups, removing the viability of an acephalous niche in the same area and so locking individuals into hierarchy; (ii) high dispersal costs limit followers' ability to escape a despot. Empirical evidence suggests that these conditions were probably met, for the first time, during the subsistence intensification of the Neolithic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: With the large amount of biological data that is currently publicly available, many investigators combine multiple data sets to increase the sample size and potentially also the power of their analyses. However, technical differences ("batch effects") as well as differences in sample composition between the data sets may significantly affect the ability to draw generalizable conclusions from such studies. FOCUS: The current study focuses on the construction of classifiers, and the use of cross-validation to estimate their performance. In particular, we investigate the impact of batch effects and differences in sample composition between batches on the accuracy of the classification performance estimate obtained via cross-validation. The focus on estimation bias is a main difference compared to previous studies, which have mostly focused on the predictive performance and how it relates to the presence of batch effects. DATA: We work on simulated data sets. To have realistic intensity distributions, we use real gene expression data as the basis for our simulation. Random samples from this expression matrix are selected and assigned to group 1 (e.g., 'control') or group 2 (e.g., 'treated'). We introduce batch effects and select some features to be differentially expressed between the two groups. We consider several scenarios for our study, most importantly different levels of confounding between groups and batch effects. METHODS: We focus on well-known classifiers: logistic regression, Support Vector Machines (SVM), k-nearest neighbors (kNN) and Random Forests (RF). Feature selection is performed with the Wilcoxon test or the lasso. Parameter tuning and feature selection, as well as the estimation of the prediction performance of each classifier, is performed within a nested cross-validation scheme. The estimated classification performance is then compared to what is obtained when applying the classifier to independent data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Barbiturate-induced coma can be used in patients to treat intractable intracranial hypertension when other therapies, such as osmotic therapy and sedation, have failed. Despite control of intracranial pressure, cerebral infarction may still occur in some patients, and the effect of barbiturates on outcome remains uncertain. In this study, we examined the relationship between barbiturate infusion and brain tissue oxygen (PbtO2). METHODS: Ten volume-resuscitated brain-injured patients who were treated with pentobarbital infusion for intracranial hypertension and underwent PbtO2 monitoring were studied in a neurosurgical intensive care unit at a university-based Level I trauma center. PbtO2, intracranial pressure (ICP), mean arterial pressure, cerebral perfusion pressure (CPP), and brain temperature were continuously monitored and compared in settings in which barbiturates were or were not administered. RESULTS: Data were available from 1595 hours of PbtO2 monitoring. When pentobarbital administration began, the mean ICP, CPP, and PbtO2 were 18 +/- 10, 72 +/- 18, and 28 +/- 12 mm Hg, respectively. During the 3 hours before barbiturate infusion, the maximum ICP was 24 +/- 13 mm Hg and the minimum CPP was 65 +/- 20 mm Hg. In the majority of patients (70%), we observed an increase in PbtO2 associated with pentobarbital infusion. Within this group, logistic regression analysis demonstrated that a higher likelihood of compromised brain oxygen (PbtO2 < 20 mm Hg) was associated with a decrease in pentobarbital dose after controlling for ICP and other physiological parameters (P < 0.001). In the remaining 3 patients, pentobarbital was associated with lower PbtO2 levels. These patients had higher ICP, lower CPP, and later initiation of barbiturates compared with patients whose PbtO2 increased. CONCLUSION: Our preliminary findings suggest that pentobarbital administered for intractable intracranial hypertension is associated with a significant and independent increase in PbtO2 in the majority of patients. However, in some patients with more compromised brain physiology, pentobarbital may have a negative effect on PbtO2, particularly if administered late. Larger studies are needed to examine the relationship between barbiturates and cerebral oxygenation in brain-injured patients with refractory intracranial hypertension and to determine whether PbtO2 responses can help guide therapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lung transplantation is an established therapy for end-stage pulmonary disorders in selected patients without significant comorbidities. The particular constraints associated with organ transplantation from deceased donors involve specific allocation rules in order to optimise the medical efficacy of the procedure. Comparison of different policies adopted by national transplant agencies reveals that an optimal and unique allocation system is an elusive goal, and that practical, geographical and logistic parameters must be taken into account. A solution to attenuate the imbalance between the number of lung transplant candidates and the limited availability of organs is to consider marginal donors. In particular, assessment and restoration of gas exchange capacity ex vivo in explanted lungs is a new and promising approach that some lung transplant programmes have started to apply in clinical practice. Chronic lung allograft dysfunction, and especially bronchiolitis obliterans, remains the major medium- and long-term problem in lung transplantation with a major impact on survival. Although there is to date no cure for established bronchiolitis obliterans, new preventive strategies have the potential to limit the burden of this feared complication. Unfortunately, randomised prospective studies are infrequent in the field of lung transplantation, and data obtained from larger studies involving kidney or liver recipients are not always relevant for this purpose.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: Continuous EEG (cEEG) is increasingly used to monitor brain function in neuro-ICU patients. However, its value in patients with coma after cardiac arrest (CA), particularly in the setting of therapeutic hypothermia (TH), is only beginning to be elucidated. The aim of this study was to examine whether cEEG performed during TH may predict outcome. METHODS: From April 2009 to April 2010, we prospectively studied 34 consecutive comatose patients treated with TH after CA who were monitored with cEEG, initiated during hypothermia and maintained after rewarming. EEG background reactivity to painful stimulation was tested. We analyzed the association between cEEG findings and neurologic outcome, assessed at 2 months with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). RESULTS: Continuous EEG recording was started 12 ± 6 hours after CA and lasted 30 ± 11 hours. Nonreactive cEEG background (12 of 15 (75%) among nonsurvivors versus none of 19 (0) survivors; P < 0.001) and prolonged discontinuous "burst-suppression" activity (11 of 15 (73%) versus none of 19; P < 0.001) were significantly associated with mortality. EEG seizures with absent background reactivity also differed significantly (seven of 15 (47%) versus none of 12 (0); P = 0.001). In patients with nonreactive background or seizures/epileptiform discharges on cEEG, no improvement was seen after TH. Nonreactive cEEG background during TH had a positive predictive value of 100% (95% confidence interval (CI), 74 to 100%) and a false-positive rate of 0 (95% CI, 0 to 18%) for mortality. All survivors had cEEG background reactivity, and the majority of them (14 (74%) of 19) had a favorable outcome (CPC 1 or 2). CONCLUSIONS: Continuous EEG monitoring showing a nonreactive or discontinuous background during TH is strongly associated with unfavorable outcome in patients with coma after CA. These data warrant larger studies to confirm the value of continuous EEG monitoring in predicting prognosis after CA and TH.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigated respiratory responses during film clip viewing and their relation to the affective dimensions of valence and arousal. Seventy-six subjects participated in a study using a between groups design. To begin with, all participants viewed an emotionally neutral film clip. Then, they were presented with one out of four emotional film clips: a positive high-arousal, a positive low-arousal, a negative high-arousal and a negative low-arousal clip. Respiration, skin conductance level, heart rate, corrugator activity and affective judgments were measured. Expiratory time was shorter and inspiratory duty cycle, mean expiratory flow and minute ventilation were larger during the high-arousal clips compared to the low-arousal clips. The pleasantness of the stimuli had no influence on any respiratory measure. These findings confirm the importance of arousal in respiratory responding but also evidence differences in comparison to previous studies using visual and auditory stimuli. [Authors]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Delta(9)-Tetrahydrocannabinol (THC) is frequently found in the blood of drivers suspected of driving under the influence of cannabis or involved in traffic crashes. The present study used a double-blind crossover design to compare the effects of medium (16.5 mg THC) and high doses (45.7 mg THC) of hemp milk decoctions or of a medium dose of dronabinol (20 mg synthetic THC, Marinol on several skills required for safe driving. Forensic interpretation of cannabinoids blood concentrations were attempted using the models proposed by Daldrup (cannabis influencing factor or CIF) and Huestis and coworkers. First, the time concentration-profiles of THC, 11-hydroxy-Delta(9)-tetrahydrocannabinol (11-OH-THC) (active metabolite of THC), and 11-nor-9-carboxy-Delta(9)-tetrahydrocannabinol (THCCOOH) in whole blood were determined by gas chromatography-mass spectrometry-negative ion chemical ionization. Compared to smoking studies, relatively low concentrations were measured in blood. The highest mean THC concentration (8.4 ng/mL) was achieved 1 h after ingestion of the strongest decoction. Mean maximum 11-OH-THC level (12.3 ng/mL) slightly exceeded that of THC. THCCOOH reached its highest mean concentration (66.2 ng/mL) 2.5-5.5 h after intake. Individual blood levels showed considerable intersubject variability. The willingness to drive was influenced by the importance of the requested task. Under significant cannabinoids influence, the participants refused to drive when they were asked whether they would agree to accomplish several unimportant tasks, (e.g., driving a friend to a party). Most of the participants reported a significant feeling of intoxication and did not appreciate the effects, notably those felt after drinking the strongest decoction. Road sign and tracking testing revealed obvious and statistically significant differences between placebo and treatments. A marked impairment was detected after ingestion of the strongest decoction. A CIF value, which relies on the molar ratio of main active to inactive cannabinoids, greater than 10 was found to correlate with a strong feeling of intoxication. It also matched with a significant decrease in the willingness to drive, and it matched also with a significant impairment in tracking performances. The mathematic model II proposed by Huestis et al. (1992) provided at best a rough estimate of the time of oral administration with 27% of actual values being out of range of the 95% confidence interval. The sum of THC and 11-OH-THC blood concentrations provided a better estimate of impairment than THC alone. This controlled clinical study points out the negative influence on fitness to drive after medium or high dose oral THC or dronabinol.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background In recent years, planaria have emerged as an important model system for research into stem cells and regeneration. Attention is focused on their unique stem cells, the neoblasts, which can differentiate into any cell type present in the adult organism. Sequencing of the Schmidtea mediterranea genome and some expressed sequence tag projects have generated extensive data on the genetic profile of these cells. However, little information is available on their protein dynamics. Results We developed a proteomic strategy to identify neoblast-specific proteins. Here we describe the method and discuss the results in comparison to the genomic high-throughput analyses carried out in planaria and to proteomic studies using other stem cell systems. We also show functional data for some of the candidate genes selected in our proteomic approach. Conclusions We have developed an accurate and reliable mass-spectra-based proteomics approach to complement previous genomic studies and to further achieve a more accurate understanding and description of the molecular and cellular processes related to the neoblasts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Continuous EEG (cEEG) is increasingly used to monitor brain function in neuro-ICU patients. However, its value in patients with coma after cardiac arrest (CA), particularly in the setting of therapeutic hypothermia (TH), is only beginning to be elucidated. The aim of this study was to examine whether cEEG performed during TH may predict outcome. Methods: From April 2009 to April 2010, we prospectively studied 34 consecutive comatose patients treated with TH after CA who were monitored with cEEG, initiated during hypothermia and maintained after rewarming. EEG background reactivity to painful stimulation was tested. We analyzed the association between cEEG findings and neurologic outcome, assessed at 2 months with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). Results: Continuous EEG recording was started 12 ± 6 hours after CA and lasted 30 ± 11 hours. Nonreactive cEEG background (12 of 15 (75%) among nonsurvivors versus none of 19 (0) survivors; P < 0.001) and prolonged discontinuous "burst-suppression" activity (11 of 15 (73%) versus none of 19; P < 0.001) were significantly associated with mortality. EEG seizures with absent background reactivity also differed significantly (seven of 15 (47%) versus none of 12 (0); P = 0.001). In patients with nonreactive background or seizures/epileptiform discharges on cEEG, no improvement was seen after TH. Nonreactive cEEG background during TH had a positive predictive value of 100% (95% confidence interval (CI), 74 to 100%) and a false-positive rate of 0 (95% CI, 0 to 18%) for mortality. All survivors had cEEG background reactivity, and the majority of them (14 (74%) of 19) had a favorable outcome (CPC 1 or 2). Conclusions: Continuous EEG monitoring showing a nonreactive or discontinuous background during TH is strongly associated with unfavorable outcome in patients with coma after CA. These data warrant larger studies to confirm the value of continuous EEG monitoring in predicting prognosis after CA and TH.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coma after cardiac arrest (CA) is an important cause of admission to the ICU. Prognosis of post-CA coma has significantly improved over the past decade, particularly because of aggressive postresuscitation care and the use of therapeutic targeted temperature management (TTM). TTM and sedatives used to maintain controlled cooling might delay neurologic reflexes and reduce the accuracy of clinical examination. In the early ICU phase, patients' good recovery may often be indistinguishable (based on neurologic examination alone) from patients who eventually will have a poor prognosis. Prognostication of post-CA coma, therefore, has evolved toward a multimodal approach that combines neurologic examination with EEG and evoked potentials. Blood biomarkers (eg, neuron-specific enolase [NSE] and soluble 100-β protein) are useful complements for coma prognostication; however, results vary among commercial laboratory assays, and applying one single cutoff level (eg, > 33 μg/L for NSE) for poor prognostication is not recommended. Neuroimaging, mainly diffusion MRI, is emerging as a promising tool for prognostication, but its precise role needs further study before it can be widely used. This multimodal approach might reduce false-positive rates of poor prognosis, thereby providing optimal prognostication of comatose CA survivors. The aim of this review is to summarize studies and the principal tools presently available for outcome prediction and to describe a practical approach to the multimodal prognostication of coma after CA, with a particular focus on neuromonitoring tools. We also propose an algorithm for the optimal use of such multimodal tools during the early ICU phase of post-CA coma.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigated the contribution of postictal memory testing for lateralizing the epileptic focus and predicting memory outcome after surgery for temporal lobe epilepsy (TLE). Forty-five patients with TLE underwent interictal, postictal, and postoperative assessment of verbal and nonverbal memory. Surgery consisted of anterior temporal lobectomy (36), selective isolated amygdalohippocampectomy (6), or amygdalohippocampectomy coupled to lesionectomy (3). Postictal and postoperative but not interictal memory were significantly lower in left TLE than in right TLE. Nonverbal memory showed no significant difference in left TLE versus right TLE in all conditions. Postictal memory was significantly correlated with postoperative memory, but the effect disappeared when the lateralization of the focus was considered. Postictal verbal memory is a useful bedside tool that can help lateralize the epileptic focus. Larger studies are needed to further estimate its predictive value of the postoperative outcome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Treatment of colonic diverticular disease has evolved over the past years. Most episodes are simple and can be successfully treated with antibiotics alone. For complicated diverticulitis, a strong trend is developing towards less invasive therapies including interventional radiology and laparoscopic lavage in an effort to avoid the morbidity and discomfort of a diverting colostomy. Based on a better understanding of the natural history of the disease, the indication to prophylactic colectomy after a few episodes of simple diverticulitis has been seriously challenged. For those patients who need a colectomy, single port laparoscopy, NOTES and transanal specimen extraction are being proposed. However larger studies are needed to confirm the hypothetical advantages of these evolving techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Plasma adiponectin is strongly associated with various components of metabolic syndrome, type 2 diabetes and cardiovascular outcomes. Concentrations are highly heritable and differ between men and women. We therefore aimed to investigate the genetics of plasma adiponectin in men and women. METHODS: We combined genome-wide association scans of three population-based studies including 4659 persons. For the replication stage in 13795 subjects, we selected the 20 top signals of the combined analysis, as well as the 10 top signals with p-values less than 1.0 x 10(-4) for each the men- and the women-specific analyses. We further selected 73 SNPs that were consistently associated with metabolic syndrome parameters in previous genome-wide association studies to check for their association with plasma adiponectin. RESULTS: The ADIPOQ locus showed genome-wide significant p-values in the combined (p=4.3 x 10(-24)) as well as in both women- and men-specific analyses (p=8.7 x 10(-17) and p=2.5 x 10(-11), respectively). None of the other 39 top signal SNPs showed evidence for association in the replication analysis. None of 73 SNPs from metabolic syndrome loci exhibited association with plasma adiponectin (p>0.01). CONCLUSIONS: We demonstrated the ADIPOQ gene as the only major gene for plasma adiponectin, which explains 6.7% of the phenotypic variance. We further found that neither this gene nor any of the metabolic syndrome loci explained the sex differences observed for plasma adiponectin. Larger studies are needed to identify more moderate genetic determinants of plasma adiponectin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rate of food consumption is a major factor affecting success in scramble competition for a limited amount of easy-to-find food. Accordingly, several studies report positive genetic correlations between larval competitive ability and feeding rate in Drosophila; both become enhanced in populations evolving under larval crowding. Here, we report the experimental evolution of enhanced competitive ability in populations of D. melanogaster previously maintained for 84 generations at low density on an extremely poor larval food. In contrast to previous studies, greater competitive ability was not associated with the evolution of higher feeding rate; if anything, the correlation between the two traits across lines tended to be negative. Thus, enhanced competitive ability may be favored by nutritional stress even when competition is not intense, and competitive ability may be decoupled from the rate of food consumption.