96 resultados para bottom-up effect
Resumo:
The present study examined the bottom-up influence of emotional context on response inhibition, an issue that remains largely unstudied in children. Thus, 62 participants, aged from 6 to 13 years old, were assessed with three stop signal tasks: one with circles, one with neutral faces, and one with emotional faces (happy and sad). Results showed that emotional context altered response inhibition ability in childhood. However, no interaction between age and emotional influence on response inhibition was found. Positive emotions were recognized faster than negative emotions, but the valence did not have a significant influence on response inhibition abilities.
Resumo:
Since the advent of high-throughput DNA sequencing technologies, the ever-increasing rate at which genomes have been published has generated new challenges notably at the level of genome annotation. Even if gene predictors and annotation softwares are more and more efficient, the ultimate validation is still in the observation of predicted gene product( s). Mass-spectrometry based proteomics provides the necessary high throughput technology to show evidences of protein presence and, from the identified sequences, confirmation or invalidation of predicted annotations. We review here different strategies used to perform a MS-based proteogenomics experiment with a bottom-up approach. We start from the strengths and weaknesses of the different database construction strategies, based on different genomic information (whole genome, ORF, cDNA, EST or RNA-Seq data), which are then used for matching mass spectra to peptides and proteins. We also review the important points to be considered for a correct statistical assessment of the peptide identifications. Finally, we provide references for tools used to map and visualize the peptide identifications back to the original genomic information.
Resumo:
Depuis un certain temps, les acteurs publics prônent l'établissement de coopérations entre hautes écoles. S'il y a certes des motivations diverses pour lancer une coopération, les notions d'aménagement de portefeuille ou d'efficience paraissent prédominantes. Sur cette trame, le présent travail vise à présenter une vue d'ensemble des coopérations entre hautes écoles suisses. Cet objectif se décline par l'établissement d'un inventaire des coopérations existantes et une discussion des facteurs qui influencent leur pérennisation. Les principaux résultats de notre étude révèlent une densité élevée de coopérations qui se caractérisent par une grande diversité de formes. Pour mener à bien un projet de coopération, il importe que les partenaires développent une vision commune en termes scientifiques et institutionnels, entretiennent la confiance mutuelle et continuent à voir dans le projet une valeur ajoutée. La pérennisation d'une coopération présuppose l'intégration dans la stratégie et les structures régulières de la haute école. La condition sine qua non pour y arriver est l'intérêt des hautes écoles concernées qui ne peut émerger que sur la base d'un processus autonome et «bottom-up». Seit geraumer Zeit ist ein steigendes Interesse an Kooperationen zwischen Hochschulen zu verzeichnen. Letztere gehen Kooperationen aus ganz unterschiedlichen Gründen ein, im öffentlichen Diskurs wird jedoch vor allem von Portfoliobereinigung oder Effizienz gesprochen. Vor diesem Hintergrund will die vorliegende Arbeit eine Übersicht über Kooperationen zwischen Schweizer Hochschulen geben. Neben der Erstellung eines Inventars existierender Kooperationen wird insbesondere diskutiert, welche Faktoren das dauerhafte Bestehen von Kooperationen beeinflussen. Es wird deutlich, dass die Dichte an Kooperationen sehr hoch ist und die unterschiedlichsten Formen existieren. Ausschlaggebend für den Erfolg einer Kooperation sind die Entwicklung einer gemeinsamen wissenschaftlichen und institutionellen Vision, der Aufbau gegenseitigen Vertrauens und ein längerfristiger Mehrwert für die Partner. Zur Sicherung des dauerhaften Bestehens von Kooperationen ist die Integration in die Strategie und die Regelstrukturen der Hochschule erforderlich. Dies kann nur erreicht werden, wenn die betroffenen Hochschulen Interesse am Projekt entwickeln, was wiederum einen autonomen und bottom-up geführten Prozess voraussetzt.
Resumo:
The pharmacokinetics (PK) of efavirenz (EFV) is characterized by marked interpatient variability that correlates with its pharmacodynamics (PD). In vitro-in vivo extrapolation (IVIVE) is a "bottom-up" approach that combines drug data with system information to predict PK and PD. The aim of this study was to simulate EFV PK and PD after dose reductions. At the standard dose, the simulated probability was 80% for viral suppression and 28% for central nervous system (CNS) toxicity. After a dose reduction to 400 mg, the probabilities of viral suppression were reduced to 69, 75, and 82%, and those of CNS toxicity were 21, 24, and 29% for the 516 GG, 516 GT, and 516 TT genotypes, respectively. With reduction of the dose to 200 mg, the probabilities of viral suppression decreased to 54, 62, and 72% and those of CNS toxicity decreased to 13, 18, and 20% for the 516 GG, 516 GT, and 516 TT genotypes, respectively. These findings indicate how dose reductions might be applied in patients with favorable genetic characteristics.
Resumo:
In the vast majority of bottom-up proteomics studies, protein digestion is performed using only mammalian trypsin. Although it is clearly the best enzyme available, the sole use of trypsin rarely leads to complete sequence coverage, even for abundant proteins. It is commonly assumed that this is because many tryptic peptides are either too short or too long to be identified by RPLC-MS/MS. We show through in silico analysis that 20-30% of the total sequence of three proteomes (Schizosaccharomyces pombe, Saccharomyces cerevisiae, and Homo sapiens) is expected to be covered by Large post-Trypsin Peptides (LpTPs) with M(r) above 3000 Da. We then established size exclusion chromatography to fractionate complex yeast tryptic digests into pools of peptides based on size. We found that secondary digestion of LpTPs followed by LC-MS/MS analysis leads to a significant increase in identified proteins and a 32-50% relative increase in average sequence coverage compared to trypsin digestion alone. Application of the developed strategy to analyze the phosphoproteomes of S. pombe and of a human cell line identified a significant fraction of novel phosphosites. Overall our data indicate that specific targeting of LpTPs can complement standard bottom-up workflows to reveal a largely neglected portion of the proteome.
Resumo:
BACKGROUND: Among the many definitions of frailty, the frailty phenotype defined by Fried et al. is one of few constructs that has been repeatedly validated: first in the Cardiovascular Health Study (CHS) and subsequently in other large cohorts in the North America. In Europe, the Survey of Health, Aging and Retirement in Europe (SHARE) is a gold mine of individual, economic and health information that can provide insight into better understanding of frailty across diverse population settings. A recent adaptation of the original five CHS-frailty criteria was proposed to make use of SHARE data and measure frailty in the European population. To test the validity of the SHARE operationalized frailty phenotype, this study aims to evaluate its prospective association with adverse health outcomes. METHODS: Data are from 11,015 community-dwelling men and women aged 60+ participating in wave 1 and 2 of the Survey of Health, Aging and Retirement in Europe, a population-based survey. Multivariate logistic regression analyses were used to assess the 2-year follow up effect of SHARE-operationalized frailty phenotype on the incidence of disability (disability-free at baseline) and on worsening disability and morbidity, adjusting for age, sex, income and baseline morbidity and disability. RESULTS: At 2-year follow up, frail individuals were at increased risk for: developing mobility (OR 3.07, 95% CI, 1.02-9.36), IADL (OR 5.52, 95% CI, 3.76-8.10) and BADL (OR 5.13, 95% CI, 3.53-7.44) disability; worsening mobility (OR 2.94, 95% CI, 2.19- 3.93) IADL (OR 4.43, 95% CI, 3.19-6.15) and BADL disability (OR 4.53, 95% CI, 3.14-6.54); and worsening morbidity (OR 1.77, 95% CI, 1.35-2.32). These associations were significant even among the prefrail, but with a lower magnitude of effect. CONCLUSIONS: The SHARE-operationalized frailty phenotype is significantly associated with all tested health outcomes independent of baseline morbidity and disability in community-dwelling men and women aged 60 and older living in Europe. The robustness of results validate the use of this phenotype in the SHARE survey for future research on frailty in Europe.
Resumo:
BACKGROUND: In 2005, findings of the first "cost of disorders of the brain in Europe" study of the European Brain Council (EBC) showed that these costs cause a substantial economic burden to the Swiss society. In 2010 an improved update with a broader range of disorders has been analysed. This report shows the new findings for Switzerland and discusses changes. METHODS: Data are derived from the EBC 2010 census study that estimates 12-month prevalence of 12 groups of disorders of the brain and calculates costs (direct health-care costs, direct non-medical costs and indirect costs) by combining top-down and bottom up cost approaches using existing data. RESULTS: The most frequent disorder was headache (2.3 million). Anxiety disorders were found in 1 million persons and sleep disorders in 700,000 persons. Annual costs for all assessed disorders total to 14.5 billion Euro corresponding to about 1,900 EUR per inhabitant per year. Mood, psychotic disorders and dementias (appr. 2 billion EUR each) were most costly. Costs per person were highest for neurological/neurosurgery-relevant disorders, e.g. neuromuscular disorders, brain tumour and multiple sclerosis (38,000 to 24,000 EUR). CONCLUSION: The estimates of the EBC 2010 study for Switzerland provide a basis for health care planning. Increase in size and costs compared to 2005 are mostly due to the inclusion of new disorders (e.g., sleep disorders), or the re-definition of others (e.g., headache) and to an increase in younger cohorts. We suggest coordinated research and preventive measures coordinated between governmental bodies, private health-care and pharmaceutical companies.
Resumo:
Mass spectrometry (MS) is currently the most sensitive and selective analytical technique for routine peptide and protein structure analysis. Top-down proteomics is based on tandem mass spectrometry (MS/ MS) of intact proteins, where multiply charged precursor ions are fragmented in the gas phase, typically by electron transfer or electron capture dissociation, to yield sequence-specific fragment ions. This approach is primarily used for the study of protein isoforms, including localization of post-translational modifications and identification of splice variants. Bottom-up proteomics is utilized for routine high-throughput protein identification and quantitation from complex biological samples. The proteins are first enzymatically digested into small (usually less than ca. 3 kDa) peptides, these are identified by MS or MS/MS, usually employing collisional activation techniques. To overcome the limitations of these approaches while combining their benefits, middle-down proteomics has recently emerged. Here, the proteins are digested into long (3-15 kDa) peptides via restricted proteolysis followed by the MS/MS analysis of the obtained digest. With advancements of high-resolution MS and allied techniques, routine implementation of the middle-down approach has been made possible. Herein, we present the liquid chromatography (LC)-MS/MS-based experimental design of our middle-down proteomic workflow coupled with post-LC supercharging.
Resumo:
Working memory, commonly defined as the ability to hold mental representations on line transiently and to manipulate these representations, is known to be a core deficit in schizophrenia. The aim of the present study was to investigate the visuo-spatial component of the working memory in schizophrenia, and more precisely to what extent the dynamic visuo-spatial information processing is impaired in schizophrenia patients. For this purpose we used a computerized paradigm in which 29 patients with schizophrenia (DSMIV, Diagnostic Interview for Genetic Studies) and 29 age and sex matched control subjects (DIGS) had to memorize a plane moving across the computer screen and to identify the observed trajectory among 9 plots proposed together. Each trajectory could be seen max. 3 times if needed. The results showed no difference between schizophrenia patients and controls regarding the number of correct trajectory identified after the first presentation. However, when we determine the mean number of correct trajectories on the basis of 3 trials, we observed that schizophrenia patients are significantly less performant than controls (Mann-Whitney, p _ 0.002). These findings suggest that, although schizophrenia patients are able to memorize some dynamic trajectories as well as controls, they do not profit from the repetition of the trajectory presentation. These findings are congruent with the hypothesis that schizophrenia could induce an unbalance between local and global information processing: the patients may be able to focus on details of the trajectory which could allow them to find the right target (bottom-up processes), but may show difficulty to refer to previous experience in order to filter incoming information (top-down processes) and enhance their visuo-spatial working memory abilities.
Resumo:
In bottom-up proteomics, rapid and efficient protein digestion is crucial for data reliability. However, sample preparation remains one of the rate-limiting steps in proteomics workflows. In this study, we compared the conventional trypsin digestion procedure with two accelerated digestion protocols based on shorter reaction times and microwave-assisted digestion for the preparation of membrane-enriched protein fractions of the human pathogenic bacterium Staphylococcus aureus. Produced peptides were analyzed by Shotgun IPG-IEF, a methodology relying on separation of peptides by IPG-IEF before the conventional LC-MS/MS steps of shotgun proteomics. Data obtained on two LC-MS/MS platforms showed that accelerated digestion protocols, especially the one relying on microwave irradiation, enhanced the cleavage specificity of trypsin and thus improved the digestion efficiency especially for hydrophobic and membrane proteins. The combination of high-throughput proteomics with accelerated and efficient sample preparation should enhance the practicability of proteomics by reducing the time from sample collection to obtaining the results.
Resumo:
Energy demand is an important constraint on neural signaling. Several methods have been proposed to assess the energy budget of the brain based on a bottom-up approach in which the energy demand of individual biophysical processes are first estimated independently and then summed up to compute the brain's total energy budget. Here, we address this question using a novel approach that makes use of published datasets that reported average cerebral glucose and oxygen utilization in humans and rodents during different activation states. Our approach allows us (1) to decipher neuron-glia compartmentalization in energy metabolism and (2) to compute a precise state-dependent energy budget for the brain. Under the assumption that the fraction of energy used for signaling is proportional to the cycling of neurotransmitters, we find that in the activated state, most of the energy ( approximately 80%) is oxidatively produced and consumed by neurons to support neuron-to-neuron signaling. Glial cells, while only contributing for a small fraction to energy production ( approximately 6%), actually take up a significant fraction of glucose (50% or more) from the blood and provide neurons with glucose-derived energy substrates. Our results suggest that glycolysis occurs for a significant part in astrocytes whereas most of the oxygen is utilized in neurons. As a consequence, a transfer of glucose-derived metabolites from glial cells to neurons has to take place. Furthermore, we find that the amplitude of this transfer is correlated to (1) the activity level of the brain; the larger the activity, the more metabolites are shuttled from glia to neurons and (2) the oxidative activity in astrocytes; with higher glial pyruvate metabolism, less metabolites are shuttled from glia to neurons. While some of the details of a bottom-up biophysical approach have to be simplified, our method allows for a straightforward assessment of the brain's energy budget from macroscopic measurements with minimal underlying assumptions.
Resumo:
OBJECTIVE: Critically ill patients are at high risk of malnutrition. Insufficient nutritional support still remains a widespread problem despite guidelines. The aim of this study was to measure the clinical impact of a two-step interdisciplinary quality nutrition program. DESIGN: Prospective interventional study over three periods (A, baseline; B and C, intervention periods). SETTING: Mixed intensive care unit within a university hospital. PATIENTS: Five hundred seventy-two patients (age 59 ± 17 yrs) requiring >72 hrs of intensive care unit treatment. INTERVENTION: Two-step quality program: 1) bottom-up implementation of feeding guideline; and 2) additional presence of an intensive care unit dietitian. The nutrition protocol was based on the European guidelines. MEASUREMENTS AND MAIN RESULTS: Anthropometric data, intensive care unit severity scores, energy delivery, and cumulated energy balance (daily, day 7, and discharge), feeding route (enteral, parenteral, combined, none-oral), length of intensive care unit and hospital stay, and mortality were collected. Altogether 5800 intensive care unit days were analyzed. Patients in period A were healthier with lower Simplified Acute Physiologic Scale and proportion of "rapidly fatal" McCabe scores. Energy delivery and balance increased gradually: impact was particularly marked on cumulated energy deficit on day 7 which improved from -5870 kcal to -3950 kcal (p < .001). Feeding technique changed significantly with progressive increase of days with nutrition therapy (A: 59% days, B: 69%, C: 71%, p < .001), use of enteral nutrition increased from A to B (stable in C), and days on combined and parenteral nutrition increased progressively. Oral energy intakes were low (mean: 385 kcal*day, 6 kcal*kg*day ). Hospital mortality increased with severity of condition in periods B and C. CONCLUSION: A bottom-up protocol improved nutritional support. The presence of the intensive care unit dietitian provided significant additional progression, which were related to early introduction and route of feeding, and which achieved overall better early energy balance.
Resumo:
1. Biogeographical models of species' distributions are essential tools for assessing impacts of changing environmental conditions on natural communities and ecosystems. Practitioners need more reliable predictions to integrate into conservation planning (e.g. reserve design and management). 2. Most models still largely ignore or inappropriately take into account important features of species' distributions, such as spatial autocorrelation, dispersal and migration, biotic and environmental interactions. Whether distributions of natural communities or ecosystems are better modelled by assembling individual species' predictions in a bottom-up approach or modelled as collective entities is another important issue. An international workshop was organized to address these issues. 3. We discuss more specifically six issues in a methodological framework for generalized regression: (i) links with ecological theory; (ii) optimal use of existing data and artificially generated data; (iii) incorporating spatial context; (iv) integrating ecological and environmental interactions; (v) assessing prediction errors and uncertainties; and (vi) predicting distributions of communities or collective properties of biodiversity. 4. Synthesis and applications. Better predictions of the effects of impacts on biological communities and ecosystems can emerge only from more robust species' distribution models and better documentation of the uncertainty associated with these models. An improved understanding of causes of species' distributions, especially at their range limits, as well as of ecological assembly rules and ecosystem functioning, is necessary if further progress is to be made. A better collaborative effort between theoretical and functional ecologists, ecological modellers and statisticians is required to reach these goals.
Resumo:
This study presents an innovative methodology for forensic science image analysis for event reconstruction. The methodology is based on experiences from real cases. It provides real added value to technical guidelines such as standard operating procedures (SOPs) and enriches the community of practices at stake in this field. This bottom-up solution outlines the many facets of analysis and the complexity of the decision-making process. Additionally, the methodology provides a backbone for articulating more detailed and technical procedures and SOPs. It emerged from a grounded theory approach; data from individual and collective interviews with eight Swiss and nine European forensic image analysis experts were collected and interpreted in a continuous, circular and reflexive manner. Throughout the process of conducting interviews and panel discussions, similarities and discrepancies were discussed in detail to provide a comprehensive picture of practices and points of view and to ultimately formalise shared know-how. Our contribution sheds light on the complexity of the choices, actions and interactions along the path of data collection and analysis, enhancing both the researchers' and participants' reflexivity.
Resumo:
Social responsibility, especially in the fields of education, society and peace, is one of the cornerstones of the olympic ideal and strategic vision (contribute to building a better world through sport). The article reviews the literature on organizational social responsibility (OSR) and the relationship between sport/olympism and OSR in order to examine the conditions governing the implementation and success of the International Olympic Committee's strategic vision. Several ways in which the IOC could promote a more ambitious and better-integrated social strategy: revise its performance model, notably evaluate and present in a social responsibility report; promote the adoption of OSR initiatives and strategies within the Olympic System from the bottom-up, rather than from the top-down; share best practices in the different countries for promoting and developing "sport for all"; create a World Agency for Development through Sport, or partnering and funding the international platform on sport and development; creating a World Agency for the International Governance of Sport. Two possible scenarios for the future of Olympic responsibility are finally discussed: strategy of "small steps" and a more ambitious local and global social strategy through sport and olympism.