934 resultados para BOTTOM-UP
Resumo:
Understanding drivers of biodiversity patterns is of prime importance in this era of severe environmental crisis. More diverse plant communities have been postulated to represent a larger functional trait-space, more likely to sustain a diverse assembly of herbivore species. Here, we expand this hypothesis to integrate environmental, functional and phylogenetic variation of plant communities as factors explaining the diversity of lepidopteran assemblages along elevation gradients in the Swiss Western Alps. According to expectations, we found that the association between butterflies and their host plants is highly phylogenetically structured. Multiple regression analyses showed the combined effect of climate, functional traits and phylogenetic diversity in structuring butterfly communities. Furthermore, we provide the first evidence that plant phylogenetic beta diversity is the major driver explaining butterfly phylogenetic beta diversity. Along ecological gradients, the bottom up control of herbivore diversity is thus driven by phylogenetically structured turnover of plant traits as well as environmental variables.
Resumo:
The present study examined the bottom-up influence of emotional context on response inhibition, an issue that remains largely unstudied in children. Thus, 62 participants, aged from 6 to 13 years old, were assessed with three stop signal tasks: one with circles, one with neutral faces, and one with emotional faces (happy and sad). Results showed that emotional context altered response inhibition ability in childhood. However, no interaction between age and emotional influence on response inhibition was found. Positive emotions were recognized faster than negative emotions, but the valence did not have a significant influence on response inhibition abilities.
Resumo:
Previous electrophysiological studies revealed that human faces elicit an early visual event-related potential (ERP) within the occipito-temporal cortex, the N170 component. Although face perception has been proposed to rely on automatic processing, the impact of selective attention on N170 remains controversial both in young and elderly individuals. Using early visual ERP and alpha power analysis, we assessed the influence of aging on selective attention to faces during delayed-recognition tasks for face and letter stimuli, examining 36 elderly and 20 young adults with preserved cognition. Face recognition performance worsened with age. Aging induced a latency delay of the N1 component for faces and letters, as well as of the face N170 component. Contrasting with letters, ignored faces elicited larger N1 and N170 components than attended faces in both age groups. This counterintuitive attention effect on face processing persisted when scenes replaced letters. In contrast with young, elderly subjects failed to suppress irrelevant letters when attending faces. Whereas attended stimuli induced a parietal alpha band desynchronization within 300-1000 ms post-stimulus with bilateral-to-right distribution for faces and left lateralization for letters, ignored and passively viewed stimuli elicited a central alpha synchronization larger on the right hemisphere. Aging delayed the latency of this alpha synchronization for both face and letter stimuli, and reduced its amplitude for ignored letters. These results suggest that due to their social relevance, human faces may cause paradoxical attention effects on early visual ERP components, but they still undergo classical top-down control as a function of endogenous selective attention. Aging does not affect the face bottom-up alerting mechanism but reduces the top-down suppression of distracting letters, possibly impinging upon face recognition, and more generally delays the top-down suppression of task-irrelevant information.
Resumo:
The plant architecture hypothesis predicts that variation in host plant architecture influences insect herbivore community structure, dynamics and performance. In this study we evaluated the effects of Macairea radula (Melastomataceae) architecture on the abundance of galls induced by a moth (Lepidoptera: Gelechiidae). Plant architecture and gall abundance were directly recorded on 58 arbitrarily chosen M. radula host plants in the rainy season of 2006 in an area of Cerrado vegetation, southeastern Brazil. Plant height, dry biomass, number of branches, number of shoots and leaf abundance were used as predicting variables of gall abundance and larval survival. Gall abundance correlated positively with host plant biomass and branch number. Otherwise, no correlation (p > 0.05) was found between gall abundance with shoot number or with the number of leaves/plant. From a total of 124 galls analyzed, 67.7% survived, 14.5% were attacked by parasitoids, while 17.7% died due to unknown causes. Larvae that survived or were parasitized were not influenced by architectural complexity of the host plant. Our results partially corroborate the plant architecture hypothesis, but since parasitism was not related to plant architecture it is argued that bottom-up effects may be more important than top-down effects in controlling the population dynamics of the galling lepidopteran. Because galling insects often decrease plant fitness, the potential of galling insects in selecting for less architectural complex plants is discussed.
Resumo:
Since the advent of high-throughput DNA sequencing technologies, the ever-increasing rate at which genomes have been published has generated new challenges notably at the level of genome annotation. Even if gene predictors and annotation softwares are more and more efficient, the ultimate validation is still in the observation of predicted gene product( s). Mass-spectrometry based proteomics provides the necessary high throughput technology to show evidences of protein presence and, from the identified sequences, confirmation or invalidation of predicted annotations. We review here different strategies used to perform a MS-based proteogenomics experiment with a bottom-up approach. We start from the strengths and weaknesses of the different database construction strategies, based on different genomic information (whole genome, ORF, cDNA, EST or RNA-Seq data), which are then used for matching mass spectra to peptides and proteins. We also review the important points to be considered for a correct statistical assessment of the peptide identifications. Finally, we provide references for tools used to map and visualize the peptide identifications back to the original genomic information.
Resumo:
The Department’s 2007 Greenhouse Gas Inventory is a refinement of previous statewide inventories. It is a bottom-up inventory of two sectors – fossil fuel combustion at federally-recognized major sources of air pollution and fossil fuel combustion and ethanol fermentation at dry mill ethanol plants. This is the first bottomup greenhouse gas inventory conducted for Iowa and the first bottom-up greenhouse gas inventory of ethanol plants in the nation that the Department is aware of. In a bottom-up inventory, facility-specific activity data is used to calculate emissions. In a top-down inventory, aggregate activity data is used to calculate emissions. For example, this bottom-up inventory calculates greenhouse gas emissions from the fossil fuel combustion at each individual facility instead of using the total amount of fossil fuel combusted state-wide, which would be a top-down inventory method. The advantage to a bottom-up inventory is that the calculations are more accurate than a top-down inventory. However, because the two methods differ, the results from a bottom-up inventory are not directly comparable to a top-down inventory.
Resumo:
Depuis un certain temps, les acteurs publics prônent l'établissement de coopérations entre hautes écoles. S'il y a certes des motivations diverses pour lancer une coopération, les notions d'aménagement de portefeuille ou d'efficience paraissent prédominantes. Sur cette trame, le présent travail vise à présenter une vue d'ensemble des coopérations entre hautes écoles suisses. Cet objectif se décline par l'établissement d'un inventaire des coopérations existantes et une discussion des facteurs qui influencent leur pérennisation. Les principaux résultats de notre étude révèlent une densité élevée de coopérations qui se caractérisent par une grande diversité de formes. Pour mener à bien un projet de coopération, il importe que les partenaires développent une vision commune en termes scientifiques et institutionnels, entretiennent la confiance mutuelle et continuent à voir dans le projet une valeur ajoutée. La pérennisation d'une coopération présuppose l'intégration dans la stratégie et les structures régulières de la haute école. La condition sine qua non pour y arriver est l'intérêt des hautes écoles concernées qui ne peut émerger que sur la base d'un processus autonome et «bottom-up». Seit geraumer Zeit ist ein steigendes Interesse an Kooperationen zwischen Hochschulen zu verzeichnen. Letztere gehen Kooperationen aus ganz unterschiedlichen Gründen ein, im öffentlichen Diskurs wird jedoch vor allem von Portfoliobereinigung oder Effizienz gesprochen. Vor diesem Hintergrund will die vorliegende Arbeit eine Übersicht über Kooperationen zwischen Schweizer Hochschulen geben. Neben der Erstellung eines Inventars existierender Kooperationen wird insbesondere diskutiert, welche Faktoren das dauerhafte Bestehen von Kooperationen beeinflussen. Es wird deutlich, dass die Dichte an Kooperationen sehr hoch ist und die unterschiedlichsten Formen existieren. Ausschlaggebend für den Erfolg einer Kooperation sind die Entwicklung einer gemeinsamen wissenschaftlichen und institutionellen Vision, der Aufbau gegenseitigen Vertrauens und ein längerfristiger Mehrwert für die Partner. Zur Sicherung des dauerhaften Bestehens von Kooperationen ist die Integration in die Strategie und die Regelstrukturen der Hochschule erforderlich. Dies kann nur erreicht werden, wenn die betroffenen Hochschulen Interesse am Projekt entwickeln, was wiederum einen autonomen und bottom-up geführten Prozess voraussetzt.
Resumo:
Inhibitory control, a core component of executive functions, refers to our ability to suppress intended or ongoing cognitive or motor processes. Mostly based on Go/NoGo paradigms, a considerable amount of literature reports that inhibitory control of responses to "NoGo" stimuli is mediated by top-down mechanisms manifesting ∼200 ms after stimulus onset within frontoparietal networks. However, whether inhibitory functions in humans can be trained and the supporting neurophysiological mechanisms remain unresolved. We addressed these issues by contrasting auditory evoked potentials (AEPs) to left-lateralized "Go" and right NoGo stimuli recorded at the beginning versus the end of 30 min of active auditory spatial Go/NoGo training, as well as during passive listening of the same stimuli before versus after the training session, generating two separate 2 × 2 within-subject designs. Training improved Go/NoGo proficiency. Response times to Go stimuli decreased. During active training, AEPs to NoGo, but not Go, stimuli modulated topographically with training 61-104 ms after stimulus onset, indicative of changes in the underlying brain network. Source estimations revealed that this modulation followed from decreased activity within left parietal cortices, which in turn predicted the extent of behavioral improvement. During passive listening, in contrast, effects were limited to topographic modulations of AEPs in response to Go stimuli over the 31-81 ms interval, mediated by decreased right anterior temporoparietal activity. We discuss our results in terms of the development of an automatic and bottom-up form of inhibitory control with training and a differential effect of Go/NoGo training during active executive control versus passive listening conditions.
Resumo:
The pharmacokinetics (PK) of efavirenz (EFV) is characterized by marked interpatient variability that correlates with its pharmacodynamics (PD). In vitro-in vivo extrapolation (IVIVE) is a "bottom-up" approach that combines drug data with system information to predict PK and PD. The aim of this study was to simulate EFV PK and PD after dose reductions. At the standard dose, the simulated probability was 80% for viral suppression and 28% for central nervous system (CNS) toxicity. After a dose reduction to 400 mg, the probabilities of viral suppression were reduced to 69, 75, and 82%, and those of CNS toxicity were 21, 24, and 29% for the 516 GG, 516 GT, and 516 TT genotypes, respectively. With reduction of the dose to 200 mg, the probabilities of viral suppression decreased to 54, 62, and 72% and those of CNS toxicity decreased to 13, 18, and 20% for the 516 GG, 516 GT, and 516 TT genotypes, respectively. These findings indicate how dose reductions might be applied in patients with favorable genetic characteristics.
Resumo:
A simple holographic model is presented and analyzed that describes chiral symmetry breaking and the physics of the meson sector in QCD. This is a bottom-up model that incorporates string theory ingredients like tachyon condensation which is expected to be the main manifestation of chiral symmetry breaking in the holographic context. As a model for glue the Kuperstein-Sonnenschein background is used. The structure of the flavor vacuum is analyzed in the quenched approximation. Chiral symmetry breaking is shown at zero temperature. Above the deconfinement transition chiral symmetry is restored. A complete holographic renormalization is performed and the chiral condensate is calculated for different quark masses both at zero and non-zero temperatures. The 0++, 0¿+, 1++, 1¿¿ meson trajectories are analyzed and their masses and decay constants are computed. The asymptotic trajectories are linear. The model has one phenomenological parameter beyond those of QCD that affects the 1++, 0¿+ sectors. Fitting this parameter we obtain very good agreement with data. The model improves in several ways the popular hard-wall and soft wall bottom-up models.
Resumo:
In the vast majority of bottom-up proteomics studies, protein digestion is performed using only mammalian trypsin. Although it is clearly the best enzyme available, the sole use of trypsin rarely leads to complete sequence coverage, even for abundant proteins. It is commonly assumed that this is because many tryptic peptides are either too short or too long to be identified by RPLC-MS/MS. We show through in silico analysis that 20-30% of the total sequence of three proteomes (Schizosaccharomyces pombe, Saccharomyces cerevisiae, and Homo sapiens) is expected to be covered by Large post-Trypsin Peptides (LpTPs) with M(r) above 3000 Da. We then established size exclusion chromatography to fractionate complex yeast tryptic digests into pools of peptides based on size. We found that secondary digestion of LpTPs followed by LC-MS/MS analysis leads to a significant increase in identified proteins and a 32-50% relative increase in average sequence coverage compared to trypsin digestion alone. Application of the developed strategy to analyze the phosphoproteomes of S. pombe and of a human cell line identified a significant fraction of novel phosphosites. Overall our data indicate that specific targeting of LpTPs can complement standard bottom-up workflows to reveal a largely neglected portion of the proteome.
Resumo:
A simple holographic model is presented and analyzed that describes chiral symmetry breaking and the physics of the meson sector in QCD. This is a bottom-up model that incorporates string theory ingredients like tachyon condensation which is expected to be the main manifestation of chiral symmetry breaking in the holographic context. As a model for glue the Kuperstein-Sonnenschein background is used. The structure of the flavor vacuum is analyzed in the quenched approximation. Chiral symmetry breaking is shown at zero temperature. Above the deconfinement transition chiral symmetry is restored. A complete holographic renormalization is performed and the chiral condensate is calculated for different quark masses both at zero and non-zero temperatures. The 0++, 0¿+, 1++, 1¿¿ meson trajectories are analyzed and their masses and decay constants are computed. The asymptotic trajectories are linear. The model has one phenomenological parameter beyond those of QCD that affects the 1++, 0¿+ sectors. Fitting this parameter we obtain very good agreement with data. The model improves in several ways the popular hard-wall and soft wall bottom-up models.
Resumo:
We present an agent-based model with the aim of studying how macro-level dynamics of spatial distances among interacting individuals in a closed space emerge from micro-level dyadic and local interactions. Our agents moved on a lattice (referred to as a room) using a model implemented in a computer program called P-Space in order to minimize their dissatisfaction, defined as a function of the discrepancy between the real distance and the ideal, or desired, distance between agents. Ideal distances evolved in accordance with the agent's personal and social space, which changed throughout the dynamics of the interactions among the agents. In the first set of simulations we studied the effects of the parameters of the function that generated ideal distances, and in a second set we explored how group macrolevel behavior depended on model parameters and other variables. We learned that certain parameter values yielded consistent patterns in the agents' personal and social spaces, which in turn led to avoidance and approaching behaviors in the agents. We also found that the spatial behavior of the group of agents as a whole was influenced by the values of the model parameters, as well as by other variables such as the number of agents. Our work demonstrates that the bottom-up approach is a useful way of explaining macro-level spatial behavior. The proposed model is also shown to be a powerful tool for simulating the spatial behavior of groups of interacting individuals.
Resumo:
BACKGROUND: In 2005, findings of the first "cost of disorders of the brain in Europe" study of the European Brain Council (EBC) showed that these costs cause a substantial economic burden to the Swiss society. In 2010 an improved update with a broader range of disorders has been analysed. This report shows the new findings for Switzerland and discusses changes. METHODS: Data are derived from the EBC 2010 census study that estimates 12-month prevalence of 12 groups of disorders of the brain and calculates costs (direct health-care costs, direct non-medical costs and indirect costs) by combining top-down and bottom up cost approaches using existing data. RESULTS: The most frequent disorder was headache (2.3 million). Anxiety disorders were found in 1 million persons and sleep disorders in 700,000 persons. Annual costs for all assessed disorders total to 14.5 billion Euro corresponding to about 1,900 EUR per inhabitant per year. Mood, psychotic disorders and dementias (appr. 2 billion EUR each) were most costly. Costs per person were highest for neurological/neurosurgery-relevant disorders, e.g. neuromuscular disorders, brain tumour and multiple sclerosis (38,000 to 24,000 EUR). CONCLUSION: The estimates of the EBC 2010 study for Switzerland provide a basis for health care planning. Increase in size and costs compared to 2005 are mostly due to the inclusion of new disorders (e.g., sleep disorders), or the re-definition of others (e.g., headache) and to an increase in younger cohorts. We suggest coordinated research and preventive measures coordinated between governmental bodies, private health-care and pharmaceutical companies.
Resumo:
Mass spectrometry (MS) is currently the most sensitive and selective analytical technique for routine peptide and protein structure analysis. Top-down proteomics is based on tandem mass spectrometry (MS/ MS) of intact proteins, where multiply charged precursor ions are fragmented in the gas phase, typically by electron transfer or electron capture dissociation, to yield sequence-specific fragment ions. This approach is primarily used for the study of protein isoforms, including localization of post-translational modifications and identification of splice variants. Bottom-up proteomics is utilized for routine high-throughput protein identification and quantitation from complex biological samples. The proteins are first enzymatically digested into small (usually less than ca. 3 kDa) peptides, these are identified by MS or MS/MS, usually employing collisional activation techniques. To overcome the limitations of these approaches while combining their benefits, middle-down proteomics has recently emerged. Here, the proteins are digested into long (3-15 kDa) peptides via restricted proteolysis followed by the MS/MS analysis of the obtained digest. With advancements of high-resolution MS and allied techniques, routine implementation of the middle-down approach has been made possible. Herein, we present the liquid chromatography (LC)-MS/MS-based experimental design of our middle-down proteomic workflow coupled with post-LC supercharging.