863 resultados para Bottom-up learning
Resumo:
A simple holographic model is presented and analyzed that describes chiral symmetry breaking and the physics of the meson sector in QCD. This is a bottom-up model that incorporates string theory ingredients like tachyon condensation which is expected to be the main manifestation of chiral symmetry breaking in the holographic context. As a model for glue the Kuperstein-Sonnenschein background is used. The structure of the flavor vacuum is analyzed in the quenched approximation. Chiral symmetry breaking is shown at zero temperature. Above the deconfinement transition chiral symmetry is restored. A complete holographic renormalization is performed and the chiral condensate is calculated for different quark masses both at zero and non-zero temperatures. The 0++, 0¿+, 1++, 1¿¿ meson trajectories are analyzed and their masses and decay constants are computed. The asymptotic trajectories are linear. The model has one phenomenological parameter beyond those of QCD that affects the 1++, 0¿+ sectors. Fitting this parameter we obtain very good agreement with data. The model improves in several ways the popular hard-wall and soft wall bottom-up models.
Resumo:
We present an agent-based model with the aim of studying how macro-level dynamics of spatial distances among interacting individuals in a closed space emerge from micro-level dyadic and local interactions. Our agents moved on a lattice (referred to as a room) using a model implemented in a computer program called P-Space in order to minimize their dissatisfaction, defined as a function of the discrepancy between the real distance and the ideal, or desired, distance between agents. Ideal distances evolved in accordance with the agent's personal and social space, which changed throughout the dynamics of the interactions among the agents. In the first set of simulations we studied the effects of the parameters of the function that generated ideal distances, and in a second set we explored how group macrolevel behavior depended on model parameters and other variables. We learned that certain parameter values yielded consistent patterns in the agents' personal and social spaces, which in turn led to avoidance and approaching behaviors in the agents. We also found that the spatial behavior of the group of agents as a whole was influenced by the values of the model parameters, as well as by other variables such as the number of agents. Our work demonstrates that the bottom-up approach is a useful way of explaining macro-level spatial behavior. The proposed model is also shown to be a powerful tool for simulating the spatial behavior of groups of interacting individuals.
Resumo:
BACKGROUND: In 2005, findings of the first "cost of disorders of the brain in Europe" study of the European Brain Council (EBC) showed that these costs cause a substantial economic burden to the Swiss society. In 2010 an improved update with a broader range of disorders has been analysed. This report shows the new findings for Switzerland and discusses changes. METHODS: Data are derived from the EBC 2010 census study that estimates 12-month prevalence of 12 groups of disorders of the brain and calculates costs (direct health-care costs, direct non-medical costs and indirect costs) by combining top-down and bottom up cost approaches using existing data. RESULTS: The most frequent disorder was headache (2.3 million). Anxiety disorders were found in 1 million persons and sleep disorders in 700,000 persons. Annual costs for all assessed disorders total to 14.5 billion Euro corresponding to about 1,900 EUR per inhabitant per year. Mood, psychotic disorders and dementias (appr. 2 billion EUR each) were most costly. Costs per person were highest for neurological/neurosurgery-relevant disorders, e.g. neuromuscular disorders, brain tumour and multiple sclerosis (38,000 to 24,000 EUR). CONCLUSION: The estimates of the EBC 2010 study for Switzerland provide a basis for health care planning. Increase in size and costs compared to 2005 are mostly due to the inclusion of new disorders (e.g., sleep disorders), or the re-definition of others (e.g., headache) and to an increase in younger cohorts. We suggest coordinated research and preventive measures coordinated between governmental bodies, private health-care and pharmaceutical companies.
Resumo:
Mass spectrometry (MS) is currently the most sensitive and selective analytical technique for routine peptide and protein structure analysis. Top-down proteomics is based on tandem mass spectrometry (MS/ MS) of intact proteins, where multiply charged precursor ions are fragmented in the gas phase, typically by electron transfer or electron capture dissociation, to yield sequence-specific fragment ions. This approach is primarily used for the study of protein isoforms, including localization of post-translational modifications and identification of splice variants. Bottom-up proteomics is utilized for routine high-throughput protein identification and quantitation from complex biological samples. The proteins are first enzymatically digested into small (usually less than ca. 3 kDa) peptides, these are identified by MS or MS/MS, usually employing collisional activation techniques. To overcome the limitations of these approaches while combining their benefits, middle-down proteomics has recently emerged. Here, the proteins are digested into long (3-15 kDa) peptides via restricted proteolysis followed by the MS/MS analysis of the obtained digest. With advancements of high-resolution MS and allied techniques, routine implementation of the middle-down approach has been made possible. Herein, we present the liquid chromatography (LC)-MS/MS-based experimental design of our middle-down proteomic workflow coupled with post-LC supercharging.
Resumo:
Working memory, commonly defined as the ability to hold mental representations on line transiently and to manipulate these representations, is known to be a core deficit in schizophrenia. The aim of the present study was to investigate the visuo-spatial component of the working memory in schizophrenia, and more precisely to what extent the dynamic visuo-spatial information processing is impaired in schizophrenia patients. For this purpose we used a computerized paradigm in which 29 patients with schizophrenia (DSMIV, Diagnostic Interview for Genetic Studies) and 29 age and sex matched control subjects (DIGS) had to memorize a plane moving across the computer screen and to identify the observed trajectory among 9 plots proposed together. Each trajectory could be seen max. 3 times if needed. The results showed no difference between schizophrenia patients and controls regarding the number of correct trajectory identified after the first presentation. However, when we determine the mean number of correct trajectories on the basis of 3 trials, we observed that schizophrenia patients are significantly less performant than controls (Mann-Whitney, p _ 0.002). These findings suggest that, although schizophrenia patients are able to memorize some dynamic trajectories as well as controls, they do not profit from the repetition of the trajectory presentation. These findings are congruent with the hypothesis that schizophrenia could induce an unbalance between local and global information processing: the patients may be able to focus on details of the trajectory which could allow them to find the right target (bottom-up processes), but may show difficulty to refer to previous experience in order to filter incoming information (top-down processes) and enhance their visuo-spatial working memory abilities.
Resumo:
Portland cement concrete (PCC) pavement undergoes repeated environmental load-related deflection resulting from temperature and moisture variations across pavement depth. This has been recognized as resulting in PCC pavement curling and warping since the mid-1920s. Slab curvature can be further magnified under repeated traffic loads and may ultimately lead to fatigue failures, including top-down and bottom-up transverse, longitudinal, and corner cracking. It is therefore significant to measure the “true” degree of curling and warping in PCC pavements, not only for quality control (QC) and quality assurance (QA) purposes, but also for better understanding of its relationship to long-term pavement performance. Although several approaches and devices—including linear variable differential transducers (LVDTs), digital indicators, and some profilers—have been proposed for measuring curling and warping, their application in the field is subject to cost, inconvenience, and complexity of operation. This research therefore explores developing an economical and simple device for measuring curling and warping in concrete pavements with accuracy comparable to or better than existing methodologies. Technical requirements were identified to establish assessment criteria for development, and field tests were conducted to modify the device to further enhancement. The finalized device is about 12 inches in height and 18 pounds in weight, and its manufacturing cost is just $320. Detailed development procedures and evaluation results for the new curling and warping measuring device are presented and discussed, with a focus on achieving reliable curling and warping measurements in a cost effective manner.
Resumo:
In bottom-up proteomics, rapid and efficient protein digestion is crucial for data reliability. However, sample preparation remains one of the rate-limiting steps in proteomics workflows. In this study, we compared the conventional trypsin digestion procedure with two accelerated digestion protocols based on shorter reaction times and microwave-assisted digestion for the preparation of membrane-enriched protein fractions of the human pathogenic bacterium Staphylococcus aureus. Produced peptides were analyzed by Shotgun IPG-IEF, a methodology relying on separation of peptides by IPG-IEF before the conventional LC-MS/MS steps of shotgun proteomics. Data obtained on two LC-MS/MS platforms showed that accelerated digestion protocols, especially the one relying on microwave irradiation, enhanced the cleavage specificity of trypsin and thus improved the digestion efficiency especially for hydrophobic and membrane proteins. The combination of high-throughput proteomics with accelerated and efficient sample preparation should enhance the practicability of proteomics by reducing the time from sample collection to obtaining the results.
Resumo:
Energy demand is an important constraint on neural signaling. Several methods have been proposed to assess the energy budget of the brain based on a bottom-up approach in which the energy demand of individual biophysical processes are first estimated independently and then summed up to compute the brain's total energy budget. Here, we address this question using a novel approach that makes use of published datasets that reported average cerebral glucose and oxygen utilization in humans and rodents during different activation states. Our approach allows us (1) to decipher neuron-glia compartmentalization in energy metabolism and (2) to compute a precise state-dependent energy budget for the brain. Under the assumption that the fraction of energy used for signaling is proportional to the cycling of neurotransmitters, we find that in the activated state, most of the energy ( approximately 80%) is oxidatively produced and consumed by neurons to support neuron-to-neuron signaling. Glial cells, while only contributing for a small fraction to energy production ( approximately 6%), actually take up a significant fraction of glucose (50% or more) from the blood and provide neurons with glucose-derived energy substrates. Our results suggest that glycolysis occurs for a significant part in astrocytes whereas most of the oxygen is utilized in neurons. As a consequence, a transfer of glucose-derived metabolites from glial cells to neurons has to take place. Furthermore, we find that the amplitude of this transfer is correlated to (1) the activity level of the brain; the larger the activity, the more metabolites are shuttled from glia to neurons and (2) the oxidative activity in astrocytes; with higher glial pyruvate metabolism, less metabolites are shuttled from glia to neurons. While some of the details of a bottom-up biophysical approach have to be simplified, our method allows for a straightforward assessment of the brain's energy budget from macroscopic measurements with minimal underlying assumptions.
Resumo:
OBJECTIVE: Critically ill patients are at high risk of malnutrition. Insufficient nutritional support still remains a widespread problem despite guidelines. The aim of this study was to measure the clinical impact of a two-step interdisciplinary quality nutrition program. DESIGN: Prospective interventional study over three periods (A, baseline; B and C, intervention periods). SETTING: Mixed intensive care unit within a university hospital. PATIENTS: Five hundred seventy-two patients (age 59 ± 17 yrs) requiring >72 hrs of intensive care unit treatment. INTERVENTION: Two-step quality program: 1) bottom-up implementation of feeding guideline; and 2) additional presence of an intensive care unit dietitian. The nutrition protocol was based on the European guidelines. MEASUREMENTS AND MAIN RESULTS: Anthropometric data, intensive care unit severity scores, energy delivery, and cumulated energy balance (daily, day 7, and discharge), feeding route (enteral, parenteral, combined, none-oral), length of intensive care unit and hospital stay, and mortality were collected. Altogether 5800 intensive care unit days were analyzed. Patients in period A were healthier with lower Simplified Acute Physiologic Scale and proportion of "rapidly fatal" McCabe scores. Energy delivery and balance increased gradually: impact was particularly marked on cumulated energy deficit on day 7 which improved from -5870 kcal to -3950 kcal (p < .001). Feeding technique changed significantly with progressive increase of days with nutrition therapy (A: 59% days, B: 69%, C: 71%, p < .001), use of enteral nutrition increased from A to B (stable in C), and days on combined and parenteral nutrition increased progressively. Oral energy intakes were low (mean: 385 kcal*day, 6 kcal*kg*day ). Hospital mortality increased with severity of condition in periods B and C. CONCLUSION: A bottom-up protocol improved nutritional support. The presence of the intensive care unit dietitian provided significant additional progression, which were related to early introduction and route of feeding, and which achieved overall better early energy balance.
Resumo:
The World Wide Web, the world¿s largest resource for information, has evolved from organizing information using controlled, top-down taxonomies to a bottom up approach that emphasizes assigning meaning to data via mechanisms such as the Social Web (Web 2.0). Tagging adds meta-data, (weak semantics) to the content available on the web. This research investigates the potential for repurposing this layer of meta-data. We propose a multi-phase approach that exploits user-defined tags to identify and extract domain-level concepts. We operationalize this approach and assess its feasibility by application to a publicly available tag repository. The paper describes insights gained from implementing and applying the heuristics contained in the approach, as well as challenges and implications of repurposing tags for extraction of domain-level concepts.
Resumo:
1. Biogeographical models of species' distributions are essential tools for assessing impacts of changing environmental conditions on natural communities and ecosystems. Practitioners need more reliable predictions to integrate into conservation planning (e.g. reserve design and management). 2. Most models still largely ignore or inappropriately take into account important features of species' distributions, such as spatial autocorrelation, dispersal and migration, biotic and environmental interactions. Whether distributions of natural communities or ecosystems are better modelled by assembling individual species' predictions in a bottom-up approach or modelled as collective entities is another important issue. An international workshop was organized to address these issues. 3. We discuss more specifically six issues in a methodological framework for generalized regression: (i) links with ecological theory; (ii) optimal use of existing data and artificially generated data; (iii) incorporating spatial context; (iv) integrating ecological and environmental interactions; (v) assessing prediction errors and uncertainties; and (vi) predicting distributions of communities or collective properties of biodiversity. 4. Synthesis and applications. Better predictions of the effects of impacts on biological communities and ecosystems can emerge only from more robust species' distribution models and better documentation of the uncertainty associated with these models. An improved understanding of causes of species' distributions, especially at their range limits, as well as of ecological assembly rules and ecosystem functioning, is necessary if further progress is to be made. A better collaborative effort between theoretical and functional ecologists, ecological modellers and statisticians is required to reach these goals.
Resumo:
Further genetic gains in wheat yield are required to match expected increases in demand. This may require the identification of physiological attributes able to produce such improvement, as well as the genetic bases controlling those traits in order to facilitate their manipulation. In the present paper, a theoretical framework of source and sink limitation to wheat yield is presented and the fine-tuning of crop development as an alternative for increasing yield potential is discussed. Following a top-down approach, most crop physiologists have agreed that the main attribute explaining past genetic gains in yield was harvest index (HI). By virtue of previous success, no further gains may be expected in HI and an alternative must be found. Using a bottom-up approach, the present paper firstly provides evidence on the generalized sink-limited condition of grain growth, determining that for further increases in yield potential, sink strength during grain filling has to be increased. The focus should be on further increasing grain number per m2, through fine-tuning pre-anthesis developmental patterns. The phase of rapid spike growth period (RSGP) is critical for grain number determination and increasing spike growth during pre-anthesis would result in an increased number of grains. This might be achieved by lengthening the duration of the phase (though without altering flowering time), as there is genotypic variation in the proportion of pre-anthesis time elapsed either before or after the onset of the stem elongation phase. Photoperiod sensitivity during RSGP could be then used as a genetic tool to further increase grain number, since slower development results in smoother floret development and more floret primordia achieve the fertile floret stage, able to produce a grain. Far less progress has been achieved on the genetic control of this attribute. None of the well-known major Ppd alleles seems to be consistently responsible for RSGP sensitivity. Alternatives for identifying the genetic factors responsible for this sensitivity (e.g. quantitative trait locus (QTL) identification in mapping populations) are being considered.
Resumo:
Environmental sounds are highly complex stimuli whose recognition depends on the interaction of top-down and bottom-up processes in the brain. Their semantic representations were shown to yield repetition suppression effects, i. e. a decrease in activity during exposure to a sound that is perceived as belonging to the same source as a preceding sound. Making use of the high spatial resolution of 7T fMRI we have investigated the representations of sound objects within early-stage auditory areas on the supratemporal plane. The primary auditory cortex was identified by means of tonotopic mapping and the non-primary areas by comparison with previous histological studies. Repeated presentations of different exemplars of the same sound source, as compared to the presentation of different sound sources, yielded significant repetition suppression effects within a subset of early-stage areas. This effect was found within the right hemisphere in primary areas A1 and R as well as two non-primary areas on the antero-medial part of the planum temporale, and within the left hemisphere in A1 and a non-primary area on the medial part of Heschl's gyrus. Thus, several, but not all early-stage auditory areas encode the meaning of environmental sounds.
Resumo:
This study presents an innovative methodology for forensic science image analysis for event reconstruction. The methodology is based on experiences from real cases. It provides real added value to technical guidelines such as standard operating procedures (SOPs) and enriches the community of practices at stake in this field. This bottom-up solution outlines the many facets of analysis and the complexity of the decision-making process. Additionally, the methodology provides a backbone for articulating more detailed and technical procedures and SOPs. It emerged from a grounded theory approach; data from individual and collective interviews with eight Swiss and nine European forensic image analysis experts were collected and interpreted in a continuous, circular and reflexive manner. Throughout the process of conducting interviews and panel discussions, similarities and discrepancies were discussed in detail to provide a comprehensive picture of practices and points of view and to ultimately formalise shared know-how. Our contribution sheds light on the complexity of the choices, actions and interactions along the path of data collection and analysis, enhancing both the researchers' and participants' reflexivity.
Resumo:
Tämän diplomityön tavoitteena on määritellä ja antaa suuntaviivat yhtenäisen, harmonisoidun toimintajärjestelmän luomiseksi kansainvälisessä rakennustuotealan yrityksessä. Toimintajärjestelmä tarkoittaa tässä työssä laatujärjestelmän ISO 9001 sekä ympäristöjärjestelmän ISO 14001 muodostamaa kokonaisuutta. Lähtökohtaisesti yrityksessä on useita hajautettuja toimintajärjestelmiä. Tarkoituksena on antaa kehitysehdotus yhtenäisen kokonaisratkaisun toteuttamisesta järkevässä mittakaavassa. Työssä tutkitaan, millainen toimintajärjestelmä soveltuisi parhaiten yrityksen toimintaan tällä hetkellä ja lähitulevaisuudessa. Tutkimus on tehty sisäisen ja ulkoisen esikuva-analyysin sekä yrityksen sisällä luotujen toimintajärjestelmävaihtoehtojen avulla. Sisäinen esikuva-analyysi toteutettiin yhteistyössä kahden sisäisen yksikön kanssa. Ulkoisessa esikuva-analyysissä oli mukana neljä ulkoista yhteistyökumppania. Yrityksen sisällä luodut toimintajärjestelmävaihtoehdot käsittelevät erilaisia sertifiointiratkaisuja ja vaihtoehtoja arvioidaan SWOT-analyysillä yhdessä ulkoisesta esikuva-analyysistä saatujen tuloksien avulla. Työssä otetaan lisäksi kantaa toimintajärjestelmän dokumenttien hallintaan ja toimintajärjestelmän kehittämiseen. Työn tulosten perusteella voidaan nähdä, että toimintajärjestelmän sertifiointi tulevaisuudessa tapahtuisi parhaiten liiketoimintaryhmittäin. Toimintajärjestelmän sähköistä dokumenttien hallintaa tulisi tehostaa ja toimintajärjestelmästä tulisi olla liityntäpinta yrityksen toiminnanohjausjärjestelmään. Toimintajärjestelmän dokumenttien siirrettävyys eri organisaatiotasoille tehostuu yhtenäisen sähköisen järjestelmän myötä ja yrityksen virallisen kielen käyttämistä voitaisiin laajentaa toimintajärjestelmädokumentaatiossa. Järjestelmän rakenteellinen kehittäminen toteutettaisiin kehittämistoimenpiteinä organisaatiossa alhaalta ylöspäin, konsernitason ohjeistus huomioiden.