921 resultados para Bottom-up processes


Relevância:

80.00% 80.00%

Publicador:

Resumo:

A simple holographic model is presented and analyzed that describes chiral symmetry breaking and the physics of the meson sector in QCD. This is a bottom-up model that incorporates string theory ingredients like tachyon condensation which is expected to be the main manifestation of chiral symmetry breaking in the holographic context. As a model for glue the Kuperstein-Sonnenschein background is used. The structure of the flavor vacuum is analyzed in the quenched approximation. Chiral symmetry breaking is shown at zero temperature. Above the deconfinement transition chiral symmetry is restored. A complete holographic renormalization is performed and the chiral condensate is calculated for different quark masses both at zero and non-zero temperatures. The 0++, 0¿+, 1++, 1¿¿ meson trajectories are analyzed and their masses and decay constants are computed. The asymptotic trajectories are linear. The model has one phenomenological parameter beyond those of QCD that affects the 1++, 0¿+ sectors. Fitting this parameter we obtain very good agreement with data. The model improves in several ways the popular hard-wall and soft wall bottom-up models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the vast majority of bottom-up proteomics studies, protein digestion is performed using only mammalian trypsin. Although it is clearly the best enzyme available, the sole use of trypsin rarely leads to complete sequence coverage, even for abundant proteins. It is commonly assumed that this is because many tryptic peptides are either too short or too long to be identified by RPLC-MS/MS. We show through in silico analysis that 20-30% of the total sequence of three proteomes (Schizosaccharomyces pombe, Saccharomyces cerevisiae, and Homo sapiens) is expected to be covered by Large post-Trypsin Peptides (LpTPs) with M(r) above 3000 Da. We then established size exclusion chromatography to fractionate complex yeast tryptic digests into pools of peptides based on size. We found that secondary digestion of LpTPs followed by LC-MS/MS analysis leads to a significant increase in identified proteins and a 32-50% relative increase in average sequence coverage compared to trypsin digestion alone. Application of the developed strategy to analyze the phosphoproteomes of S. pombe and of a human cell line identified a significant fraction of novel phosphosites. Overall our data indicate that specific targeting of LpTPs can complement standard bottom-up workflows to reveal a largely neglected portion of the proteome.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A simple holographic model is presented and analyzed that describes chiral symmetry breaking and the physics of the meson sector in QCD. This is a bottom-up model that incorporates string theory ingredients like tachyon condensation which is expected to be the main manifestation of chiral symmetry breaking in the holographic context. As a model for glue the Kuperstein-Sonnenschein background is used. The structure of the flavor vacuum is analyzed in the quenched approximation. Chiral symmetry breaking is shown at zero temperature. Above the deconfinement transition chiral symmetry is restored. A complete holographic renormalization is performed and the chiral condensate is calculated for different quark masses both at zero and non-zero temperatures. The 0++, 0¿+, 1++, 1¿¿ meson trajectories are analyzed and their masses and decay constants are computed. The asymptotic trajectories are linear. The model has one phenomenological parameter beyond those of QCD that affects the 1++, 0¿+ sectors. Fitting this parameter we obtain very good agreement with data. The model improves in several ways the popular hard-wall and soft wall bottom-up models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present an agent-based model with the aim of studying how macro-level dynamics of spatial distances among interacting individuals in a closed space emerge from micro-level dyadic and local interactions. Our agents moved on a lattice (referred to as a room) using a model implemented in a computer program called P-Space in order to minimize their dissatisfaction, defined as a function of the discrepancy between the real distance and the ideal, or desired, distance between agents. Ideal distances evolved in accordance with the agent's personal and social space, which changed throughout the dynamics of the interactions among the agents. In the first set of simulations we studied the effects of the parameters of the function that generated ideal distances, and in a second set we explored how group macrolevel behavior depended on model parameters and other variables. We learned that certain parameter values yielded consistent patterns in the agents' personal and social spaces, which in turn led to avoidance and approaching behaviors in the agents. We also found that the spatial behavior of the group of agents as a whole was influenced by the values of the model parameters, as well as by other variables such as the number of agents. Our work demonstrates that the bottom-up approach is a useful way of explaining macro-level spatial behavior. The proposed model is also shown to be a powerful tool for simulating the spatial behavior of groups of interacting individuals.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: In 2005, findings of the first "cost of disorders of the brain in Europe" study of the European Brain Council (EBC) showed that these costs cause a substantial economic burden to the Swiss society. In 2010 an improved update with a broader range of disorders has been analysed. This report shows the new findings for Switzerland and discusses changes. METHODS: Data are derived from the EBC 2010 census study that estimates 12-month prevalence of 12 groups of disorders of the brain and calculates costs (direct health-care costs, direct non-medical costs and indirect costs) by combining top-down and bottom up cost approaches using existing data. RESULTS: The most frequent disorder was headache (2.3 million). Anxiety disorders were found in 1 million persons and sleep disorders in 700,000 persons. Annual costs for all assessed disorders total to 14.5 billion Euro corresponding to about 1,900 EUR per inhabitant per year. Mood, psychotic disorders and dementias (appr. 2 billion EUR each) were most costly. Costs per person were highest for neurological/neurosurgery-relevant disorders, e.g. neuromuscular disorders, brain tumour and multiple sclerosis (38,000 to 24,000 EUR). CONCLUSION: The estimates of the EBC 2010 study for Switzerland provide a basis for health care planning. Increase in size and costs compared to 2005 are mostly due to the inclusion of new disorders (e.g., sleep disorders), or the re-definition of others (e.g., headache) and to an increase in younger cohorts. We suggest coordinated research and preventive measures coordinated between governmental bodies, private health-care and pharmaceutical companies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mass spectrometry (MS) is currently the most sensitive and selective analytical technique for routine peptide and protein structure analysis. Top-down proteomics is based on tandem mass spectrometry (MS/ MS) of intact proteins, where multiply charged precursor ions are fragmented in the gas phase, typically by electron transfer or electron capture dissociation, to yield sequence-specific fragment ions. This approach is primarily used for the study of protein isoforms, including localization of post-translational modifications and identification of splice variants. Bottom-up proteomics is utilized for routine high-throughput protein identification and quantitation from complex biological samples. The proteins are first enzymatically digested into small (usually less than ca. 3 kDa) peptides, these are identified by MS or MS/MS, usually employing collisional activation techniques. To overcome the limitations of these approaches while combining their benefits, middle-down proteomics has recently emerged. Here, the proteins are digested into long (3-15 kDa) peptides via restricted proteolysis followed by the MS/MS analysis of the obtained digest. With advancements of high-resolution MS and allied techniques, routine implementation of the middle-down approach has been made possible. Herein, we present the liquid chromatography (LC)-MS/MS-based experimental design of our middle-down proteomic workflow coupled with post-LC supercharging.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Portland cement concrete (PCC) pavement undergoes repeated environmental load-related deflection resulting from temperature and moisture variations across pavement depth. This has been recognized as resulting in PCC pavement curling and warping since the mid-1920s. Slab curvature can be further magnified under repeated traffic loads and may ultimately lead to fatigue failures, including top-down and bottom-up transverse, longitudinal, and corner cracking. It is therefore significant to measure the “true” degree of curling and warping in PCC pavements, not only for quality control (QC) and quality assurance (QA) purposes, but also for better understanding of its relationship to long-term pavement performance. Although several approaches and devices—including linear variable differential transducers (LVDTs), digital indicators, and some profilers—have been proposed for measuring curling and warping, their application in the field is subject to cost, inconvenience, and complexity of operation. This research therefore explores developing an economical and simple device for measuring curling and warping in concrete pavements with accuracy comparable to or better than existing methodologies. Technical requirements were identified to establish assessment criteria for development, and field tests were conducted to modify the device to further enhancement. The finalized device is about 12 inches in height and 18 pounds in weight, and its manufacturing cost is just $320. Detailed development procedures and evaluation results for the new curling and warping measuring device are presented and discussed, with a focus on achieving reliable curling and warping measurements in a cost effective manner.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In bottom-up proteomics, rapid and efficient protein digestion is crucial for data reliability. However, sample preparation remains one of the rate-limiting steps in proteomics workflows. In this study, we compared the conventional trypsin digestion procedure with two accelerated digestion protocols based on shorter reaction times and microwave-assisted digestion for the preparation of membrane-enriched protein fractions of the human pathogenic bacterium Staphylococcus aureus. Produced peptides were analyzed by Shotgun IPG-IEF, a methodology relying on separation of peptides by IPG-IEF before the conventional LC-MS/MS steps of shotgun proteomics. Data obtained on two LC-MS/MS platforms showed that accelerated digestion protocols, especially the one relying on microwave irradiation, enhanced the cleavage specificity of trypsin and thus improved the digestion efficiency especially for hydrophobic and membrane proteins. The combination of high-throughput proteomics with accelerated and efficient sample preparation should enhance the practicability of proteomics by reducing the time from sample collection to obtaining the results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: Critically ill patients are at high risk of malnutrition. Insufficient nutritional support still remains a widespread problem despite guidelines. The aim of this study was to measure the clinical impact of a two-step interdisciplinary quality nutrition program. DESIGN: Prospective interventional study over three periods (A, baseline; B and C, intervention periods). SETTING: Mixed intensive care unit within a university hospital. PATIENTS: Five hundred seventy-two patients (age 59 ± 17 yrs) requiring >72 hrs of intensive care unit treatment. INTERVENTION: Two-step quality program: 1) bottom-up implementation of feeding guideline; and 2) additional presence of an intensive care unit dietitian. The nutrition protocol was based on the European guidelines. MEASUREMENTS AND MAIN RESULTS: Anthropometric data, intensive care unit severity scores, energy delivery, and cumulated energy balance (daily, day 7, and discharge), feeding route (enteral, parenteral, combined, none-oral), length of intensive care unit and hospital stay, and mortality were collected. Altogether 5800 intensive care unit days were analyzed. Patients in period A were healthier with lower Simplified Acute Physiologic Scale and proportion of "rapidly fatal" McCabe scores. Energy delivery and balance increased gradually: impact was particularly marked on cumulated energy deficit on day 7 which improved from -5870 kcal to -3950 kcal (p < .001). Feeding technique changed significantly with progressive increase of days with nutrition therapy (A: 59% days, B: 69%, C: 71%, p < .001), use of enteral nutrition increased from A to B (stable in C), and days on combined and parenteral nutrition increased progressively. Oral energy intakes were low (mean: 385 kcal*day, 6 kcal*kg*day ). Hospital mortality increased with severity of condition in periods B and C. CONCLUSION: A bottom-up protocol improved nutritional support. The presence of the intensive care unit dietitian provided significant additional progression, which were related to early introduction and route of feeding, and which achieved overall better early energy balance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The World Wide Web, the world¿s largest resource for information, has evolved from organizing information using controlled, top-down taxonomies to a bottom up approach that emphasizes assigning meaning to data via mechanisms such as the Social Web (Web 2.0). Tagging adds meta-data, (weak semantics) to the content available on the web. This research investigates the potential for repurposing this layer of meta-data. We propose a multi-phase approach that exploits user-defined tags to identify and extract domain-level concepts. We operationalize this approach and assess its feasibility by application to a publicly available tag repository. The paper describes insights gained from implementing and applying the heuristics contained in the approach, as well as challenges and implications of repurposing tags for extraction of domain-level concepts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

1. Biogeographical models of species' distributions are essential tools for assessing impacts of changing environmental conditions on natural communities and ecosystems. Practitioners need more reliable predictions to integrate into conservation planning (e.g. reserve design and management). 2. Most models still largely ignore or inappropriately take into account important features of species' distributions, such as spatial autocorrelation, dispersal and migration, biotic and environmental interactions. Whether distributions of natural communities or ecosystems are better modelled by assembling individual species' predictions in a bottom-up approach or modelled as collective entities is another important issue. An international workshop was organized to address these issues. 3. We discuss more specifically six issues in a methodological framework for generalized regression: (i) links with ecological theory; (ii) optimal use of existing data and artificially generated data; (iii) incorporating spatial context; (iv) integrating ecological and environmental interactions; (v) assessing prediction errors and uncertainties; and (vi) predicting distributions of communities or collective properties of biodiversity. 4. Synthesis and applications. Better predictions of the effects of impacts on biological communities and ecosystems can emerge only from more robust species' distribution models and better documentation of the uncertainty associated with these models. An improved understanding of causes of species' distributions, especially at their range limits, as well as of ecological assembly rules and ecosystem functioning, is necessary if further progress is to be made. A better collaborative effort between theoretical and functional ecologists, ecological modellers and statisticians is required to reach these goals.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Further genetic gains in wheat yield are required to match expected increases in demand. This may require the identification of physiological attributes able to produce such improvement, as well as the genetic bases controlling those traits in order to facilitate their manipulation. In the present paper, a theoretical framework of source and sink limitation to wheat yield is presented and the fine-tuning of crop development as an alternative for increasing yield potential is discussed. Following a top-down approach, most crop physiologists have agreed that the main attribute explaining past genetic gains in yield was harvest index (HI). By virtue of previous success, no further gains may be expected in HI and an alternative must be found. Using a bottom-up approach, the present paper firstly provides evidence on the generalized sink-limited condition of grain growth, determining that for further increases in yield potential, sink strength during grain filling has to be increased. The focus should be on further increasing grain number per m2, through fine-tuning pre-anthesis developmental patterns. The phase of rapid spike growth period (RSGP) is critical for grain number determination and increasing spike growth during pre-anthesis would result in an increased number of grains. This might be achieved by lengthening the duration of the phase (though without altering flowering time), as there is genotypic variation in the proportion of pre-anthesis time elapsed either before or after the onset of the stem elongation phase. Photoperiod sensitivity during RSGP could be then used as a genetic tool to further increase grain number, since slower development results in smoother floret development and more floret primordia achieve the fertile floret stage, able to produce a grain. Far less progress has been achieved on the genetic control of this attribute. None of the well-known major Ppd alleles seems to be consistently responsible for RSGP sensitivity. Alternatives for identifying the genetic factors responsible for this sensitivity (e.g. quantitative trait locus (QTL) identification in mapping populations) are being considered.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study presents an innovative methodology for forensic science image analysis for event reconstruction. The methodology is based on experiences from real cases. It provides real added value to technical guidelines such as standard operating procedures (SOPs) and enriches the community of practices at stake in this field. This bottom-up solution outlines the many facets of analysis and the complexity of the decision-making process. Additionally, the methodology provides a backbone for articulating more detailed and technical procedures and SOPs. It emerged from a grounded theory approach; data from individual and collective interviews with eight Swiss and nine European forensic image analysis experts were collected and interpreted in a continuous, circular and reflexive manner. Throughout the process of conducting interviews and panel discussions, similarities and discrepancies were discussed in detail to provide a comprehensive picture of practices and points of view and to ultimately formalise shared know-how. Our contribution sheds light on the complexity of the choices, actions and interactions along the path of data collection and analysis, enhancing both the researchers' and participants' reflexivity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of my thesis is to assess mechanisms of ecological community control in macroalgal communities in the Baltic Sea. In the top-down model, predatory fish feed on invertebrate mesograzers, releasing algae partly from grazing pressure. Such a reciprocal relationship is called trophic cascade. In the bottom-up model, nutrients increase biomass in the food chain. The nutrients are first assimilated by algae and, via food chain, increase also abundance of grazers and predators. Previous studies on oceanic shores have described these two regulative mechanisms in the grazer - alga link, but how they interact in the trophic cascades from fish to algae is still inadequately known. Because the top-down and bottom-up mechanisms are predicted to depend on environmental disturbances, such as wave stress and light, I have studied these models at two distinct water depths. There are five factorial field experiments behind the thesis, which were all conducted in the Finnish Archipelago Sea. In all the experiments, I studied macroalgal colonization - either density, filament length or biomass - on submerged colonization substrates. By excluding predatory fish and mesograzers from the algal communities, the studies compared the strength of the top-down control to natural algal communities. A part of the experimental units were, in addition, exposed to enriched nitrogen and phosphorus concentrations, which enabled testing of bottom-up control. These two models of community control were further investigated in shallow (<1 m) and deep (ca. 3 m) water. Moreover, the control mechanisms were also expected to depend on grazer species. Therefore different grazer species were enclosed into experimental units and their impacts on macroalgal communities were followed specifically. The community control in the Baltic rocky shores was found to follow theoretical predictions, which have not been confirmed by field studies before. Predatory fish limited grazing impact, which was seen as denser algal communities and longer algal filaments. Nutrient enrichment increased density and filament length of annual algae and, thus, changed the species composition of the algal community. The perennial alga Fucus vesiculosusA and the red alga Ceramium tenuicorne suffered from the increased nutrient availabilities. The enriched nutrient conditions led to denser grazer fauna, thereby causing strong top-down control over both the annual and perennial macroalgae. The strength of the top-down control seemed to depend on the density and diversity of grazers and predators as well as on the species composition of macroalgal assemblages. The nutrient enrichment led to, however, weaker limiting impact of predatory fish on grazer fauna, because fish stocks did not respond as quickly to enhanced resources in the environment as the invertebrate fauna. According to environmental stress model, environmental disturbances weaken the top-down control. For example, on a wave-exposed shore, wave stress causes more stress to animals close to the surface than deeper on the shore. Mesograzers were efficient consumers at both the depths, while predation by fish was weaker in shallow water. Thus, the results supported the environmental stress model, which predicts that environmental disturbance affects stronger the higher a species is in the food chain. This thesis assessed the mechanisms of community control in three-level food chains and did not take into account higher predators. Such predators in the Baltic Sea are, for example, cormorant, seals, white-tailed sea eagle, cod and salmon. All these predatory species were recently or are currently under intensive fishing, hunting and persecution, and their stocks have only recently increased in the region. Therefore, it is possible that future densities of top predators may yet alter the strengths of the controlling mechanisms in the Baltic littoral zone.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of this research was to study the role of key individuals in facilitation of technology enabled bottom-up innovation in large organization context. The development of innovation was followed from the point of view of individual actor (key individual) in two cases, through three levels: individual, team and organization, by using knowledge creation and innovation models. This study provides theoretical synthesis and framework through which the study is driven. The results of the study indicate, that in bottom-up initiated innovations the role of key individuals is still crucial, but innovation today is collective effort and there acts several entrepreneurial key individuals: innovator, user champion and organizational sponsor, whose collaboration and developing interaction drives innovation further. The team work is functional and fluent, but it meets great problems in interaction with organization. The large organizations should develop its practices and ability to react on emerging bottom-up initiations, in order to embed innovation to organization and gain sustainable innovation. In addition, bottom-up initiated innovations are demonstrations of peoples knowing, tacit knowledge and therefore renewing of an organization.