11 resultados para Maximizing

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Degeneration of the intervertebral disc, sometimes associated with low back pain and abnormal spinal motions, represents a major health issue with high costs. A non-invasive degeneration assessment via qualitative or quantitative MRI (magnetic resonance imaging) is possible, yet, no relation between mechanical properties and T2 maps of the intervertebral disc (IVD) has been considered, albeit T2 relaxation time values quantify the degree of degeneration. Therefore, MRI scans and mechanical tests were performed on 14 human lumbar intervertebral segments freed from posterior elements and all soft tissues excluding the IVD. Degeneration was evaluated in each specimen using morphological criteria, qualitative T2 weighted images and quantitative axial T2 map data and stiffness was calculated from the load-deflection curves of in vitro compression, torsion, lateral bending and flexion/extension tests. In addition to mean T2, the OTSU threshold of T2 (TOTSU), a robust and automatic histogram-based method that computes the optimal threshold maximizing the distinction of two classes of values, was calculated for anterior, posterior, left and right regions of each annulus fibrosus (AF). While mean T2 and degeneration schemes were not related to the IVDs' mechanical properties, TOTSU computed in the posterior AF correlated significantly with those classifications as well as with all stiffness values. TOTSU should therefore be included in future degeneration grading schemes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

New directly acting antivirals (DAAs) that inhibit hepatitis C virus (HCV) replication are increasingly used for the treatment of chronic hepatitis C. A marked pharmacokinetic variability and a high potential for drug-drug interactions between DAAs and numerous drug classes have been identified. In addition, ribavirin (RBV), commonly associated with hemolytic anemia, often requires dose adjustment, advocating for therapeutic drug monitoring (TDM) in patients under combined antiviral therapy. However, an assay for the simultaneous analysis of RBV and DAAs constitutes an analytical challenge because of the large differences in polarity among these drugs, ranging from hydrophilic (RBV) to highly lipophilic (telaprevir [TVR]). Moreover, TVR is characterized by erratic behavior on standard octadecyl-based reversed-phase column chromatography and must be separated from VRT-127394, its inactive C-21 epimer metabolite. We have developed a convenient assay employing simple plasma protein precipitation, followed by high-performance liquid chromatography coupled to tandem mass spectrometry (HPLC-MS/MS) for the simultaneous determination of levels of RBV, boceprevir, and TVR, as well as its metabolite VRT-127394, in plasma. This new, simple, rapid, and robust HPLC-MS/MS assay offers an efficient method of real-time TDM aimed at maximizing efficacy while minimizing the toxicity of antiviral therapy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A major objective in ecology is to find general patterns, and to establish the rules and underlying mechanisms that generate those patterns. Nevertheless, most of our current insights in ecology are based on case studies of a single or few species, whereas multi-species experimental studies remain rare. We underline the power of the multi-species experimental approach for addressing general ecological questions, e. g. on species environmental responses or on patterns of among-and within-species variation. We present simulations that show that the accuracy of estimates of between-group differences is increased by maximizing the number of species rather than the number of populations or individuals per species. Thus, the more species a multi-species experiment includes, the more powerful it is. In addition, we discuss some inevitable methodological challenges of multi-species experiments. While we acknowledge the value of single-or few-species experiments, we strongly advocate the use of multi-species experiments for addressing ecological questions at a more general level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this work was to clarify the mechanism taking place in field-enhanced sample injection coupled to sweeping and micellar EKC (FESI-Sweep-MEKC), with the utilization of two acidic high-conductivity buffers (HCBs), phosphoric acid or sodium phosphate buffer, in view of maximizing sensitivity enhancements. Using cationic model compounds in acidic media, a chemometric approach and simulations with SIMUL5 were implemented. Experimental design first enabled to identify the significant factors and their potential interactions. Simulation demonstrates the formation of moving boundaries during sample injection, which originate at the initial sample/HCB and HCB/buffer discontinuities and gradually change the compositions of HCB and BGE. With sodium phosphate buffer, the HCB conductivity increased during the injection, leading to a more efficient preconcentration by staking (about 1.6 times) than with phosphoric acid alone, for which conductivity decreased during injection. For the same injection time at constant voltage, however, a lower amount of analytes was injected with sodium phosphate buffer than with phosphoric acid. Consequently sensitivity enhancements were lower for the whole FESI-Sweep-MEKC process. This is why, in order to maximize sensitivity enhancements, it is proposed to work with sodium phosphate buffer as HCB and to use constant current during sample injection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RATIONALE People often face decisions that pit self-interested behavior aimed at maximizing personal reward against normative behavior such as acting cooperatively, which benefits others. The threat of social sanctions for defying the fairness norm prevents people from behaving overly selfish. Thus, normative behavior is influenced by both seeking rewards and avoiding punishment. However, the neurochemical processes mediating the impact of these influences remain unknown. Several lines of evidence link the dopaminergic system to reward and punishment processing, respectively, but this evidence stems from studies in non-social contexts. OBJECTIVES The present study investigates dopaminergic drug effects on individuals' reward seeking and punishment avoidance in social interaction. METHODS Two-hundred one healthy male participants were randomly assigned to receive 300 mg of L-3,4-dihydroxyphenylalanine (L-DOPA) or a placebo before playing an economic bargaining game. This game involved two conditions, one in which unfair behavior could be punished and one in which unfair behavior could not be punished. RESULTS In the absence of punishment threats, L-DOPA administration led to more selfish behavior, likely mediated through an increase in reward seeking. In contrast, L-DOPA administration had no significant effect on behavior when faced with punishment threats. CONCLUSIONS The results of this study broaden the role of the dopaminergic system in reward seeking to human social interactions. We could show that even a single dose of a dopaminergic drug may bring selfish behavior to the fore, which in turn may shed new light on potential causal relationships between the dopaminergic system and norm abiding behaviors in certain clinical subpopulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing workload conditions, such as number of connected users, application performance might suffer, leading to violations of Service Level Agreements (SLA) and possible inefficient use of hardware resources. Combining dynamic application requirements with the increased use of virtualised computing resources creates a challenging resource Management context for application and cloud-infrastructure owners. In such complex environments, business entities use SLAs as a means for specifying quantitative and qualitative requirements of services. There are several challenges in running distributed enterprise applications in cloud environments, ranging from the instantiation of service VMs in the correct order using an adequate quantity of computing resources, to adapting the number of running services in response to varying external loads, such as number of users. The application owner is interested in finding the optimum amount of computing and network resources to use for ensuring that the performance requirements of all her/his applications are met. She/he is also interested in appropriately scaling the distributed services so that application performance guarantees are maintained even under dynamic workload conditions. Similarly, the infrastructure Providers are interested in optimally provisioning the virtual resources onto the available physical infrastructure so that her/his operational costs are minimized, while maximizing the performance of tenants’ applications. Motivated by the complexities associated with the management and scaling of distributed applications, while satisfying multiple objectives (related to both consumers and providers of cloud resources), this thesis proposes a cloud resource management platform able to dynamically provision and coordinate the various lifecycle actions on both virtual and physical cloud resources using semantically enriched SLAs. The system focuses on dynamic sizing (scaling) of virtual infrastructures composed of virtual machines (VM) bounded application services. We describe several algorithms for adapting the number of VMs allocated to the distributed application in response to changing workload conditions, based on SLA-defined performance guarantees. We also present a framework for dynamic composition of scaling rules for distributed service, which used benchmark-generated application Monitoring traces. We show how these scaling rules can be combined and included into semantic SLAs for controlling allocation of services. We also provide a detailed description of the multi-objective infrastructure resource allocation problem and various approaches to satisfying this problem. We present a resource management system based on a genetic algorithm, which performs allocation of virtual resources, while considering the optimization of multiple criteria. We prove that our approach significantly outperforms reactive VM-scaling algorithms as well as heuristic-based VM-allocation approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Microbial functions in the host physiology are a result of the microbiota-host co-evolution. We show that cold exposure leads to marked shift of the microbiota composition, referred to as cold microbiota. Transplantation of the cold microbiota to germ-free mice is sufficient to increase insulin sensitivity of the host and enable tolerance to cold partly by promoting the white fat browning, leading to increased energy expenditure and fat loss. During prolonged cold, however, the body weight loss is attenuated, caused by adaptive mechanisms maximizing caloric uptake and increasing intestinal, villi, and microvilli lengths. This increased absorptive surface is transferable with the cold microbiota, leading to altered intestinal gene expression promoting tissue remodeling and suppression of apoptosis-the effect diminished by co-transplanting the most cold-downregulated strain Akkermansia muciniphila during the cold microbiota transfer. Our results demonstrate the microbiota as a key factor orchestrating the overall energy homeostasis during increased demand.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The lifespan of plants ranges from a few weeks in annuals to thousands of years in trees. It is hard to explain such extreme longevity considering that DNA replication errors inevitably cause mutations. Without purging through meiotic recombination, the accumulation of somatic mutations will eventually result in mutational meltdown, a phenomenon known as Muller’s ratchet. Nevertheless, the lifespan of trees is limited more often by incidental disease or structural damage than by genetic aging. The key determinants of tree architecture are the axillary meristems, which form in the axils of leaves and grow out to form branches. The number of branches is low in annual plants, but in perennial plants iterative branching can result in thousands of terminal branches. Here, we use stem cell ablation and quantitative cell-lineage analysis to show that axillary meristems are set aside early, analogous to the metazoan germline. While neighboring cells divide vigorously, axillary meristem precursors maintain a quiescent state, with only 7–9 cell divisions occurring between the apical and axillary meristem. During iterative branching, the number of branches increases exponentially, while the number of cell divisions increases linearly. Moreover, computational modeling shows that stem cell arrangement and positioning of axillary meristems distribute somatic mutations around the main shoot, preventing their fixation and maximizing genetic heterogeneity. These features slow down Muller’s ratchet and thereby extend lifespan.