13 resultados para Bulk parameter approach

em Université de Lausanne, Switzerland


Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the context of Systems Biology, computer simulations of gene regulatory networks provide a powerful tool to validate hypotheses and to explore possible system behaviors. Nevertheless, modeling a system poses some challenges of its own: especially the step of model calibration is often difficult due to insufficient data. For example when considering developmental systems, mostly qualitative data describing the developmental trajectory is available while common calibration techniques rely on high-resolution quantitative data. Focusing on the calibration of differential equation models for developmental systems, this study investigates different approaches to utilize the available data to overcome these difficulties. More specifically, the fact that developmental processes are hierarchically organized is exploited to increase convergence rates of the calibration process as well as to save computation time. Using a gene regulatory network model for stem cell homeostasis in Arabidopsis thaliana the performance of the different investigated approaches is evaluated, documenting considerable gains provided by the proposed hierarchical approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale for the purpose of improving predictions of groundwater flow and solute transport. However, extending corresponding approaches to the regional scale still represents one of the major challenges in the domain of hydrogeophysics. To address this problem, we have developed a regional-scale data integration methodology based on a two-step Bayesian sequential simulation approach. Our objective is to generate high-resolution stochastic realizations of the regional-scale hydraulic conductivity field in the common case where there exist spatially exhaustive but poorly resolved measurements of a related geophysical parameter, as well as highly resolved but spatially sparse collocated measurements of this geophysical parameter and the hydraulic conductivity. To integrate this multi-scale, multi-parameter database, we first link the low- and high-resolution geophysical data via a stochastic downscaling procedure. This is followed by relating the downscaled geophysical data to the high-resolution hydraulic conductivity distribution. After outlining the general methodology of the approach, we demonstrate its application to a realistic synthetic example where we consider as data high-resolution measurements of the hydraulic and electrical conductivities at a small number of borehole locations, as well as spatially exhaustive, low-resolution estimates of the electrical conductivity obtained from surface-based electrical resistivity tomography. The different stochastic realizations of the hydraulic conductivity field obtained using our procedure are validated by comparing their solute transport behaviour with that of the underlying ?true? hydraulic conductivity field. We find that, even in the presence of strong subsurface heterogeneity, our proposed procedure allows for the generation of faithful representations of the regional-scale hydraulic conductivity structure and reliable predictions of solute transport over long, regional-scale distances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fatty acids from cocoa butters of different origins, varieties, and suppliers and a number of cocoa butter equivalents (Illexao 30-61, Illexao 30-71, Illexao 30-96, Choclin, Coberine, Chocosine-Illipe, Chocosine-Shea, Shokao, Akomax, Akonord, and Ertina) were investigated by bulk stable carbon isotope analysis and compound specific isotope analysis. The interpretation is based on principal component analysis combining the fatty acid concentrations and the bulk and molecular isotopic data. The scatterplot of the two first principal components allowed detection of the addition of vegetable fats to cocoa butters. Enrichment in heavy carbon isotope (C-13) of the bulk cocoa butter and of the individual fatty acids is related to mixing with other vegetable fats and possibly to thermally or oxidatively induced degradation during processing (e.g., drying and roasting of the cocoa beans or deodorization of the pressed fat) or storage. The feasibility of the analytical approach for authenticity assessment is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper proposes an approach aimed at detecting optimal model parameter combinations to achieve the most representative description of uncertainty in the model performance. A classification problem is posed to find the regions of good fitting models according to the values of a cost function. Support Vector Machine (SVM) classification in the parameter space is applied to decide if a forward model simulation is to be computed for a particular generated model. SVM is particularly designed to tackle classification problems in high-dimensional space in a non-parametric and non-linear way. SVM decision boundaries determine the regions that are subject to the largest uncertainty in the cost function classification, and, therefore, provide guidelines for further iterative exploration of the model space. The proposed approach is illustrated by a synthetic example of fluid flow through porous media, which features highly variable response due to the parameter values' combination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: HIV-1 RNA viral load is a key parameter for reliable treatment monitoring of HIV-1 infection. Accurate HIV-1 RNA quantitation can be impaired by primer and probe sequence polymorphisms as a result of tremendous genetic diversity and ongoing evolution of HIV-1. A novel dual HIV-1 target amplification approach was realized in the quantitative COBAS AmpliPrep/COBAS TaqMan HIV-1 Test, v2.0 (HIV-1 TaqMan test v2.0) to cope with the high genetic diversity of the virus. OBJECTIVES AND STUDY DESIGN: The performance of the new assay was evaluated for sensitivity, dynamic range, precision, subtype inclusivity, diagnostic and analytical specificity, interfering substances, and correlation with the COBAS AmpliPrep/COBAS TaqMan HIV-1 (HIV-1 TaqMan test v1.0) predecessor test in patients specimens. RESULTS: The new assay demonstrated a sensitivity of 20 copies/mL, a linear measuring range of 20-10,000,000 copies/mL, with a lower limit of quantitation of 20 copies/mL. HIV-1 Group M subtypes and HIV-1 Group O were quantified within +/-0.3 log(10) of the assigned titers. Specificity was 100% in 660 tested specimens, no cross reactivity was found for 15 pathogens nor any interference for endogenous substances or 29 drugs. Good comparability with the predecessor assay was demonstrated in 82 positive patient samples. In selected clinical samples 35/66 specimens were found underquantitated in the predecessor assay; all were quantitated correctly in the new assay. CONCLUSIONS: The dual-target approach for the HIV-1 TaqMan test v2.0 enables superior HIV-1 Group M subtype coverage including HIV-1 Group O detection. Correct quantitation of specimens underquantitated in the HIV-1 TaqMan test v1.0 test was demonstrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The n-octanol/water partition coefficient (log Po/w) is a key physicochemical parameter for drug discovery, design, and development. Here, we present a physics-based approach that shows a strong linear correlation between the computed solvation free energy in implicit solvents and the experimental log Po/w on a cleansed data set of more than 17,500 molecules. After internal validation by five-fold cross-validation and data randomization, the predictive power of the most interesting multiple linear model, based on two GB/SA parameters solely, was tested on two different external sets of molecules. On the Martel druglike test set, the predictive power of the best model (N = 706, r = 0.64, MAE = 1.18, and RMSE = 1.40) is similar to six well-established empirical methods. On the 17-drug test set, our model outperformed all compared empirical methodologies (N = 17, r = 0.94, MAE = 0.38, and RMSE = 0.52). The physical basis of our original GB/SA approach together with its predictive capacity, computational efficiency (1 to 2 s per molecule), and tridimensional molecular graphics capability lay the foundations for a promising predictor, the implicit log P method (iLOGP), to complement the portfolio of drug design tools developed and provided by the SIB Swiss Institute of Bioinformatics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Normal ageing is associated with characteristic changes in brain microstructure. Although in vivo neuroimaging captures spatial and temporal patterns of age-related changes of anatomy at the macroscopic scale, our knowledge of the underlying (patho)physiological processes at cellular and molecular levels is still limited. The aim of this study is to explore brain tissue properties in normal ageing using quantitative magnetic resonance imaging (MRI) alongside conventional morphological assessment. Using a whole-brain approach in a cohort of 26 adults, aged 18-85years, we performed voxel-based morphometric (VBM) analysis and voxel-based quantification (VBQ) of diffusion tensor, magnetization transfer (MT), R1, and R2* relaxation parameters. We found age-related reductions in cortical and subcortical grey matter volume paralleled by changes in fractional anisotropy (FA), mean diffusivity (MD), MT and R2*. The latter were regionally specific depending on their differential sensitivity to microscopic tissue properties. VBQ of white matter revealed distinct anatomical patterns of age-related change in microstructure. Widespread and profound reduction in MT contrasted with local FA decreases paralleled by MD increases. R1 reductions and R2* increases were observed to a smaller extent in overlapping occipito-parietal white matter regions. We interpret our findings, based on current biophysical models, as a fingerprint of age-dependent brain atrophy and underlying microstructural changes in myelin, iron deposits and water. The VBQ approach we present allows for systematic unbiased exploration of the interaction between imaging parameters and extends current methods for detection of neurodegenerative processes in the brain. The demonstrated parameter-specific distribution patterns offer insights into age-related brain structure changes in vivo and provide essential baseline data for studying disease against a background of healthy ageing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Organization of the Thesis The remainder of the thesis comprises five chapters and a conclusion. The next chapter formalizes the envisioned theory into a tractable model. Section 2.2 presents a formal description of the model economy: the individual heterogeneity, the individual objective, the UI setting, the population dynamics and the equilibrium. The welfare and efficiency criteria for qualifying various equilibrium outcomes are proposed in section 2.3. The fourth section shows how the model-generated information can be computed. Chapter 3 transposes the model from chapter 2 in conditions that enable its use in the analysis of individual labor market strategies and their implications for the labor market equilibrium. In section 3.2 the Swiss labor market data sets, stylized facts, and the UI system are presented. The third section outlines and motivates the parameterization method. In section 3.4 the model's replication ability is evaluated and some aspects of the parameter choice are discussed. Numerical solution issues can be found in the appendix. Chapter 4 examines the determinants of search-strategic behavior in the model economy and its implications for the labor market aggregates. In section 4.2, the unemployment duration distribution is examined and related to search strategies. Section 4.3 shows how the search- strategic behavior is influenced by the UI eligibility and section 4.4 how it is determined by individual heterogeneity. The composition effects generated by search strategies in labor market aggregates are examined in section 4.5. The last section evaluates the model's replication of empirical unemployment escape frequencies reported in Sheldon [67]. Chapter 5 applies the model economy to examine the effects on the labor market equilibrium of shocks to the labor market risk structure, to the deep underlying labor market structure and to the UI setting. Section 5.2 examines the effects of the labor market risk structure on the labor market equilibrium and the labor market strategic behavior. The effects of alterations in the labor market deep economic structural parameters, i.e. individual preferences and production technology, are shown in Section 5.3. Finally, the UI setting impacts on the labor market are studied in Section 5.4. This section also evaluates the role of the UI authority monitoring and the differences in the Way changes in the replacement rate and the UI benefit duration affect the labor market. In chapter 6 the model economy is applied in counterfactual experiments to assess several aspects of the Swiss labor market movements in the nineties. Section 6.2 examines the two equilibria characterizing the Swiss labor market in the nineties, the " growth" equilibrium with a "moderate" UI regime and the "recession" equilibrium with a more "generous" UI. Section 6.3 evaluates the isolated effects of the structural shocks, while the isolated effects of the UI reforms are analyzed in section 6.4. Particular dimensions of the UI reforms, the duration, replacement rate and the tax rate effects, are studied in section 6.5, while labor market equilibria without benefits are evaluated in section 6.6. In section 6.7 the structural and institutional interactions that may act as unemployment amplifiers are discussed in view of the obtained results. A welfare analysis based on individual welfare in different structural and UI settings is presented in the eighth section. Finally, the results are related to more favorable unemployment trends after 1997. The conclusion evaluates the features embodied in the model economy with respect to the resulting model dynamics to derive lessons from the model design." The thesis ends by proposing guidelines for future improvements of the model and directions for further research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Suction-based wound healing devices with open-pore foam interfaces are widely used to treat complex tissue defects. The impact of changes in physicochemical parameters of the wound interfaces has not been investigated. METHODS: Full-thickness wounds in diabetic mice were treated with occlusive dressing or a suction device with a polyurethane foam interface varying in mean pore size diameter. Wound surface deformation on day 2 was measured on fixed tissues. Histologic cross-sections were analyzed for granulation tissue thickness (hematoxylin and eosin), myofibroblast density (α-smooth muscle actin), blood vessel density (platelet endothelial cell adhesion molecule-1), and cell proliferation (Ki67) on day 7. RESULTS: Polyurethane foam-induced wound surface deformation increased with polyurethane foam pore diameter: 15 percent (small pore size), 60 percent (medium pore size), and 150 percent (large pore size). The extent of wound strain correlated with granulation tissue thickness that increased 1.7-fold in small pore size foam-treated wounds, 2.5-fold in medium pore size foam-treated wounds, and 4.9-fold in large pore size foam-treated wounds (p < 0.05) compared with wounds treated with an occlusive dressing. All polyurethane foams increased the number of myofibroblasts over occlusive dressing, with maximal presence in large pore size foam-treated wounds compared with all other groups (p < 0.05). CONCLUSIONS: The pore size of the interface material of suction devices has a significant impact on the wound healing response. Larger pores increased wound surface strain, tissue growth, and transformation of contractile cells. Modification of the pore size is a powerful approach for meeting biological needs of specific wounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article extends existing discussion in literature on probabilistic inference and decision making with respect to continuous hypotheses that are prevalent in forensic toxicology. As a main aim, this research investigates the properties of a widely followed approach for quantifying the level of toxic substances in blood samples, and to compare this procedure with a Bayesian probabilistic approach. As an example, attention is confined to the presence of toxic substances, such as THC, in blood from car drivers. In this context, the interpretation of results from laboratory analyses needs to take into account legal requirements for establishing the 'presence' of target substances in blood. In a first part, the performance of the proposed Bayesian model for the estimation of an unknown parameter (here, the amount of a toxic substance) is illustrated and compared with the currently used method. The model is then used in a second part to approach-in a rational way-the decision component of the problem, that is judicial questions of the kind 'Is the quantity of THC measured in the blood over the legal threshold of 1.5 μg/l?'. This is pointed out through a practical example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Plasticity in cancer stem-like cells (CSC) may provide a key basis for cancer heterogeneity and therapeutic response. In this study, we assessed the effect of combining a drug that abrogates CSC properties with standard-of-care therapy in a Ewing sarcoma family tumor (ESFT). Emergence of CSC in this setting has been shown to arise from a defect in TARBP2-dependent microRNA maturation, which can be corrected by exposure to the fluoroquinolone enoxacin. In the present work, primary ESFT from four patients containing CD133(+) CSC subpopulations ranging from 3% to 17% of total tumor cells were subjected to treatment with enoxacin, doxorubicin, or both drugs. Primary ESFT CSC and bulk tumor cells displayed divergent responses to standard-of-care chemotherapy and enoxacin. Doxorubicin, which targets the tumor bulk, displayed toxicity toward primary adherent ESFT cells in culture but not to CSC-enriched ESFT spheres. Conversely, enoxacin, which enhances miRNA maturation by stimulating TARBP2 function, induced apoptosis but only in ESFT spheres. In combination, the two drugs markedly depleted CSCs and strongly reduced primary ESFTs in xenograft assays. Our results identify a potentially attractive therapeutic strategy for ESFT that combines mechanism-based targeting of CSC using a low-toxicity antibiotic with a standard-of-care cytotoxic drug, offering immediate applications for clinical evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerous drug exposures do occur unintentionally at the beginning of pregnancy. On the other hand, pursuing drug treatment may be necessary in women who wish to be pregnant. In these situations risk evaluation has to be done in a precise and differentiated manner, taking into account at the same time the risk for the fetus and maternal health. Teratovigilance services are able to give a thorough information enabling to avoid unwarranted drug arrests or pregnancy terminations. In return, physician's catamnesis about the outcome of the pregnancy exposed to one or several therapeutic agents will increase the bulk of knowledge health professionals and pregnant women have at their disposal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a new global method for the identification of hotspots in conservation and ecology. The method is based on the identification of spatial structure properties through cumulative relative frequency distributions curves, and is tested with two case studies, the identification of fish density hotspots and terrestrial vertebrate species diversity hotspots. Results from the frequency distribution method are compared with those from standard techniques among local, partially local and global methods. Our approach offers the main advantage to be independent from the selection of any threshold, neighborhood, or other parameter that affect most of the currently available methods for hotspot analysis. The two case studies show how such elements of arbitrariness of the traditional methods influence both size and location of the identified hotspots, and how this new global method can be used for a more objective selection of hotspots.