964 resultados para Data Migration Processes Modeling
Resumo:
MOTIVATION: Understanding gene regulation in biological processes and modeling the robustness of underlying regulatory networks is an important problem that is currently being addressed by computational systems biologists. Lately, there has been a renewed interest in Boolean modeling techniques for gene regulatory networks (GRNs). However, due to their deterministic nature, it is often difficult to identify whether these modeling approaches are robust to the addition of stochastic noise that is widespread in gene regulatory processes. Stochasticity in Boolean models of GRNs has been addressed relatively sparingly in the past, mainly by flipping the expression of genes between different expression levels with a predefined probability. This stochasticity in nodes (SIN) model leads to over representation of noise in GRNs and hence non-correspondence with biological observations. RESULTS: In this article, we introduce the stochasticity in functions (SIF) model for simulating stochasticity in Boolean models of GRNs. By providing biological motivation behind the use of the SIF model and applying it to the T-helper and T-cell activation networks, we show that the SIF model provides more biologically robust results than the existing SIN model of stochasticity in GRNs. AVAILABILITY: Algorithms are made available under our Boolean modeling toolbox, GenYsis. The software binaries can be downloaded from http://si2.epfl.ch/ approximately garg/genysis.html.
Resumo:
Aim: The aim of the study was to investigate the influence of dietary intake of commercial hydrolyzed collagen (Gelatine Royal ®) on bone remodeling in pre-pubertal children. Methods: A randomized double-blind study was carried out in 60 children (9.42 ± 1.31 years) divided into three groups according to the amount of partially hydrolyzed collagen taken daily for 4 months: placebo (G-I, n = 18), collagen (G-II, n = 20) and collagen + calcium (G-III, n = 22) groups. Analyses of the following biochemical markers were carried out: total and bone alkaline phosphatase (tALP and bALP), osteocalcin, tartrate-resistant acid phosphatase (TRAP), type I collagen carboxy terminal telopeptide, lipids, calcium, 25-hydroxyvitamin D, insulin-like growth factor 1 (IGF-1), thyroid-stimulating hormone, free thyroxin and intact parathormone. Results: There was a significantly greater increase in serum IGF-1 in G-III than in G II (p < 0.01) or G-I (p < 0.05) during the study period, and a significantly greater increase in plasma tALP in G-III than in G-I (p < 0.05). Serum bALP behavior significantly (p < 0.05) differed between G-II (increase) and G-I (decrease). Plasma TRAP behavior significantly differed between G-II and G-I (p < 0.01) and between G-III and G-II (p < 0.05). Conclusion: Daily dietary intake of hydrolyzed collagen seems to have a potential role in enhancing bone remodeling at key stages of growth and development.
Resumo:
Isotopic data are currently becoming an important source of information regardingsources, evolution and mixing processes of water in hydrogeologic systems. However, itis not clear how to treat with statistics the geochemical data and the isotopic datatogether. We propose to introduce the isotopic information as new parts, and applycompositional data analysis with the resulting increased composition. Results areequivalent to downscale the classical isotopic delta variables, because they are alreadyrelative (as needed in the compositional framework) and isotopic variations are almostalways very small. This methodology is illustrated and tested with the study of theLlobregat River Basin (Barcelona, NE Spain), where it is shown that, though verysmall, isotopic variations comp lement geochemical principal components, and help inthe better identification of pollution sources
Resumo:
Empirical literature on the analysis of the efficiency of measures for reducing persistent government deficits has mainly focused on the direct explanation of deficit. By contrast, this paper aims at modeling government revenue and expenditure within a simultaneous framework and deriving the fiscal balance (surplus or deficit) equation as the difference between the two variables. This setting enables one to not only judge how relevant the explanatory variables are in explaining the fiscal balance but also understand their impact on revenue and/or expenditure. Our empirical results, obtained by using a panel data set on Swiss Cantons for the period 1980-2002, confirm the relevance of the approach followed here, by providing unambiguous evidence of a simultaneous relationship between revenue and expenditure. They also reveal strong dynamic components in revenue, expenditure, and fiscal balance. Among the significant determinants of public fiscal balance we not only find the usual business cycle elements, but also and more importantly institutional factors such as the number of administrative units, and the ease with which people can resort to political (direct democracy) instruments, such as public initiatives and referendum.
Resumo:
En aquest treball, es proposa un nou mètode per estimar en temps real la qualitat del producte final en processos per lot. Aquest mètode permet reduir el temps necessari per obtenir els resultats de qualitat de les anàlisi de laboratori. S'utiliza un model de anàlisi de componentes principals (PCA) construït amb dades històriques en condicions normals de funcionament per discernir si un lot finalizat és normal o no. Es calcula una signatura de falla pels lots anormals i es passa a través d'un model de classificació per la seva estimació. L'estudi proposa un mètode per utilitzar la informació de les gràfiques de contribució basat en les signatures de falla, on els indicadors representen el comportament de les variables al llarg del procés en les diferentes etapes. Un conjunt de dades compost per la signatura de falla dels lots anormals històrics es construeix per cercar els patrons i entrenar els models de classifcació per estimar els resultas dels lots futurs. La metodologia proposada s'ha aplicat a un reactor seqüencial per lots (SBR). Diversos algoritmes de classificació es proven per demostrar les possibilitats de la metodologia proposada.
Resumo:
Purpose: HIV-infected patients present an increased cardiovascular risk (CVR) of multifactorial origin, usually lower in women than in men. Information by gender about prevalence of modifiable risk factors is scarce. Methods: Coronator is a cross-sectional survey of a representative sample of HIV-infected patients on ART within 10 hospitals across Spain in 2011. Variables include sociodemographics, CVR factors and 10-year CV disease risk estimation (Regicor: Framingham score adapted to the Spanish population). Results: We included 860 patients (76.3% male) with no history of CVD. Median age 45.6 years; 84.1% were Spaniards; 29.9% women were IDUs. Median time since HIV diagnosis for men and women was 10 and 13 years (p=0.001), 28% had an AIDS diagnosis. Median CD4 cell count was 596 cells/mm3, 88% had undetectable viral load. Median time on ART was 91 and 108 months (p=0.017). There was a family history of early CVD in 113 men (17.9%) and 41 women (20.6%). Classical CVR factors are described in the table. Median (IQR) Regicor Score was 3% (2-5) for men and 2% (1-3) for women (p=0.000), and the proportion of subjects with mid-high risk (>5%) was 26.1% for men and 9.4% for women (p=0.000). Conclusions: In this population of HIV-infected patients, women have lower cardiovascular risk than men, partly due to higher levels of HDL cholesterol. Of note is the high frequency of smoking, abdominal obesity and sedentary lifestyle in our population. (Table Presented).
Resumo:
OBJECTIVES: Theory of mind (ToM) performance in aging and dementia of the Alzheimer type (DAT) has been a growing interest of researchers and recently, theoretical trends in ToM development have led to a focus on determining the cognitive skills involved in ToM performance. The aim of the present review is to answer three main questions: How is ToM assessed in aging and DAT? How does ToM performance evolve in aging and DAT? Do cognitive processes influence ToM performance in aging and DAT? METHOD: A systematic review was conducted to provide a targeted overview of recent studies relating ToM performance with cognitive processes in aging and DAT. RESULTS: RESULTS suggest a decrease in ToM performance, more pronounced in complex ToM tasks. Moreover, the review points up the strong involvement of executive functions, especially inhibition, and reasoning skills in ToM task achievement. CONCLUSION: Current data suggest that the structure of ToM tasks itself could lead to poor performance, especially in populations with reduced cognitive abilities.
Resumo:
Major climatic and geological events but also population history (secondary contacts) have generated cycles of population isolation and connection of long and short periods. Recent empirical and theoretical studies suggest that fast evolutionary processes might be triggered by such events, as commonly illustrated in ecology by the adaptive radiation of cichlid fishes (isolation and reconnection of lakes and watersheds) and in epidemiology by the fast adaptation of the influenza virus (isolation and reconnection in hosts). We test whether cyclic population isolation and connection provide the raw material (standing genetic variation) for species evolution and diversification. Our analytical results demonstrate that population isolation and connection can provide, to populations, a high excess of genetic diversity compared with what is expected at equilibrium. This excess is either cyclic (high allele turnover) or cumulates with time depending on the duration of the isolation and the connection periods and the mutation rate. We show that diversification rates of animal clades are associated with specific periods of climatic cycles in the Quaternary. We finally discuss the importance of our results for macroevolutionary patterns and for the inference of population history from genomic data.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
The CIAOW study (Complicated intra-abdominal infections worldwide observational study) is a multicenter observational study underwent in 68 medical institutions worldwide during a six-month study period (October 2012-March 2013). The study included patients older than 18 years undergoing surgery or interventional drainage to address complicated intra-abdominal infections (IAIs). 1898 patients with a mean age of 51.6 years (range 18-99) were enrolled in the study. 777 patients (41%) were women and 1,121 (59%) were men. Among these patients, 1,645 (86.7%) were affected by community-acquired IAIs while the remaining 253 (13.3%) suffered from healthcare-associated infections. Intraperitoneal specimens were collected from 1,190 (62.7%) of the enrolled patients. 827 patients (43.6%) were affected by generalized peritonitis while 1071 (56.4%) suffered from localized peritonitis or abscesses. The overall mortality rate was 10.5% (199/1898). According to stepwise multivariate analysis (PR = 0.005 and PE = 0.001), several criteria were found to be independent variables predictive of mortality, including patient age (OR = 1.1; 95%CI = 1.0-1.1; p < 0.0001), the presence of small bowel perforation (OR = 2.8; 95%CI = 1.5-5.3; p < 0.0001), a delayed initial intervention (a delay exceeding 24 hours) (OR = 1.8; 95%CI = 1.5-3.7; p < 0.0001), ICU admission (OR = 5.9; 95%CI = 3.6-9.5; p < 0.0001) and patient immunosuppression (OR = 3.8; 95%CI = 2.1-6.7; p < 0.0001).
Resumo:
In epidemiologic studies, measurement error in dietary variables often attenuates association between dietary intake and disease occurrence. To adjust for the attenuation caused by error in dietary intake, regression calibration is commonly used. To apply regression calibration, unbiased reference measurements are required. Short-term reference measurements for foods that are not consumed daily contain excess zeroes that pose challenges in the calibration model. We adapted two-part regression calibration model, initially developed for multiple replicates of reference measurements per individual to a single-replicate setting. We showed how to handle excess zero reference measurements by two-step modeling approach, how to explore heteroscedasticity in the consumed amount with variance-mean graph, how to explore nonlinearity with the generalized additive modeling (GAM) and the empirical logit approaches, and how to select covariates in the calibration model. The performance of two-part calibration model was compared with the one-part counterpart. We used vegetable intake and mortality data from European Prospective Investigation on Cancer and Nutrition (EPIC) study. In the EPIC, reference measurements were taken with 24-hour recalls. For each of the three vegetable subgroups assessed separately, correcting for error with an appropriately specified two-part calibration model resulted in about three fold increase in the strength of association with all-cause mortality, as measured by the log hazard ratio. Further found is that the standard way of including covariates in the calibration model can lead to over fitting the two-part calibration model. Moreover, the extent of adjusting for error is influenced by the number and forms of covariates in the calibration model. For episodically consumed foods, we advise researchers to pay special attention to response distribution, nonlinearity, and covariate inclusion in specifying the calibration model.
Resumo:
BACKGROUND Multiple sclerosis (MS) is a neurodegenerative, autoimmune disease of the central nervous system. Genome-wide association studies (GWAS) have identified over hundred polymorphisms with modest individual effects in MS susceptibility and they have confirmed the main individual effect of the Major Histocompatibility Complex. Additional risk loci with immunologically relevant genes were found significantly overrepresented. Nonetheless, it is accepted that most of the genetic architecture underlying susceptibility to the disease remains to be defined. Candidate association studies of the leukocyte immunoglobulin-like receptor LILRA3 gene in MS have been repeatedly reported with inconsistent results. OBJECTIVES In an attempt to shed some light on these controversial findings, a combined analysis was performed including the previously published datasets and three newly genotyped cohorts. Both wild-type and deleted LILRA3 alleles were discriminated in a single-tube PCR amplification and the resulting products were visualized by their different electrophoretic mobilities. RESULTS AND CONCLUSION Overall, this meta-analysis involved 3200 MS patients and 3069 matched healthy controls and it did not evidence significant association of the LILRA3 deletion [carriers of LILRA3 deletion: p = 0.25, OR (95% CI) = 1.07 (0.95-1.19)], even after stratification by gender and the HLA-DRB1*15:01 risk allele.
Resumo:
Objectives: We are interested in the numerical simulation of the anastomotic region comprised between outflow canula of LVAD and the aorta. Segmenta¬tion, geometry reconstruction and grid generation from patient-specific data remain an issue because of the variable quality of DICOM images, in particular CT-scan (e.g. metallic noise of the device, non-aortic contrast phase). We pro¬pose a general framework to overcome this problem and create suitable grids for numerical simulations.Methods: Preliminary treatment of images is performed by reducing the level window and enhancing the contrast of the greyscale image using contrast-limited adaptive histogram equalization. A gradient anisotropic diffusion filter is applied to reduce the noise. Then, watershed segmentation algorithms and mathematical morphology filters allow reconstructing the patient geometry. This is done using the InsightToolKit library (www.itk.org). Finally the Vascular Model¬ing ToolKit (www.vmtk.org) and gmsh (www.geuz.org/gmsh) are used to create the meshes for the fluid (blood) and structure (arterial wall, outflow canula) and to a priori identify the boundary layers. The method is tested on five different patients with left ventricular assistance and who underwent a CT-scan exam.Results: This method produced good results in four patients. The anastomosis area is recovered and the generated grids are suitable for numerical simulations. In one patient the method failed to produce a good segmentation because of the small dimension of the aortic arch with respect to the image resolution.Conclusions: The described framework allows the use of data that could not be otherwise segmented by standard automatic segmentation tools. In particular the computational grids that have been generated are suitable for simulations that take into account fluid-structure interactions. Finally the presented method features a good reproducibility and fast application.
Resumo:
The chemical composition of sediments and rocks, as well as their distribution at theMartian surface, represent a long term archive of processes, which have formed theplanetary surface. A survey of chemical compositions by means of Compositional DataAnalysis represents a valuable tool to extract direct evidence for weathering processesand allows to quantify weathering and sedimentation rates. clr-biplot techniques areapplied for visualization of chemical relationships across the surface (“chemical maps”).The variability among individual suites of data is further analyzed by means of clr-PCA,in order to extract chemical alteration vectors between fresh rocks and their crusts andfor an assessment of different source reservoirs accessible to soil formation. Bothtechniques are applied to elucidate the influence of remote weathering by combinedanalysis of several soil forming branches. Vector analysis in the Simplex provides theopportunity to study atmosphere surface interactions, including the role andcomposition of volcanic gases
Resumo:
Les écosystèmes fournissent de nombreuses ressources et services écologiques qui sont utiles à la population humaine. La biodiversité est une composante essentielle des écosystèmes et maintient de nombreux services. Afin d'assurer la permanence des services écosystémiques, des mesures doivent être prises pour conserver la biodiversité. Dans ce but, l'acquisition d'informations détaillées sur la distribution de la biodiversité dans l'espace est essentielle. Les modèles de distribution d'espèces (SDMs) sont des modèles empiriques qui mettent en lien des observations de terrain (présences ou absences d'une espèce) avec des descripteurs de l'environnement, selon des courbes de réponses statistiques qui décrive la niche réalisée des espèces. Ces modèles fournissent des projections spatiales indiquant les lieux les plus favorables pour les espèces considérées. Le principal objectif de cette thèse est de fournir des projections plus réalistes de la distribution des espèces et des communautés en montagne pour le climat présent et futur en considérant non-seulement des variables abiotiques mais aussi biotiques. Les régions de montagne et l'écosystème alpin sont très sensibles aux changements globaux et en même temps assurent de nombreux services écosystémiques. Cette thèse est séparée en trois parties : (i) fournir une meilleure compréhension du rôle des interactions biotiques dans la distribution des espèces et l'assemblage des communautés en montagne (ouest des Alpes Suisses), (ii) permettre le développement d'une nouvelle approche pour modéliser la distribution spatiale de la biodiversité, (iii) fournir des projections plus réalistes de la distribution future des espèces ainsi que de la composition des communautés. En me focalisant sur les papillons, bourdons et plantes vasculaires, j'ai détecté des interactions biotiques importantes qui lient les espèces entre elles. J'ai également identifié la signature du filtre de l'environnement sur les communautés en haute altitude confirmant l'utilité des SDMs pour reproduire ce type de processus. A partir de ces études, j'ai contribué à l'amélioration méthodologique des SDMs dans le but de prédire les communautés en incluant les interactions biotiques et également les processus non-déterministes par une approche probabiliste. Cette approche permet de prédire non-seulement la distribution d'espèces individuelles, mais également celle de communautés dans leur entier en empilant les projections (S-SDMs). Finalement, j'ai utilisé cet outil pour prédire la distribution d'espèces et de communautés dans le passé et le futur. En particulier, j'ai modélisé la migration post-glaciaire de Trollius europaeus qui est à l'origine de la structure génétique intra-spécifique chez cette espèce et évalué les risques de perte face au changement climatique. Finalement, j'ai simulé la distribution des communautés de bourdons pour le 21e siècle afin d'évaluer les changements probables dans ce groupe important de pollinisateurs. La diversité fonctionnelle des bourdons va être altérée par la perte d'espèces spécialistes de haute altitude et ceci va influencer la pollinisation des plantes en haute altitude. - Ecosystems provide a multitude of resources and ecological services, which are useful to human. Biodiversity is an essential component of those ecosystems and guarantee many services. To assure the permanence of ecosystem services for future generation, measure should be applied to conserve biodiversity. For this purpose, the acquisition of detailed information on how biodiversity implicated in ecosystem function is distributed in space is essential. Species distribution models (SDMs) are empirical models relating field observations to environmental predictors based on statistically-derived response surfaces that fit the realized niche. These models result in spatial predictions indicating locations of the most suitable environment for the species and may potentially be applied to predict composition of communities and their functional properties. The main objective of this thesis was to provide more accurate projections of species and communities distribution under current and future climate in mountains by considering not solely abiotic but also biotic drivers of species distribution. Mountain areas and alpine ecosystems are considered as particularly sensitive to global changes and are also sources of essential ecosystem services. This thesis had three main goals: (i) a better ecological understanding of biotic interactions and how they shape the distribution of species and communities, (ii) the development of a novel approach to the spatial modeling of biodiversity, that can account for biotic interactions, and (iii) ecologically more realistic projections of future species distributions, of future composition and structure of communities. Focusing on butterfly and bumblebees in interaction with the vegetation, I detected important biotic interactions for species distribution and community composition of both plant and insects along environmental gradients. I identified the signature of environmental filtering processes at high elevation confirming the suitability of SDMs for reproducing patterns of filtering. Using those case-studies, I improved SDMs by incorporating biotic interaction and accounting for non-deterministic processes and uncertainty using a probabilistic based approach. I used improved modeling to forecast the distribution of species through the past and future climate changes. SDMs hindcasting allowed a better understanding of the spatial range dynamic of Trollius europaeus in Europe at the origin of the species intra-specific genetic diversity and identified the risk of loss of this genetic diversity caused by climate change. By simulating the future distribution of all bumblebee species in the western Swiss Alps under nine climate change scenarios for the 21st century, I found that the functional diversity of this pollinator guild will be largely affected by climate change through the loss of high elevation specialists. In turn, this will have important consequences on alpine plant pollination.