926 resultados para Model-Data Integration and Data Assimilation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The variety of DNA microarray formats and datasets presently available offers an unprecedented opportunity to perform insightful comparisons of heterogeneous data. Cross-species studies, in particular, have the power of identifying conserved, functionally important molecular processes. Validation of discoveries can now often be performed in readily available public data which frequently requires cross-platform studies.Cross-platform and cross-species analyses require matching probes on different microarray formats. This can be achieved using the information in microarray annotations and additional molecular biology databases, such as orthology databases. Although annotations and other biological information are stored using modern database models ( e. g. relational), they are very often distributed and shared as tables in text files, i.e. flat file databases. This common flat database format thus provides a simple and robust solution to flexibly integrate various sources of information and a basis for the combined analysis of heterogeneous gene expression profiles.Results: We provide annotationTools, a Bioconductor-compliant R package to annotate microarray experiments and integrate heterogeneous gene expression profiles using annotation and other molecular biology information available as flat file databases. First, annotationTools contains a specialized set of functions for mining this widely used database format in a systematic manner. It thus offers a straightforward solution for annotating microarray experiments. Second, building on these basic functions and relying on the combination of information from several databases, it provides tools to easily perform cross-species analyses of gene expression data.Here, we present two example applications of annotationTools that are of direct relevance for the analysis of heterogeneous gene expression profiles, namely a cross-platform mapping of probes and a cross-species mapping of orthologous probes using different orthology databases. We also show how to perform an explorative comparison of disease-related transcriptional changes in human patients and in a genetic mouse model.Conclusion: The R package annotationTools provides a simple solution to handle microarray annotation and orthology tables, as well as other flat molecular biology databases. Thereby, it allows easy integration and analysis of heterogeneous microarray experiments across different technological platforms or species.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The reverse transcription quantitative real-time polymerase chain reaction (RT-qPCR) is a widely used, highly sensitive laboratory technique to rapidly and easily detect, identify and quantify gene expression. Reliable RT-qPCR data necessitates accurate normalization with validated control genes (reference genes) whose expression is constant in all studied conditions. This stability has to be demonstrated.We performed a literature search for studies using quantitative or semi-quantitative PCR in the rat spared nerve injury (SNI) model of neuropathic pain to verify whether any reference genes had previously been validated. We then analyzed the stability over time of 7 commonly used reference genes in the nervous system - specifically in the spinal cord dorsal horn and the dorsal root ganglion (DRG). These were: Actin beta (Actb), Glyceraldehyde-3-phosphate dehydrogenase (GAPDH), ribosomal proteins 18S (18S), L13a (RPL13a) and L29 (RPL29), hypoxanthine phosphoribosyltransferase 1 (HPRT1) and hydroxymethylbilane synthase (HMBS). We compared the candidate genes and established a stability ranking using the geNorm algorithm. Finally, we assessed the number of reference genes necessary for accurate normalization in this neuropathic pain model. RESULTS: We found GAPDH, HMBS, Actb, HPRT1 and 18S cited as reference genes in literature on studies using the SNI model. Only HPRT1 and 18S had been once previously demonstrated as stable in RT-qPCR arrays. All the genes tested in this study, using the geNorm algorithm, presented gene stability values (M-value) acceptable enough for them to qualify as potential reference genes in both DRG and spinal cord. Using the coefficient of variation, 18S failed the 50% cut-off with a value of 61% in the DRG. The two most stable genes in the dorsal horn were RPL29 and RPL13a; in the DRG they were HPRT1 and Actb. Using a 0.15 cut-off for pairwise variations we found that any pair of stable reference gene was sufficient for the normalization process. CONCLUSIONS: In the rat SNI model, we validated and ranked Actb, RPL29, RPL13a, HMBS, GAPDH, HPRT1 and 18S as good reference genes in the spinal cord. In the DRG, 18S did not fulfill stability criteria. The combination of any two stable reference genes was sufficient to provide an accurate normalization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many dynamic revenue management models divide the sale period into a finite number of periods T and assume, invoking a fine-enough grid of time, that each period sees at most one booking request. These Poisson-type assumptions restrict the variability of the demand in the model, but researchers and practitioners were willing to overlook this for the benefit of tractability of the models. In this paper, we criticize this model from another angle. Estimating the discrete finite-period model poses problems of indeterminacy and non-robustness: Arbitrarily fixing T leads to arbitrary control values and on the other hand estimating T from data adds an additional layer of indeterminacy. To counter this, we first propose an alternate finite-population model that avoids this problem of fixing T and allows a wider range of demand distributions, while retaining the useful marginal-value properties of the finite-period model. The finite-population model still requires jointly estimating market size and the parameters of the customer purchase model without observing no-purchases. Estimation of market-size when no-purchases are unobservable has rarely been attempted in the marketing or revenue management literature. Indeed, we point out that it is akin to the classical statistical problem of estimating the parameters of a binomial distribution with unknown population size and success probability, and hence likely to be challenging. However, when the purchase probabilities are given by a functional form such as a multinomial-logit model, we propose an estimation heuristic that exploits the specification of the functional form, the variety of the offer sets in a typical RM setting, and qualitative knowledge of arrival rates. Finally we perform simulations to show that the estimator is very promising in obtaining unbiased estimates of population size and the model parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context There are no evidence syntheses available to guide clinicians on when to titrate antihypertensive medication after initiation. Objective To model the blood pressure (BP) response after initiating antihypertensive medication. Data sources electronic databases including Medline, Embase, Cochrane Register and reference lists up to December 2009. Study selection Trials that initiated antihypertensive medication as single therapy in hypertensive patients who were either drug naive or had a placebo washout from previous drugs. Data extraction Office BP measurements at a minimum of two weekly intervals for a minimum of 4 weeks. An asymptotic approach model of BP response was assumed and non-linear mixed effects modelling used to calculate model parameters. Results and conclusions Eighteen trials that recruited 4168 patients met inclusion criteria. The time to reach 50% of the maximum estimated BP lowering effect was 1 week (systolic 0.91 weeks, 95% CI 0.74 to 1.10; diastolic 0.95, 0.75 to 1.15). Models incorporating drug class as a source of variability did not improve fit of the data. Incorporating the presence of a titration schedule improved model fit for both systolic and diastolic pressure. Titration increased both the predicted maximum effect and the time taken to reach 50% of the maximum (systolic 1.2 vs 0.7 weeks; diastolic 1.4 vs 0.7 weeks). Conclusions Estimates of the maximum efficacy of antihypertensive agents can be made early after starting therapy. This knowledge will guide clinicians in deciding when a newly started antihypertensive agent is likely to be effective or not at controlling BP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Plant growth analysis presents difficulties related to statistical comparison of growth rates, and the analysis of variance of primary data could guide the interpretation of results. The objective of this work was to evaluate the analysis of variance of data from distinct harvests of an experiment, focusing especially on the homogeneity of variances and the choice of an adequate ANOVA model. Data from five experiments covering different crops and growth conditions were used. From the total number of variables, 19% were originally homoscedastic, 60% became homoscedastic after logarithmic transformation, and 21% remained heteroscedastic after transformation. Data transformation did not affect the F test in one experiment, whereas in the other experiments transformation modified the F test usually reducing the number of significant effects. Even when transformation has not altered the F test, mean comparisons led to divergent interpretations. The mixed ANOVA model, considering harvest as a random effect, reduced the number of significant effects of every factor which had the F test modified by this model. Examples illustrated that analysis of variance of primary variables provides a tool for identifying significant differences in growth rates. The analysis of variance imposes restrictions to experimental design thereby eliminating some advantages of the functional growth analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many people worldwide live with a disability, i.e. limitations in functioning. The prevalence is expected to increase due to demographic change and the growing importance of non-communicable disease and injury. To date, many epidemiological studies have used simple dichotomous measures of disability, even though the WHO's International Classification of Functioning, Disability, and Health (ICF) provides a multi-dimensional framework of functioning. We aimed to examine associations of socio-economic status (SES) and social integration in 3 core domains of functioning (impairment, pain, limitations in activity and participation) and perceived health. We conducted a secondary analysis of representative cross-sectional data of the Swiss Health Survey 2007 including 10,336 female and 8,424 male Swiss residents aged 15 or more. Guided by a theoretical ICF-based model, 4 mixed effects Poisson regressions were fitted in order to explain functioning and perceived health by indicators of SES and social integration. Analyses were stratified by age groups (15-30, 31-54, ≥55 years). In all age groups, SES and social integration were significantly associated with functional and perceived health. Among the functional domains, impairment and pain were closely related, and both were associated with limitations in activity and participation. SES, social integration and functioning were related to perceived health. We found pronounced social inequalities in functioning and perceived health, supporting our theoretical model. Social factors play a significant role in the experience of health, even in a wealthy country such as Switzerland. These findings await confirmation in other, particularly lower resourced settings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study is to analyze how European integration and, especially, changes in ownership, has affected banking efficiency in Central and Eastern European countries which have recently experimented this process more intensely. Using a stochastic frontier approach (SFA) applied to panel data, we have estimated bank efficiency levels in a sample of 189 banks from 12 countries during the period 2000 to 2008 and we have analyzed the influence of some bank characteristics on these efficiency levels. The results show that European integration has significantly improved the cost efficiency of banks in these countries but profit efficiency has significantly decreased. We have found very small differences between different ownership types and only a very small impact of foreign ownership on cost efficiency, showing that the entry of foreign ownership is not enough to explain the significant variations in banking efficiency after the accession.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämä työ tehtiin globaaliin elektroniikka-alan yritykseen. Diplomityö liittyy haasteeseen, jonka lisääntynyt globalisaatio ja kiristyvä kilpailu ovat luoneet: case yrityksen on selvitettävä kuinka se voi saavuttaa kasvutavoitteet myös tulevaisuudessa hankkimalla uusia asiakkaita ja olemalla yhä enenevissä määrin maailmanlaajuisesti läsnä. Tutkimuksen tavoite oli löytää sopiva malli potentiaalisten avainasiakkaiden identifiointiin ja valintaan, sekä testata ja modifioida valittua mallia case yrityksen tarpeiden mukaisesti. Erityisesti raakadatan kerääminen, asiakkaiden houkuttelevuuskriteerit ja kohdemarkkinarako olivat asioita, jotka tarvitsivat tutkimuksessa huomiota. Kirjallisuuskatsauksessa keskityttiin yritysmarkkinoihin, eri asiakassuhteenhallinnan lähestymistapoihin ja avainasiakkaiden määrittämiseen. CRM:n, KAM:n ja Customer Insight-ajattelun perusteet esiteltiin yhdessä eri avainasiakkaiden identifiointimallien kanssa. Valittua Chevertonin mallia testattiin ja muokattiin työn empiirisessä osassa. Tutkimuksen empiirinen kontribuutio on modifioitu malli potentiaalisten avainasiakkaiden identifiointiin. Se auttaa päätöksentekijöitä etenemään systemaattisesti ja organisoidusti askel askeleelta kohti potentiaalisten asiakkaiden listaa tietyltä markkina-alueelta. Työ tarjoaa työkalun tähän prosessiin sekä luo pohjaa tulevaisuuden tutkimukselle ja toimenpiteille.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The cellular DNA repair hRAD51 protein has been shown to restrict HIV-1 integration both in vitro and in vivo. To investigate its regulatory functions, we performed a pharmacological analysis of the retroviral integration modulation by hRAD51. We found that, in vitro, chemical activation of hRAD51 stimulates its integration inhibitory properties, whereas inhibition of hRAD51 decreases the integration restriction, indicating that the modulation of HIV-1 integration depends on the hRAD51 recombinase activity. Cellular analyses demonstrated that cells exhibiting high hRAD51 levels prior to de novo infection are more resistant to integration. On the other hand, when hRAD51 was activated during integration, cells were more permissive. Altogether, these data establish the functional link between hRAD51 activity and HIV-1 integration. Our results highlight the multiple and opposite effects of the recombinase during integration and provide new insights into the cellular regulation of HIV-1 replication.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study is to investigate volatility spillover-effect and market integration between BRIC countries. Motivated by existing literature of market integration between developed and emerging markets, we will investigate market linkages using multivariate asymmetric GARCH BEKK model. The increasing globalization of the financial markets and consequent higher volatility transfer between markets makes it more important to understand market integration between BRIC countries. We investigate the stock market integration and volatility transfer between the BRIC countries form 1998 to 2007, using daily data. The empirical results show that there are international diversification benefits among Brazil, Russia, China and India. U.S. influence to these countries has been week, even though U.S. economy has been leading the global financial markets. From Finnish point of view, diversification benefits are robust but we find some correlation with Russia and China.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work a fuzzy linear system is used to solve Leontief input-output model with fuzzy entries. For solving this model, we assume that the consumption matrix from di erent sectors of the economy and demand are known. These assumptions heavily depend on the information obtained from the industries. Hence uncertainties are involved in this information. The aim of this work is to model these uncertainties and to address them by fuzzy entries such as fuzzy numbers and LR-type fuzzy numbers (triangular and trapezoidal). Fuzzy linear system has been developed using fuzzy data and it is solved using Gauss-Seidel algorithm. Numerical examples show the e ciency of this algorithm. The famous example from Prof. Leontief, where he solved the production levels for U.S. economy in 1958, is also further analyzed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A model to manage even-aged stands was developed using a modification of the Buckman model. Data from Eucalyptus urophylla and Eucalyptus cloeziana stands located in the Northern region of Minas Gerais State, Brazil were used in the formulation of the system. The proposed model generated precise and unbiased estimates in non-thinned stands.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The climate variability between the growth and harvesting of sugar cane is very important because it directly affects yield. The MODIS sensor has characteristics like spatial and temporal resolution that can be applied to monitoring of vegetative vigor variability in the land surface and then, temporal profiles generation. Agro meteorological data from ECMWF model are free and easy to access and have a good representation of reality. In this study, we used the period between sugar cane growth and harvest in the state of Sao Paulo, Brazil, from temporal profiles selecting of NDVI behavior. For each period the precipitation, evapotranspiration, global radiation, length (days) and degree-days were accumulated. The periods were presented in a map format on MODIS spatial resolution of 250 meters. The results showed the spatial variability of climate variables and the relationship to the reality presented by official data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents a set of methods and models for estimation of iron and slag flows in the blast furnace hearth and taphole. The main focus was put on predicting taphole flow patterns and estimating the effects of various taphole conditions on the drainage behavior of the blast furnace hearth. All models were based on a general understanding of the typical tap cycle of an industrial blast furnace. Some of the models were evaluated on short-term process data from the reference furnace. A computational fluid dynamics (CFD) model was built and applied to simulate the complicated hearth flows and thus to predict the regions of the hearth exerted to erosion under various operating conditions. Key boundary variables of the CFD model were provided by a simplified drainage model based on the first principles. By examining the evolutions of liquid outflow rates measured from the furnace studied, the drainage model was improved to include the effects of taphole diameter and length. The estimated slag delays showed good agreement with the observed ones. The liquid flows in the taphole were further studied using two different models and the results of both models indicated that it is more likely that separated flow of iron and slag occurs in the taphole when the liquid outflow rates are comparable during tapping. The drainage process was simulated with an integrated model based on an overall balance analysis: The high in-furnace overpressure can compensate for the resistances induced by the liquid flows in the hearth and through the taphole. Finally, a recently developed multiphase CFD model including interfacial forces between immiscible liquids was developed and both the actual iron-slag system and a water-oil system in laboratory scale were simulated. The model was demonstrated to be a useful tool for simulating hearth flows for gaining understanding of the complex phenomena in the drainage of the blast furnace.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The necessity of EC (Electronic Commerce) and enterprise systems integration is perceived from the integrated nature of enterprise systems. The proven benefits of EC to provide competitive advantages to the organizations force enterprises to adopt and integrate EC with their enterprise systems. Integration is a complex task to facilitate seamless flow of information and data between different systems within and across enterprises. Different systems have different platforms, thus to integrate systems with different platforms and infrastructures, integration technologies, such as middleware, SOA (Service-Oriented Architecture), ESB (Enterprise Service Bus), JCA (J2EE Connector Architecture), and B2B (Business-to-Business) integration standards are required. Huge software vendors, such as Oracle, IBM, Microsoft, and SAP suggest various solutions to address EC and enterprise systems integration problems. There are limited numbers of literature about the integration of EC and enterprise systems in detail. Most of the studies in this area have focused on the factors which influence the adoption of EC by enterprise or other studies provide limited information about a specific platform or integration methodology in general. Therefore, this thesis is conducted to cover the technical details of EC and enterprise systems integration and covers both the adoption factors and integration solutions. In this study, many literature was reviewed and different solutions were investigated. Different enterprise integration approaches as well as most popular integration technologies were investigated. Moreover, various methodologies of integrating EC and enterprise systems were studied in detail and different solutions were examined. In this study, the influential factors to adopt EC in enterprises were studied based on previous literature and categorized to technical, social, managerial, financial, and human resource factors. Moreover, integration technologies were categorized based on three levels of integration, which are data, application, and process. In addition, different integration approaches were identified and categorized based on their communication and platform. Also, different EC integration solutions were investigated and categorized based on the identified integration approaches. By considering different aspects of integration, this study is a great asset to the architectures, developers, and system integrators in order to integrate and adopt EC with enterprise systems.