885 resultados para Artificial Information Models
Resumo:
Cancer is a major cause of morbidity and mortality worldwide, with a disease burden estimated to increase in the coming decades. Disease heterogeneity and limited information on cancer biology and disease mechanisms are aspects that 2D cell cultures fail to address. We review the current "state-of-the-art" in 3D Tissue Engineering (TE) models developed for and used in cancer research. Scaffold-based TE models and microfluidics, are assessed for their potential to fill the gap between 2D models and clinical application. Recent advances in combining the principles of 3D TE models and microfluidics are discussed, with a special focus on biomaterials and the most promising chip-based 3D models.
Resumo:
Dissertação de mestrado integrado em Engenharia Civil
Resumo:
The use of genome-scale metabolic models has been rapidly increasing in fields such as metabolic engineering. An important part of a metabolic model is the biomass equation since this reaction will ultimately determine the predictive capacity of the model in terms of essentiality and flux distributions. Thus, in order to obtain a reliable metabolic model the biomass precursors and their coefficients must be as precise as possible. Ideally, determination of the biomass composition would be performed experimentally, but when no experimental data are available this is established by approximation to closely related organisms. Computational methods however, can extract some information from the genome such as amino acid and nucleotide compositions. The main objectives of this study were to compare the biomass composition of several organisms and to evaluate how biomass precursor coefficients affected the predictability of several genome-scale metabolic models by comparing predictions with experimental data in literature. For that, the biomass macromolecular composition was experimentally determined and the amino acid composition was both experimentally and computationally estimated for several organisms. Sensitivity analysis studies were also performed with the Escherichia coli iAF1260 metabolic model concerning specific growth rates and flux distributions. The results obtained suggest that the macromolecular composition is conserved among related organisms. Contrasting, experimental data for amino acid composition seem to have no similarities for related organisms. It was also observed that the impact of macromolecular composition on specific growth rates and flux distributions is larger than the impact of amino acid composition, even when data from closely related organisms are used.
Resumo:
Tese de Doutoramento em Engenharia Industrial e de Sistemas.
Resumo:
AbstractBackground:30-40% of cardiac resynchronization therapy cases do not achieve favorable outcomes.Objective:This study aimed to develop predictive models for the combined endpoint of cardiac death and transplantation (Tx) at different stages of cardiac resynchronization therapy (CRT).Methods:Prospective observational study of 116 patients aged 64.8 ± 11.1 years, 68.1% of whom had functional class (FC) III and 31.9% had ambulatory class IV. Clinical, electrocardiographic and echocardiographic variables were assessed by using Cox regression and Kaplan-Meier curves.Results:The cardiac mortality/Tx rate was 16.3% during the follow-up period of 34.0 ± 17.9 months. Prior to implantation, right ventricular dysfunction (RVD), ejection fraction < 25% and use of high doses of diuretics (HDD) increased the risk of cardiac death and Tx by 3.9-, 4.8-, and 5.9-fold, respectively. In the first year after CRT, RVD, HDD and hospitalization due to congestive heart failure increased the risk of death at hazard ratios of 3.5, 5.3, and 12.5, respectively. In the second year after CRT, RVD and FC III/IV were significant risk factors of mortality in the multivariate Cox model. The accuracy rates of the models were 84.6% at preimplantation, 93% in the first year after CRT, and 90.5% in the second year after CRT. The models were validated by bootstrapping.Conclusion:We developed predictive models of cardiac death and Tx at different stages of CRT based on the analysis of simple and easily obtainable clinical and echocardiographic variables. The models showed good accuracy and adjustment, were validated internally, and are useful in the selection, monitoring and counseling of patients indicated for CRT.
Resumo:
Information sharing in oligopoly has been analyzed by assuming that firms behave as a sole economic agent. In this paper I assume that ownership and management are separated. Managers are allowed to falsely report their costs to owners and rivals. Under such circumstances, if owners want to achieve information sharing they must use managerial contracts that implement truthful cost reporting by managers as a dominant strategy. I show that, contrary to the classical result, without the inclusion of message-dependent payments in managerial contracts there will be no information sharing. On the other hand, with the inclusion of such publicly observable payments and credible ex-ante commitment by owners not to modify these payments, there will be perfect information sharing without the need for third parties. Keywords: Information sharing, Delegation, Managerial contracts. JEL classification numbers: D21, D82, L13, L21
Resumo:
This paper studies the impact of instrumental voting on information demand and mass media behaviour during electoral campaigns. If voters act instrumentally then information demand should increase with the closeness of an election. Mass media are modeled as profit-maximizing firms that take into account information demand, the value of customers to advertisers and the marginal cost of customers. Information supply should be larger in electoral constituencies where the contest is expected to be closer, there is a higher population density, and customers are on average more profitable for advertisers. The impact of electorate size is theoretically undetermined. These conclusions are then tested with comfortable results on data from the 1997 general election in Britain.
Resumo:
This paper examines competition in a spatial model of two-candidate elections, where one candidate enjoys a quality advantage over the other candidate. The candidates care about winning and also have policy preferences. There is two-dimensional private information. Candidate ideal points as well as their tradeoffs between policy preferences and winning are private information. The distribution of this two-dimensional type is common knowledge. The location of the median voter's ideal point is uncertain, with a distribution that is commonly known by both candidates. Pure strategy equilibria always exist in this model. We characterize the effects of increased uncertainty about the median voter, the effect of candidate policy preferences, and the effects of changes in the distribution of private information. We prove that the distribution of candidate policies approaches the mixed equilibrium of Aragones and Palfrey (2002a), when both candidates' weights on policy preferences go to zero.
Resumo:
We analyze the effects of uncertainty and private information on horizontal mergers. Firms face uncertain demands or costs and receive private signals. They may decide to merge sharing their private information. If the uncertainty parameters are independent and the signals are perfect, uncertainty generates an informational advantage only to the merging firms, increasing merger incentives and decreasing free-riding effects. Thus, mergers become more profitable and stable. These results generalize to the case of correlated parameters if the correlation is not very severe, and for perfect correlation if the firms receive noisy signals. From the normative point of view, mergers are socially less harmful compared to deterministic markets and may even be welfare enhancing. If the signals are, instead, publicly observed, uncertainty does not necessarily give more incentives to merge, and mergers are not always less socially harmful.
Resumo:
We study markets where the characteristics or decisions of certain agents are relevant but not known to their trading partners. Assuming exclusive transactions, the environment is described as a continuum economy with indivisible commodities. We characterize incentive efficient allocations as solutions to linear programming problems and appeal to duality theory to demonstrate the generic existence of external effects in these markets. Because under certain conditions such effects may generate non-convexities, randomization emerges as a theoretic possibility. In characterizing market equilibria we show that, consistently with the personalized nature of transactions, prices are generally non-linear in the underlying consumption. On the other hand, external effects may have critical implications for market efficiency. With adverse selection, in fact, cross-subsidization across agents with different private information may be necessary for optimality, and so, the market need not even achieve an incentive efficient allocation. In contrast, for the case of a single commodity, we find that when informational asymmetries arise after the trading period (e.g. moral hazard; ex post hidden types) external effects are fully internalized at a market equilibrium.
Resumo:
We analyze a continuous-time bilateral double auction in the presence of two-sided incomplete information and a smallest money unit. A distinguishing feature of our model is that intermediate concessions are not observable by the adversary: they are only communicated to a passive auctioneer. An alternative interpretation is that of mediated bargaining. We show that an equilibrium using only the extreme agreements always exists and display the necessary and sufficient condition for the existence of (perfect Bayesian) equilibra which yield intermediate agreements. For the symmetric case with uniform type distribution we numerically calculate the equilibria. We find that the equilibrium which does not use compromise agreements is the least efficient, however, the rest of the equilibria yield the lower social welfare the higher number of compromise agreements are used.
Resumo:
Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.
Resumo:
This paper provides evidence on the sources of co-movement in monthly US and UK stock price movements by investigating the role of macroeconomic and financial variables in a bivariate system with time-varying conditional correlations. Crosscountry communality in response is uncovered, with changes in the US Federal Funds rate, UK bond yields and oil prices having similar negative effects in both markets. Other variables also play a role, especially for the UK market. These effects do not, however, explain the marked increase in cross-market correlations observed from around 2000, which we attribute to time variation in the correlations of shocks to these markets. A regime-switching smooth transition model captures this time variation well and shows the correlations increase dramatically around 1999-2000. JEL classifications: C32, C51, G15 Keywords: international stock returns, DCC-GARCH model, smooth transition conditional correlation GARCH model, model evaluation.
Resumo:
Among the largest resources for biological sequence data is the large amount of expressed sequence tags (ESTs) available in public and proprietary databases. ESTs provide information on transcripts but for technical reasons they often contain sequencing errors. Therefore, when analyzing EST sequences computationally, such errors must be taken into account. Earlier attempts to model error prone coding regions have shown good performance in detecting and predicting these while correcting sequencing errors using codon usage frequencies. In the research presented here, we improve the detection of translation start and stop sites by integrating a more complex mRNA model with codon usage bias based error correction into one hidden Markov model (HMM), thus generalizing this error correction approach to more complex HMMs. We show that our method maintains the performance in detecting coding sequences.
Resumo:
Block factor methods offer an attractive approach to forecasting with many predictors. These extract the information in these predictors into factors reflecting different blocks of variables (e.g. a price block, a housing block, a financial block, etc.). However, a forecasting model which simply includes all blocks as predictors risks being over-parameterized. Thus, it is desirable to use a methodology which allows for different parsimonious forecasting models to hold at different points in time. In this paper, we use dynamic model averaging and dynamic model selection to achieve this goal. These methods automatically alter the weights attached to different forecasting models as evidence comes in about which has forecast well in the recent past. In an empirical study involving forecasting output growth and inflation using 139 UK monthly time series variables, we find that the set of predictors changes substantially over time. Furthermore, our results show that dynamic model averaging and model selection can greatly improve forecast performance relative to traditional forecasting methods.