939 resultados para Markov process modeling
Resumo:
Nonsyndromic cleft lip and palate (NSCL/P) is a complex disease resulting from failure of fusion of facial primordia, a complex developmental process that includes the epithelial-mesenchymal transition (EMT). Detection of differential gene transcription between NSCL/P patients and control individuals offers an interesting alternative for investigating pathways involved in disease manifestation. Here we compared the transcriptome of 6 dental pulp stem cell (DPSC) cultures from NSCL/P patients and 6 controls. Eighty-seven differentially expressed genes (DEGs) were identified. The most significant putative gene network comprised 13 out of 87 DEGs of which 8 encode extracellular proteins: ACAN, COL4A1, COL4A2, GDF15, IGF2, MMP1, MMP3 and PDGFa. Through clustering analyses we also observed that MMP3, ACAN, COL4A1 and COL4A2 exhibit co-regulated expression. Interestingly, it is known that MMP3 cleavages a wide range of extracellular proteins, including the collagens IV, V, IX, X, proteoglycans, fibronectin and laminin. It is also capable of activating other MMPs. Moreover, MMP3 had previously been associated with NSCL/P. The same general pattern was observed in a further sample, confirming involvement of synchronized gene expression patterns which differed between NSCL/P patients and controls. These results show the robustness of our methodology for the detection of differentially expressed genes using the RankProd method. In conclusion, DPSCs from NSCL/P patients exhibit gene expression signatures involving genes associated with mechanisms of extracellular matrix modeling and palate EMT processes which differ from those observed in controls. This comparative approach should lead to a more rapid identification of gene networks predisposing to this complex malformation syndrome than conventional gene mapping technologies.
Resumo:
In this paper, we present different ofrailtyo models to analyze longitudinal data in the presence of covariates. These models incorporate the extra-Poisson variability and the possible correlation among the repeated counting data for each individual. Assuming a CD4 counting data set in HIV-infected patients, we develop a hierarchical Bayesian analysis considering the different proposed models and using Markov Chain Monte Carlo methods. We also discuss some Bayesian discrimination aspects for the choice of the best model.
Resumo:
Human parasitic diseases are the foremost threat to human health and welfare around the world. Trypanosomiasis is a very serious infectious disease against which the currently available drugs are limited and not effective. Therefore, there is an urgent need for new chemotherapeutic agents. One attractive drug target is the major cysteine protease from Trypanosoma cruzi, cruzain. In the present work, comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) studies were conducted on a series of thiosemicarbazone and semicarbazone derivatives as inhibitors of cruzain. Molecular modeling studies were performed in order to identify the preferred binding mode of the inhibitors into the enzyme active site, and to generate structural alignments for the three-dimensional quantitative structure-activity relationship (3D QSAR) investigations. Statistically significant models were obtained (CoMFA. r(2) = 0.96 and q(2) = 0.78; CoMSIA, r(2) = 0.91 and q(2) = 0.73), indicating their predictive ability for untested compounds. The models were externally validated employing a test set, and the predicted values were in good agreement with the experimental results. The final QSAR models and the information gathered from the 3D CoMFA and CoMSIA contour maps provided important insights into the chemical and structural basis involved in the molecular recognition process of this family of cruzain inhibitors, and should be useful for the design of new structurally related analogs with improved potency. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
When modeling real-world decision-theoretic planning problems in the Markov Decision Process (MDP) framework, it is often impossible to obtain a completely accurate estimate of transition probabilities. For example, natural uncertainty arises in the transition specification due to elicitation of MOP transition models from an expert or estimation from data, or non-stationary transition distributions arising from insufficient state knowledge. In the interest of obtaining the most robust policy under transition uncertainty, the Markov Decision Process with Imprecise Transition Probabilities (MDP-IPs) has been introduced to model such scenarios. Unfortunately, while various solution algorithms exist for MDP-IPs, they often require external calls to optimization routines and thus can be extremely time-consuming in practice. To address this deficiency, we introduce the factored MDP-IP and propose efficient dynamic programming methods to exploit its structure. Noting that the key computational bottleneck in the solution of factored MDP-IPs is the need to repeatedly solve nonlinear constrained optimization problems, we show how to target approximation techniques to drastically reduce the computational overhead of the nonlinear solver while producing bounded, approximately optimal solutions. Our results show up to two orders of magnitude speedup in comparison to traditional ""flat"" dynamic programming approaches and up to an order of magnitude speedup over the extension of factored MDP approximate value iteration techniques to MDP-IPs while producing the lowest error of any approximation algorithm evaluated. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Before signing electronic contracts, a rational agent should estimate the expected utilities of these contracts and calculate the violation risks related to them. In order to perform such pre-signing procedures, this agent has to be capable of computing a policy taking into account the norms and sanctions in the contracts. In relation to this, the contribution of this work is threefold. First, we present the Normative Markov Decision Process, an extension of the Markov Decision Process for explicitly representing norms. In order to illustrate the usage of our framework, we model an example in a simulated aerospace aftermarket. Second, we specify an algorithm for identifying the states of the process which characterize the violation of norms. Finally, we show how to compute policies with our framework and how to calculate the risk of violating the norms in the contracts by adopting a particular policy.
Resumo:
This paper performs a thorough statistical examination of the time-series properties of the daily market volatility index (VIX) from the Chicago Board Options Exchange (CBOE). The motivation lies not only on the widespread consensus that the VIX is a barometer of the overall market sentiment as to what concerns investors' risk appetite, but also on the fact that there are many trading strategies that rely on the VIX index for hedging and speculative purposes. Preliminary analysis suggests that the VIX index displays long-range dependence. This is well in line with the strong empirical evidence in the literature supporting long memory in both options-implied and realized variances. We thus resort to both parametric and semiparametric heterogeneous autoregressive (HAR) processes for modeling and forecasting purposes. Our main ndings are as follows. First, we con rm the evidence in the literature that there is a negative relationship between the VIX index and the S&P 500 index return as well as a positive contemporaneous link with the volume of the S&P 500 index. Second, the term spread has a slightly negative long-run impact in the VIX index, when possible multicollinearity and endogeneity are controlled for. Finally, we cannot reject the linearity of the above relationships, neither in sample nor out of sample. As for the latter, we actually show that it is pretty hard to beat the pure HAR process because of the very persistent nature of the VIX index.
Resumo:
Corporate governance has been in the spotlight for the past two decades, being subject of numerous researches all over the world. Governance is pictured as a broad and diverse theme, evolving through different routes to form distinct systems. This scenario together with 2 types of agency problems (investor vs. management and minorities vs. controlling shareholders) produce different definitions for governance. Usually, studies investigate whether corporate governance structures influence firm performance, and company valuation. This approach implies investors can identify those impacts and later take them into consideration when making investment decisions. However, behavioral finance theory shows that not always investors take rational decisions, and therefore the modus operandi of those professionals needs to be understood. So, this research aimed to investigate to what extent Brazilian corporate governance standards and practices influence the investment decision-making process of equity markets' professionals from the sell-side and buy-side. This exploratory study was carried out through qualitative and quantitative approaches. In the qualitative phase, 8 practitioners were interviewed and 3 dimensions emerged: understanding, pertinence and practice. Based on the interviews’ findings, a questionnaire was formulated and distributed to buy-siders and sell-siders that cover Brazilian stocks. 117 respondents from all over the world contributed to the study. The data obtained were analyzed through structural equation modeling and descriptive statistics. The 3 dimensions became 5 constructs: definition (institutionalized governance, informal governance), pertinence (relevance), practice (valuation process, structured governance assessment) The results of this thesis suggest there is no definitive answer, as the extent to which governance will influence an investment decision process will depend on a number of circumstances which compose the context. The only certainty is the need to present a “corporate governance behavior”, rather than simply establishing rules and regulations at firm and country level.
Resumo:
Generalized hyper competitiveness in the world markets has determined the need to offer better products to potential and actual clients in order to mark an advantagefrom other competitors. To ensure the production of an adequate product, enterprises need to work on the efficiency and efficacy of their business processes (BPs) by means of the construction of Interactive Information Systems (IISs, including Interactive Multimedia Documents) so that they are processed more fluidly and correctly.The construction of the correct IIS is a major task that can only be successful if the needs from every intervenient are taken into account. Their requirements must bedefined with precision, extensively analyzed and consequently the system must be accurately designed in order to minimize implementation problems so that the IIS isproduced on schedule and with the fewer mistakes as possible. The main contribution of this thesis is the proposal of Goals, a software (engineering) construction process which aims at defining the tasks to be carried out in order to develop software. This process defines the stakeholders, the artifacts, and the techniques that should be applied to achieve correctness of the IIS. Complementarily, this process suggests two methodologies to be applied in the initial phases of the lifecycle of the Software Engineering process: Process Use Cases for the phase of requirements, and; MultiGoals for the phases of analysis and design. Process Use Cases is a UML-based (Unified Modeling Language), goal-driven and use case oriented methodology for the definition of functional requirements. It uses an information oriented strategy in order to identify BPs while constructing the enterprise’s information structure, and finalizes with the identification of use cases within the design of these BPs. This approach provides a useful tool for both activities of Business Process Management and Software Engineering. MultiGoals is a UML-based, use case-driven and architectural centric methodology for the analysis and design of IISs with support for Multimedia. It proposes the analysis of user tasks as the basis of the design of the: (i) user interface; (ii) the system behaviour that is modeled by means of patterns which can combine Multimedia and standard information, and; (iii) the database and media contents. This thesis makes the theoretic presentation of these approaches accompanied with examples from a real project which provide the necessary support for the understanding of the used techniques.
Resumo:
In a world where organizations are ever more complex the need for the knowledge of the organizational self is a growing necessity. The DEMO methodology sets a goal in achieving the specification of the organizational self capturing the essence of the organization in way independent of its implementation and also coherent, consistent, complete, modular and objective. But having such organization self notion is of little meaning if this notion is not shared by the organization actors. To achieve this goal in a society that has grown attached to technology and where time is of utmost importance, using a tool such as a semantic Wikipedia may be the perfect way of making the information accessible. However, to establish DEMO methodology in such platform there is a need to create bridges between its modeling components and semantic Wikipedia. It’s in that aspect that our thesis focuses, trying to establish and implement, using a study case, the principles of a way of transforming the DEMO methodology diagrams in comprehensive pages on semantic Wikipedia but keeping them as abstract as possible to allow expansibility and generalization to all diagrams without losing any valuable information so that, if that is the wish, those diagrams may be recreated from the semantic pages and make this process a full cycle.
Resumo:
Nowadays, more than half of the computer development projects fail to meet the final users' expectations. One of the main causes is insufficient knowledge about the organization of the enterprise to be supported by the respective information system. The DEMO methodology (Design and Engineering Methodology for Organizations) has been proved as a well-defined method to specify, through models and diagrams, the essence of any organization at a high level of abstraction. However, this methodology is platform implementation independent, lacking the possibility of saving and propagating possible changes from the organization models to the implemented software, in a runtime environment. The Universal Enterprise Adaptive Object Model (UEAOM) is a conceptual schema being used as a basis for a wiki system, to allow the modeling of any organization, independent of its implementation, as well as the previously mentioned change propagation in a runtime environment. Based on DEMO and UEAOM, this project aims to develop efficient and standardized methods, to enable an automatic conversion of DEMO Ontological Models, based on UEAOM specification into BPMN (Business Process Model and Notation) models of processes, using clear semantics, without ambiguities, in order to facilitate the creation of processes, almost ready for being executed on workflow systems that support BPMN.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
In this work, the Markov chain will be the tool used in the modeling and analysis of convergence of the genetic algorithm, both the standard version as for the other versions that allows the genetic algorithm. In addition, we intend to compare the performance of the standard version with the fuzzy version, believing that this version gives the genetic algorithm a great ability to find a global optimum, own the global optimization algorithms. The choice of this algorithm is due to the fact that it has become, over the past thirty yares, one of the more importan tool used to find a solution of de optimization problem. This choice is due to its effectiveness in finding a good quality solution to the problem, considering that the knowledge of a good quality solution becomes acceptable given that there may not be another algorithm able to get the optimal solution for many of these problems. However, this algorithm can be set, taking into account, that it is not only dependent on how the problem is represented as but also some of the operators are defined, to the standard version of this, when the parameters are kept fixed, to their versions with variables parameters. Therefore to achieve good performance with the aforementioned algorithm is necessary that it has an adequate criterion in the choice of its parameters, especially the rate of mutation and crossover rate or even the size of the population. It is important to remember that those implementations in which parameters are kept fixed throughout the execution, the modeling algorithm by Markov chain results in a homogeneous chain and when it allows the variation of parameters during the execution, the Markov chain that models becomes be non - homogeneous. Therefore, in an attempt to improve the algorithm performance, few studies have tried to make the setting of the parameters through strategies that capture the intrinsic characteristics of the problem. These characteristics are extracted from the present state of execution, in order to identify and preserve a pattern related to a solution of good quality and at the same time that standard discarding of low quality. Strategies for feature extraction can either use precise techniques as fuzzy techniques, in the latter case being made through a fuzzy controller. A Markov chain is used for modeling and convergence analysis of the algorithm, both in its standard version as for the other. In order to evaluate the performance of a non-homogeneous algorithm tests will be applied to compare the standard fuzzy algorithm with the genetic algorithm, and the rate of change adjusted by a fuzzy controller. To do so, pick up optimization problems whose number of solutions varies exponentially with the number of variables
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Cure kinetic model is an integral part of composite process simulation, which is used to predict the degree of curing and the amount of the generated heat. The parameters involved in kinetic models are usually determined empirically from isothermal or dynamic differential scanning calorimetry (DSC) data. In this work, DSC and rheological techniques were used to investigate some of the kinetic parameters of cure reactions of carbon/F161 epoxy prepreg and to evaluate the cure cycle used to manufacture polymeric composites for aeronautical applications. As a result, it was observed that the F161 prepreg presents cure kinetic with total order 1.2-1.9. (c) 2006 Springer Science + Business Media, Inc.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)