939 resultados para Markov process modeling
Resumo:
Colorectal cancer is the forth most common diagnosed cancer in the United States. Every year about a hundred forty-seven thousand people will be diagnosed with colorectal cancer and fifty-six thousand people lose their lives due to this disease. Most of the hereditary nonpolyposis colorectal cancer (HNPCC) and 12% of the sporadic colorectal cancer show microsatellite instability. Colorectal cancer is a multistep progressive disease. It starts from a mutation in a normal colorectal cell and grows into a clone of cells that further accumulates mutations and finally develops into a malignant tumor. In terms of molecular evolution, the process of colorectal tumor progression represents the acquisition of sequential mutations. ^ Clinical studies use biomarkers such as microsatellite or single nucleotide polymorphisms (SNPs) to study mutation frequencies in colorectal cancer. Microsatellite data obtained from single genome equivalent PCR or small pool PCR can be used to infer tumor progression. Since tumor progression is similar to population evolution, we used an approach known as coalescent, which is well established in population genetics, to analyze this type of data. Coalescent theory has been known to infer the sample's evolutionary path through the analysis of microsatellite data. ^ The simulation results indicate that the constant population size pattern and the rapid tumor growth pattern have different genetic polymorphic patterns. The simulation results were compared with experimental data collected from HNPCC patients. The preliminary result shows the mutation rate in 6 HNPCC patients range from 0.001 to 0.01. The patients' polymorphic patterns are similar to the constant population size pattern which implies the tumor progression is through multilineage persistence instead of clonal sequential evolution. The results should be further verified using a larger dataset. ^
Resumo:
Previously degradation studies carried out, over a number of different mortars by the research team, have shown that observed degradation does not exclusively depend on the solution equilibrium pH, nor the aggressive anions relative solubility. In our tests no reason was found that could allow us to explain, why same solubility anions with a lower pH are less aggressive than others. The aim of this paper is to study cement pastes behavior in aggressive environments. As observed in previous research, this cement pastes behaviors are not easily explained only taking into account only usual parameters, pH, solubility etc. Consequently the paper is about studying if solution physicochemical characteristics are more important in certain environments than specific pH values. The paper tries to obtain a degradation model, which starting from solution physicochemical parameters allows us to interpret the different behaviors shown by different composition cements. To that end, the rates of degradation of the solid phases were computed for each considered environment. Three cement have been studied: CEM I 42.5R/SR, CEM II/A-V 42.5R and CEM IV/B-(P-V) 32.5 N. The pastes have been exposed to five environments: sodium acetate/acetic acid 0.35 M, sodium sulfate solution 0.17 M, a solution representing natural water, saturated calcium hydroxide solution and laboratory environment. The attack mechanism was meant to be unidirectional, in order to achieve so; all sides of cylinders were sealed except from the attacked surface. The cylinders were taking out of the exposition environments after 2, 4, 7, 14, 30, 58 and 90 days. Both aggressive solution variations in solid phases and in different depths have been characterized. To each age and depth the calcium, magnesium and iron contents have been analyzed. Hydrated phases evolution studied, using thermal analysis, and crystalline compound changes, using X ray diffraction have been also analyzed. Sodium sulphate and water solutions stabilize an outer pH near to 8 in short time, however the stability of the most pH dependent phases is not the same. Although having similar pH and existing the possibility of forming a plaster layer near to the calcium leaching surface, this stability is greater than other sulphate solutions. Stability variations of solids formed by inverse diffusion, determine the rate of degradation.
Resumo:
Usability plays an important role to satisfy users? needs. There are many recommendations in the HCI literature on how to improve software usability. Our research focuses on such recommendations that affect the system architecture rather than just the interface. However, improving software usability in aspects that affect architecture increases the analyst?s workload and development complexity. This paper proposes a solution based on model-driven development. We propose representing functional usability mechanisms abstractly by means of conceptual primitives. The analyst will use these primitives to incorporate functional usability features at the early stages of the development process. Following the model-driven development paradigm, these features are then automatically transformed into subsequent steps of development, a practice that is hidden from the analyst.
Resumo:
Unripe banana flour (UBF) production employs bananas not submitted to maturation process, is an interesting alternative to minimize the fruit loss reduction related to inappropriate handling or fast ripening. The UBF is considered as a functional ingredient improving glycemic and plasma insulin levels in blood, have also shown efficacy on the control of satiety, insulin resistance. The aim of this work was to study the drying process of unripe banana slabs (Musa cavendishii, Nanicão) developing a transient drying model through mathematical modeling with simultaneous moisture and heat transfer. The raw material characterization was performed and afterwards the drying process was conducted at 40 ºC, 50 ºC e 60 ºC, the product temperature was recorded using thermocouples, the air velocity inside the chamber was 4 m·s-1. With the experimental data was possible to validate the diffusion model based on the Fick\'s second law and Fourier. For this purpose, the sorption isotherms were measured and fitted to the GAB model estimating the equilibrium moisture content (Xe), 1.76 [g H2O/100g d.b.] at 60 ºC and 10 % of relative humidity (RH), the thermophysical properties (k, Cp, ?) were also measured to be used in the model. Five cases were contemplated: i) Constant thermophysical properties; ii) Variable properties; iii) Mass (hm), heat transfer (h) coefficient and effective diffusivity (De) estimation 134 W·m-2·K-1, 4.91x10-5 m-2·s-1 and 3.278?10-10 m·s-2 at 60 ºC, respectively; iv) Variable De, it presented a third order polynomial behavior as function of moisture content; v) The shrinkage had an effect on the mathematical model, especially in the 3 first hours of process, the thickness experienced a contraction of about (30.34 ± 1.29) % out of the initial thickness, finding two decreasing drying rate periods (DDR I and DDR II), 3.28x10-10 m·s-2 and 1.77x10-10 m·s-2, respectively. COMSOL Multiphysics simulations were possible to perform through the heat and mass transfer coefficient estimated by the mathematical modeling.
Resumo:
Presentation submitted to PSE Seminar, Chemical Engineering Department, Center for Advanced Process Design-making (CAPD), Carnegie Mellon University, Pittsburgh (USA), October 2012.
Terrain classification based on markov random field texture modeling of SAR and SAR coherency images
Resumo:
Stochastic differential equations arise naturally in a range of contexts, from financial to environmental modeling. Current solution methods are limited in their representation of the posterior process in the presence of data. In this work, we present a novel Gaussian process approximation to the posterior measure over paths for a general class of stochastic differential equations in the presence of observations. The method is applied to two simple problems: the Ornstein-Uhlenbeck process, of which the exact solution is known and can be compared to, and the double-well system, for which standard approaches such as the ensemble Kalman smoother fail to provide a satisfactory result. Experiments show that our variational approximation is viable and that the results are very promising as the variational approximate solution outperforms standard Gaussian process regression for non-Gaussian Markov processes.
Resumo:
The compact and visualized documenting of information business modeling is a major prerequisite for comprehending its basic concepts, as well as for its effective application and improvement. The documenting of this process is related to its modeling. Thus, the process of information business modeling can be represented by its own tools. Being based on this thesis, the authors suggest an approach to representing the process of information business modeling. A profile for its documenting has been developed for the purpose.
Resumo:
This paper is dedicated to modelling of network maintaining based on live example – maintaining ATM banking network, where any problems are mean money loss. A full analysis is made in order to estimate valuable and not-valuable parameters based on complex analysis of available data. Correlation analysis helps to estimate provided data and to produce a complex solution of increasing network maintaining effectiveness.
Resumo:
Performance-based maintenance contracts differ significantly from material and method-based contracts that have been traditionally used to maintain roads. Road agencies around the world have moved towards a performance-based contract approach because it offers several advantages like cost saving, better budgeting certainty, better customer satisfaction with better road services and conditions. Payments for the maintenance of road are explicitly linked to the contractor successfully meeting certain clearly defined minimum performance indicators in these contracts. Quantitative evaluation of the cost of performance-based contracts has several difficulties due to the complexity of the pavement deterioration process. Based on a probabilistic analysis of failures of achieving multiple performance criteria over the length of the contract period, an effort has been made to develop a model that is capable of estimating the cost of these performance-based contracts. One of the essential functions of such model is to predict performance of the pavement as accurately as possible. Prediction of future degradation of pavement is done using Markov Chain Process, which requires estimating transition probabilities from previous deterioration rate for similar pavements. Transition probabilities were derived using historical pavement condition rating data, both for predicting pavement deterioration when there is no maintenance, and for predicting pavement improvement when maintenance activities are performed. A methodological framework has been developed to estimate the cost of maintaining road based on multiple performance criteria such as crack, rut and, roughness. The application of the developed model has been demonstrated via a real case study of Miami Dade Expressways (MDX) using pavement condition rating data from Florida Department of Transportation (FDOT) for a typical performance-based asphalt pavement maintenance contract. Results indicated that the pavement performance model developed could predict the pavement deterioration quite accurately. Sensitivity analysis performed shows that the model is very responsive to even slight changes in pavement deterioration rate and performance constraints. It is expected that the use of this model will assist the highway agencies and contractors in arriving at a fair contract value for executing long term performance-based pavement maintenance works.
Resumo:
Performance-based maintenance contracts differ significantly from material and method-based contracts that have been traditionally used to maintain roads. Road agencies around the world have moved towards a performance-based contract approach because it offers several advantages like cost saving, better budgeting certainty, better customer satisfaction with better road services and conditions. Payments for the maintenance of road are explicitly linked to the contractor successfully meeting certain clearly defined minimum performance indicators in these contracts. Quantitative evaluation of the cost of performance-based contracts has several difficulties due to the complexity of the pavement deterioration process. Based on a probabilistic analysis of failures of achieving multiple performance criteria over the length of the contract period, an effort has been made to develop a model that is capable of estimating the cost of these performance-based contracts. One of the essential functions of such model is to predict performance of the pavement as accurately as possible. Prediction of future degradation of pavement is done using Markov Chain Process, which requires estimating transition probabilities from previous deterioration rate for similar pavements. Transition probabilities were derived using historical pavement condition rating data, both for predicting pavement deterioration when there is no maintenance, and for predicting pavement improvement when maintenance activities are performed. A methodological framework has been developed to estimate the cost of maintaining road based on multiple performance criteria such as crack, rut and, roughness. The application of the developed model has been demonstrated via a real case study of Miami Dade Expressways (MDX) using pavement condition rating data from Florida Department of Transportation (FDOT) for a typical performance-based asphalt pavement maintenance contract. Results indicated that the pavement performance model developed could predict the pavement deterioration quite accurately. Sensitivity analysis performed shows that the model is very responsive to even slight changes in pavement deterioration rate and performance constraints. It is expected that the use of this model will assist the highway agencies and contractors in arriving at a fair contract value for executing long term performance-based pavement maintenance works.
Resumo:
A class of multi-process models is developed for collections of time indexed count data. Autocorrelation in counts is achieved with dynamic models for the natural parameter of the binomial distribution. In addition to modeling binomial time series, the framework includes dynamic models for multinomial and Poisson time series. Markov chain Monte Carlo (MCMC) and Po ́lya-Gamma data augmentation (Polson et al., 2013) are critical for fitting multi-process models of counts. To facilitate computation when the counts are high, a Gaussian approximation to the P ́olya- Gamma random variable is developed.
Three applied analyses are presented to explore the utility and versatility of the framework. The first analysis develops a model for complex dynamic behavior of themes in collections of text documents. Documents are modeled as a “bag of words”, and the multinomial distribution is used to characterize uncertainty in the vocabulary terms appearing in each document. State-space models for the natural parameters of the multinomial distribution induce autocorrelation in themes and their proportional representation in the corpus over time.
The second analysis develops a dynamic mixed membership model for Poisson counts. The model is applied to a collection of time series which record neuron level firing patterns in rhesus monkeys. The monkey is exposed to two sounds simultaneously, and Gaussian processes are used to smoothly model the time-varying rate at which the neuron’s firing pattern fluctuates between features associated with each sound in isolation.
The third analysis presents a switching dynamic generalized linear model for the time-varying home run totals of professional baseball players. The model endows each player with an age specific latent natural ability class and a performance enhancing drug (PED) use indicator. As players age, they randomly transition through a sequence of ability classes in a manner consistent with traditional aging patterns. When the performance of the player significantly deviates from the expected aging pattern, he is identified as a player whose performance is consistent with PED use.
All three models provide a mechanism for sharing information across related series locally in time. The models are fit with variations on the P ́olya-Gamma Gibbs sampler, MCMC convergence diagnostics are developed, and reproducible inference is emphasized throughout the dissertation.
Resumo:
A RET network consists of a network of photo-active molecules called chromophores that can participate in inter-molecular energy transfer called resonance energy transfer (RET). RET networks are used in a variety of applications including cryptographic devices, storage systems, light harvesting complexes, biological sensors, and molecular rulers. In this dissertation, we focus on creating a RET device called closed-diffusive exciton valve (C-DEV) in which the input to output transfer function is controlled by an external energy source, similar to a semiconductor transistor like the MOSFET. Due to their biocompatibility, molecular devices like the C-DEVs can be used to introduce computing power in biological, organic, and aqueous environments such as living cells. Furthermore, the underlying physics in RET devices are stochastic in nature, making them suitable for stochastic computing in which true random distribution generation is critical.
In order to determine a valid configuration of chromophores for the C-DEV, we developed a systematic process based on user-guided design space pruning techniques and built-in simulation tools. We show that our C-DEV is 15x better than C-DEVs designed using ad hoc methods that rely on limited data from prior experiments. We also show ways in which the C-DEV can be improved further and how different varieties of C-DEVs can be combined to form more complex logic circuits. Moreover, the systematic design process can be used to search for valid chromophore network configurations for a variety of RET applications.
We also describe a feasibility study for a technique used to control the orientation of chromophores attached to DNA. Being able to control the orientation can expand the design space for RET networks because it provides another parameter to tune their collective behavior. While results showed limited control over orientation, the analysis required the development of a mathematical model that can be used to determine the distribution of dipoles in a given sample of chromophore constructs. The model can be used to evaluate the feasibility of other potential orientation control techniques.
Resumo:
Current research shows a relationship between healthcare architecture and patient-related Outcomes. The planning and designing of new healthcare environments is a complex process; the needs of the various end-users of the environment must be considered, including the patients, the patients’ significant others, and the staff. The aim of this study was to explore the experiences of healthcare professionals participating in group modelling utilizing system dynamics in the pre-design phase of new healthcare environments. We engaged healthcare professionals in a series of workshops using system dynamics to discuss the planning of healthcare environments in the beginning of a construction, and then interviewed them about their experience. An explorative and qualitative design was used to describe participants’ experiences of participating in the group modelling projects. Participants (n=20) were recruited from a larger intervention study using group modeling and system dynamics in planning and designing projects. The interviews were analysed by qualitative content analysis. Two themes were formed, representing the experiences in the group modeling process: ‘Partaking in the G-M created knowledge and empowerment’and ‘Partaking in the G-M was different from what was expected and required time and skills’. The method can support participants in design teams to focus more on their healthcare organization, their care activities and their aims rather than focusing on detailed layout solutions. This clarification is important when decisions about the design are discussed and prepared and will most likely lead to greater readiness for future building process.