939 resultados para Markov process modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, a predictive analytical and numerical modeling approach for the orthogonal cutting process is proposed to calculate temperature distributions and subsequently, forces and stress distributions. The models proposed include a constitutive model for the material being cut based on the work of Weber, a model for the shear plane based on Merchants model, a model describing the contribution of friction based on Zorev’s approach, a model for the effect of wear on the tool based on the work of Waldorf, and a thermal model based on the works of Komanduri and Hou, with a fraction heat partition for a non-uniform distribution of the heat in the interfaces, but extended to encompass a set of contributions to the global temperature rise of chip, tool and work piece. The models proposed in this work, try to avoid from experimental based values or expressions, and simplifying assumptions or suppositions, as much as possible. On a thermo-physical point of view, the results were affected not only by the mechanical or cutting parameters chosen, but also by their coupling effects, instead of the simplifying way of modeling which is to contemplate only the direct effect of the variation of a parameter. The implementation of these models was performed using the MATLAB environment. Since it was possible to find in the literature all the parameters for AISI 1045 and AISI O2, these materials were used to run the simulations in order to avoid arbitrary assumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polysaccharides are gaining increasing attention as potential environmental friendly and sustainable building blocks in many fields of the (bio)chemical industry. The microbial production of polysaccharides is envisioned as a promising path, since higher biomass growth rates are possible and therefore higher productivities may be achieved compared to vegetable or animal polysaccharides sources. This Ph.D. thesis focuses on the modeling and optimization of a particular microbial polysaccharide, namely the production of extracellular polysaccharides (EPS) by the bacterial strain Enterobacter A47. Enterobacter A47 was found to be a metabolically versatile organism in terms of its adaptability to complex media, notably capable of achieving high growth rates in media containing glycerol byproduct from the biodiesel industry. However, the industrial implementation of this production process is still hampered due to a largely unoptimized process. Kinetic rates from the bioreactor operation are heavily dependent on operational parameters such as temperature, pH, stirring and aeration rate. The increase of culture broth viscosity is a common feature of this culture and has a major impact on the overall performance. This fact complicates the mathematical modeling of the process, limiting the possibility to understand, control and optimize productivity. In order to tackle this difficulty, data-driven mathematical methodologies such as Artificial Neural Networks can be employed to incorporate additional process data to complement the known mathematical description of the fermentation kinetics. In this Ph.D. thesis, we have adopted such an hybrid modeling framework that enabled the incorporation of temperature, pH and viscosity effects on the fermentation kinetics in order to improve the dynamical modeling and optimization of the process. A model-based optimization method was implemented that enabled to design bioreactor optimal control strategies in the sense of EPS productivity maximization. It is also critical to understand EPS synthesis at the level of the bacterial metabolism, since the production of EPS is a tightly regulated process. Methods of pathway analysis provide a means to unravel the fundamental pathways and their controls in bioprocesses. In the present Ph.D. thesis, a novel methodology called Principal Elementary Mode Analysis (PEMA) was developed and implemented that enabled to identify which cellular fluxes are activated under different conditions of temperature and pH. It is shown that differences in these two parameters affect the chemical composition of EPS, hence they are critical for the regulation of the product synthesis. In future studies, the knowledge provided by PEMA could foster the development of metabolically meaningful control strategies that target the EPS sugar content and oder product quality parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de mestrado em Construção e Reabilitação Sustentáveis

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When representing the requirements for an intended software solution during the development process, a logical architecture is a model that provides an organized vision of how functionalities behave regardless of the technologies to be implemented. If the logical architecture represents an ambient assisted living (AAL) ecosystem, such representation is a complex task due to the existence of interrelated multidomains, which, most of the time, results in incomplete and incoherent user requirements. In this chap- ter, we present the results obtained when applying process-level modeling techniques to the derivation of the logical architecture for a real industrial AAL project. We adopt a V-Model–based approach that expresses the AAL requirements in a process-level perspec- tive, instead of the traditional product-level view. Additionally, we ensure compliance of the derived logical architecture with the National Institute of Standards and Technology (NIST) reference architecture as nonfunctional requirements to support the implementa- tion of the AAL architecture in cloud contexts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The influence of the hip joint formulation on the kinematic response of the model of human gait is investigated throughout this work. To accomplish this goal, the fundamental issues of the modeling process of a planar hip joint under the framework of multibody systems are revisited. In particular, the formulations for the ideal, dry, and lubricated revolute joints are described and utilized for the interaction of femur head inside acetabulum or the hip bone. In this process, the main kinematic and dynamic aspects of hip joints are analyzed. In a simple manner, the forces that are generated during human gait, for both dry and lubricated hip joint models, are computed in terms of the system’s state variables and subsequently introduced into the dynamics equations of motion of the multibody system as external generalized forces. Moreover, a human multibody model is considered, which incorporates the different approaches for the hip articulation, namely ideal joint, dry, and lubricated models. Finally, several computational simulations based on different approaches are performed, and the main results presented and compared to identify differences among the methodologies and procedures adopted in this work. The input conditions to the models correspond to the experimental data capture from an adult male during normal gait. In general, the obtained results in terms of positions do not differ significantly when the different hip joint models are considered. In sharp contrast, the velocity and acceleration plotted vary significantly. The effect of the hip joint modeling approach is clearly measurable and visible in terms of peaks and oscillations of the velocities and accelerations. In general, with the dry hip model, intra-joint force peaks can be observed, which can be associated with the multiple impacts between the femur head and the cup. In turn, when the lubricant is present, the system’s response tends to be smoother due to the damping effects of the synovial fluid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Civil

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main purpose of the poster is to present how the Unified Modeling Language (UML) can be used for diagnosing and optimizing real industrial production systems. By using a car radios production line as a case study, the poster shows the modeling process that can be followed during the analysis phase of complex control applications. In order to guarantee the continuity mapping of the models, the authors propose some guidelines to transform the use cases diagrams into a single object diagram, which is the main diagram for the next phases of the development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work focuses on the modeling and numerical approximations of population balance equations (PBEs) for the simulation of different phenomena occurring in process engineering. The population balance equation (PBE) is considered to be a statement of continuity. It tracks the change in particle size distribution as particles are born, die, grow or leave a given control volume. In the population balance models the one independent variable represents the time, the other(s) are property coordinate(s), e.g., the particle volume (size) in the present case. They typically describe the temporal evolution of the number density functions and have been used to model various processes such as granulation, crystallization, polymerization, emulsion and cell dynamics. The semi-discrete high resolution schemes are proposed for solving PBEs modeling one and two-dimensional batch crystallization models. The schemes are discrete in property coordinates but continuous in time. The resulting ordinary differential equations can be solved by any standard ODE solver. To improve the numerical accuracy of the schemes a moving mesh technique is introduced in both one and two-dimensional cases ...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Mont Collon mafic complex is one of the best preserved examples of the Early Permian magmatism in the Central Alps, related to the intra-continental collapse of the Variscan belt. It mostly consists (> 95 vol.%) of ol+hy-nonnative plagioclase-wehrlites, olivine- and cpx-gabbros with cumulitic structures, crosscut by acid dikes. Pegmatitic gabbros, troctolites and anorthosites outcrop locally. A well-preserved cumulative, sequence is exposed in the Dents de Bertol area (center of intrusion). PT-calculations indicate that this layered magma chamber emplaced at mid-crustal levels at about 0.5 GPa and 1100 degrees C. The Mont Collon cumulitic rocks record little magmatic differentiation, as illustrated by the restricted range of clinopyroxene mg-number (Mg#(cpx)=83-89). Whole-rock incompatible trace-element contents (e.g. Nb, Zr, Ba) vary largely and without correlation with major-element composition. These features are characteristic of an in-situ crystallization process with variable amounts of interstitial liquid L trapped between the cumulus mineral phases. LA-ICPMS measurements show that trace-element distribution in the latter is homogeneous, pointing to subsolidus re-equilibration between crystals and interstitial melts. A quantitative modeling based on Langmuir's in-situ crystallization equation successfully duplicated the REE concentrations in cumulitic minerals of all rock facies of the intrusion. The calculated amounts of interstitial liquid L vary between 0 and 35% for degrees of differentiation F of 0 to 20%, relative to the least evolved facies of the intrusion. L values are well correlated with the modal proportions of interstitial amphibole and whole-rock incompatible trace-element concentrations (e.g. Zr, Nb) of the tested samples. However, the in-situ crystallization model reaches its limitations with rock containing high modal content of REE-bearing minerals (i.e. zircon), such as pegmatitic gabbros. Dikes of anorthositic composition, locally crosscutting the layered lithologies, evidence that the Mont Collon rocks evolved in open system with mixing of intercumulus liquids of different origins and possibly contrasting compositions. The proposed model is not able to resolve these complex open systems, but migrating liquids could be partly responsible for the observed dispersion of points in some correlation diagrams. Absence of significant differentiation with recurrent lithologies in the cumulitic pile of Dents de Bertol points to an efficiently convective magma chamber, with possible periodic replenishment, (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper compares the forecasting performance of different models which have been proposed for forecasting in the presence of structural breaks. These models differ in their treatment of the break process, the parameters defining the model which applies in each regime and the out-of-sample probability of a break occurring. In an extensive empirical evaluation involving many important macroeconomic time series, we demonstrate the presence of structural breaks and their importance for forecasting in the vast majority of cases. However, we find no single forecasting model consistently works best in the presence of structural breaks. In many cases, the formal modeling of the break process is important in achieving good forecast performance. However, there are also many cases where simple, rolling OLS forecasts perform well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper compares the forecasting performance of different models which have been proposed for forecasting in the presence of structural breaks. These models differ in their treatment of the break process, the parameters defining the model which applies in each regime and the out-of-sample probability of a break occurring. In an extensive empirical evaluation involving many important macroeconomic time series, we demonstrate the presence of structural breaks and their importance for forecasting in the vast majority of cases. However, we find no single forecasting model consistently works best in the presence of structural breaks. In many cases, the formal modeling of the break process is important in achieving good forecast performance. However, there are also many cases where simple, rolling OLS forecasts perform well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, multi-atlas fusion methods have gainedsignificant attention in medical image segmentation. Inthis paper, we propose a general Markov Random Field(MRF) based framework that can perform edge-preservingsmoothing of the labels at the time of fusing the labelsitself. More specifically, we formulate the label fusionproblem with MRF-based neighborhood priors, as an energyminimization problem containing a unary data term and apairwise smoothness term. We present how the existingfusion methods like majority voting, global weightedvoting and local weighted voting methods can be reframedto profit from the proposed framework, for generatingmore accurate segmentations as well as more contiguoussegmentations by getting rid of holes and islands. Theproposed framework is evaluated for segmenting lymphnodes in 3D head and neck CT images. A comparison ofvarious fusion algorithms is also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PECUBE is a three-dimensional thermal-kinematic code capable of solving the heat production-diffusion-advection equation under a temporally varying surface boundary condition. It was initially developed to assess the effects of time-varying surface topography (relief) on low-temperature thermochronological datasets. Thermochronometric ages are predicted by tracking the time-temperature histories of rock-particles ending up at the surface and by combining these with various age-prediction models. In the decade since its inception, the PECUBE code has been under continuous development as its use became wider and addressed different tectonic-geomorphic problems. This paper describes several major recent improvements in the code, including its integration with an inverse-modeling package based on the Neighborhood Algorithm, the incorporation of fault-controlled kinematics, several different ways to address topographic and drainage change through time, the ability to predict subsurface (tunnel or borehole) data, prediction of detrital thermochronology data and a method to compare these with observations, and the coupling with landscape-evolution (or surface-process) models. Each new development is described together with one or several applications, so that the reader and potential user can clearly assess and make use of the capabilities of PECUBE. We end with describing some developments that are currently underway or should take place in the foreseeable future. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Empirical modeling of exposure levels has been popular for identifying exposure determinants in occupational hygiene. Traditional data-driven methods used to choose a model on which to base inferences have typically not accounted for the uncertainty linked to the process of selecting the final model. Several new approaches propose making statistical inferences from a set of plausible models rather than from a single model regarded as 'best'. This paper introduces the multimodel averaging approach described in the monograph by Burnham and Anderson. In their approach, a set of plausible models are defined a priori by taking into account the sample size and previous knowledge of variables influent on exposure levels. The Akaike information criterion is then calculated to evaluate the relative support of the data for each model, expressed as Akaike weight, to be interpreted as the probability of the model being the best approximating model given the model set. The model weights can then be used to rank models, quantify the evidence favoring one over another, perform multimodel prediction, estimate the relative influence of the potential predictors and estimate multimodel-averaged effects of determinants. The whole approach is illustrated with the analysis of a data set of 1500 volatile organic compound exposure levels collected by the Institute for work and health (Lausanne, Switzerland) over 20 years, each concentration having been divided by the relevant Swiss occupational exposure limit and log-transformed before analysis. Multimodel inference represents a promising procedure for modeling exposure levels that incorporates the notion that several models can be supported by the data and permits to evaluate to a certain extent model selection uncertainty, which is seldom mentioned in current practice.