939 resultados para Markov process modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of desiccation cracks in soils has received increasing attention in the last few years, in both experimental investigations and modeling. Experimental research has been mainly focused on the behavior of slurries subjected to drying in plates of different shapes, sizes and thickness. The main objectives of these studies were to learn about the process of crack formation under controlled environmental conditions, and also to better understand the effect of different factors (e.g. soil type, boundary conditions, soil thickness) on the morphology of the crack network. As for the numerical modeling, different approaches have been suggested lately to describe the behavior of drying cracks in soils. One aspect that it is still difficult to describe properly is the crack pattern observed in desiccated soils. This work presents a novel technique to model the behavior of drying soils. The crack patter observed in desiccation tests on circular plates are simulated with the main objective of predicting the effect of soil thickness on crack pattern. Good agreement between experimental results and model prediction are observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pós-graduação em Ciência da Computação - IBILCE

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forward modeling is commonly applied to gravity field data of impact structures to determine the main gravity anomaly sources. In this context, we have developed 2.5-D gravity models of the Serra da Cangalha impact structure for the purpose of investigating geological bodies/structures underneath the crater. Interpretation of the models was supported by ground magnetic data acquired along profiles, as well as by high resolution aeromagnetic data. Ground magnetic data reveal the presence of short-wavelength anomalies probably related to shallow magnetic sources that could have been emplaced during the cratering process. Aeromagnetic data show that the basement underneath the crater occurs at an average depth of about 1.9 km, whereas in the region beneath the central uplift it is raised to 0.51 km below the current surface. These depths are also supported by 2.5-D gravity models showing a gentle relief for the basement beneath the central uplift area. Geophysical data were used to provide further constraints for numeral modeling of crater formation that provided important information on the structural modification that affected the rocks underneath the crater, as well as on shock-induced modifications of target rocks. The results showed that the morphology is consistent with the current observations of the crater and that Serra da Cangalha was formed by a meteorite of approximately 1.4 km diameter striking at 12 km s-1.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The classic conservative approach for thermal process design can lead to over-processing, especially for laminar flow, when a significant distribution of temperature and of residence time occurs. In order to optimize quality retention, a more comprehensive model is required. A model comprising differential equations for mass and heat transfer is proposed for the simulation of the continuous thermal processing of a non-Newtonian food in a tubular system. The model takes into account the contribution from heating and cooling sections, the heat exchange with the ambient air and effective diffusion associated with non-ideal laminar flow. The study case of soursop juice processing was used to test the model. Various simulations were performed to evaluate the effect of the model assumptions. An expressive difference in the predicted lethality was observed between the classic approach and the proposed model. The main advantage of the model is its flexibility to represent different aspects with a small computational time, making it suitable for process evaluation and design. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transplantation brings hope for many patients. A multidisciplinary approach on this field aims at creating biologically functional tissues to be used as implants and prostheses. The freeze-drying process allows the fundamental properties of these materials to be preserved, making future manipulation and storage easier. Optimizing a freeze-drying cycle is of great importance since it aims at reducing process costs while increasing product quality of this time-and-energy-consuming process. Mathematical modeling comes as a tool to help a better understanding of the process variables behavior and consequently it helps optimization studies. Freeze-drying microscopy is a technique usually applied to determine critical temperatures of liquid formulations. It has been used in this work to determine the sublimation rates of a biological tissue freeze-drying. The sublimation rates were measured from the speed of the moving interface between the dried and the frozen layer under 21.33, 42.66 and 63.99 Pa. The studied variables were used in a theoretical model to simulate various temperature profiles of the freeze-drying process. Good agreement between the experimental and the simulated results was found.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report the first tungsten isotopic measurements in stardust silicon carbide (SiC) grains recovered from the Murchison carbonaceous chondrite. The isotopes (182,183,184,186)Wand (179,180)Hf were measured on both an aggregate (KJB fraction) and single stardust SiC grains (LS+ LU fraction) believed to have condensed in the outflows of low-mass carbon-rich asymptotic giant branch (AGB) stars with close-to-solar metallicity. The SiC aggregate shows small deviations from terrestrial (= solar) composition in the (182)W/(184)Wand (183)W/(184)Wratios, with deficits in (182)W and (183)W with respect to (184)W. The (186)W/(184)W ratio, however, shows no apparent deviation from the solar value. Tungsten isotopic measurements in single mainstream stardust SiC grains revealed lower than solar (182)W/(184)W, (183)W/(184)W, and (186)W/(184)W ratios. We have compared the SiC data with theoretical predictions of the evolution of W isotopic ratios in the envelopes of AGB stars. These ratios are affected by the slow neutron-capture process and match the SiC data regarding their (182)W/(184)W, (183)W/(184)W, and (179)Hf/(180)Hf isotopic compositions, although a small adjustment in the s-process production of (183)W is needed in order to have a better agreement between the SiC data and model predictions. The models cannot explain the (186)W/(184)W ratios observed in the SiC grains, even when the current (185)W neutron-capture cross section is increased by a factor of two. Further study is required to better assess how model uncertainties (e. g., the formation of the (13)C neutron source, the mass-loss law, the modeling of the third dredge-up, and the efficiency of the (22)Ne neutron source) may affect current s-process predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solar reactors can be attractive in photodegradation processes due to lower electrical energy demand. The performance of a solar reactor for two flow configurations, i.e., plug flow and mixed flow, is compared based on experimental results with a pilot-scale solar reactor. Aqueous solutions of phenol were used as a model for industrial wastewater containing organic contaminants. Batch experiments were carried out under clear sky, resulting in removal rates in the range of 96100?%. The dissolved organic carbon removal rate was simulated by an empirical model based on neural networks, which was adjusted to the experimental data, resulting in a correlation coefficient of 0.9856. This approach enabled to estimate effects of process variables which could not be evaluated from the experiments. Simulations with different reactor configurations indicated relevant aspects for the design of solar reactors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A systematic approach to model nonlinear systems using norm-bounded linear differential inclusions (NLDIs) is proposed in this paper. The resulting NLDI model is suitable for the application of linear control design techniques and, therefore, it is possible to fulfill certain specifications for the underlying nonlinear system, within an operating region of interest in the state-space, using a linear controller designed for this NLDI model. Hence, a procedure to design a dynamic output feedback controller for the NLDI model is also proposed in this paper. One of the main contributions of the proposed modeling and control approach is the use of the mean-value theorem to represent the nonlinear system by a linear parameter-varying model, which is then mapped into a polytopic linear differential inclusion (PLDI) within the region of interest. To avoid the combinatorial problem that is inherent of polytopic models for medium- and large-sized systems, the PLDI is transformed into an NLDI, and the whole process is carried out ensuring that all trajectories of the underlying nonlinear system are also trajectories of the resulting NLDI within the operating region of interest. Furthermore, it is also possible to choose a particular structure for the NLDI parameters to reduce the conservatism in the representation of the nonlinear system by the NLDI model, and this feature is also one important contribution of this paper. Once the NLDI representation of the nonlinear system is obtained, the paper proposes the application of a linear control design method to this representation. The design is based on quadratic Lyapunov functions and formulated as search problem over a set of bilinear matrix inequalities (BMIs), which is solved using a two-step separation procedure that maps the BMIs into a set of corresponding linear matrix inequalities. Two numerical examples are given to demonstrate the effectiveness of the proposed approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polynomial Chaos Expansion (PCE) is widely recognized as a flexible tool to represent different types of random variables/processes. However, applications to real, experimental data are still limited. In this article, PCE is used to represent the random time-evolution of metal corrosion growth in marine environments. The PCE coefficients are determined in order to represent data of 45 corrosion coupons tested by Jeffrey and Melchers (2001) at Taylors Beach, Australia. Accuracy of the representation and possibilities for model extrapolation are considered in the study. Results show that reasonably accurate smooth representations of the corrosion process can be obtained. The representation is not better because a smooth model is used to represent non-smooth corrosion data. Random corrosion leads to time-variant reliability problems, due to resistance degradation over time. Time variant reliability problems are not trivial to solve, especially under random process loading. Two example problems are solved herein, showing how the developed PCE representations can be employed in reliability analysis of structures subject to marine corrosion. Monte Carlo Simulation is used to solve the resulting time-variant reliability problems. However, an accurate and more computationally efficient solution is also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Micelles composed of amphiphilic copolymers linked to a radioactive element are used in nuclear medicine predominantly as a diagnostic application. A relevant advantage of polymeric micelles in aqueous solution is their resulting particle size, which can vary from 10 to 100 nm in diameter. In this review, polymeric micelles labeled with radioisotopes including technetium (99mTc) and indium (111In), and their clinical applications for several diagnostic techniques, such as single photon emission computed tomography (SPECT), gamma-scintigraphy, and nuclear magnetic resonance (NMR), were discussed. Also, micelle use primarily for the diagnosis of lymphatic ducts and sentinel lymph nodes received special attention. Notably, the employment of these diagnostic techniques can be considered a significant tool for functionally exploring body systems as well as investigating molecular pathways involved in the disease process. The use of molecular modeling methodologies and computer-aided drug design strategies can also yield valuable information for the rational design and development of novel radiopharmaceuticals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The discovery and development of a new drug are time-consuming, difficult and expensive. This complex process has evolved from classical methods into an integration of modern technologies and innovative strategies addressed to the design of new chemical entities to treat a variety of diseases. The development of new drug candidates is often limited by initial compounds lacking reasonable chemical and biological properties for further lead optimization. Huge libraries of compounds are frequently selected for biological screening using a variety of techniques and standard models to assess potency, affinity and selectivity. In this context, it is very important to study the pharmacokinetic profile of the compounds under investigation. Recent advances have been made in the collection of data and the development of models to assess and predict pharmacokinetic properties (ADME - absorption, distribution, metabolism and excretion) of bioactive compounds in the early stages of drug discovery projects. This paper provides a brief perspective on the evolution of in silico ADME tools, addressing challenges, limitations, and opportunities in medicinal chemistry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process algebraic architectural description languages provide a formal means for modeling software systems and assessing their properties. In order to bridge the gap between system modeling and system im- plementation, in this thesis an approach is proposed for automatically generating multithreaded object-oriented code from process algebraic architectural descriptions, in a way that preserves – under certain assumptions – the properties proved at the architectural level. The approach is divided into three phases, which are illustrated by means of a running example based on an audio processing system. First, we develop an architecture-driven technique for thread coordination management, which is completely automated through a suitable package. Second, we address the translation of the algebraically-specified behavior of the individual software units into thread templates, which will have to be filled in by the software developer according to certain guidelines. Third, we discuss performance issues related to the suitability of synthesizing monitors rather than threads from software unit descriptions that satisfy specific constraints. In addition to the running example, we present two case studies about a video animation repainting system and the implementation of a leader election algorithm, in order to summarize the whole approach. The outcome of this thesis is the implementation of the proposed approach in a translator called PADL2Java and its integration in the architecture-centric verification tool TwoTowers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The object of the present study is the process of gas transport in nano-sized materials, i.e. systems having structural elements of the order of nanometers. The aim of this work is to advance the understanding of the gas transport mechanism in such materials, for which traditional models are not often suitable, by providing a correct interpretation of the relationship between diffusive phenomena and structural features. This result would allow the development new materials with permeation properties tailored on the specific application, especially in packaging systems. The methods used to achieve this goal were a detailed experimental characterization and different simulation methods. The experimental campaign regarded the determination of oxygen permeability and diffusivity in different sets of organic-inorganic hybrid coatings prepared via sol-gel technique. The polymeric samples coated with these hybrid layers experienced a remarkable enhancement of the barrier properties, which was explained by the strong interconnection at the nano-scale between the organic moiety and silica domains. An analogous characterization was performed on microfibrillated cellulose films, which presented remarkable barrier effect toward oxygen when it is dry, while in the presence of water the performance significantly drops. The very low value of water diffusivity at low activities is also an interesting characteristic which deals with its structural properties. Two different approaches of simulation were then considered: the diffusion of oxygen through polymer-layered silicates was modeled on a continuum scale with a CFD software, while the properties of n-alkanthiolate self assembled monolayers on gold were analyzed from a molecular point of view by means of a molecular dynamics algorithm. Modeling transport properties in layered nanocomposites, resulting from the ordered dispersion of impermeable flakes in a 2-D matrix, allowed the calculation of the enhancement of barrier effect in relation with platelets structural parameters leading to derive a new expression. On this basis, randomly distributed systems were simulated and the results were analyzed to evaluate the different contributions to the overall effect. The study of more realistic three-dimensional geometries revealed a prefect correspondence with the 2-D approximation. A completely different approach was applied to simulate the effect of temperature on the oxygen transport through self assembled monolayers; the structural information obtained from equilibrium MD simulations showed that raising the temperature, makes the monolayer less ordered and consequently less crystalline. This disorder produces a decrease in the barrier free energy and it lowers the overall resistance to oxygen diffusion, making the monolayer more permeable to small molecules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The term "Brain Imaging" identi�es a set of techniques to analyze the structure and/or functional behavior of the brain in normal and/or pathological situations. These techniques are largely used in the study of brain activity. In addition to clinical usage, analysis of brain activity is gaining popularity in others recent �fields, i.e. Brain Computer Interfaces (BCI) and the study of cognitive processes. In this context, usage of classical solutions (e.g. f MRI, PET-CT) could be unfeasible, due to their low temporal resolution, high cost and limited portability. For these reasons alternative low cost techniques are object of research, typically based on simple recording hardware and on intensive data elaboration process. Typical examples are ElectroEncephaloGraphy (EEG) and Electrical Impedance Tomography (EIT), where electric potential at the patient's scalp is recorded by high impedance electrodes. In EEG potentials are directly generated from neuronal activity, while in EIT by the injection of small currents at the scalp. To retrieve meaningful insights on brain activity from measurements, EIT and EEG relies on detailed knowledge of the underlying electrical properties of the body. This is obtained from numerical models of the electric �field distribution therein. The inhomogeneous and anisotropic electric properties of human tissues make accurate modeling and simulation very challenging, leading to a tradeo�ff between physical accuracy and technical feasibility, which currently severely limits the capabilities of these techniques. Moreover elaboration of data recorded requires usage of regularization techniques computationally intensive, which influences the application with heavy temporal constraints (such as BCI). This work focuses on the parallel implementation of a work-flow for EEG and EIT data processing. The resulting software is accelerated using multi-core GPUs, in order to provide solution in reasonable times and address requirements of real-time BCI systems, without over-simplifying the complexity and accuracy of the head models.