958 resultados para Process Modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The tensile deformation behavior of a range of supersaturated Mg-Al solid solutions and an as-cast magnesium alloy AM60 has been studied. The Mg-Al alloys were tested at room temperature while the alloy AM60 was tested in the temperature range 293-573 K. The differences in the deformation behavior of the alloys is discussed in terms of hardening and softening processes. In order to identify which processes were active, the stress dependence of the strain-hardening coefficient was assessed using Lukac and Balik's model of hardening and softening. The analysis indicates that hardening involves solid solution hardening and interaction with forest dislocations and non-dislocation obstacles such as second phase particles. Cross slip is not a significant recovery process in the temperature range 293-423 K. At temperatures between 473 and 523 K the analysis suggests that softening is controlled by cross slip and climb of dislocations. At temperatures above 523 K softening seems to be controlled by dynamic recrystallisation. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examine the transport of methane in microporous carbon by performing equilibrium and nonequilibrium molecular dynamics simulations over a range of pore sizes, densities, and temperatures. We interpret these simulation results using two models of the transport process. At low densities, we consider a molecular flow model, in which intermolecular interactions are neglected, and find excellent agreement between transport diffusion coefficients determined from simulation, and those predicted by the model. Simulation results indicate that the model can be applied up to fluid densities of the order to 0.1-1 nm(-3). Above these densities, we consider a slip flow model, combining hydrodynamic theory with a slip condition at the solid-fluid interface. As the diffusion coefficient at low densities can be accurately determined by the molecular flow model, we also consider a model where the slip condition is supplied by the molecular flow model. We find that both density-dependent models provide a useful means of estimating the transport coefficient that compares well with simulation. (C) 2004 American Institute of Physics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Superplastic bulging is the most successful application of superplastic forming (SPF) in industry, but the non-uniform wall thickness distribution of parts formed by it is a common technical problem yet to be overcome. Based on a rigid-viscoplastic finite element program developed by the authors, for simulation of the sheet superplastic forming process combined with the prediction of microstructure variations (such as grain growth and cavity growth), a simple and efficient preform design method is proposed and applied to the design of preform mould for manufacturing parts with uniform wall thickness. Examples of formed parts are presented here to demonstrate that the technology can be used to improve the uniformity of wall thickness to meet practical requirements. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Much research has been devoted over the years to investigating and advancing the techniques and tools used by analysts when they model. As opposed to what academics, software providers and their resellers promote as should be happening, the aim of this research was to determine whether practitioners still embraced conceptual modeling seriously. In addition, what are the most popular techniques and tools used for conceptual modeling? What are the major purposes for which conceptual modeling is used? The study found that the top six most frequently used modeling techniques and methods were ER diagramming, data flow diagramming, systems flowcharting, workflow modeling, UML, and structured charts. Modeling technique use was found to decrease significantly from smaller to medium-sized organizations, but then to increase significantly in larger organizations (proxying for large, complex projects). Technique use was also found to significantly follow an inverted U-shaped curve, contrary to some prior explanations. Additionally, an important contribution of this study was the identification of the factors that uniquely influence the decision of analysts to continue to use modeling, viz., communication (using diagrams) to/from stakeholders, internal knowledge (lack of) of techniques, user expectations management, understanding models' integration into the business, and tool/software deficiencies. The highest ranked purposes for which modeling was undertaken were database design and management, business process documentation, business process improvement, and software development. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A steady state mathematical model for co-current spray drying was developed for sugar-rich foods with the application of the glass transition temperature concept. Maltodextrin-sucrose solution was used as a sugar-rich food model. The model included mass, heat and momentum balances for a single droplet drying as well as temperature and humidity profile of the drying medium. A log-normal volume distribution of the droplets was generated at the exit of the rotary atomizer. This generation created a certain number of bins to form a system of non-linear first-order differential equations as a function of the axial distance of the drying chamber. The model was used to calculate the changes of droplet diameter, density, temperature, moisture content and velocity in association with the change of air properties along the axial distance. The difference between the outlet air temperature and the glass transition temperature of the final products (AT) was considered as an indicator of stickiness of the particles in spray drying process. The calculated and experimental AT values were close, indicating successful validation of the model. (c) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The leaching of elements from the surface of charged fly ash particles is known to be an unsteady process. The mass transfer resistance provided by the diffuse double layer has been quantified as one of the reasons for this delayed leaching. In this work, a model based on mass transfer principles for predicting the concentration of calcium hydroxide in the diffuse double layer is presented. The significant difference between predicted calcium hydroxide concentration and the experimentally measured is explained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a new method of modeling imaging of laser beams in the presence of diffraction. Our method is based on the concept of first orthogonally expanding the resultant diffraction field (that would have otherwise been obtained by the laborious application of the Huygens diffraction principle) and then representing it by an effective multimodal laser beam with different beam parameters. We show not only that the process of obtaining the new beam parameters is straightforward but also that it permits a different interpretation of the diffraction-caused focal shift in laser beams. All of the criteria that we have used to determine the minimum number of higher-order modes needed to accurately represent the diffraction field show that the mode-expansion method is numerically efficient. Finally, the characteristics of the mode-expansion method are such that it allows modeling of a vast array of diffraction problems, regardless of the characteristics of the incident laser beam, the diffracting element, or the observation plane. (C) 2005 Optical Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasingly, large areas of native tropical forests are being transformed into a mosaic of human dominated land uses with scattered mature remnants and secondary forests. In general, at the end of the land clearing process, the landscape will have two forest components: a stable component of surviving mature forests, and a dynamic component of secondary forests of different ages. As the proportion of mature forests continues to decline, secondary forests play an increasing role in the conservation and restoration of biodiversity. This paper aims to predict and explain spatial and temporal patterns in the age of remnant mature and secondary forests in lowland Colombian landscapes. We analyse the age distributions of forest fragments, using detailed temporal land cover data derived from aerial photographs. Ordinal logistic regression analysis was applied to model the spatial dynamics of mature and secondary forest patches. In particular, the effect of soil fertility, accessibility and auto-correlated neighbourhood terms on forest age and time of isolation of remnant patches was assessed. In heavily transformed landscapes, forests account for approximately 8% of the total landscape area, of which three quarters are comprised of secondary forests. Secondary forest growth adjacent to mature forest patches increases mean patch size and core area, and therefore plays an important ecological role in maintaining landscape structure. The regression models show that forest age is positively associated with the amount of neighbouring forest, and negatively associated with the amount of neighbouring secondary vegetation, so the older the forest is the less secondary vegetation there is adjacent to it. Accessibility and soil fertility also have a negative but variable influence on the age of forest remnants. The probability of future clearing if current conditions hold is higher for regenerated than mature forests. The challenge of biodiversity conservation and restoration in dynamic and spatially heterogeneous landscape mosaics composed of mature and secondary forests is discussed. (c) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The data available during the drug discovery process is vast in amount and diverse in nature. To gain useful information from such data, an effective visualisation tool is required. To provide better visualisation facilities to the domain experts (screening scientist, biologist, chemist, etc.),we developed a software which is based on recently developed principled visualisation algorithms such as Generative Topographic Mapping (GTM) and Hierarchical Generative Topographic Mapping (HGTM). The software also supports conventional visualisation techniques such as Principal Component Analysis, NeuroScale, PhiVis, and Locally Linear Embedding (LLE). The software also provides global and local regression facilities . It supports regression algorithms such as Multilayer Perceptron (MLP), Radial Basis Functions network (RBF), Generalised Linear Models (GLM), Mixture of Experts (MoE), and newly developed Guided Mixture of Experts (GME). This user manual gives an overview of the purpose of the software tool, highlights some of the issues to be taken care while creating a new model, and provides information about how to install & use the tool. The user manual does not require the readers to have familiarity with the algorithms it implements. Basic computing skills are enough to operate the software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, the data available to tackle many scientific challenges is vast in quantity and diverse in nature. The exploration of heterogeneous information spaces requires suitable mining algorithms as well as effective visual interfaces. miniDVMS v1.8 provides a flexible visual data mining framework which combines advanced projection algorithms developed in the machine learning domain and visual techniques developed in the information visualisation domain. The advantage of this interface is that the user is directly involved in the data mining process. Principled projection methods, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), are integrated with powerful visual techniques, such as magnification factors, directional curvatures, parallel coordinates, and user interaction facilities, to provide this integrated visual data mining framework. The software also supports conventional visualisation techniques such as principal component analysis (PCA), Neuroscale, and PhiVis. This user manual gives an overview of the purpose of the software tool, highlights some of the issues to be taken care while creating a new model, and provides information about how to install and use the tool. The user manual does not require the readers to have familiarity with the algorithms it implements. Basic computing skills are enough to operate the software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stochastic differential equations arise naturally in a range of contexts, from financial to environmental modeling. Current solution methods are limited in their representation of the posterior process in the presence of data. In this work, we present a novel Gaussian process approximation to the posterior measure over paths for a general class of stochastic differential equations in the presence of observations. The method is applied to two simple problems: the Ornstein-Uhlenbeck process, of which the exact solution is known and can be compared to, and the double-well system, for which standard approaches such as the ensemble Kalman smoother fail to provide a satisfactory result. Experiments show that our variational approximation is viable and that the results are very promising as the variational approximate solution outperforms standard Gaussian process regression for non-Gaussian Markov processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this work was to design, construct and commission a new ablative pyrolysis reactor and a high efficiency product collection system. The reactor was to have a nominal throughput of 10 kg/11r of dry biomass and be inherently scalable up to an industrial scale application of 10 tones/hr. The whole process consists of a bladed ablative pyrolysis reactor, two high efficiency cyclones for char removal and a disk and doughnut quench column combined with a wet walled electrostatic precipitator, which is directly mounted on top, for liquids collection. In order to aid design and scale-up calculations, detailed mathematical modelling was undertaken of the reaction system enabling sizes, efficiencies and operating conditions to be determined. Specifically, a modular approach was taken due to the iterative nature of some of the design methodologies, with the output from one module being the input to the next. Separate modules were developed for the determination of the biomass ablation rate, specification of the reactor capacity, cyclone design, quench column design and electrostatic precipitator design. These models enabled a rigorous design protocol to be developed capable of specifying the required reactor and product collection system size for specified biomass throughputs, operating conditions and collection efficiencies. The reactor proved capable of generating an ablation rate of 0.63 mm/s for pine wood at a temperature of 525 'DC with a relative velocity between the heated surface and reacting biomass particle of 12.1 m/s. The reactor achieved a maximum throughput of 2.3 kg/hr, which was the maximum the biomass feeder could supply. The reactor is capable of being operated at a far higher throughput but this would require a new feeder and drive motor to be purchased. Modelling showed that the reactor is capable of achieving a reactor throughput of approximately 30 kg/hr. This is an area that should be considered for the future as the reactor is currently operating well below its theoretical maximum. Calculations show that the current product collection system could operate efficiently up to a maximum feed rate of 10 kg/Fir, provided the inert gas supply was adjusted accordingly to keep the vapour residence time in the electrostatic precipitator above one second. Operation above 10 kg/hr would require some modifications to the product collection system. Eight experimental runs were documented and considered successful, more were attempted but due to equipment failure had to be abandoned. This does not detract from the fact that the reactor and product collection system design was extremely efficient. The maximum total liquid yield was 64.9 % liquid yields on a dry wood fed basis. It is considered that the liquid yield would have been higher had there been sufficient development time to overcome certain operational difficulties and if longer operating runs had been attempted to offset product losses occurring due to the difficulties in collecting all available product from a large scale collection unit. The liquids collection system was highly efficient and modeling determined a liquid collection efficiency of above 99% on a mass basis. This was validated due to the fact that a dry ice/acetone condenser and a cotton wool filter downstream of the collection unit enabled mass measurements of the amount of condensable product exiting the product collection unit. This showed that the collection efficiency was in excess of 99% on a mass basis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Theprocess of manufacturing system design frequently includes modeling, and usually, this means applying a technique such as discrete event simulation (DES). However, the computer tools currently available to apply this technique enable only a superficial representation of the people that operate within the systems. This is a serious limitation because the performance of people remains central to the competitiveness of many manufacturing enterprises. Therefore, this paper explores the use of probability density functions to represent the variation of worker activity times within DES models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dynamically adaptive systems (DASs) are intended to monitor the execution environment and then dynamically adapt their behavior in response to changing environmental conditions. The uncertainty of the execution environment is a major motivation for dynamic adaptation; it is impossible to know at development time all of the possible combinations of environmental conditions that will be encountered. To date, the work performed in requirements engineering for a DAS includes requirements monitoring and reasoning about the correctness of adaptations, where the DAS requirements are assumed to exist. This paper introduces a goal-based modeling approach to develop the requirements for a DAS, while explicitly factoring uncertainty into the process and resulting requirements. We introduce a variation of threat modeling to identify sources of uncertainty and demonstrate how the RELAX specification language can be used to specify more flexible requirements within a goal model to handle the uncertainty. © 2009 Springer Berlin Heidelberg.