981 resultados para Digital computer simulation.


Relevância:

90.00% 90.00%

Publicador:

Resumo:

For metal and metal halide vapor lasers excited by high frequency pulsed discharge, the thermal effect mainly caused by the radial temperature distribution is of considerable importance for stable laser operation and improvement of laser output characteristics. A short survey of the obtained analytical and numerical-analytical mathematical models of the temperature profile in a high-powered He-SrBr2 laser is presented. The models are described by the steady-state heat conduction equation with mixed type nonlinear boundary conditions for the arbitrary form of the volume power density. A complete model of radial heat flow between the two tubes is established for precise calculating the inner wall temperature. The models are applied for simulating temperature profiles for newly designed laser. The author’s software prototype LasSim is used for carrying out the mathematical models and simulations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We discuss some main points of computer-assisted proofs based on reliable numerical computations. Such so-called self-validating numerical methods in combination with exact symbolic manipulations result in very powerful mathematical software tools. These tools allow proving mathematical statements (existence of a fixed point, of a solution of an ODE, of a zero of a continuous function, of a global minimum within a given range, etc.) using a digital computer. To validate the assertions of the underlying theorems fast finite precision arithmetic is used. The results are absolutely rigorous. To demonstrate the power of reliable symbolic-numeric computations we investigate in some details the verification of very long periodic orbits of chaotic dynamical systems. The verification is done directly in Maple, e.g. using the Maple Power Tool intpakX or, more efficiently, using the C++ class library C-XSC.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The European Leonardo da Vinci Transfer of Innovation project “Teacher training to improve attractiveness and quality of management education through the simulation tool ‘Emerald Forest’” which emphases on using the computer simulation tool for increasing attractiveness of teaching and learning in economics is presented in this paper. The observation of using computer systems and especially serious games in education is provided as well. “Education is not the filling of a pail, but the lighting of a fire” - William Butler Yeats

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Sol-gel-synthesized bioactive glasses may be formed via a hydrolysis condensation reaction, silica being introduced in the form of tetraethyl orthosilicate (TEOS), and calcium is typically added in the form of calcium nitrate. The synthesis reaction proceeds in an aqueous environment; the resultant gel is dried, before stabilization by heat treatment. These materials, being amorphous, are complex at the level of their atomic-scale structure, but their bulk properties may only be properly understood on the basis of that structural insight. Thus, a full understanding of their structure-property relationship may only be achieved through the application of a coherent suite of leading-edge experimental probes, coupled with the cogent use of advanced computer simulation methods. Using as an exemplar a calcia-silica sol-gel glass of the kind developed by Larry Hench, in the memory of whom this paper is dedicated, we illustrate the successful use of high-energy X-ray and neutron scattering (diffraction) methods, magic-angle spinning solid-state NMR, and molecular dynamics simulation as components to a powerful methodology for the study of amorphous materials.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The peculiarities of Roman architecture, town planning, and landscape architecture are visible in many of the empire's remaining cities. However, evaluation of the landscapes; and analysis of the urban fabric, spatial compositions, and the concepts and characteristics of its open spaces are missing for Jerash (Gerasa in antiquity) in Jordan. Those missing elements will be discussed in this work, as an example of an urban arrangement that survived through different civilizations in history.^ To address the characteristics of the exterior spaces in Jerash, a study of the major concepts of planning in Classical Antiquity will be conducted, followed by a comparative analysis of the quality of space and architectural composition in Jerash. Through intensive investigation of data available for the area under study, the historical method used in this paper illustrates the uniqueness of the site's urban morphology and architectural disposition.^ An analysis will be performed to compare the design composition of the landscape, urban fabric, and open space of Jerash as a provincial Roman city with its existing excavated remains. Such an analysis will provide new information about the roles these factors and their relationships played in determining the design layout of the city. Information, such as the relationship between void and solid, space shaping, the ground and ceiling, the composition of city elements, the ancient landscapes, and the relationship between the land and architecture, will be acquired.^ A computer simulation for a portion of the city will be developed to enable researchers, students and citizens interested in Jordan's past to visualize more clearly what the city looked like in its prime. Such a simulation could result in the revival of the old city of Jerash and help promote its tourism. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Clusters are aggregations of atoms or molecules, generally intermediate in size between individual atoms and aggregates that are large enough to be called bulk matter. Clusters can also be called nanoparticles, because their size is on the order of nanometers or tens of nanometers. A new field has begun to take shape called nanostructured materials which takes advantage of these atom clusters. The ultra-small size of building blocks leads to dramatically different properties and it is anticipated that such atomically engineered materials will be able to be tailored to perform as no previous material could.^ The idea of ionized cluster beam (ICB) thin film deposition technique was first proposed by Takagi in 1972. It was based upon using a supersonic jet source to produce, ionize and accelerate beams of atomic clusters onto substrates in a vacuum environment. Conditions for formation of cluster beams suitable for thin film deposition have only recently been established following twenty years of effort. Zinc clusters over 1,000 atoms in average size have been synthesized both in our lab and that of Gspann. More recently, other methods of synthesizing clusters and nanoparticles, using different types of cluster sources, have come under development.^ In this work, we studied different aspects of nanoparticle beams. The work includes refinement of a model of the cluster formation mechanism, development of a new real-time, in situ cluster size measurement method, and study of the use of ICB in the fabrication of semiconductor devices.^ The formation process of the vaporized-metal cluster beam was simulated and investigated using classical nucleation theory and one dimensional gas flow equations. Zinc cluster sizes predicted at the nozzle exit are in good quantitative agreement with experimental results in our laboratory.^ A novel in situ real-time mass, energy and velocity measurement apparatus has been designed, built and tested. This small size time-of-flight mass spectrometer is suitable to be used in our cluster deposition systems and does not suffer from problems related to other methods of cluster size measurement like: requirement for specialized ionizing lasers, inductive electrical or electromagnetic coupling, dependency on the assumption of homogeneous nucleation, limits on the size measurement and non real-time capability. Measured ion energies using the electrostatic energy analyzer are in good accordance with values obtained from computer simulation. The velocity (v) is measured by pulsing the cluster beam and measuring the time of delay between the pulse and analyzer output current. The mass of a particle is calculated from m = (2E/v$\sp2).$ The error in the measured value of background gas mass is on the order of 28% of the mass of one N$\sb2$ molecule which is negligible for the measurement of large size clusters. This resolution in cluster size measurement is very acceptable for our purposes.^ Selective area deposition onto conducting patterns overlying insulating substrates was demonstrated using intense, fully-ionized cluster beams. Parameters influencing the selectivity are ion energy, repelling voltage, the ratio of the conductor to insulator dimension, and substrate thickness. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The physics of self-organization and complexity is manifested on a variety of biological scales, from large ecosystems to the molecular level. Protein molecules exhibit characteristics of complex systems in terms of their structure, dynamics, and function. Proteins have the extraordinary ability to fold to a specific functional three-dimensional shape, starting from a random coil, in a biologically relevant time. How they accomplish this is one of the secrets of life. In this work, theoretical research into understanding this remarkable behavior is discussed. Thermodynamic and statistical mechanical tools are used in order to investigate the protein folding dynamics and stability. Theoretical analyses of the results from computer simulation of the dynamics of a four-helix bundle show that the excluded volume entropic effects are very important in protein dynamics and crucial for protein stability. The dramatic effects of changing the size of sidechains imply that a strategic placement of amino acid residues with a particular size may be an important consideration in protein engineering. Another investigation deals with modeling protein structural transitions as a phase transition. Using finite size scaling theory, the nature of unfolding transition of a four-helix bundle protein was investigated and critical exponents for the transition were calculated for various hydrophobic strengths in the core. It is found that the order of the transition changes from first to higher order as the strength of the hydrophobic interaction in the core region is significantly increased. Finally, a detailed kinetic and thermodynamic analysis was carried out in a model two-helix bundle. The connection between the structural free-energy landscape and folding kinetics was quantified. I show how simple protein engineering, by changing the hydropathy of a small number of amino acids, can enhance protein folding by significantly changing the free energy landscape so that kinetic traps are removed. The results have general applicability in protein engineering as well as understanding the underlying physical mechanisms of protein folding. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research is motivated by the need for considering lot sizing while accepting customer orders in a make-to-order (MTO) environment, in which each customer order must be delivered by its due date. Job shop is the typical operation model used in an MTO operation, where the production planner must make three concurrent decisions; they are order selection, lot size, and job schedule. These decisions are usually treated separately in the literature and are mostly led to heuristic solutions. The first phase of the study is focused on a formal definition of the problem. Mathematical programming techniques are applied to modeling this problem in terms of its objective, decision variables, and constraints. A commercial solver, CPLEX is applied to solve the resulting mixed-integer linear programming model with small instances to validate the mathematical formulation. The computational result shows it is not practical for solving problems of industrial size, using a commercial solver. The second phase of this study is focused on development of an effective solution approach to this problem of large scale. The proposed solution approach is an iterative process involving three sequential decision steps of order selection, lot sizing, and lot scheduling. A range of simple sequencing rules are identified for each of the three subproblems. Using computer simulation as the tool, an experiment is designed to evaluate their performance against a set of system parameters. For order selection, the proposed weighted most profit rule performs the best. The shifting bottleneck and the earliest operation finish time both are the best scheduling rules. For lot sizing, the proposed minimum cost increase heuristic, based on the Dixon-Silver method performs the best, when the demand-to-capacity ratio at the bottleneck machine is high. The proposed minimum cost heuristic, based on the Wagner-Whitin algorithm is the best lot-sizing heuristic for shops of a low demand-to-capacity ratio. The proposed heuristic is applied to an industrial case to further evaluate its performance. The result shows it can improve an average of total profit by 16.62%. This research contributes to the production planning research community with a complete mathematical definition of the problem and an effective solution approach to solving the problem of industry scale.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

La tendencia mundial de administrar y operar a distancia las centrales hidroeléctricas está obligando a los expertos a replantear los estrategias de monitoreo y diagnóstico de sus máquinas -- Esto ha conducido también, a reducir el personal experto que reside in-situ y que se encarga de operar y mantener los sistemas técnicos, y además de atender cualquier eventualidad que pueda ocurrir -- Por eso, desde hace ya varios años se han venido desarrollando sistemas expertos que puedan suplir las deficiencias del recurso humano -- Pero aunque tales sistemas han alcanzado niveles interesantes de independencia, aún requieren del acompañamiento de un experto que pueda interpretar las evidencias, emitir un diagnóstico y tomar una decisión -- Un ejemplo de los aspectos que aún se deben perfeccionar, es el se las falsas alarmas que llegan a producir el efecto “cry wolf” y que terminan por inactivar el sistema -- Otra forma de enfrentar esta nueva dinámica de operación es la de subcontratar el servicio de diagnóstico técnico, que puede dar resultados aceptables, pero no siempre en el caso de centrales hidroeléctricas -- Las centrales por lo general se encuentran en sitios remotos y en ocasiones blindadas por condiciones geográficas y climatológicas por lo que no es posible reaccionar rápidamente para atender una eventualidad cuando el experto y sus instrumentos no están cerca -- Una solución que resulta conveniente es de hecho, la centralización de la experticia para los servicios de monitoreo y diagnóstico técnico, soportados por una plataforma portátil e idealmente no-invasiva, que permanezca siempre junto a las máquinas y que pueda ser consultada on-line -- De este modo una cantidad reducida de expertos tendrán acceso permanente a las variables o síntomas que definen el estado técnico de la maquinaria; ellos se encargarán de analizar las señales sintomáticas, evaluar los resultados, emitir juicios y elaborar reportes ejecutivos que finalmente llegarán a manos del administrador o persona encargada de la operación -- Esta alternativa aliviará molestias relacionadas con los procesos de monitoreo y diagnóstico: instrumentación/sensórica, cableado, acondicionamiento de señales, adquisición digital de datos, procesamiento de señales, administración y gestión de equipos, reporte de resultados, recomendaciones, etc. -- Este proyecto propone en dos etapas, el diseño de una plataforma tecnológica que pueda soportar la alternativa mencionada -- En detalle, el diseño de un sistema integrado de adquisición de datos que además de ser portátil, modular y escalable, adecuado para monitoreo de las principales variables de diagnóstico de una central hidroeléctrica; que aunque no incorpore un sistema experto, si ofrece las herramientas de análisis, diagnóstico y toma de decisiones del estado del arte

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We apply Agent-Based Modeling and Simulation (ABMS) to investigate a set of problems in a retail context. Specifically, we are working to understand the relationship between human resource management practices and retail productivity. Despite the fact we are working within a relatively novel and complex domain, it is clear that intelligent agents do offer potential for developing organizational capabilities in the future. Our multi-disciplinary research team has worked with a UK department store to collect data and capture perceptions about operations from actors within departments. Based on this case study work, we have built a simulator that we present in this paper. We then use the simulator to gather empirical evidence regarding two specific management practices: empowerment and employee development.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In our research we investigate the output accuracy of discrete event simulation models and agent based simulation models when studying human centric complex systems. In this paper we focus on human reactive behaviour as it is possible in both modelling approaches to implement human reactive behaviour in the model by using standard methods. As a case study we have chosen the retail sector, and here in particular the operations of the fitting room in the women wear department of a large UK department store. In our case study we looked at ways of determining the efficiency of implementing new management policies for the fitting room operation through modelling the reactive behaviour of staff and customers of the department. First, we have carried out a validation experiment in which we compared the results from our models to the performance of the real system. This experiment also allowed us to establish differences in output accuracy between the two modelling methods. In a second step a multi-scenario experiment was carried out to study the behaviour of the models when they are used for the purpose of operational improvement. Overall we have found that for our case study example both, discrete event simulation and agent based simulation have the same potential to support the investigation into the efficiency of implementing new management policies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Molecular simulation can provide valuable guidance in establishing clear links between structure and function to enable the design of new polymer-based materials. However, molecular simulation of thermoset polymers in particular, such as epoxies, present specific challenges, chiefly in the credible preparation of polymerised samples. Despite this need, a comprehensive, reproducible and robust process for accomplishing this using molecular simulation is still lacking. Here, we introduce a clear and reproducible cross-linking protocol to reliably generate three dimensional epoxy cross-linked polymer structures for use in molecular simulations. This protocol is sufficiently detailed to allow complete reproduction of our results, and is applicable to any general thermoset polymer. Amongst our developments, key features include a reproducible procedure for calculation of partial atomic charges, a reliable process for generating and validating an equilibrated liquid precursor mixture, and establishment of a novel, robust and reproducible protocol for generating the three-dimensional cross-linked solid polymer. We use these structures as input to subsequent molecular dynamics simulations to calculate a range thermo-mechanical properties, which compare favourably with experimental data. Our general protocol provides a benchmark for the process of simulating epoxy polymers, and can be readily translated to prepare and model epoxy samples that are dynamically cross-linked in the presence of surfaces and nanostructures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper uses dynamic computer simulation techniques to apply a procedure using vibration-based methods for damage assessment in multiple-girder composite bridge. In addition to changes in natural frequencies, this multi-criteria procedure incorporates two methods, namely the modal flexibility and the modal strain energy method. Using the numerically simulated modal data obtained through finite element analysis software, algorithms based on modal flexibility and modal strain energy change before and after damage are obtained and used as the indices for the assessment of structural health state. The feasibility and capability of the approach is demonstrated through numerical studies of proposed structure with six damage scenarios. It is concluded that the modal strain energy method is competent for application on multiple-girder composite bridge, as evidenced through the example treated in this paper.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction: Bone mineral density (BMD) is currently the preferred surrogate for bone strength in clinical practice. Finite element analysis (FEA) is a computer simulation technique that can predict the deformation of a structure when a load is applied, providing a measure of stiffness (Nmm−1). Finite element analysis of X-ray images (3D-FEXI) is a FEA technique whose analysis is derived froma single 2D radiographic image. Methods: 18 excised human femora had previously been quantitative computed tomography scanned, from which 2D BMD-equivalent radiographic images were derived, and mechanically tested to failure in a stance-loading configuration. A 3D proximal femur shape was generated from each 2D radiographic image and used to construct 3D-FEA models. Results: The coefficient of determination (R2%) to predict failure load was 54.5% for BMD and 80.4% for 3D-FEXI. Conclusions: This ex vivo study demonstrates that 3D-FEXI derived from a conventional 2D radiographic image has the potential to significantly increase the accuracy of failure load assessment of the proximal femur compared with that currently achieved with BMD. This approach may be readily extended to routine clinical BMD images derived by dual energy X-ray absorptiometry. Crown Copyright © 2009 Published by Elsevier Ltd on behalf of IPEM. All rights reserved

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Bone mineral density (BMD) is currently the preferred surrogate for bone strength in clinical practice. Finite element analysis (FEA) is a computer simulation technique that can predict the deformation of a structure when a load is applied, providing a measure of stiffness (N mm− 1). Finite element analysis of X-ray images (3D-FEXI) is a FEA technique whose analysis is derived from a single 2D radiographic image. This ex-vivo study demonstrates that 3D-FEXI derived from a conventional 2D radiographic image has the potential to significantly increase the accuracy of failure load assessment of the proximal femur compared with that currently achieved with BMD.