903 resultados para Modeling and Simulation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aluminum alloyed with small atomic fractions of Sc, Zr, and Hf has been shown to exhibit high temperature microstructural stability that may improve high temperature mechanical behavior. These quaternary alloys were designed using thermodynamic modeling to increase the volume fraction of precipitated tri-aluminide phases to improve thermal stability. When aged during a multi-step, isochronal heat treatment, two compositions showed a secondary room-temperature hardness peak up to 700 MPa at 450°C. Elevated temperature hardness profiles also indicated an increase in hardness from 200-300°C, attributed to the precipitation of Al3Sc, however, no secondary hardness response was observed from the Al3Zr or Al3Hf phases in this alloy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the past three decades the automotive industry is facing two main conflicting challenges to improve fuel economy and meet emissions standards. This has driven the engineers and researchers around the world to develop engines and powertrain which can meet these two daunting challenges. Focusing on the internal combustion engines there are very few options to enhance their performance beyond the current standards without increasing the price considerably. The Homogeneous Charge Compression Ignition (HCCI) engine technology is one of the combustion techniques which has the potential to partially meet the current critical challenges including CAFE standards and stringent EPA emissions standards. HCCI works on very lean mixtures compared to current SI engines, resulting in very low combustion temperatures and ultra-low NOx emissions. These engines when controlled accurately result in ultra-low soot formation. On the other hand HCCI engines face a problem of high unburnt hydrocarbon and carbon monoxide emissions. This technology also faces acute combustion controls problem, which if not dealt properly with yields highly unfavorable operating conditions and exhaust emissions. This thesis contains two main parts. One part deals in developing an HCCI experimental setup and the other focusses on developing a grey box modelling technique to control HCCI exhaust gas emissions. The experimental part gives the complete details on modification made on the stock engine to run in HCCI mode. This part also comprises details and specifications of all the sensors, actuators and other auxiliary parts attached to the conventional SI engine in order to run and monitor the engine in SI mode and future SI-HCCI mode switching studies. In the latter part around 600 data points from two different HCCI setups for two different engines are studied. A grey-box model for emission prediction is developed. The grey box model is trained with the use of 75% data and the remaining data is used for validation purpose. An average of 70% increase in accuracy for predicting engine performance is found while using the grey-box over an empirical (black box) model during this study. The grey-box model provides a solution for the difficulty faced for real time control of an HCCI engine. The grey-box model in this thesis is the first study in literature to develop a control oriented model for predicting HCCI engine emissions for control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Pleistocene carbonate rock Biscayne Aquifer of south Florida contains laterally-extensive bioturbated ooltic zones characterized by interconnected touching-vug megapores that channelize most flow and make the aquifer extremely permeable. Standard petrophysical laboratory techniques may not be capable of accurately measuring such high permeabilities. Instead, innovative procedures that can measure high permeabilities were applied. These fragile rocks cannot easily be cored or cut to shapes convenient for conducting permeability measurements. For the laboratory measurement, a 3D epoxy-resin printed rock core was produced from computed tomography data obtained from an outcrop sample. Permeability measurements were conducted using a viscous fluid to permit easily observable head gradients (~2 cm over 1 m) simultaneously with low Reynolds number flow. For a second permeability measurement, Lattice Boltzmann Method flow simulations were computed on the 3D core renderings. Agreement between the two estimates indicates an accurate permeability was obtained that can be applied to future studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Minimization of undesirable temperature gradients in all dimensions of a planar solid oxide fuel cell (SOFC) is central to the thermal management and commercialization of this electrochemical reactor. This article explores the effective operating variables on the temperature gradient in a multilayer SOFC stack and presents a trade-off optimization. Three promising approaches are numerically tested via a model-based sensitivity analysis. The numerically efficient thermo-chemical model that had already been developed by the authors for the cell scale investigations (Tang et al. Chem. Eng. J. 2016, 290, 252-262) is integrated and extended in this work to allow further thermal studies at commercial scales. Initially, the most common approach for the minimization of stack's thermal inhomogeneity, i.e., usage of the excess air, is critically assessed. Subsequently, the adjustment of inlet gas temperatures is introduced as a complementary methodology to reduce the efficiency loss due to application of excess air. As another practical approach, regulation of the oxygen fraction in the cathode coolant stream is examined from both technical and economic viewpoints. Finally, a multiobjective optimization calculation is conducted to find an operating condition in which stack's efficiency and temperature gradient are maximum and minimum, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In knowledge technology work, as expressed by the scope of this conference, there are a number of communities, each uncovering new methods, theories, and practices. The Library and Information Science (LIS) community is one such community. This community, through tradition and innovation, theories and practice, organizes knowledge and develops knowledge technologies formed by iterative research hewn to the values of equal access and discovery for all. The Information Modeling community is another contributor to knowledge technologies. It concerns itself with the construction of symbolic models that capture the meaning of information and organize it in ways that are computer-based, but human understandable. A recent paper that examines certain assumptions in information modeling builds a bridge between these two communities, offering a forum for a discussion on common aims from a common perspective. In a June 2000 article, Parsons and Wand separate classes from instances in information modeling in order to free instances from what they call the “tyranny” of classes. They attribute a number of problems in information modeling to inherent classification – or the disregard for the fact that instances can be conceptualized independent of any class assignment. By faceting instances from classes, Parsons and Wand strike a sonorous chord with classification theory as understood in LIS. In the practice community and in the publications of LIS, faceted classification has shifted the paradigm of knowledge organization theory in the twentieth century. Here, with the proposal of inherent classification and the resulting layered information modeling, a clear line joins both the LIS classification theory community and the information modeling community. Both communities have their eyes turned toward networked resource discovery, and with this conceptual conjunction a new paradigmatic conversation can take place. Parsons and Wand propose that the layered information model can facilitate schema integration, schema evolution, and interoperability. These three spheres in information modeling have their own connotation, but are not distant from the aims of classification research in LIS. In this new conceptual conjunction, established by Parsons and Ward, information modeling through the layered information model, can expand the horizons of classification theory beyond LIS, promoting a cross-fertilization of ideas on the interoperability of subject access tools like classification schemes, thesauri, taxonomies, and ontologies. This paper examines the common ground between the layered information model and faceted classification, establishing a vocabulary and outlining some common principles. It then turns to the issue of schema and the horizons of conventional classification and the differences between Information Modeling and Library and Information Science. Finally, a framework is proposed that deploys an interpretation of the layered information modeling approach in a knowledge technologies context. In order to design subject access systems that will integrate, evolve and interoperate in a networked environment, knowledge organization specialists must consider a semantic class independence like Parsons and Wand propose for information modeling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article reports a combined thermodynamic, spectroscopic, and computational study on the interactions and structure of binary mixtures of hydrogenated and fluorinated substances that simultaneously interact through strong hydrogen bonding. Four binary mixtures of hydrogenated and fluorinated alcohols have been studied, namely, (ethanol + 2,2,2-trifluoroethanol (TFE)), (ethanol + 2,2,3,3,4,4,4-heptafluoro-1-butanol), (1-butanol (BuOH) + TFE), and (BuOH + 2,2,3,3,4,4,4-heptafluoro-1-butanol). Excess molar volumes and vibrational spectra of all four binary mixtures have been measured as a function of composition at 298 K, and molecular dynamics simulations have been performed. The systems display a complex behavior when compared with mixtures of hydrogenated alcohols and mixtures of alkanes and perfluoroalkanes. The combined analysis of the results from different approaches indicates that this results from a balance between preferential hydrogen bonding between the hydrogenated and fluorinated alcohols and the unfavorable dispersion forces between the hydrogenated and fluorinated chains. As the chain length increases, the contribution of dispersion increases and overcomes the contribution of H-bonds. In terms of the liquid structure, the simulations suggest the possibility of segregation between the hydrogenated and fluorinated segments, a hypothesis corroborated by the spectroscopic results. Furthermore, a quantitative analysis of the infrared spectra reveals that the presence of fluorinated groups induces conformational changes in the hydrogenated chains from the usually preferred all-trans to more globular arrangements involving gauche conformations. Conformational rearrangements at the CCOH dihedral angle upon mixing are also disclosed by the spectra.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this book are published results of high-tech application of computational modeling and simulation the dynamics of different flows, heat and mass transfer in different fields of science and engineering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The following thesis focused on the dry grinding process modelling and optimization for automotive gears production. A FEM model was implemented with the aim at predicting process temperatures and preventing grinding thermal defects on the material surface. In particular, the model was conceived to facilitate the choice of the grinding parameters during the design and the execution of the dry-hard finishing process developed and patented by the company Samputensili Machine Tools (EMAG Group) on automotive gears. The proposed model allows to analyse the influence of the technological parameters, comprising the grinding wheel specifications. Automotive gears finished by dry-hard finishing process are supposed to reach the same quality target of the gears finished through the conventional wet grinding process with the advantage of reducing production costs and environmental pollution. But, the grinding process allows very high values of specific pressure and heat absorbed by the material, therefore, removing the lubricant increases the risk of thermal defects occurrence. An incorrect design of the process parameters set could cause grinding burns, which affect the mechanical performance of the ground component inevitably. Therefore, a modelling phase of the process could allow to enhance the mechanical characteristics of the components and avoid waste during production. A hierarchical FEM model was implemented to predict dry grinding temperatures and was represented by the interconnection of a microscopic and a macroscopic approach. A microscopic single grain grinding model was linked to a macroscopic thermal model to predict the dry grinding process temperatures and so to forecast the thermal cycle effect caused by the process parameters and the grinding wheel specification choice. Good agreement between the model and the experiments was achieved making the dry-hard finishing an efficient and reliable technology to implement in the gears automotive industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis deals with optimization techniques and modeling of vehicular networks. Thanks to the models realized with the integer linear programming (ILP) and the heuristic ones, it was possible to study the performances in 5G networks for the vehicular. Thanks to Software-defined networking (SDN) and Network functions virtualization (NFV) paradigms it was possible to study the performances of different classes of service, such as the Ultra Reliable Low Latency Communications (URLLC) class and enhanced Mobile BroadBand (eMBB) class, and how the functional split can have positive effects on network resource management. Two different protection techniques have been studied: Shared Path Protection (SPP) and Dedicated Path Protection (DPP). Thanks to these different protections, it is possible to achieve different network reliability requirements, according to the needs of the end user. Finally, thanks to a simulator developed in Python, it was possible to study the dynamic allocation of resources in a 5G metro network. Through different provisioning algorithms and different dynamic resource management techniques, useful results have been obtained for understanding the needs in the vehicular networks that will exploit 5G. Finally, two models are shown for reconfiguring backup resources when using shared resource protection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the increasing of the distributed generation, DC microgrids have become more and more common in the electrical network. To connect devices in a microgrid, converter are necessary, but they are also source of disturbances due to their functioning. In this thesis, measurement and simulation of conducted emissions, within the frequency range 2-150kHz, of a DC/DC buck converter are studied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The time-dependent CP asymmetries of the $B^0\to\pi^+\pi^-$ and $B^0_s\toK^+K^-$ decays and the time-integrated CP asymmetries of the $B^0\toK^+\pi^-$ and $B^0_s\to\pi^+K^-$ decays are measured, using the $p-p$ collision data collected with the LHCb detector and corresponding to the full Run2. The results are compatible with previous determinations of these quantities from LHCb, except for the CP-violation parameters of the $B^0_s\to K^+K^-$ decays, that show a discrepancy exceeding 3 standard deviations between different data-taking periods. The investigations being conducted to understand the discrepancy are documented. The measurement of the CKM matrix element $|V_{cb}|$ using $B^0_{s}\to D^{(*)-}_s\mu^+ \nu_\mu$ is also reported, using the $p-p$ collision data collected with the LHCb detector and corresponding to the full Run1. The measurement leads to $|V_{cb}| = (41.4\pm0.6\pm0.9\pm1.2)\times 10^{-3}$, where the first uncertainty is statistical, the second is systematic, and the third is due to external inputs. This measurement is compatible with the world averages and constitutes the first measurement of $|V_{cb}|$ at a hadron collider and the absolute first one with decays of the $B^0_s$ meson. The analysis also provides the very first measurements of the branching ratio and form factors parameters of the signal decay modes. The study of the characteristics ruling the response of an electromagnetic calorimeter (ECAL) to profitably operate in the high luminosity regime foreseen for the Upgrade2 of LHCb is reported in the final part of this Thesis. A fast and flexible simulation framework is developed to this purpose. Physics performance of different configurations of the ECAL are evaluated using samples of fully simulated $B^0\to \pi^+\pi^-\pi^0$ and $B^0\to K^{*0}e^+e^-$ decays. The results are used to guide the development of the future ECAL and are reported in the Framework Technical Design Report of the LHCb Upgrade2 detector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The microstructure of 6XXX aluminum alloys deeply affects mechanical, crash, corrosion and aesthetic properties of extruded profiles. Unfortunately, grain structure evolution during manufacturing processes is a complex phenomenon because several process and material parameters such as alloy chemical composition, temperature, extrusion speed, tools geometries, quenching and thermal treatment parameters affect the grain evolution during the manufacturing process. The aim of the present PhD thesis was the analysis of the recrystallization kinetics during the hot extrusion of 6XXX aluminum alloys and the development of reliable recrystallization models to be used in FEM codes for the microstructure prediction at a die design stage. Experimental activities have been carried out in order to acquire data for the recrystallization models development, validation and also to investigate the effect of process parameters and die design on the microstructure of the final component. The experimental campaign reported in this thesis involved the extrusion of AA6063, AA6060 and AA6082 profiles with different process parameters in order to provide a reliable amount of data for the models validation. A particular focus was made to investigate the PCG defect evolution during the extrusion of medium-strength alloys such as AA6082. Several die designs and process conditions were analysed in order to understand the influence of each of them on the recrystallization behaviour of the investigated alloy. From the numerical point of view, innovative models for the microstructure prediction were developed and validated over the extrusion of industrial-scale profiles with complex geometries, showing a good matching in terms of the grain size and surface recrystallization prediction. The achieved results suggest the reliability of the developed models and their application in the industrial field for process and material properties optimization at a die-design stage.