982 resultados para Validated Interval Software
Resumo:
This paper describes the implementation of the Boussinesq-type model and extends its application to the tsunami wave runup on the clustered islands (multiple adjacent conical islands), in turn, an extensively validated two-dimensional Boussinesq-type model is employed to examine the interaction between a propagating solitary wave and multiple idealised conical islands, with particular emphasis on a combination effect of two adjustable parameters for spacing interval/diameter ratio between the adjacent conical islands, S/D, and the rotating angle of the structural configuration,θ on maximum soliton runup heights. An extensive parameter study concerning the combination effect of alteringθ and S/D on the maximum soliton runup with the multi-conical islands is subsequently carried out and the distributions of the maximum runup heights on each conical island are obtained and compared for the twin-island cases. The worst case study is performed for each case in respect of the enhancement in the maximum wave runup heights by the multi-conical islands. It is found that the nonlinear wave diffraction, reflection and refraction play a significant role in varying the maximum soliton runup heights on multiconical islands. The comparatively large maximum soliton runups are generally predicted for the merged and bottom mounted clusteredislands. Furthermore, the joints of the clustered-merged islands are demonstrated to suffer the most of the tsunami wave attack. The conical islands that position in the shadow regions behind the surrounding islands are found to withstand relatively less extreme wave impact. Although, these numerical investigations are considerable simplifications of the multi conical islands, they give a critical insight into certain important hydrodynamic characteristics of the interaction between an extreme wave event and a group of clustered conical islands, and thus providing a useful engineering guidance for extreme wave mitigation and coastal development. Copyright © 2012 by the International Society of Offshore and Polar Engineers (ISOPE).
Resumo:
Computational fluid dynamics (CFD) simulations are becoming increasingly widespread with the advent of more powerful computers and more sophisticated software. The aim of these developments is to facilitate more accurate reactor design and optimization methods compared to traditional lumped-parameter models. However, in order for CFD to be a trusted method, it must be validated using experimental data acquired at sufficiently high spatial resolution. This article validates an in-house CFD code by comparison with flow-field data obtained using magnetic resonance imaging (MRI) for a packed bed with a particle-to-column diameter ratio of 2. Flows characterized by inlet Reynolds numbers, based on particle diameter, of 27, 55, 111, and 216 are considered. The code used employs preconditioning to directly solve for pressure in low-velocity flow regimes. Excellent agreement was found between the MRI and CFD data with relative error between the experimentally determined and numerically predicted flow-fields being in the range of 3-9%. © 2012 American Institute of Chemical Engineers (AIChE).
Resumo:
Humans have been shown to adapt to the temporal statistics of timing tasks so as to optimize the accuracy of their responses, in agreement with the predictions of Bayesian integration. This suggests that they build an internal representation of both the experimentally imposed distribution of time intervals (the prior) and of the error (the loss function). The responses of a Bayesian ideal observer depend crucially on these internal representations, which have only been previously studied for simple distributions. To study the nature of these representations we asked subjects to reproduce time intervals drawn from underlying temporal distributions of varying complexity, from uniform to highly skewed or bimodal while also varying the error mapping that determined the performance feedback. Interval reproduction times were affected by both the distribution and feedback, in good agreement with a performance-optimizing Bayesian observer and actor model. Bayesian model comparison highlighted that subjects were integrating the provided feedback and represented the experimental distribution with a smoothed approximation. A nonparametric reconstruction of the subjective priors from the data shows that they are generally in agreement with the true distributions up to third-order moments, but with systematically heavier tails. In particular, higher-order statistical features (kurtosis, multimodality) seem much harder to acquire. Our findings suggest that humans have only minor constraints on learning lower-order statistical properties of unimodal (including peaked and skewed) distributions of time intervals under the guidance of corrective feedback, and that their behavior is well explained by Bayesian decision theory.
Resumo:
A three-dimensional (3D) numerical model is proposed to solve the electromagnetic problems involving transport current and background field of a high-T c superconducting (HTS) system. The model is characterized by the E-J power law and H-formulation, and is successfully implemented using finite element software. We first discuss the model in detail, including the mesh methods, boundary conditions and computing time. To validate the 3D model, we calculate the ac loss and trapped field solution for a bulk material and compare the results with the previously verified 2D solutions and an analytical solution. We then apply our model to test some typical problems such as superconducting bulk array and twisted conductors, which cannot be tackled by the 2D models. The new 3D model could be a powerful tool for researchers and engineers to investigate problems with a greater level of complicity.
Resumo:
Free software and open source projects are often perceived to be of high quality. It has been suggested that the high level of quality found in some free software projects is related to the open development model which promotes peer review. While the quality of some free software projects is comparable to, if not better than, that of closed source software, not all free software projects are successful and of high quality. Even mature and successful projects face quality problems; some of these are related to the unique characteristics of free software and open source as a distributed development model led primarily by volunteers. In exploratory interviews performed with free software and open source developers, several common quality practices as well as actual quality problems have been identified. The results of these interviews are presented in this paper in order to take stock of the current status of quality in free software projects and to act as a starting point for the implementation of quality process improvement strategies.
Resumo:
BGCore is a software package for comprehensive computer simulation of nuclear reactor systems and their fuel cycles. The BGCore interfaces Monte Carlo particles transport code MCNP4C with a SARAF module - an independently developed code for calculating in-core fuel composition and spent fuel emissions following discharge. In BGCore system, depletion coupling methodology is based on the multi-group approach that significantly reduces computation time and allows tracking of large number of nuclides during calculations. In this study, burnup calculation capabilities of BGCore system were validated against well established and verified, computer codes for thermal and fast spectrum lattices. Very good agreement in k eigenvalue and nuclide densities prediction was observed for all cases under consideration. In addition, decay heat prediction capabilities of the BGCore system were benchmarked against the most recent edition of ANS Standard methodology for UO2 fuel decay power prediction in LWRs. It was found that the difference between ANS standard data and that predicted by the BGCore does not exceed 5%.
Resumo:
In the modern engineering design cycle the use of computational tools becomes a neces- sity. The complexity of the engineering systems under consideration for design increases dramatically as the demands for advanced and innovative design concepts and engineering products is expanding. At the same time the advancements in the available technology in terms of computational resources and power, as well as the intelligence of the design software, accommodate these demands and make them a viable approach towards the chal- lenge of real-world engineering problems. This class of design optimisation problems is by nature multi-disciplinary. In the present work we establish enhanced optimisation capabil- ities within the Nimrod/O tool for massively distributed execution of computational tasks through cluster and computational grid resources, and develop the potential to combine and benefit from all the possible available technological advancements, both software and hardware. We develop the interface between a Free Form Deformation geometry manage- ment in-house code with the 2D airfoil aerodynamic efficiency evaluation tool XFoil, and the well established multi-objective heuristic optimisation algorithm NSGA-II. A simple airfoil design problem has been defined to demonstrate the functionality of the design sys- tem, but also to accommodate a framework for future developments and testing with other state-of-the-art optimisation algorithms such as the Multi-Objective Genetic Algorithm (MOGA) and the Multi-Objective Tabu Search (MOTS) techniques. Ultimately, heav- ily computationally expensive industrial design cases can be realised within the presented framework that could not be investigated before. © 2012 by the authors. Published by the American Institute of Aeronautics and Astronautics, Inc.
Resumo:
In the modern engineering design cycle the use of computational tools becomes a necessity. The complexity of the engineering systems under consideration for design increases dramatically as the demands for advanced and innovative design concepts and engineering products is expanding. At the same time the advancements in the available technology in terms of computational resources and power, as well as the intelligence of the design software, accommodate these demands and make them a viable approach towards the challenge of real-world engineering problems. This class of design optimisation problems is by nature multi-disciplinary. In the present work we establish enhanced optimisation capabilities within the Nimrod/O tool for massively distributed execution of computational tasks through cluster and computational grid resources, and develop the potential to combine and benefit from all the possible available technological advancements, both software and hardware. We develop the interface between a Free Form Deformation geometry management in-house code with the 2D airfoil aerodynamic efficiency evaluation tool XFoil, and the well established multi-objective heuristic optimisation algorithm NSGA-II. A simple airfoil design problem has been defined to demonstrate the functionality of the design system, but also to accommodate a framework for future developments and testing with other state-of-the-art optimisation algorithms such as the Multi-Objective Genetic Algorithm (MOGA) and the Multi-Objective Tabu Search (MOTS) techniques. Ultimately, heavily computationally expensive industrial design cases can be realised within the presented framework that could not be investigated before. ©2012 AIAA.
Resumo:
The delivery of integrated product and service solutions is growing in the aerospace industry, driven by the potential of increasing profits. Such solutions require a life cycle view at the design phase in order to support the delivery of the equipment. The influence of uncertainty associated with design for services is increasingly a challenge due to information and knowledge constraints. There is a lack of frameworks that aim to define and quantify relationship between information and knowledge with uncertainty. Driven by this gap, the paper presents a framework to illustrate the link between uncertainty and knowledge within the design context for services in the aerospace industry. The paper combines industrial interaction and literature review to initially define the design attributes, the associated knowledge requirements and the uncertainties experienced. The framework is then applied in three cases through development of causal loop models (CLMs), which are validated by industrial and academic experts. The concepts and inter-linkages are developed with the intention of developing a software prototype. Future recommendations are also included. © 2014 CIRP.
Resumo:
A technique based on the integrations of the product of amplified spontaneous emission spectrum and a phase function over one mode interval is proposed for measuring gain spectrum for Fabry-Perot semiconductor lasers, and a gain correction factor related to the response function of the optical spectrum analyzer (OSA) is obtained for improving the accuracy of measured gain spectrum. The gain spectra with a difference less than 1.3 cm(-1) from 1500 to 1600 nm are obtained for a 250-mum-long semiconductor laser at the OSA resolution of 0.06, 0.1, 0.2, and 0.5 nm. The corresponding gain correction factor is about 9 cm(-1) at the resolution of 0.5 nm. The gain spectrum measured at the resolution of 0.5 nm has the same accuracy as that obtained by the Hakki-Paoli method at the resolution of 0.06 nm for the laser with the mode interval of 1.3 nm.