75 resultados para Validated Interval Software


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Software importance keeps growing fast and consistently for many organizations. The growth of software functionality in manufactured products and the emergence of digital media, convergent spaces including digital content, software, and multi-channels to the market, are recent examples of organizational changes where software assumed a central position for the corporate strategy. This paper analyzes the alignment between strategic objectives and software development processes at software companies and proposes a methodology to ensure that development processes are aligned with the corporate capabilities required to exploit future market opportunities. The methodology includes the categorization of different software companies according to their core capabilities and the customization of the technology roadmapping technique for software companies. The research process included the realization of case studies and a survey. (c) 2006 PICMET.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Offshore software development has been identified as one of the most striking manifestations of contemporary globalisation and as evidence of placelessness, the idea that information and communication technologies have rendered location irrelevant. Research in the International Business and Information Systems fields, in contrast, has suggested that all locations are not equal and has identified a number of characteristics that may influence the attractiveness of a location for multinational investment and offshoring, respectively. These literatures, however, focus almost exclusively on quantitative, economic characteristics that are seen as fixed and applying uniformly throughout a whole country. They therefore offer little guidance on the suitability of particular locations as offshoring destinations, especially in countries without a track record in offshore software development. Drawing on two cases of nearshore software development centres set up by offshore service providers in the Caribbean, this paper illustrates that, while the initial decision to establish the ventures reflected a logic of placelessness, characteristics of these particular locations affected their subsequent success. Through the findings, we therefore develop a typology of espoused, unanticipated and remediable locational characteristics, which illustrates that locational attractiveness may vary significantly within countries and that offshore service providers and government agencies can modify locational characteristics to their advantage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present a wafer level three-dimensional simulation model of the Gate Commutated Thyristor (GCT) under inductive switching conditions. The simulations are validated by extensive experimental measurements. To the authors' knowledge such a complex simulation domain has not been used so far. This method allows the in depth study of large area devices such as GCTs, Gate Turn Off Thyristors (GTOs) and Phase Control Thyristors (PCTs). The model captures complex phenomena, such as current filamentation including subsequent failure, which allow us to predict the Maximum Controllable turn-off Current (MCC) and the Safe Operating Area (SOA) previously impossible using 2D distributed models. © 2012 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

CLI .NET 4.0 research prototype platform coded in C# and Windows Presentation Foundation (WPF)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the implementation of the Boussinesq-type model and extends its application to the tsunami wave runup on the clustered islands (multiple adjacent conical islands), in turn, an extensively validated two-dimensional Boussinesq-type model is employed to examine the interaction between a propagating solitary wave and multiple idealised conical islands, with particular emphasis on a combination effect of two adjustable parameters for spacing interval/diameter ratio between the adjacent conical islands, S/D, and the rotating angle of the structural configuration,θ on maximum soliton runup heights. An extensive parameter study concerning the combination effect of alteringθ and S/D on the maximum soliton runup with the multi-conical islands is subsequently carried out and the distributions of the maximum runup heights on each conical island are obtained and compared for the twin-island cases. The worst case study is performed for each case in respect of the enhancement in the maximum wave runup heights by the multi-conical islands. It is found that the nonlinear wave diffraction, reflection and refraction play a significant role in varying the maximum soliton runup heights on multiconical islands. The comparatively large maximum soliton runups are generally predicted for the merged and bottom mounted clusteredislands. Furthermore, the joints of the clustered-merged islands are demonstrated to suffer the most of the tsunami wave attack. The conical islands that position in the shadow regions behind the surrounding islands are found to withstand relatively less extreme wave impact. Although, these numerical investigations are considerable simplifications of the multi conical islands, they give a critical insight into certain important hydrodynamic characteristics of the interaction between an extreme wave event and a group of clustered conical islands, and thus providing a useful engineering guidance for extreme wave mitigation and coastal development. Copyright © 2012 by the International Society of Offshore and Polar Engineers (ISOPE).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational fluid dynamics (CFD) simulations are becoming increasingly widespread with the advent of more powerful computers and more sophisticated software. The aim of these developments is to facilitate more accurate reactor design and optimization methods compared to traditional lumped-parameter models. However, in order for CFD to be a trusted method, it must be validated using experimental data acquired at sufficiently high spatial resolution. This article validates an in-house CFD code by comparison with flow-field data obtained using magnetic resonance imaging (MRI) for a packed bed with a particle-to-column diameter ratio of 2. Flows characterized by inlet Reynolds numbers, based on particle diameter, of 27, 55, 111, and 216 are considered. The code used employs preconditioning to directly solve for pressure in low-velocity flow regimes. Excellent agreement was found between the MRI and CFD data with relative error between the experimentally determined and numerically predicted flow-fields being in the range of 3-9%. © 2012 American Institute of Chemical Engineers (AIChE).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Humans have been shown to adapt to the temporal statistics of timing tasks so as to optimize the accuracy of their responses, in agreement with the predictions of Bayesian integration. This suggests that they build an internal representation of both the experimentally imposed distribution of time intervals (the prior) and of the error (the loss function). The responses of a Bayesian ideal observer depend crucially on these internal representations, which have only been previously studied for simple distributions. To study the nature of these representations we asked subjects to reproduce time intervals drawn from underlying temporal distributions of varying complexity, from uniform to highly skewed or bimodal while also varying the error mapping that determined the performance feedback. Interval reproduction times were affected by both the distribution and feedback, in good agreement with a performance-optimizing Bayesian observer and actor model. Bayesian model comparison highlighted that subjects were integrating the provided feedback and represented the experimental distribution with a smoothed approximation. A nonparametric reconstruction of the subjective priors from the data shows that they are generally in agreement with the true distributions up to third-order moments, but with systematically heavier tails. In particular, higher-order statistical features (kurtosis, multimodality) seem much harder to acquire. Our findings suggest that humans have only minor constraints on learning lower-order statistical properties of unimodal (including peaked and skewed) distributions of time intervals under the guidance of corrective feedback, and that their behavior is well explained by Bayesian decision theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A three-dimensional (3D) numerical model is proposed to solve the electromagnetic problems involving transport current and background field of a high-T c superconducting (HTS) system. The model is characterized by the E-J power law and H-formulation, and is successfully implemented using finite element software. We first discuss the model in detail, including the mesh methods, boundary conditions and computing time. To validate the 3D model, we calculate the ac loss and trapped field solution for a bulk material and compare the results with the previously verified 2D solutions and an analytical solution. We then apply our model to test some typical problems such as superconducting bulk array and twisted conductors, which cannot be tackled by the 2D models. The new 3D model could be a powerful tool for researchers and engineers to investigate problems with a greater level of complicity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Free software and open source projects are often perceived to be of high quality. It has been suggested that the high level of quality found in some free software projects is related to the open development model which promotes peer review. While the quality of some free software projects is comparable to, if not better than, that of closed source software, not all free software projects are successful and of high quality. Even mature and successful projects face quality problems; some of these are related to the unique characteristics of free software and open source as a distributed development model led primarily by volunteers. In exploratory interviews performed with free software and open source developers, several common quality practices as well as actual quality problems have been identified. The results of these interviews are presented in this paper in order to take stock of the current status of quality in free software projects and to act as a starting point for the implementation of quality process improvement strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BGCore is a software package for comprehensive computer simulation of nuclear reactor systems and their fuel cycles. The BGCore interfaces Monte Carlo particles transport code MCNP4C with a SARAF module - an independently developed code for calculating in-core fuel composition and spent fuel emissions following discharge. In BGCore system, depletion coupling methodology is based on the multi-group approach that significantly reduces computation time and allows tracking of large number of nuclides during calculations. In this study, burnup calculation capabilities of BGCore system were validated against well established and verified, computer codes for thermal and fast spectrum lattices. Very good agreement in k eigenvalue and nuclide densities prediction was observed for all cases under consideration. In addition, decay heat prediction capabilities of the BGCore system were benchmarked against the most recent edition of ANS Standard methodology for UO2 fuel decay power prediction in LWRs. It was found that the difference between ANS standard data and that predicted by the BGCore does not exceed 5%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the modern engineering design cycle the use of computational tools becomes a neces- sity. The complexity of the engineering systems under consideration for design increases dramatically as the demands for advanced and innovative design concepts and engineering products is expanding. At the same time the advancements in the available technology in terms of computational resources and power, as well as the intelligence of the design software, accommodate these demands and make them a viable approach towards the chal- lenge of real-world engineering problems. This class of design optimisation problems is by nature multi-disciplinary. In the present work we establish enhanced optimisation capabil- ities within the Nimrod/O tool for massively distributed execution of computational tasks through cluster and computational grid resources, and develop the potential to combine and benefit from all the possible available technological advancements, both software and hardware. We develop the interface between a Free Form Deformation geometry manage- ment in-house code with the 2D airfoil aerodynamic efficiency evaluation tool XFoil, and the well established multi-objective heuristic optimisation algorithm NSGA-II. A simple airfoil design problem has been defined to demonstrate the functionality of the design sys- tem, but also to accommodate a framework for future developments and testing with other state-of-the-art optimisation algorithms such as the Multi-Objective Genetic Algorithm (MOGA) and the Multi-Objective Tabu Search (MOTS) techniques. Ultimately, heav- ily computationally expensive industrial design cases can be realised within the presented framework that could not be investigated before. © 2012 by the authors. Published by the American Institute of Aeronautics and Astronautics, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the modern engineering design cycle the use of computational tools becomes a necessity. The complexity of the engineering systems under consideration for design increases dramatically as the demands for advanced and innovative design concepts and engineering products is expanding. At the same time the advancements in the available technology in terms of computational resources and power, as well as the intelligence of the design software, accommodate these demands and make them a viable approach towards the challenge of real-world engineering problems. This class of design optimisation problems is by nature multi-disciplinary. In the present work we establish enhanced optimisation capabilities within the Nimrod/O tool for massively distributed execution of computational tasks through cluster and computational grid resources, and develop the potential to combine and benefit from all the possible available technological advancements, both software and hardware. We develop the interface between a Free Form Deformation geometry management in-house code with the 2D airfoil aerodynamic efficiency evaluation tool XFoil, and the well established multi-objective heuristic optimisation algorithm NSGA-II. A simple airfoil design problem has been defined to demonstrate the functionality of the design system, but also to accommodate a framework for future developments and testing with other state-of-the-art optimisation algorithms such as the Multi-Objective Genetic Algorithm (MOGA) and the Multi-Objective Tabu Search (MOTS) techniques. Ultimately, heavily computationally expensive industrial design cases can be realised within the presented framework that could not be investigated before. ©2012 AIAA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The delivery of integrated product and service solutions is growing in the aerospace industry, driven by the potential of increasing profits. Such solutions require a life cycle view at the design phase in order to support the delivery of the equipment. The influence of uncertainty associated with design for services is increasingly a challenge due to information and knowledge constraints. There is a lack of frameworks that aim to define and quantify relationship between information and knowledge with uncertainty. Driven by this gap, the paper presents a framework to illustrate the link between uncertainty and knowledge within the design context for services in the aerospace industry. The paper combines industrial interaction and literature review to initially define the design attributes, the associated knowledge requirements and the uncertainties experienced. The framework is then applied in three cases through development of causal loop models (CLMs), which are validated by industrial and academic experts. The concepts and inter-linkages are developed with the intention of developing a software prototype. Future recommendations are also included. © 2014 CIRP.