941 resultados para test case optimization
Resumo:
This research has explored the relationship between system test complexity and tacit knowledge. It is proposed as part of this thesis, that the process of system testing (comprising of test planning, test development, test execution, test fault analysis, test measurement, and case management), is directly affected by both complexity associated with the system under test, and also by other sources of complexity, independent of the system under test, but related to the wider process of system testing. While a certain amount of knowledge related to the system under test is inherent, tacit in nature, and therefore difficult to make explicit, it has been found that a significant amount of knowledge relating to these other sources of complexity, can indeed be made explicit. While the importance of explicit knowledge has been reinforced by this research, there has been a lack of evidence to suggest that the availability of tacit knowledge to a test team is of any less importance to the process of system testing, when operating in a traditional software development environment. The sentiment was commonly expressed by participants, that even though a considerable amount of explicit knowledge relating to the system is freely available, that a good deal of knowledge relating to the system under test, which is demanded for effective system testing, is actually tacit in nature (approximately 60% of participants operating in a traditional development environment, and 60% of participants operating in an agile development environment, expressed similar sentiments). To cater for the availability of tacit knowledge relating to the system under test, and indeed, both explicit and tacit knowledge required by system testing in general, an appropriate knowledge management structure needs to be in place. This would appear to be required, irrespective of the employed development methodology.
Resumo:
Goodness-of-fit tests have been studied by many researchers. Among them, an alternative statistical test for uniformity was proposed by Chen and Ye (2009). The test was used by Xiong (2010) to test normality for the case that both location parameter and scale parameter of the normal distribution are known. The purpose of the present thesis is to extend the result to the case that the parameters are unknown. A table for the critical values of the test statistic is obtained using Monte Carlo simulation. The performance of the proposed test is compared with the Shapiro-Wilk test and the Kolmogorov-Smirnov test. Monte-Carlo simulation results show that proposed test performs better than the Kolmogorov-Smirnov test in many cases. The Shapiro Wilk test is still the most powerful test although in some cases the test proposed in the present research performs better.
Resumo:
The effectiveness of an optimization algorithm can be reduced to its ability to navigate an objective function’s topology. Hybrid optimization algorithms combine various optimization algorithms using a single meta-heuristic so that the hybrid algorithm is more robust, computationally efficient, and/or accurate than the individual algorithms it is made of. This thesis proposes a novel meta-heuristic that uses search vectors to select the constituent algorithm that is appropriate for a given objective function. The hybrid is shown to perform competitively against several existing hybrid and non-hybrid optimization algorithms over a set of three hundred test cases. This thesis also proposes a general framework for evaluating the effectiveness of hybrid optimization algorithms. Finally, this thesis presents an improved Method of Characteristics Code with novel boundary conditions, which better characterizes pipelines than previous codes. This code is coupled with the hybrid optimization algorithm in order to optimize the operation of real-world piston pumps.
Resumo:
Design and analysis of conceptually different cooling systems for the human heart preservation are numerically investigated. A heart cooling container with required connections was designed for a normal size human heart. A three-dimensional, high resolution human heart geometric model obtained from CT-angio data was used for simulations. Nine different cooling designs are introduced in this research. The first cooling design (Case 1) used a cooling gelatin only outside of the heart. In the second cooling design (Case 2), the internal parts of the heart were cooled via pumping a cooling liquid inside both the heart’s pulmonary and systemic circulation systems. An unsteady conjugate heat transfer analysis is performed to simulate the temperature field variations within the heart during the cooling process. Case 3 simulated the currently used cooling method in which the coolant is stagnant. Case 4 was a combination of Case 1 and Case 2. A linear thermoelasticity analysis was performed to assess the stresses applied on the heart during the cooling process. In Cases 5 through 9, the coolant solution was used for both internal and external cooling. For external circulation in Case 5 and Case 6, two inlets and two outlets were designed on the walls of the cooling container. Case 5 used laminar flows for coolant circulations inside and outside of the heart. Effects of turbulent flow on cooling of the heart were studied in Case 6. In Case 7, an additional inlet was designed on the cooling container wall to create a jet impinging the hot region of the heart’s wall. Unsteady periodic inlet velocities were applied in Case 8 and Case 9. The average temperature of the heart in Case 5 was +5.0oC after 1500 s of cooling. Multi-objective constrained optimization was performed for Case 5. Inlet velocities for two internal and one external coolant circulations were the three design variables for optimization. Minimizing the average temperature of the heart, wall shear stress and total volumetric flow rates were the three objectives. The only constraint was to keep von Mises stress below the ultimate tensile stress of the heart’s tissue.
Resumo:
Flame retardants (FRs) are added to materials to enhance the fire safety level of readily combustible polymers. Although they have been purported to aid in preventing fires in some cases, they have also become a significant cause for concern given the vast data on environmental persistence and human and animal adverse health effects. Evidence since the 1980s has shown that Canadian, American and Europeans have detectable levels of FRs in their bodies. North Americans in particular have high levels of these chemicals due to stringent flammability standards and the higher use of polybrominated diphenyl ethers (PBDEs) in North America as opposed to Europe. FRs have been detected in household dust and some evidence suggests that TVs could be a significant source of exposure to FRs. It is imperative to re-visit the flammability standard (UL94V) that allows for FR use in TVs plastic materials by providing a risk versus benefit analysis to determine if this standard provides a fire safety benefit and if it plays a major role in FR exposure. This report first examined the history of televisions and the progression to the UL94V flammability test standard to understand why FRs were first added to polymers used in the manufacturing of TVs. It has been demonstrated to be due to fire hazards resulting from the use of plastic materials in cathode-ray tube (CRT) TVs that had an “instant-on” feature and high voltage and operating temperatures. In providing a risk versus benefit analysis, this paper presents the argument that 1) by providing a market survey the current flammability test standard (UL94V) is outdated and lacks relevance to current technology as flat, thin, energy efficient Liquid Crystal Displays (LCDs) dominate over traditionally used heavy, bulky and energy-intensive CRTs; 2) FRs do not impart fire safety benefits considering that there is a lack of valid fire safety concern, such as reduced internal and external ignition and fire hazard, and a lack of valid fire data and hazard for television fires in general and finally; 3) the standard is overly stringent as it does not consider the risk due to exposure to FRs in household dust due to the proliferation and greater use of televisions in households. Therefore, this report argues that the UL94V standard has become trapped in history and needs to be updated as it may play a major role in FR exposure.
Resumo:
Strategic supply chain optimization (SCO) problems are often modelled as a two-stage optimization problem, in which the first-stage variables represent decisions on the development of the supply chain and the second-stage variables represent decisions on the operations of the supply chain. When uncertainty is explicitly considered, the problem becomes an intractable infinite-dimensional optimization problem, which is usually solved approximately via a scenario or a robust approach. This paper proposes a novel synergy of the scenario and robust approaches for strategic SCO under uncertainty. Two formulations are developed, namely, naïve robust scenario formulation and affinely adjustable robust scenario formulation. It is shown that both formulations can be reformulated into tractable deterministic optimization problems if the uncertainty is bounded with the infinity-norm, and the uncertain equality constraints can be reformulated into deterministic constraints without assumption of the uncertainty region. Case studies of a classical farm planning problem and an energy and bioproduct SCO problem demonstrate the advantages of the proposed formulations over the classical scenario formulation. The proposed formulations not only can generate solutions with guaranteed feasibility or indicate infeasibility of a problem, but also can achieve optimal expected economic performance with smaller numbers of scenarios.
Resumo:
This paper is concerned with strategic optimization of a typical industrial chemical supply chain, which involves a material purchase and transportation network, several manufacturing plants with on-site material and product inventories, a product transportation network and several regional markets. In order to address large uncertainties in customer demands at the different regional markets, a novel robust scenario formulation, which has been developed by the authors recently, is tailored and applied for the strategic optimization. Case study results show that the robust scenario formulation works well for this real industrial supply chain system, and it outperforms the deterministic formulation and the classical scenario-based stochastic programming formulation by generating better expected economic performance and solutions that are guaranteed to be feasible for all uncertainty realizations. The robust scenario problem exhibits a decomposable structure that can be taken advantage of by Benders decomposition for efficient solution, so the application of Benders decomposition to the solution of the strategic optimization is also discussed. The case study results show that Benders decomposition can reduce the solution time by almost an order of magnitude when the number of scenarios in the problem is large.
Resumo:
The study examines the short-run and long-run causality running from real economic growth to real foreign direct investment inflows (RFDI). Other variables such as education (involving combination of primary, secondary and tertiary enrolment as a proxy to education), real development finance, unskilled labour, to real RFDI inflows are included in the study. The time series data covering the period of 1983 -2013 are examined. First, I applied Augmented Dicky-Fuller (ADF) technique to test for unit root in variables. Findings shows all variables integrated of order one [I(1)]. Thereafter, Johansen Co-integration Test (JCT) was conducted to establish the relationship among variables. Both trace and maximum Eigen value at 5% level of significance indicate 3 co-integrated equations. Vector error correction method (VECM) was applied to capture short and long-run causality running from education, economic growth, real development finance, and unskilled labour to real foreign direct investment inflows in the Republic of Rwanda. Findings shows no short-run causality running from education, real development finance, real GDP and unskilled labour to real FDI inflows, however there were existence of long-run causality. This can be interpreted that, in the short-run; education, development finance, finance and economic growth does not influence inflows of foreign direct investment in Rwanda; but it does in long-run. From the policy perspective, the Republic of Rwanda should focus more on long term goal of investing in education to improve human capital, undertake policy reforms that promotes economic growth, in addition to promoting good governance to attract development finance – especially from Nordics countries (particularly Norway and Denmark).
Resumo:
The problem of decentralized sequential detection is studied in this thesis, where local sensors are memoryless, receive independent observations, and no feedback from the fusion center. In addition to traditional criteria of detection delay and error probability, we introduce a new constraint: the number of communications between local sensors and the fusion center. This metric is able to reflect both the cost of establishing communication links as well as overall energy consumption over time. A new formulation for communication-efficient decentralized sequential detection is proposed where the overall detection delay is minimized with constraints on both error probabilities and the communication cost. Two types of problems are investigated based on the communication-efficient formulation: decentralized hypothesis testing and decentralized change detection. In the former case, an asymptotically person-by-person optimum detection framework is developed, where the fusion center performs a sequential probability ratio test based on dependent observations. The proposed algorithm utilizes not only reported statistics from local sensors, but also the reporting times. The asymptotically relative efficiency of proposed algorithm with respect to the centralized strategy is expressed in closed form. When the probabilities of false alarm and missed detection are close to one another, a reduced-complexity algorithm is proposed based on a Poisson arrival approximation. In addition, decentralized change detection with a communication cost constraint is also investigated. A person-by-person optimum change detection algorithm is proposed, where transmissions of sensing reports are modeled as a Poisson process. The optimum threshold value is obtained through dynamic programming. An alternative method with a simpler fusion rule is also proposed, where the threshold values in the algorithm are determined by a combination of sequential detection analysis and constrained optimization. In both decentralized hypothesis testing and change detection problems, tradeoffs in parameter choices are investigated through Monte Carlo simulations.
Resumo:
Government communication is an important management tool during a public health crisis, but understanding its impact is difficult. Strategies may be adjusted in reaction to developments on the ground and it is challenging to evaluate the impact of communication separately from other crisis management activities. Agent-based modeling is a well-established research tool in social science to respond to similar challenges. However, there have been few such models in public health. We use the example of the TELL ME agent-based model to consider ways in which a non-predictive policy model can assist policy makers. This model concerns individuals’ protective behaviors in response to an epidemic, and the communication that influences such behavior. Drawing on findings from stakeholder workshops and the results of the model itself, we suggest such a model can be useful: (i) as a teaching tool, (ii) to test theory, and (iii) to inform data collection. We also plot a path for development of similar models that could assist with communication planning for epidemics.
Resumo:
The occurrence of hand grindstones at the Cogotas I archaeological sites is considered to be a common feature. Given that a distant-provenance raw material is frequently involved, determination of its source is a basic factor in the search for a better understanding of resource management and for any Political Economy approach. To progress in these directions an overall study should be planned, using selected grindstones with a view to covering diverse sub-zones of the Cogotas I dispersal area, especially because of its considerable distance from the granite basement source. Such a study may today includes diverse analytical procedures combining successive geographic, petrographic, mineralogical and geochemical criteria. To check the plausibility of the proposed methodology, a preliminary test has been carried out on two granite grindstones, obtained at the archaeological excavation at the Castronuño (Valladolid) Cogotian site, which is fifty km away from an inferred source area that was presumably located at Peñausende (Zamora). The result obtained validates the proposed operational process, yielding a generalizable knowledge to other similar situations.
Resumo:
The article presents a study of a CEFR B2-level reading subtest that is part of the Slovenian national secondary school leaving examination in English as a foreign language, and compares the test-taker actual performance (objective difficulty) with the test-taker and expert perceptions of item difficulty (subjective difficulty). The study also analyses the test-takers’ comments on item difficulty obtained from a while-reading questionnaire. The results are discussed in the framework of the existing research in the fields of (the assessment of) reading comprehension, and are addressed with regard to their implications for item-writing, FL teaching and curriculum development.
Resumo:
The VLT-FLAMES Tarantula Survey (VFTS) has secured mid-resolution spectra of over 300 O-type stars in the 30 Doradus region of the Large Magellanic Cloud. A homogeneous analysis of such a large sample requires automated techniques, an approach that will also be needed for the upcoming analysis of the Gaia surveys of the Northern and Southern Hemisphere supplementing the Gaia measurements. We point out the importance of Gaia for the study of O stars, summarize the O star science case of VFTS and present a test of the automated modeling technique using synthetically generated data. This method employs a genetic algorithm based optimization technique in combination with fastwind model atmospheres. The method is found to be robust and able to recover the main photospheric parameters accurately. Precise wind parameters can be obtained as well, however, as expected, for dwarf stars the rate of acceleration of the ow is poorly constrained.
Resumo:
This paper describes a methodology of using individual engineering undergraduate student projects as a means of effectively and efficiently developing new Design-Build-Test (DBT) learning experiences and challenges.
A key aspect of the rationale for this approach is that it benefits all parties. The student undertaking the individual project gets an authentic experience of producing a functional artefact, which has been the result of a design process that addresses conception, design, implementation and operation. The supervising faculty member benefits from live prototyping of new curriculum content and resources with a student who is at a similar level of knowledge and experience as the intended end users of the DBT outputs. The multiple students who ultimately undertake the DBT experiences / challenges benefit from the enhanced nature of a learning experience which has been “road tested” and optimised.
To demonstrate the methodology the paper will describe a case study example of an individual project completed in 2015. This resulted in a DBT design challenge with a theme of designing a catapult for throwing table tennis balls, the device being made from components laser cut from medium density fibreboard (MDF). Further three different modes of operation will be described which use the same resource materials but operate over different timescales and with different learning outcomes, from an icebreaker exercise focused on developing team dynamics through to full DBT where students get an opportunity to experience the full impact of their design decisions by competing against other students with a catapult they have designed and built themselves.
Resumo:
This study investigates topology optimization of energy absorbing structures in which material damage is accounted for in the optimization process. The optimization objective is to design the lightest structures that are able to absorb the required mechanical energy. A structural continuity constraint check is introduced that is able to detect when no feasible load path remains in the finite element model, usually as a result of large scale fracture. This assures that designs do not fail when loaded under the conditions prescribed in the design requirements. This continuity constraint check is automated and requires no intervention from the analyst once the optimization process is initiated. Consequently, the optimization algorithm proceeds towards evolving an energy absorbing structure with the minimum structural mass that is not susceptible to global structural failure. A method is also introduced to determine when the optimization process should halt. The method identifies when the optimization method has plateaued and is no longer likely to provide improved designs if continued for further iterations. This provides the designer with a rational method to determine the necessary time to run the optimization and avoid wasting computational resources on unnecessary iterations. A case study is presented to demonstrate the use of this method.