50 resultados para Software testing. Problem-oriented programming. Teachingmethodology
Resumo:
Functional and non-functional concerns require different programming effort, different techniques and different methodologies when attempting to program efficient parallel/distributed applications. In this work we present a "programmer oriented" methodology based on formal tools that permits reasoning about parallel/distributed program development and refinement. The proposed methodology is semi-formal in that it does not require the exploitation of highly formal tools and techniques, while providing a palatable and effective support to programmers developing parallel/distributed applications, in particular when handling non-functional concerns.
Resumo:
Nurse rostering is a difficult search problem with many constraints. In the literature, a number of approaches have been investigated including penalty function methods to tackle these constraints within genetic algorithm frameworks. In this paper, we investigate an extension of a previously proposed stochastic ranking method, which has demonstrated superior performance to other constraint handling techniques when tested against a set of constrained optimisation benchmark problems. An initial experiment on nurse rostering problems demonstrates that the stochastic ranking method is better in finding feasible solutions but fails to obtain good results with regard to the objective function. To improve the performance of the algorithm, we hybridise it with a recently proposed simulated annealing hyper-heuristic within a local search and genetic algorithm framework. The hybrid algorithm shows significant improvement over both the genetic algorithm with stochastic ranking and the simulated annealing hyper-heuristic alone. The hybrid algorithm also considerably outperforms the methods in the literature which have the previously best known results.
Resumo:
Self-compacting concrete (SCC) flows into place and around obstructions under its own weight to fill the formwork completely and self-compact without any segregation and blocking. Elimination of the need for compaction leads to better quality concrete and substantial improvement of working conditions. This investigation aimed to show possible applicability of genetic programming (GP) to model and formulate the fresh and hardened properties of self-compacting concrete (SCC) containing pulverised fuel ash (PFA) based on experimental data. Twenty-six mixes were made with 0.38 to 0.72 water-to-binder ratio (W/B), 183–317 kg/m3 of cement content, 29–261 kg/m3 of PFA, and 0 to 1% of superplasticizer, by mass of powder. Parameters of SCC mixes modelled by genetic programming were the slump flow, JRing combined to the Orimet, JRing combined to cone, and the compressive strength at 7, 28 and 90 days. GP is constructed of training and testing data using the experimental results obtained in this study. The results of genetic programming models are compared with experimental results and are found to be quite accurate. GP has showed a strong potential as a feasible tool for modelling the fresh properties and the compressive strength of SCC containing PFA and produced analytical prediction of these properties as a function as the mix ingredients. Results showed that the GP model thus developed is not only capable of accurately predicting the slump flow, JRing combined to the Orimet, JRing combined to cone, and the compressive strength used in the training process, but it can also effectively predict the above properties for new mixes designed within the practical range with the variation of mix ingredients.
Resumo:
This paper describes the testing of a novel flexible masonry concrete arch system which requires no centering in the construction phase or steel reinforcement in the long-term. The arch is constructed from a 'flat pack' system by use of a polymer reinforcement for supporting the self-weight of the concrete voussoirs and behaves as a masonry arch once in the arch form. The paper outlines the construction of a prototype arch and load testing of the backfilled arch ring. Some comparisons to the results from analysis software have been made. The arch had a load carrying capacity far in excess of the current Highways Agency (United Kingdom) design wheel loads.
Resumo:
Developing a desirable framework for handling inconsistencies in software requirements specifications is a challenging problem. It has been widely recognized that the relative priority of requirements can help developers to make some necessary trade-off decisions for resolving con- flicts. However, for most distributed development such as viewpoints-based approaches, different stakeholders may assign different levels of priority to the same shared requirements statement from their own perspectives. The disagreement in the local levels of priority assigned to the same shared requirements statement often puts developers into a dilemma during the inconsistency handling process. The main contribution of this paper is to present a prioritized merging-based framework for handling inconsistency in distributed software requirements specifications. Given a set of distributed inconsistent requirements collections with the local prioritization, we first construct a requirements specification with a prioritization from an overall perspective. We provide two approaches to constructing a requirements specification with the global prioritization, including a merging-based construction and a priority vector-based construction. Following this, we derive proposals for handling inconsistencies from the globally prioritized requirements specification in terms of prioritized merging. Moreover, from the overall perspective, these proposals may be viewed as the most appropriate to modifying the given inconsistent requirements specification in the sense of the ordering relation over all the consistent subsets of the requirements specification. Finally, we consider applying negotiation-based techniques to viewpoints so as to identify an acceptable common proposal from these proposals.
Resumo:
This case study examines how the lean ideas behind the Toyota production system can be applied to software project management. It is a detailed investigation of the performance of a nine-person software development team employed by BBC Worldwide based in London. The data collected in 2009 involved direct observations of the development team, the kanban boards, the daily stand-up meetings, semistructured interviews with a wide variety of staff, and statistical analysis. The evidence shows that over the 12-month period, lead time to deliver software improved by 37%, consistency of delivery rose by 47%, and defects reported by customers fell 24%. The significance of this work is showing that the use of lean methods including visual management, team-based problem solving, smaller batch sizes, and statistical process control can improve software development. It also summarizes key differences between agile and lean approaches to software development. The conclusion is that the performance of the software development team was improved by adopting a lean approach. The faster delivery with a focus on creating the highest value to the customer also reduced both technical and market risks. The drawbacks are that it may not fit well with existing corporate standards.
Resumo:
A BSP superstep is a distributed computation comprising a number of simultaneously executing processes which may generate asynchronous messages. A superstep terminates with a barrier which enforces a global synchronisation and delivers all ongoing communications. Multilevel supersteps can utilise barriers in which subsets of processes, interacting through shared memories, are locally synchronised (partitioned synchronisation). In this paper a state-based semantics, closely related to the classical sequential programming model, is derived for distributed BSP with partitioned synchronisation.
Resumo:
A Web-service based approach is presented which enables geographically dispersed users to share software resources over the Internet. A service-oriented software sharing system has been developed, which consists of shared applications, client applications and three types of services: application proxy service, proxy implementation service and application manager service. With the aids of the services, the client applications interact with the shared applications to implement a software sharing task. The approach satisfies the requirements of copyright protection and reuse of legacy codes. In this paper, the role of Web-services and the architecture of the system are presented first, followed by a case study to illustrate the approach developed.
Resumo:
Background: Unexplained persistent breathlessness in patients with difficult asthma despite multiple treatments is a common clinical problem. Cardiopulmonary exercise testing (CPX) may help identify the mechanism causing these symptoms, allowing appropriate management.
Methods: This was a retrospective analysis of patients attending a specialist-provided service for difficult asthma who proceeded to CPX as part of our evaluation protocol. Patient demographics, lung function, and use of health care and rescue medication were compared with those in patients with refractory asthma. Medication use 6 months following CPX was compared with treatment during CPX.
Results: Of 302 sequential referrals, 39 patients underwent CPX. A single explanatory feature was identified in 30 patients and two features in nine patients: hyperventilation (n = 14), exercise-induced bronchoconstriction (n = 8), submaximal test (n = 8), normal test (n = 8), ventilatory limitation (n = 7), deconditioning (n = 2), cardiac ischemia (n = 1). Compared with patients with refractory asthma, patients without “pulmonary limitation” on CPX were prescribed similar doses of inhaled corticosteroid (ICS) (median, 1,300 µg [interquartile range (IQR), 800-2,000 µg] vs 1,800 µg [IQR, 1,000-2,000 µg]) and rescue oral steroid courses in the previous year (median, 5 [1-6] vs 5 [1-6]). In this group 6 months post-CPX, ICS doses were reduced (median, 1,300 µg [IQR, 800-2,000 µg] to 800 µg [IQR, 400-1,000 µg]; P < .001) and additional medication treatment was withdrawn (n = 7). Patients with pulmonary limitation had unchanged ICS doses post CPX and additional therapies were introduced.
Conclusions: In difficult asthma, CPX can confirm that persistent exertional breathlessness is due to asthma but can also identify other contributing factors. Patients with nonpulmonary limitation are prescribed inappropriately high doses of steroid therapy, and CPX can identify the primary mechanism of breathlessness, facilitating steroid reduction.
Resumo:
The contemporary dominance of visuality has turned our understanding of space into a mode of unidirectional experience that externalizes other sensual capacities of the body while perceiving the built environment. This affects not only architectural practice but also architectural education when an introduction to the concept of space is often challenging, especially for the students who have limited spatial and sensual training. Considering that an architectural work is not perceived as a series of retinal pictures but as a repeated multi-sensory experience, the problem definitions in the design studio need to be disengaged from the dominance of a ‘focused vision’ and be re-constructed in a holistic manner. A method to address this approach is to enable the students to refer to their own sensual experiences of the built environment as a part of their design processes. This paper focuses on a particular approach to the second year architectural design teaching which has been followed in the Department of Architecture at Izmir University of Economics for the last three years. The very first architectural project of the studio and the program, entitled ‘Sensing Spaces’, is conducted as a multi-staged design process including ‘sense games, analyses of organs and their interpretations into space’. The objectives of this four-week project are to explore the sense of space through the design of a three-dimensional assembly, to create an awareness of the significance of the senses in the design process and to experiment with re-interpreted forms of bodily parts. Hence, the students are encouraged to explore architectural space through their ‘tactile, olfactory, auditory, gustative and visual stimuli’. In this paper, based on a series of examples, architectural space is examined beyond its boundaries of structure, form and function, and spatial design is considered as an activity of re-constructing the built environment through the awareness of bodily senses.
Resumo:
This paper describes the development of a novel metaheuristic that combines an electromagnetic-like mechanism (EM) and the great deluge algorithm (GD) for the University course timetabling problem. This well-known timetabling problem assigns lectures to specific numbers of timeslots and rooms maximizing the overall quality of the timetable while taking various constraints into account. EM is a population-based stochastic global optimization algorithm that is based on the theory of physics, simulating attraction and repulsion of sample points in moving toward optimality. GD is a local search procedure that allows worse solutions to be accepted based on some given upper boundary or ‘level’. In this paper, the dynamic force calculated from the attraction-repulsion mechanism is used as a decreasing rate to update the ‘level’ within the search process. The proposed method has been applied to a range of benchmark university course timetabling test problems from the literature. Moreover, the viability of the method has been tested by comparing its results with other reported results from the literature, demonstrating that the method is able to produce improved solutions to those currently published. We believe this is due to the combination of both approaches and the ability of the resultant algorithm to converge all solutions at every search process.
Resumo:
This paper describes the design, application, and evaluation of a user friendly, flexible, scalable and inexpensive Advanced Educational Parallel (AdEPar) digital signal processing (DSP) system based on TMS320C25 digital processors to implement DSP algorithms. This system will be used in the DSP laboratory by graduate students to work on advanced topics such as developing parallel DSP algorithms. The graduating senior students who have gained some experience in DSP can also use the system. The DSP laboratory has proved to be a useful tool in the hands of the instructor to teach the mathematically oriented topics of DSP that are often difficult for students to grasp. The DSP laboratory with assigned projects has greatly improved the ability of the students to understand such complex topics as the fast Fourier transform algorithm, linear and circular convolution, the theory and design of infinite impulse response (IIR) and finite impulse response (FIR) filters. The user friendly PC software support of the AdEPar system makes it easy to develop DSP programs for students. This paper gives the architecture of the AdEPar DSP system. The communication between processors and the PC-DSP processor communication are explained. The parallel debugger kernels and the restrictions of the system are described. The programming in the AdEPar is explained, and two benchmarks (parallel FFT and DES) are presented to show the system performance.
Resumo:
As a promising method for pattern recognition and function estimation, least squares support vector machines (LS-SVM) express the training in terms of solving a linear system instead of a quadratic programming problem as for conventional support vector machines (SVM). In this paper, by using the information provided by the equality constraint, we transform the minimization problem with a single equality constraint in LS-SVM into an unconstrained minimization problem, then propose reduced formulations for LS-SVM. By introducing this transformation, the times of using conjugate gradient (CG) method, which is a greatly time-consuming step in obtaining the numerical solution, are reduced to one instead of two as proposed by Suykens et al. (1999). The comparison on computational speed of our method with the CG method proposed by Suykens et al. and the first order and second order SMO methods on several benchmark data sets shows a reduction of training time by up to 44%. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The efficient development of multi-threaded software has, for many years, been an unsolved problem in computer science. Finding a solution to this problem has become urgent with the advent of multi-core processors. Furthermore, the problem has become more complicated because multi-cores are everywhere (desktop, laptop, embedded system). As such, they execute generic programs which exhibit very different characteristics than the scientific applications that have been the focus of parallel computing in the past.
Implicitly parallel programming is an approach to parallel pro- gramming that promises high productivity and efficiency and rules out synchronization errors and race conditions by design. There are two main ingredients to implicitly parallel programming: (i) a con- ventional sequential programming language that is extended with annotations that describe the semantics of the program and (ii) an automatic parallelizing compiler that uses the annotations to in- crease the degree of parallelization.
It is extremely important that the annotations and the automatic parallelizing compiler are designed with the target application do- main in mind. In this paper, we discuss the Paralax approach to im- plicitly parallel programming and we review how the annotations and the compiler design help to successfully parallelize generic programs. We evaluate Paralax on SPECint benchmarks, which are a model for such programs, and demonstrate scalable speedups, up to a factor of 6 on 8 cores.