162 resultados para Exception Handling. Exceptional Behavior. Exception Policy. Software Testing. Design Rules
em Queensland University of Technology - ePrints Archive
Resumo:
This paper considers the debate about the relationship between globalization and media policy from the perspective provided by a current review of the Australian media classification scheme. Drawing upon the author’s recent experience in being ‘inside’ the policy process, as Lead Commissioner on the Australian National Classification Scheme Review, it is argued that theories of globalization – including theories of neoliberal globalization – fail to adequately capture the complexities of the reform process, particularly around the relationship between regulation and markets. The paper considers the pressure points for media content policies arising from media globalization, and the wider questions surrounding media content policies in an age of media convergence.
Resumo:
Various tools have been developed to assist designers in making interfaces easier to use although none yet offer a complete solution. Through previous work we have established that intuitive interaction is based on past experience. From this we have developed theory around intuitive interaction, a continuum and a conceptual tool for intuitive use. We then trialled our tool. Firstly, one designer used the tool to design a camera. Secondly, seven groups of postgraduate students re-designed various products using our tool. We then chose one of these - a microwave – and prototyped the new and original microwave interfaces on a touchscreen. We tested them on three different age groups. We found that the new design was more intuitive and rated by participants as more familiar. Therefore, design interventions based on our intuitive interaction theory can work. Work is ongoing to develop the tool further.
Resumo:
A significant proportion of the cost of software development is due to software testing and maintenance. This is in part the result of the inevitable imperfections due to human error, lack of quality during the design and coding of software, and the increasing need to reduce faults to improve customer satisfaction in a competitive marketplace. Given the cost and importance of removing errors improvements in fault detection and removal can be of significant benefit. The earlier in the development process faults can be found, the less it costs to correct them and the less likely other faults are to develop. This research aims to make the testing process more efficient and effective by identifying those software modules most likely to contain faults, allowing testing efforts to be carefully targeted. This is done with the use of machine learning algorithms which use examples of fault prone and not fault prone modules to develop predictive models of quality. In order to learn the numerical mapping between module and classification, a module is represented in terms of software metrics. A difficulty in this sort of problem is sourcing software engineering data of adequate quality. In this work, data is obtained from two sources, the NASA Metrics Data Program, and the open source Eclipse project. Feature selection before learning is applied, and in this area a number of different feature selection methods are applied to find which work best. Two machine learning algorithms are applied to the data - Naive Bayes and the Support Vector Machine - and predictive results are compared to those of previous efforts and found to be superior on selected data sets and comparable on others. In addition, a new classification method is proposed, Rank Sum, in which a ranking abstraction is laid over bin densities for each class, and a classification is determined based on the sum of ranks over features. A novel extension of this method is also described based on an observed polarising of points by class when rank sum is applied to training data to convert it into 2D rank sum space. SVM is applied to this transformed data to produce models the parameters of which can be set according to trade-off curves to obtain a particular performance trade-off.
Resumo:
Model-based testing (MBT) relies on models of a system under test and/or its environment to derive test cases for the system. This paper discusses the process of MBT and defines a taxonomy that covers the key aspects of MBT approaches. It is intended to help with understanding the characteristics, similarities and differences of those approaches, and with classifying the approach used in a particular MBT tool. To illustrate the taxonomy, a description of how three different examples of MBT tools fit into the taxonomy is provided.
Resumo:
A discussion of 2008/2009 developments in Australian educational policy, with specific reference to the adoption of US and UK trends in accountability, testing and school reform.
Resumo:
This paper presents the details of experimental studies on the shear behaviour of a recently developed, cold-formed steel beam known as LiteSteel Beam (LSB). The LSB section has a unique shape of a channel beam with two rectangular hollow flanges and is produced by a patented manufacturing process involving simultaneous cold-forming and dual electric resistance welding. To date, no research has been undertaken on the shear behaviour of LiteSteel beams with torsionally rigid, rectangular hollow flanges. In the present investigation, experimental studies involving more than 30 shear tests were carried out to investigate the shear behaviour of 13 different LSB sections. It was found that the current design rules in cold-formed steel structures design codes are very conservative for the shear design of LiteSteel beams. Significant improvements to web shear buckling occurred due to the presence of rectangular hollow flanges while considerable post-buckling strength was also observed. Experimental results are presented and compared with corresponding predictions from the current design codes in this paper. Appropriate improvements have been proposed for the shear strength of LSBs based on AS/NZS 4600 design equations.
Resumo:
Scalable high-resolution tiled display walls are becoming increasingly important to decision makers and researchers because high pixel counts in combination with large screen areas facilitate content rich, simultaneous display of computer-generated visualization information and high-definition video data from multiple sources. This tutorial is designed to cater for new users as well as researchers who are currently operating tiled display walls or 'OptiPortals'. We will discuss the current and future applications of display wall technology and explore opportunities for participants to collaborate and contribute in a growing community. Multiple tutorial streams will cover both hands-on practical development, as well as policy and method design for embedding these technologies into the research process. Attendees will be able to gain an understanding of how to get started with developing similar systems themselves, in addition to becoming familiar with typical applications and large-scale visualisation techniques. Presentations in this tutorial will describe current implementations of tiled display walls that highlight the effective usage of screen real-estate with various visualization datasets, including collaborative applications such as visualcasting, classroom learning and video conferencing. A feature presentation for this tutorial will be given by Jurgen Schulze from Calit2 at the University of California, San Diego. Jurgen is an expert in scientific visualization in virtual environments, human-computer interaction, real-time volume rendering, and graphics algorithms on programmable graphics hardware.
Resumo:
The reporting and auditing of patient dose is an important component of radiotherapy quality assurance. The manual extraction of dose-volume metrics is time consuming and undesirable when auditing the dosimetric quality of a large cohort of patient plans. A dose assessment application was written to overcome this, allowing the calculation of various dose-volume metrics for large numbers of plans exported from treatment planning systems. This application expanded on the DICOM-handling functionality of the MCDTK software suite. The software extracts dose values in the volume of interest by using a ray casting point-in-polygon algorithm, where the polygons have been defined by the contours in the RTSTRUCT file...
Resumo:
Extant models of decision making in social neurobiological systems have typically explained task dynamics as characterized by transitions between two attractors. In this paper, we model a three-attractor task exemplified in a team sport context. The model showed that an attacker–defender dyadic system can be described by the angle x between a vector connecting the participants and the try line. This variable was proposed as an order parameter of the system and could be dynamically expressed by integrating a potential function. Empirical evidence has revealed that this kind of system has three stable attractors, with a potential function of the form V(x)=−k1x+k2ax2/2−bx4/4+x6/6, where k1 and k2 are two control parameters. Random fluctuations were also observed in system behavior, modeled as white noise εt, leading to the motion equation dx/dt = −dV/dx+Q0.5εt, where Q is the noise variance. The model successfully mirrored the behavioral dynamics of agents in a social neurobiological system, exemplified by interactions of players in a team sport.
Resumo:
Bug fixing is a highly cooperative work activity where developers, testers, product managers and other stake-holders collaborate using a bug tracking system. In the context of Global Software Development (GSD), where software development is distributed across different geographical locations, we focus on understanding the role of bug trackers in supporting software bug fixing activities. We carried out a small-scale ethnographic fieldwork in a software product team distributed between Finland and India at a multinational engineering company. Using semi-structured interviews and in-situ observations of 16 bug cases, we show that the bug tracker 1) supported information needs of different stake holder, 2) established common-ground, and 3) reinforced issues related to ownership, performance and power. Consequently, we provide implications for design around these findings.
Resumo:
Amphibian is an 10’00’’ musical work which explores new musical interfaces and approaches to hybridising performance practices from the popular music, electronic dance music and computer music traditions. The work is designed to be presented in a range of contexts associated with the electro-acoustic, popular and classical music traditions. The work is for two performers using two synchronised laptops, an electric guitar and a custom designed gestural interface for vocal performers - the e-Mic (Extended Mic-stand Interface Controller). This interface was developed by one of the co-authors, Donna Hewitt. The e-Mic allows a vocal performer to manipulate the voice in real time through the capture of physical gestures via an array of sensors - pressure, distance, tilt - along with ribbon controllers and an X-Y joystick microphone mount. Performance data are then sent to a computer, running audio-processing software, which is used to transform the audio signal from the microphone. In this work, data is also exchanged between performers via a local wireless network, allowing performers to work with shared data streams. The duo employs the gestural conventions of guitarist and singer (i.e. 'a band' in a popular music context), but transform these sounds and gestures into new digital music. The gestural language of popular music is deliberately subverted and taken into a new context. The piece thus explores the nexus between the sonic and performative practices of electro acoustic music and intelligent electronic dance music (‘idm’). This work was situated in the research fields of new musical interfacing, interaction design, experimental music composition and performance. The contexts in which the research was conducted were live musical performance and studio music production. The work investigated new methods for musical interfacing, performance data mapping, hybrid performance and compositional practices in electronic music. The research methodology was practice-led. New insights were gained from the iterative experimental workshopping of gestural inputs, musical data mapping, inter-performer data exchange, software patch design, data and audio processing chains. In respect of interfacing, there were innovations in the design and implementation of a novel sensor-based gestural interface for singers, the e-Mic, one of the only existing gestural controllers for singers. This work explored the compositional potential of sharing real time performance data between performers and deployed novel methods for inter-performer data exchange and mapping. As regards stylistic and performance innovation, the work explored and demonstrated an approach to the hybridisation of the gestural and sonic language of popular music with recent ‘post-digital’ approaches to laptop based experimental music The development of the work was supported by an Australia Council Grant. Research findings have been disseminated via a range of international conference publications, recordings, radio interviews (ABC Classic FM), broadcasts, and performances at international events and festivals. The work was curated into the major Australian international festival, Liquid Architecture, and was selected by an international music jury (through blind peer review) for presentation at the International Computer Music Conference in Belfast, N. Ireland.
Resumo:
An important aspect of designing any product is validation. Virtual design process (VDP) is an alternative to hardware prototyping in which analysis of designs can be done without manufacturing physical samples. In recent years, VDP have been generated either for animation or filming applications. This paper proposes a virtual reality design process model on one of the applications when used as a validation tool. This technique is used to generate a complete design guideline and validation tool of product design. To support the design process of a product, a virtual environment and VDP method were developed that supports validation and an initial design cycle performed by a designer. The product model car carrier is used as illustration for which virtual design was generated. The loading and unloading sequence of the model for the prototype was generated using automated reasoning techniques and was completed by interactively animating the product in the virtual environment before complete design was built. By using the VDP process critical issues like loading, unloading, Australian Design rules (ADR) and clearance analysis were done. The process would save time, money in physical sampling and to large extent in complete math generation. Since only schematic models are required, it saves time in math modelling and handling of bigger size assemblies due to complexity of the models. This extension of VDP process for design evaluation is unique and was developed, implemented successfully. In this paper a Toll logistics and J Smith and Sons car carrier which is developed under author’s responsibility has been used to illustrate our approach of generating design validation via VDP.
Resumo:
Measuring quality attributes of object-oriented designs (e.g. maintainability and performance) has been covered by a number of studies. However, these studies have not considered security as much as other quality attributes. Also, most security studies focus at the level of individual program statements. This approach makes it hard and expensive to discover and fix vulnerabilities caused by design errors. In this work, we focus on the security design of an object oriented application and define a number of security metrics. These metrics allow designers to discover and fix security vulnerabilities at an early stage, and help compare the security of various alternative designs. In particular, we propose seven security metrics to measure Data Encapsulation (accessibility) and Cohesion (interactions) of a given object-oriented class from the point of view of potential information flow.