14 resultados para automated software testing

em Aston University Research Archive


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In earlier work we proposed the idea of requirements-aware systems that could introspect about the extent to which their goals were being satisfied at runtime. When combined with requirements monitoring and self adaptive capabilities, requirements awareness should help optimize goal satisfaction even in the presence of changing run-time context. In this paper we describe initial progress towards the realization of requirements-aware systems with REAssuRE. REAssuRE focuses on explicit representation of assumptions made at design time. When such assumptions are shown not to hold, REAssuRE can trigger system adaptations to alternative goal realization strategies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The primary aim of this thesis was to investigate the in vivo ocular morphological and contractile changes occurring within the accommodative apparatus prior to the onset of presbyopia, with particular reference to ciliary muscle changes with age and the origin of a myopic shift in refraction during incipient presbyopia. Commissioned semi-automated software proved capable of extracting accurate and repeatable measurements from crystalline lens and ciliary muscle Anterior Segment Optical Coherence Tomography (AS-OCT) images and reduced the subjectivity of AS-OCT image analysis. AS-OCT was utilised to document longitudinal changes in ciliary muscle morphology within an incipient presbyopic population (n=51). A significant antero-inwards shift of ciliary muscle mass was observed after 2.5 years. Furthermore, in a subgroup study (n=20), an accommodative antero-inwards movement of ciliary muscle mass was evident. After 2.5 years, the centripetal response of the ciliary muscle significantly attenuated during accommodation, whereas the antero-posterior mobility of the ciliary muscle remained invariant. Additionally, longitudinal measurement of ocular biometry revealed a significant increase in crystalline lens thickness and a corresponding decrease in anterior chamber depth after 2.5 years (n=51). Lenticular changes appear to be determinant of changes in refraction during incipient presbyopia. During accommodation, a significant increase in crystalline lens thickness and axial length was observed, whereas anterior chamber depth decreased (n=20). The change in ocular biometry per dioptre of accommodation exerted remained invariant after 2.5 years. Cross-sectional ocular biometric data were collected to quantify accommodative axial length changes from early adulthood to advanced presbyopia (n=72). Accommodative axial length elongation significantly attenuated during presbyopia, which was consistent with a significant increase in ocular rigidity during presbyopia. The studies presented in this thesis support the Helmholtz theory of accommodation and despite the reduction in centripetal ciliary muscle contractile response with age, primarily implicate lenticular changes in the development of presbyopia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Summarised retinal vessel diameters are linked to systemic vascular pathology. Monochromatic images provide best contrast to measure vessel calibres. However, when obtaining images with a dual wavelength oximeter the red-free image can be extracted as the green channel information only which in turn will reduce the number of photographs taken at a given time. This will reduce patient exposure to the camera flash and could provide sufficient quality images to reliably measure vessel calibres. Methods: We obtained retinal images of one eye of 45 healthy participants. Central retinal arteriolar and central retinal venular equivalents (CRAE and CRVE, respectively) were measured using semi-automated software from two monochromatic images: one taken with a red-free filter and one extracted from the green channel of a dual wavelength oximetry image. Results: Participants were aged between 21 and 62 years, all were normotensive (SBP: 115 (12) mmHg; DBP: 72 (10) mmHg) and had normal intra-ocular pressures (12 (3) mmHg). Bland-Altman analysis revealed good agreement of CRAE and CRVE as obtained from both images (mean bias CRAE = 0.88; CRVE = 2.82). Conclusions: Summarised retinal vessel calibre measurements obtained from oximetry images are in good agreement to those obtained using red-free photographs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Many software engineers have found that it is difficult to understand, incorporate and use different formal models consistently in the process of software developments, especially for large and complex software systems. This is mainly due to the complex mathematical nature of the formal methods and the lack of tool support. It is highly desirable to have software models and their related software artefacts systematically connected and used collaboratively, rather than in isolation. The success of the Semantic Web, as the next generation of Web technology, can have profound impact on the environment for formal software development. It allows both the software engineers and machines to understand the content of formal models and supports more effective software design in terms of understanding, sharing and reusing in a distributed manner. To realise the full potential of the Semantic Web in formal software development, effectively creating proper semantic metadata for formal software models and their related software artefacts is crucial. This paper proposed a framework that allows users to interconnect the knowledge about formal software models and other related documents using the semantic technology. We first propose a methodology with tool support is proposed to automatically derive ontological metadata from formal software models and semantically describe them. We then develop a Semantic Web environment for representing and sharing formal Z/OZ models. A method with prototype tool is presented to enhance semantic query to software models and other artefacts. © 2014.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While the retrieval of existing designs to prevent unnecessary duplication of parts is a recognised strategy in the control of design costs the available techniques to achieve this, even in product data management systems, are limited in performance or require large resources. A novel system has been developed based on a new version of an existing coding system (CAMAC) that allows automatic coding of engineering drawings and their subsequent retrieval using a drawing of the desired component as the input. The ability to find designs using a detail drawing rather than textual descriptions is a significant achievement in itself. Previous testing of the system has demonstrated this capability but if a means could be found to find parts from a simple sketch then its practical application would be much more effective. This paper describes the development and testing of such a search capability using a database of over 3000 engineering components.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using current software engineering technology, the robustness required for safety critical software is not assurable. However, different approaches are possible which can help to assure software robustness to some extent. For achieving high reliability software, methods should be adopted which avoid introducing faults (fault avoidance); then testing should be carried out to identify any faults which persist (error removal). Finally, techniques should be used which allow any undetected faults to be tolerated (fault tolerance). The verification of correctness in system design specification and performance analysis of the model, are the basic issues in concurrent systems. In this context, modeling distributed concurrent software is one of the most important activities in the software life cycle, and communication analysis is a primary consideration to achieve reliability and safety. By and large fault avoidance requires human analysis which is error prone; by reducing human involvement in the tedious aspect of modelling and analysis of the software it is hoped that fewer faults will persist into its implementation in the real-time environment. The Occam language supports concurrent programming and is a language where interprocess interaction takes place by communications. This may lead to deadlock due to communication failure. Proper systematic methods must be adopted in the design of concurrent software for distributed computing systems if the communication structure is to be free of pathologies, such as deadlock. The objective of this thesis is to provide a design environment which ensures that processes are free from deadlock. A software tool was designed and used to facilitate the production of fault-tolerant software for distributed concurrent systems. Where Occam is used as a design language then state space methods, such as Petri-nets, can be used in analysis and simulation to determine the dynamic behaviour of the software, and to identify structures which may be prone to deadlock so that they may be eliminated from the design before the program is ever run. This design software tool consists of two parts. One takes an input program and translates it into a mathematical model (Petri-net), which is used for modeling and analysis of the concurrent software. The second part is the Petri-net simulator that takes the translated program as its input and starts simulation to generate the reachability tree. The tree identifies `deadlock potential' which the user can explore further. Finally, the software tool has been applied to a number of Occam programs. Two examples were taken to show how the tool works in the early design phase for fault prevention before the program is ever run.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates how existing software engineering techniques can be employed, adapted and integrated for the development of systems of systems. Starting from existing system-of-systems (SoS) studies, we identify computing paradigms and techniques that have the potential to help address the challenges associated with SoS development, and propose an SoS development framework that combines these techniques in a novel way. This framework addresses the development of a class of IT systems of systems characterised by high variability in the types of interactions between their component systems, and by relatively small numbers of such interactions. We describe how the framework supports the dynamic, automated generation of the system interfaces required to achieve these interactions, and present a case study illustrating the development of a data-centre SoS using the new framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this research is to design and build a groupware system which will allow members of a distributed group more flexibility in performing software inspection. Software inspection, which is part of non-execution based testing in software development, is a group activity. The groupware system aims to provide a system that will improve acceptability of groupware and improve software quality by providing a software inspection tool that is flexible and adaptable. The groupware system provide a flexible structure for software inspection meetings. The groupware system will extend the structure of the software inspection meeting itself, allowing software inspection meetings to use all four quadrant of the space-time matrix: face-to-face, distributed synchronous, distributed asynchronous, and same place-different time. This will open up new working possibilities. The flexibility and adaptability of the system allows work to switch rapidly between synchronous and asynchronous interaction. A model for a flexible groupware system was developed. The model was developed based on review of the literature and questionnaires. A prototype based on the model was built using java and WWW technology. To test the effectiveness of the system, an evaluation was conducted. Questionnaires was used to gather response from the users. The evaluations ascertained that the model developed is flexible and adaptable to the different working modes, and the system is capable of supporting several different models of the software inspection process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A small lathe has been modified to work under microprocessor control to enhance the facilities which the lathe offers and provide a wider operating range with relevant economic gains. The result of these modifications give better operating system characteristics. A system of electronic circuits have been developed, utilising the latest technology, to replace the pegboard with the associated obsolete electrical components. Software for the system includes control programmes for the implementation of the original pegboard operation and several sample machine code programmes are included, covering a wide spectrum of applications, including diagnostic testing of the control system. It is concluded that it is possible to carry out a low cost retrofit on existing machine tools to enhance their range of capabilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DEA literature continues apace but software has lagged behind. This session uses suitably selected data to present newly developed software which includes many of the most recent DEA models. The software enables the user to address a variety of issues not frequently found in existing DEA software such as: -Assessments under a variety of possible assumptions of returns to scale including NIRS and NDRS; -Scale elasticity computations; -Numerous Input/Output variables and truly unlimited number of assessment units (DMUs) -Panel data analysis -Analysis of categorical data (multiple categories) -Malmquist Index and its decompositions -Computations of Supper efficiency -Automated removal of super-efficient outliers under user-specified criteria; -Graphical presentation of results -Integrated statistical tests

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The focus of our work is the verification of tight functional properties of numerical programs, such as showing that a floating-point implementation of Riemann integration computes a close approximation of the exact integral. Programmers and engineers writing such programs will benefit from verification tools that support an expressive specification language and that are highly automated. Our work provides a new method for verification of numerical software, supporting a substantially more expressive language for specifications than other publicly available automated tools. The additional expressivity in the specification language is provided by two constructs. First, the specification can feature inclusions between interval arithmetic expressions. Second, the integral operator from classical analysis can be used in the specifications, where the integration bounds can be arbitrary expressions over real variables. To support our claim of expressivity, we outline the verification of four example programs, including the integration example mentioned earlier. A key component of our method is an algorithm for proving numerical theorems. This algorithm is based on automatic polynomial approximation of non-linear real and real-interval functions defined by expressions. The PolyPaver tool is our implementation of the algorithm and its source code is publicly available. In this paper we report on experiments using PolyPaver that indicate that the additional expressivity does not come at a performance cost when comparing with other publicly available state-of-the-art provers. We also include a scalability study that explores the limits of PolyPaver in proving tight functional specifications of progressively larger randomly generated programs. © 2014 Springer International Publishing Switzerland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard SAP visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. The purpose of this study is to examine the benefit of adding mfVEP hemifield Intersector analysis protocol to the standard HFA test when there is suspicious glaucomatous visual field loss. 3 groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey visual field HFA test 24-2, optical coherence tomography of the optic nerve head, and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the Hemifield Sector Analysis HSA protocol. The retinal nerve fibre (RNFL) thickness was recorded to identify subjects with suspicious RNFL loss. The hemifield Intersector analysis of mfVEP results showed that signal to noise ratio (SNR) difference between superior and inferior hemifields was statistically significant between the 3 groups (ANOVA p<0.001 with a 95% CI). The difference between superior and inferior hemispheres in all subjects were all statistically significant in the glaucoma patient group 11/11 sectors (t-test p<0.001), partially significant 5/11 in glaucoma suspect group (t-test p<0.01) and no statistical difference between most sectors in normal group (only 1/11 was significant) (t-test p<0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86% respectively, while for glaucoma suspect were 89% and 79%. The use of SAP and mfVEP results in subjects with suspicious glaucomatous visual field defects, identified by low RNFL thickness, is beneficial in confirming early visual field defects. The new HSA protocol used in the mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patient. Using this protocol in addition to SAP analysis can provide information about focal visual field differences across the horizontal midline, and confirm suspicious field defects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss. The Intersector analysis protocol can detect early field changes not detected by standard HFA test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lowering glucose levels, while avoiding hypoglycaemia, can be challenging in insulin-treated patients with diabetes. We evaluated the role of ambulatory glucose profile in optimising glycaemic control in this population. Insulin-treated patients with type 1 and type 2 diabetes were recruited into a prospective, multicentre, 100-day study and randomised to control (n = 28) or intervention (n = 59) groups. The intervention group used ambulatory glucose profile, generated by continuous glucose monitoring, to assess daily glucose levels, whereas the controls relied on capillary glucose testing. Patients were reviewed at days 30 and 45 by the health care professional to adjust insulin therapy. Comparing first and last 2 weeks of the study, ambulatory glucose profile-monitored type 2 diabetes patients (n = 28) showed increased time in euglycaemia (mean ± standard deviation) by 1.4 ± 3.5 h/day (p = 0.0427) associated with reduction in HbA1c from 77 ± 15 to 67 ± 13 mmol/mol (p = 0.0002) without increased hypoglycaemia. Type 1 diabetes patients (n = 25) showed reduction in hypoglycaemia from 1.4 ± 1.7 to 0.8 ± 0.8 h/day (p = 0.0472) associated with a marginal HbA1c decrease from 75 ± 10 to 72 ± 8 mmol/mol (p = 0.0508). Largely similar findings were observed comparing intervention and control groups at end of study. In conclusion, ambulatory glucose profile helps glycaemic management in insulin-treated diabetes patients by increasing time spent in euglycaemia and decreasing HbA1c in type 2 diabetes patients, while reducing hypoglycaemia in type 1 diabetes patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software architecture plays an essential role in the high level description of a system design, where the structure and communication are emphasized. Despite its importance in the software engineering process, the lack of formal description and automated verification hinders the development of good software architecture models. In this paper, we present an approach to support the rigorous design and verification of software architecture models using the semantic web technology. We view software architecture models as ontology representations, where their structures and communication constraints are captured by the Web Ontology Language (OWL) and the Semantic Web Rule Language (SWRL). Specific configurations on the design are represented as concrete instances of the ontology, to which their structures and dynamic behaviors must conform. Furthermore, ontology reasoning tools can be applied to perform various automated verification on the design to ensure correctness, such as consistency checking, style recognition, and behavioral inference.