9 resultados para Correctness
em Aston University Research Archive
Resumo:
In this thesis we present an approach to automated verification of floating point programs. Existing techniques for automated generation of correctness theorems are extended to produce proof obligations for accuracy guarantees and absence of floating point exceptions. A prototype automated real number theorem prover is presented, demonstrating a novel application of function interval arithmetic in the context of subdivision-based numerical theorem proving. The prototype is tested on correctness theorems for two simple yet nontrivial programs, proving exception freedom and tight accuracy guarantees automatically. The prover demonstrates a novel application of function interval arithmetic in the context of subdivision-based numerical theorem proving. The experiments show how function intervals can be used to combat the information loss problems that limit the applicability of traditional interval arithmetic in the context of hard real number theorem proving.
Resumo:
Using current software engineering technology, the robustness required for safety critical software is not assurable. However, different approaches are possible which can help to assure software robustness to some extent. For achieving high reliability software, methods should be adopted which avoid introducing faults (fault avoidance); then testing should be carried out to identify any faults which persist (error removal). Finally, techniques should be used which allow any undetected faults to be tolerated (fault tolerance). The verification of correctness in system design specification and performance analysis of the model, are the basic issues in concurrent systems. In this context, modeling distributed concurrent software is one of the most important activities in the software life cycle, and communication analysis is a primary consideration to achieve reliability and safety. By and large fault avoidance requires human analysis which is error prone; by reducing human involvement in the tedious aspect of modelling and analysis of the software it is hoped that fewer faults will persist into its implementation in the real-time environment. The Occam language supports concurrent programming and is a language where interprocess interaction takes place by communications. This may lead to deadlock due to communication failure. Proper systematic methods must be adopted in the design of concurrent software for distributed computing systems if the communication structure is to be free of pathologies, such as deadlock. The objective of this thesis is to provide a design environment which ensures that processes are free from deadlock. A software tool was designed and used to facilitate the production of fault-tolerant software for distributed concurrent systems. Where Occam is used as a design language then state space methods, such as Petri-nets, can be used in analysis and simulation to determine the dynamic behaviour of the software, and to identify structures which may be prone to deadlock so that they may be eliminated from the design before the program is ever run. This design software tool consists of two parts. One takes an input program and translates it into a mathematical model (Petri-net), which is used for modeling and analysis of the concurrent software. The second part is the Petri-net simulator that takes the translated program as its input and starts simulation to generate the reachability tree. The tree identifies `deadlock potential' which the user can explore further. Finally, the software tool has been applied to a number of Occam programs. Two examples were taken to show how the tool works in the early design phase for fault prevention before the program is ever run.
Resumo:
Techniques are developed for the visual interpretation of drainage features from satellite imagery. The process of interpretation is formalised by the introduction of objective criteria. Problems of assessing the accuracy of maps are recognized, and a method is developed for quantifying the correctness of an interpretation, in which the more important features are given an appropriate weight. A study was made of imagery from a variety of landscapes in Britain and overseas, from which maps of drainage networks were drawn. The accuracy of the mapping was assessed in absolute terms, and also in relation to the geomorphic parameters used in hydrologic models. Results are presented relating the accuracy of interpretation to image quality, subjectivity and the effects of topography. It is concluded that the visual interpretation of satellite imagery gives maps of sufficient accuracy for the preliminary assessment of water resources, and for the estimation of geomorphic parameters. An examination is made of the use of remotely sensed data in hydrologic models. It is proposed that the spectral properties of a scene are holistic, and are therefore more efficient than conventional catchment characteristics. Key hydrologic parameters were identified, and were estimated from streamflow records. The correlation between hydrologic variables and spectral characteristics was examined, and regression models for streamflow were developed, based solely on spectral data. Regression models were also developed using conventional catchment characteristics, whose values were estimated using satellite imagery. It was concluded that models based primarily on variables derived from remotely sensed data give results which are as good as, or better than, models using conventional map data. The holistic properties of remotely sensed data are realised only in undeveloped areas. In developed areas an assessment of current land-use is a more useful indication of hydrologic response.
Resumo:
What does it mean to be white and working class in modern Britain? The Joseph Rowntree Foundation’s studies of traditionally white estates in Bradford, London, Coventry and Birmingham are part of a growing body of research into ‘white identities’. This paper: • identifies common findings from JRF research into traditionally white estates, in the context of other similar work; • suggests how issues of white identity can be better understood and makes recommendations for policy and practice. Key points: • Profound economic and social change has increased isolation and fear in traditionally white estates. Residents often claim that things were better in the past. • ‘Estatism’ refers to specific social dynamics associated with council estates and prejudice towards residents based on where they live. This can result in lowered self-esteem and reluctance to participate in community campaigns. • People on traditionally white estates often feel they are not listened to by outside agencies. Consultations can raise hopes but ultimately reinforce disengagement. Initiatives to ensure equality have become associated with political correctness (‘PC’). • White working-class people feel they are bound by values of hard work, reciprocity and support. They are frustrated by the closure and lack of access to community facilities. The social class system simultaneously disadvantages the working class while giving advantage to other classes. • There is a strong desire for allocation of resources to be fair, with a widespread perception that minorities are given preference. Blaming incomers for decline is common, with the target of blame differing between sites. Participants did not want to be considered racist and felt that labelling ideas as racist prevents discussion. Similarly, the term ‘PC’ can also be used to shut down debate. • Recommendations include community-twinning, new ways of accessing local authorities, involvement from the private sector in disadvantaged areas and local panels to define and develop the ‘Big Society’. Initiatives aimed solely at white working-class people are unlikely to be successful.
Resumo:
Dynamically adaptive systems (DASs) are intended to monitor the execution environment and then dynamically adapt their behavior in response to changing environmental conditions. The uncertainty of the execution environment is a major motivation for dynamic adaptation; it is impossible to know at development time all of the possible combinations of environmental conditions that will be encountered. To date, the work performed in requirements engineering for a DAS includes requirements monitoring and reasoning about the correctness of adaptations, where the DAS requirements are assumed to exist. This paper introduces a goal-based modeling approach to develop the requirements for a DAS, while explicitly factoring uncertainty into the process and resulting requirements. We introduce a variation of threat modeling to identify sources of uncertainty and demonstrate how the RELAX specification language can be used to specify more flexible requirements within a goal model to handle the uncertainty. © 2009 Springer Berlin Heidelberg.
Resumo:
Multicast is an efficient approach to save network bandwidth for multimedia streaming services. To provide Quality of Services (QoS) for the multimedia services while maintain the advantage of multicast in bandwidth efficiency, admission control for multicast sessions are expected. Probe-based multicast admission control (PBMAC) schemes are of a sort of scalable and simple admission control for multicast. Probing scheme is the essence of PBMAC. In this paper, after a detailed survey on three existing probing schemes, we evaluate these schemes using simulation and analysis approaches in two aspects: admission correctness and group scalability. Admission correctness of the schemes is compared by simulation investigation. Analytical models for group scalability are derived, and validated by simulation results. The evaluation results illustrate the advantages and weaknesses of each scheme, which are helpful for people to choose proper probing scheme for network.
Resumo:
Research in the present thesis is focused on the norms, strategies,and approaches which translators employ when translating humour in Children's Literature from English into Greek. It is based on process-oriented descriptive translation studies, since the focus is on investigating the process of translation. Viewing translation as a cognitive process and a problem soling activity, this thesis Think-aloud protocols (TAPs) in order to investigate translator's minds. As it is not possible to directly observe the human mind at work, an attempt is made to ask the translators themselves to reveal their mental processes in real time by verbalising their thoughts while carrying out a translation task involving humour. In this study, thirty participants at three different levels of expertise in translation competence, i.e. tn beginner, ten competent, and ten experts translators, were requested to translate two humourous extracts from the fictional diary novel The Secret Diary of Adrian Mole, Aged 13 ¾ by Sue Townsend (1982) from English into Greek. As they translated, they were asked to verbalise their thoughts and reason them, whenever possible, so that their strategies and approaches could be detected, and that subsequently, the norms that govern these strategies and approaches could be revealed. The thesis consists of four parts: the introduction, the literature review, the study, and the conclusion, and is developed in eleven chapters. the introduction contextualises the study within translation studies (TS) and presents its rationale, research questions, aims, and significance. Chapters 1 to 7 present an extensive and inclusive literature review identifying the principles axioms that guide and inform the study. In these seven chapters the following areas are critically introduced: Children's literature (Chapter 1), Children's Literature Translation (Chapter 2), Norms in Children's Literature (Chapter 3), Strategies in Children's Literature (Chapter 4), Humour in Children's Literature Translation (Chapter 5), Development of Translation Competence (Chapter 6), and Translation Process Research (Chapter 7). In Chapters 8 - 11 the fieldwork is described in detail. the piolot and the man study are described with a reference to he environments and setting, the participants, the research -observer, the data and its analysis, and limitations of the study. The findings of the study are presented and analysed in Chapter 9. Three models are then suggested for systematising translators' norms, strategies, and approaches, thus, filling the existing gap in the field. Pedagogical norms (e.g. appropriateness/correctness, famililarity, simplicity, comprehensibility, and toning down), literary norms (e.g. sound of language and fluency). and source-text norms (e.g. equivalence) were revealed to b the most prominent general and specific norms governing the translators' strategies and approaches in the process of translating humour in ChL. The data also revealed that monitoring and communication strategies (e.g. additions, omissions, and exoticism) were the prevalent strategies employed by translators. In Chapter 10 the main findings and outcomes of a potential secondary benefit (beneficial outcomes) are discussed on the basis of the research questions and aims of the study, and implications of the study are tackled in Chapter 11. In the conclusion, suggestions for future directions are given and final remarks noted.
Resumo:
Metrology processes contribute to entire manufacturing systems that can have a considerable impact on financial investment in coordinate measuring systems. However, there is a lack of generic methodologies to quantify their economical value in today’s industry. To solve this problem, a mathematical model is proposed in this paper by statistical deductive reasoning. This is done through defining the relationships between Process Capability Index, measurement uncertainty and tolerance band. The correctness of the mathematical model is proved by a case study. Finally, several comments and suggestions on evaluating and maximizing the benefits of metrology investment are given.
Resumo:
Software architecture plays an essential role in the high level description of a system design, where the structure and communication are emphasized. Despite its importance in the software engineering process, the lack of formal description and automated verification hinders the development of good software architecture models. In this paper, we present an approach to support the rigorous design and verification of software architecture models using the semantic web technology. We view software architecture models as ontology representations, where their structures and communication constraints are captured by the Web Ontology Language (OWL) and the Semantic Web Rule Language (SWRL). Specific configurations on the design are represented as concrete instances of the ontology, to which their structures and dynamic behaviors must conform. Furthermore, ontology reasoning tools can be applied to perform various automated verification on the design to ensure correctness, such as consistency checking, style recognition, and behavioral inference.