20 resultados para Technology software
em Aston University Research Archive
Resumo:
Corpus Linguistics is a young discipline. The earliest work was done in the 1960s, but corpora only began to be widely used by lexicographers and linguists in the late 1980s, by language teachers in the late 1990s, and by language students only very recently. This course in corpus linguistics was held at the Departamento de Linguistica Aplicada, E.T.S.I. de Minas, Universidad Politecnica de Madrid from June 15-19 1998. About 45 teachers registered for the course. 30% had PhDs in linguistics, 20% in literature, and the rest were doctorandi or qualified English teachers. The course was designed to introduce the use of corpora and other computational resources in teaching and research, with special reference to scientific and technological discourse in English. Each participant had a computer networked with the lecturer’s machine, whose display could be projected onto a large screen. Application programs were loaded onto the central server, and telnet and a web browser were available. COBUILD gave us permission to access the 323 million word Bank of English corpus, Mike Scott allowed us to use his Wordsmith Tools software, and Tim Johns gave us a copy of his MicroConcord program.
Resumo:
The thesis presents a two-dimensional Risk Assessment Method (RAM) where the assessment of risk to the groundwater resources incorporates both the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The approach emphasizes the need for a greater dependency on the potential pollution sources, rather than the traditional approach where assessment is based mainly on the intrinsic geo-hydrologic parameters. The risk is calculated using Monte Carlo simulation methods whereby random pollution events were generated to the same distribution as historically occurring events or a priori potential probability distribution. Integrated mathematical models then simulate contaminant concentrations at the predefined monitoring points within the aquifer. The spatial and temporal distributions of the concentrations were calculated from repeated realisations, and the number of times when a user defined concentration magnitude was exceeded is quantified as a risk. The method was setup by integrating MODFLOW-2000, MT3DMS and a FORTRAN coded risk model, and automated, using a DOS batch processing file. GIS software was employed in producing the input files and for the presentation of the results. The functionalities of the method, as well as its sensitivities to the model grid sizes, contaminant loading rates, length of stress periods, and the historical frequencies of occurrence of pollution events were evaluated using hypothetical scenarios and a case study. Chloride-related pollution sources were compiled and used as indicative potential contaminant sources for the case study. At any active model cell, if a random generated number is less than the probability of pollution occurrence, then the risk model will generate synthetic contaminant source term as an input into the transport model. The results of the applications of the method are presented in the form of tables, graphs and spatial maps. Varying the model grid sizes indicates no significant effects on the simulated groundwater head. The simulated frequency of daily occurrence of pollution incidents is also independent of the model dimensions. However, the simulated total contaminant mass generated within the aquifer, and the associated volumetric numerical error appear to increase with the increasing grid sizes. Also, the migration of contaminant plume advances faster with the coarse grid sizes as compared to the finer grid sizes. The number of daily contaminant source terms generated and consequently the total mass of contaminant within the aquifer increases in a non linear proportion to the increasing frequency of occurrence of pollution events. The risk of pollution from a number of sources all occurring by chance together was evaluated, and quantitatively presented as risk maps. This capability to combine the risk to a groundwater feature from numerous potential sources of pollution proved to be a great asset to the method, and a large benefit over the contemporary risk and vulnerability methods.
Resumo:
Purpose - The main objective of the paper is to develop a risk management framework for software development projects from developers' perspective. Design/methodology/approach - This study uses a combined qualitative and quantitative technique with the active involvement of stakeholders in order to identify, analyze and respond to risks. The entire methodology has been explained using a case study on software development project in a public sector organization in Barbados. Findings - Analytical approach to managing risk in software development ensures effective delivery of projects to clients. Research limitations/implications - The proposed risk management framework has been applied to a single case. Practical implications - Software development projects are characterized by technical complexity, market and financial uncertainties and competent manpower availability. Therefore, successful project accomplishment depends on addressing those issues throughout the project phases. Effective risk management ensures the success of projects. Originality/value - There are several studies on managing risks in software development and information technology (IT) projects. Most of the studies identify and prioritize risks through empirical research in order to suggest mitigating measures. Although they are important to clients for future projects, these studies fail to provide any framework for risk management from software developers' perspective. Although a few studies introduced framework of risk management in software development, most of them are presented from clients' perspectives and very little effort has been made to integrate this with the software development cycle. As software developers absorb considerable amount of risks, an integrated framework for managing risks in software development from developers' perspective is needed. © Emerald Group Publishing Limited.
Resumo:
Expert systems, and artificial intelligence more generally, can provide a useful means for representing decision-making processes. By linking expert systems software to simulation software an effective means of including these decision-making processes in a simulation model can be achieved. This paper demonstrates how a commercial-off-the-shelf simulation package (Witness) can be linked to an expert systems package (XpertRule) through a Visual Basic interface. The methodology adopted could be used for models, and possibly software, other than those presented here.
Resumo:
Purpose - To consider the role of technology in knowledge management in organizations, both actual and desired. Design/methodology/approach - Facilitated, computer-supported group workshops were conducted with 78 people from ten different organizations. The objective of each workshop was to review the current state of knowledge management in that organization and develop an action plan for the future. Findings - Only three organizations had adopted a strongly technology-based "solution" to knowledge management problems, and these followed three substantially different routes. There was a clear emphasis on the use of general information technology tools to support knowledge management activities, rather than the use of tools specific to knowledge management. Research limitations/implications - Further research is needed to help organizations make best use of generally available software such as intranets and e-mail for knowledge management. Many issues, especially human, relate to the implementation of any technology. Participation was restricted to organizations that wished to produce an action plan for knowledge management. The findings may therefore represent only "average" organizations, not the very best practice. Practical implications - Each organization must resolve four tensions: Between the quantity and quality of information/knowledge, between centralized and decentralized organization, between head office and organizational knowledge, and between "push" and "pull" processes. Originality/value - Although it is the group rather than an individual that determines what counts as knowledge, hardly any previous studies of knowledge management have collected data in a group context.
Resumo:
Using current software engineering technology, the robustness required for safety critical software is not assurable. However, different approaches are possible which can help to assure software robustness to some extent. For achieving high reliability software, methods should be adopted which avoid introducing faults (fault avoidance); then testing should be carried out to identify any faults which persist (error removal). Finally, techniques should be used which allow any undetected faults to be tolerated (fault tolerance). The verification of correctness in system design specification and performance analysis of the model, are the basic issues in concurrent systems. In this context, modeling distributed concurrent software is one of the most important activities in the software life cycle, and communication analysis is a primary consideration to achieve reliability and safety. By and large fault avoidance requires human analysis which is error prone; by reducing human involvement in the tedious aspect of modelling and analysis of the software it is hoped that fewer faults will persist into its implementation in the real-time environment. The Occam language supports concurrent programming and is a language where interprocess interaction takes place by communications. This may lead to deadlock due to communication failure. Proper systematic methods must be adopted in the design of concurrent software for distributed computing systems if the communication structure is to be free of pathologies, such as deadlock. The objective of this thesis is to provide a design environment which ensures that processes are free from deadlock. A software tool was designed and used to facilitate the production of fault-tolerant software for distributed concurrent systems. Where Occam is used as a design language then state space methods, such as Petri-nets, can be used in analysis and simulation to determine the dynamic behaviour of the software, and to identify structures which may be prone to deadlock so that they may be eliminated from the design before the program is ever run. This design software tool consists of two parts. One takes an input program and translates it into a mathematical model (Petri-net), which is used for modeling and analysis of the concurrent software. The second part is the Petri-net simulator that takes the translated program as its input and starts simulation to generate the reachability tree. The tree identifies `deadlock potential' which the user can explore further. Finally, the software tool has been applied to a number of Occam programs. Two examples were taken to show how the tool works in the early design phase for fault prevention before the program is ever run.
Resumo:
The objective of this research is to design and build a groupware system which will allow members of a distributed group more flexibility in performing software inspection. Software inspection, which is part of non-execution based testing in software development, is a group activity. The groupware system aims to provide a system that will improve acceptability of groupware and improve software quality by providing a software inspection tool that is flexible and adaptable. The groupware system provide a flexible structure for software inspection meetings. The groupware system will extend the structure of the software inspection meeting itself, allowing software inspection meetings to use all four quadrant of the space-time matrix: face-to-face, distributed synchronous, distributed asynchronous, and same place-different time. This will open up new working possibilities. The flexibility and adaptability of the system allows work to switch rapidly between synchronous and asynchronous interaction. A model for a flexible groupware system was developed. The model was developed based on review of the literature and questionnaires. A prototype based on the model was built using java and WWW technology. To test the effectiveness of the system, an evaluation was conducted. Questionnaires was used to gather response from the users. The evaluations ascertained that the model developed is flexible and adaptable to the different working modes, and the system is capable of supporting several different models of the software inspection process.
Resumo:
The subject of investigation of the present research is the use of smart hydrogels with fibre optic sensor technology. The aim was to develop a costeffective sensor platform for the detection of water in hydrocarbon media, and of dissolved inorganic analytes, namely potassium, calcium and aluminium. The fibre optic sensors in this work depend upon the use of hydrogels to either entrap chemotropic agents or to respond to external environmental changes, by changing their inherent properties, such as refractive index (RI). A review of current fibre optic technology for sensing outlined that the main principles utilised are either the measurement of signal loss or a change in wavelength of the light transmitted through the system. The signal loss principle relies on changing the conditions required for total internal reflection to occur. Hydrogels are cross-linked polymer networks that swell but do not dissolve in aqueous environments. Smart hydrogels are synthetic materials that exhibit additional properties to those inherent in their structure. In order to control the non-inherent properties, the hydrogels were fabricated with the addition of chemotropic agents. For the detection of water, hydrogels of low refractive index were synthesized using fluorinated monomers. Sulfonated monomers were used for their extreme hydrophilicity as a means of water sensing through an RI change. To enhance the sensing capability of the hydrogel, chemotropic agents, such as pH indicators and cobalt salts, were used. The system comprises of the smart hydrogel coated onto an exposed section of the fibre optic core, connected to the interrogation system measuring the difference in the signal. Information obtained was analysed using a purpose designed software. The developed sensor platform showed that an increase in the target species caused an increase in the signal lost from the sensor system, allowing for a detection of the target species. The system has potential applications in areas such as clinical point of care, water detection in fuels and the detection of dissolved ions in the water industry.
Resumo:
A small lathe has been modified to work under microprocessor control to enhance the facilities which the lathe offers and provide a wider operating range with relevant economic gains. The result of these modifications give better operating system characteristics. A system of electronic circuits have been developed, utilising the latest technology, to replace the pegboard with the associated obsolete electrical components. Software for the system includes control programmes for the implementation of the original pegboard operation and several sample machine code programmes are included, covering a wide spectrum of applications, including diagnostic testing of the control system. It is concluded that it is possible to carry out a low cost retrofit on existing machine tools to enhance their range of capabilities.
Resumo:
There has been little research in health and safety management concernmg the application of information technology to the field. This thesis attempts to stimulate interest in this area by analysing the value of proprietary health and safety software to proactive health and safety management. The thesis is based upon the detailed software evaluation of seven pieces of proprietary health and safety software. It features a discussion concerning the development of information technology and health and safety management, a review of the key issues identified during the software evaluations, an analysis of the commercial market for this type of software, and a consideration of the broader issues which surround the use of this software. It also includes practical guidance for the evaluation, selection, implementation and maintenance of all health and safety management software. This includes a comprehensive software evaluation chart. The implications of the research are considered for proprietary health and safety software, the application of information technology to health and safety management, and for future research.
Resumo:
Information technology is at the centre of today’s business environment. The increasing importance of e-commerce and the integration of information systems in all areas of a business means it is crucial for managers to understand and implement IS (information systems). This major text, now in its second edition, provides the skills and knowledge necessary to choose the right systems, and to develop and manage them effectively. Business Information Systems: Technology, Development and Management assumes no prior knowledge of IS or IT, and emphasises the importance of IS to management decision making. It takes a 3 part structure: Part One covers hardware and software technologies; Part Two looks at information systems analysis and design; and Part Three describes the strategic management of IS. This successful format allows each section to be studied alongside individual modules, and enables students to focus clearly on specific areas and use the book for more than one course. This book is suitable for college students, undergraduate degree and postgraduate students taking courses with modules in the practical IT skills of selection, implementation, management and use of BIS. The practical sections are also of use to managers in industry involved in the development and use of IS.
Resumo:
The success of the Semantic Web, as the next generation of Web technology, can have profound impact on the environment for formal software development. It allows both the software engineers and machines to understand the content of formal models and supports more effective software design in terms of understanding, sharing and reusing in a distributed manner. To realise the full potential of the Semantic Web in formal software development, effectively creating proper semantic metadata for formal software models and their related software artefacts is crucial. In this paper, a methodology with tool support is proposed to automatically derive ontological metadata from formal software models and semantically describe them.
Resumo:
This first edition of the workshop Model-driven Software Adaptation (M-ADAPT'07) took place in the Technische Universität Berlin with the International Conference ECOOP'07 in the beautiful and buzzing city of Berlin, on the 30th of July, 2007. The workshop was organized by Gordon Blair, Nelly Bencomo, and Robert France. Participants explored how to develop appropriate model-driven approaches to model, analyze, and validate the volatile properties of the behaviour of adaptive systems and its environments. This report gives an overview of the presentations as well as an account of the fruitful discussions that took place at M-ADAPT'07. © 2008 Springer-Verlag Berlin Heidelberg.
Resumo:
Developers of interactive software are confronted by an increasing variety of software tools to help engineer the interactive aspects of software applications. Not only do these tools fall into different categories in terms of functionality, but within each category there is a growing number of competing tools with similar, although not identical, features. Choice of user interface development tool (UIDT) is therefore becoming increasingly complex.
Resumo:
Many software engineers have found that it is difficult to understand, incorporate and use different formal models consistently in the process of software developments, especially for large and complex software systems. This is mainly due to the complex mathematical nature of the formal methods and the lack of tool support. It is highly desirable to have software models and their related software artefacts systematically connected and used collaboratively, rather than in isolation. The success of the Semantic Web, as the next generation of Web technology, can have profound impact on the environment for formal software development. It allows both the software engineers and machines to understand the content of formal models and supports more effective software design in terms of understanding, sharing and reusing in a distributed manner. To realise the full potential of the Semantic Web in formal software development, effectively creating proper semantic metadata for formal software models and their related software artefacts is crucial. This paper proposed a framework that allows users to interconnect the knowledge about formal software models and other related documents using the semantic technology. We first propose a methodology with tool support is proposed to automatically derive ontological metadata from formal software models and semantically describe them. We then develop a Semantic Web environment for representing and sharing formal Z/OZ models. A method with prototype tool is presented to enhance semantic query to software models and other artefacts. © 2014.