964 resultados para computer engineering
Resumo:
The aim of this research was to investigate the integration of computer-aided drafting and finite-element analysis in a linked computer-aided design procedure and to develop the necessary software. The Be'zier surface patch for surface representation was used to bridge the gap between the rather separate fields of drafting and finite-element analysis because the surfaces are defined by analytical functions which allow systematic and controlled variation of the shape and provide continuous derivatives up to any required degree. The objectives of this research were achieved by establishing : (i) A package which interpretes the engineering drawings of plate and shell structures and prepares the Be'zier net necessary for surface representation. (ii) A general purpose stand-alone meshed-surface modelling package for surface representation of plates and shells using the Be'zier surface patch technique. (iii) A translator which adapts the geometric description of plate and shell structures as given by the meshed-surface modeller to the form needed by the finite-element analysis package. The translator was extended to suit fan impellers by taking advantage of their sectorial symmetry. The linking processes were carried out for simple test structures, simplified and actual fan impellers to verify the flexibility and usefulness of the linking technique adopted. Finite-element results for thin plate and shell structures showed excellent agreement with those obtained by other investigators while results for the simplified and actual fan impellers also showed good agreement with those obtained in an earlier investigation where finite-element analysis input data were manually prepared. Some extensions of this work have also been discussed.
Resumo:
This thesis describes the procedure and results from four years research undertaken through the IHD (Interdisciplinary Higher Degrees) Scheme at Aston University in Birmingham, sponsored by the SERC (Science and Engineering Research Council) and Monk Dunstone Associates, Chartered Quantity Surveyors. A stochastic networking technique VERT (Venture Evaluation and Review Technique) was used to model the pre-tender costs of public health, heating ventilating, air-conditioning, fire protection, lifts and electrical installations within office developments. The model enabled the quantity surveyor to analyse, manipulate and explore complex scenarios which previously had defied ready mathematical analysis. The process involved the examination of historical material costs, labour factors and design performance data. Components and installation types were defined and formatted. Data was updated and adjusted using mechanical and electrical pre-tender cost indices and location, selection of contractor, contract sum, height and site condition factors. Ranges of cost, time and performance data were represented by probability density functions and defined by constant, uniform, normal and beta distributions. These variables and a network of the interrelationships between services components provided the framework for analysis. The VERT program, in this particular study, relied upon Monte Carlo simulation to model the uncertainties associated with pre-tender estimates of all possible installations. The computer generated output in the form of relative and cumulative frequency distributions of current element and total services costs, critical path analyses and details of statistical parameters. From this data alternative design solutions were compared, the degree of risk associated with estimates was determined, heuristics were tested and redeveloped, and cost significant items were isolated for closer examination. The resultant models successfully combined cost, time and performance factors and provided the quantity surveyor with an appreciation of the cost ranges associated with the various engineering services design options.
Resumo:
The civil engineering industry generally regards new methods and technology with a high amount of scepticism, preferring to use traditional and trusted methods. During the 1980s competition for civil engineering consultancy work in the world has become fierce. Halcrow recognised the need to maintain and improve their competitive edge over other consultants. The use of new technology in the form of microcomputers was seen to be one method to maintain and improve their repuation in the world. This thesis examines the role of microcomputers in civil engineering consultancy with particular reference to overseas projects. The involvement of civil engineers with computers, both past and present, has been investigated and a survey of the use of microcomputers by consultancies was carried out, the results are presented and analysed. A resume of the state-of-the-art of microcomputer technology was made. Various case studies were carried out in order to examine the feasibility of using microcomputers on overseas projects. One case study involved the examination of two projects in Bangladesh and is used to illustrate the requirements and problems encountered in such situations. Two programming applications were undertaken, a dynamic programming model of a single site reservoir and the simulation of the Bangladesh gas grid system. A cost-benefit analysis of a water resources project using microcomputers in the Aguan Valley, Honduras was carried out. Although the initial cost of microcomputers is often small, the overall costs can prove to be very high and are likely to exceed the costs of traditional computer methods. A planned approach for the use of microcomputers is essential in order to reap the expected benefits and recommendations for the implementation of such an approach are presented.
Resumo:
This thesis considers the computer simulation of moist agglomerate collisions using the discrete element method (DEM). The study is confined to pendular state moist agglomerates, at which liquid is presented as either absorbed immobile films or pendular liquid bridges and the interparticle force is modelled as the adhesive contact force and interstitial liquid bridge force. Algorithms used to model the contact force due to surface adhesion, tangential friction and particle deformation have been derived by other researchers and are briefly described in the thesis. A theoretical study of the pendular liquid bridge force between spherical particles has been made and the algorithms for the modelling of the pendular liquid bridge force between spherical particles have been developed and incorporated into the Aston version of the DEM program TRUBAL. It has been found that, for static liquid bridges, the more explicit criterion for specifying the stable solution and critical separation is provided by the total free energy. The critical separation is given by the cube root of liquid bridge volume to a good approximation and the 'gorge method' of evaluation based on the toroidal approximation leads to errors in the calculated force of less than 10%. Three dimensional computer simulations of an agglomerate impacting orthogonally with a wall are reported. The results demonstrate the effectiveness of adding viscous binder to prevent attrition, a common practice in process engineering. Results of simulated agglomerate-agglomerate collisions show that, for colinear agglomerate impacts, there is an optimum velocity which results in a near spherical shape of the coalesced agglomerate and, hence, minimises attrition due to subsequent collisions. The relationship between the optimum impact velocity and the liquid viscosity and surface tension is illustrated. The effect of varying the angle of impact on the coalescence/attrition behaviour is also reported. (DX 187, 340).
Resumo:
The aim of the research project was to gain d complete and accurate accounting of the needs and deficiencies of materials selection and design data, with particular attention given to the feasibility of a computerised materials selection system that would include application analysis, property data and screening techniques. The project also investigates and integrates the three major aspects of materials resources, materials selection and materials recycling. Consideration of the materials resource base suggests that, though our discovery potential has increased, geologic availability is the ultimate determinant and several metals may well become scarce at the same time, thus compounding the problem of substitution. With around 2- to 20- million units of engineering materials data, the use of a computer is the only logical answer for scientific selection of materials. The system developed at Aston is used for data storage, mathematical computation and output. The system enables programs to be run in batch and interactive (on-line) mode. The program with modification can also handle such variables as quantity of mineral resources, energy cost of materials and depletion and utilisation rates of strateqic materials. The work also carries out an in-depth study of copper recycling in the U.K. and concludes that, somewhere in the region of 2 million tonnes of copper is missing from the recycling cycle. It also sets out guidelines on product design and conservation policies from the recyclability point of view.
Resumo:
There is a great deal of literature about the initial stages of innovative design. This is the process whereby a completely new product is conceived, invented and developed. In industry, however, the continuing success of a company is more often achieved by improving or developing existing designs to maintain their marketability. Unfortunately, this process of design by evolution is less well documented. This thesis reports the way in which this process was improved for the sponsoring company. The improvements were achieved by implementing a new form of computer aided design (C.A.D.) system. The advent of this system enabled the company to both shorten the design and development time and also to review the principles underlying the existing design procedures. C.A.D. was a new venture for the company and care had to be taken to ensure that the new procedures were compatible with the existing design office environment. In particular, they had to be acceptable to the design office staff. The C.A.D. system produced guides the designer from the draft specification to the first prototype layout. The computer presents the consequences of the designer's decisions clearly and fully, often by producing charts and sketches. The C.A.D. system and the necessary peripheral facilities were implemented, monitored and maintained. The system structure was left sufficiently flexible for maintenance to be undertaken quickly and effectively. The problems encountered during implementation are well documented in this thesis.
Resumo:
The work described was carried out as part of a collaborative Alvey software engineering project (project number SE057). The project collaborators were the Inter-Disciplinary Higher Degrees Scheme of the University of Aston in Birmingham, BIS Applied Systems Ltd. (BIS) and the British Steel Corporation. The aim of the project was to investigate the potential application of knowledge-based systems (KBSs) to the design of commercial data processing (DP) systems. The work was primarily concerned with BIS's Structured Systems Design (SSD) methodology for DP systems development and how users of this methodology could be supported using KBS tools. The problems encountered by users of SSD are discussed and potential forms of computer-based support for inexpert designers are identified. The architecture for a support environment for SSD is proposed based on the integration of KBS and non-KBS tools for individual design tasks within SSD - The Intellipse system. The Intellipse system has two modes of operation - Advisor and Designer. The design, implementation and user-evaluation of Advisor are discussed. The results of a Designer feasibility study, the aim of which was to analyse major design tasks in SSD to assess their suitability for KBS support, are reported. The potential role of KBS tools in the domain of database design is discussed. The project involved extensive knowledge engineering sessions with expert DP systems designers. Some practical lessons in relation to KBS development are derived from this experience. The nature of the expertise possessed by expert designers is discussed. The need for operational KBSs to be built to the same standards as other commercial and industrial software is identified. A comparison between current KBS and conventional DP systems development is made. On the basis of this analysis, a structured development method for KBSs in proposed - the POLITE model. Some initial results of applying this method to KBS development are discussed. Several areas for further research and development are identified.
Resumo:
Over recent years, the role of engineering in promoting a sustainable society has received much public attention [1] with particular emphasis given to the need to promote the future prosperity and security of society through the recruitment and education of more engineers [2,3]. From an employment perspective, the Leitch Review [4] suggested that ‘generic’ transferable employability skills development should constitute a more substantial part of university education. This paper argues that the global drivers impacting engineering education [5] correlate strongly to those underpinning the Leitch review, therefore the question of how to promote transferable employability skills within the wider engineering curriculum is increasingly relevant. By exploring the use of heritage in the engineering curriculum as a way to promote learning and engage students, a less familiar approach to study is discussed. This approach moves away from stereotypical notions of the use of information technology as representing the pinnacle of innovation in education. Taking the student experience as its starting point, the paper draws upon the findings of an exploratory study critically analysing the pedagogical value of using heritage in engineering education. It discusses a teaching approach in which engineering students are taken out of their ‘comfort zone’ - away from the classroom, laboratory and computer, to a heritage site some 100 miles away from the university. The primary learning objective underpinning this approach is to develop students’ transferable skills by encouraging them to consider how to apply theoretical concepts to a previously unexplored situation. By reflecting upon students’ perceptions of the value of this approach, and by identifying how heritage may be utilised as an innovative learning and teaching approach in engineering education, this paper makes a notable contribution to current pedagogical debates in the discipline.
Resumo:
As a subset of the Internet of Things (IoT), the Web of Things (WoT) shares many characteristics with wireless sensor and actuator networks (WSANs) and ubiquitous computing systems (Ubicomp). Yet to a far greater degree than the IoT, WSANs or Ubicomp, the WoT will integrate physical and information objects, necessitating a means to model and reason about a range of context types that have hitherto received little or no attention from the RE community. RE practice is only now developing the means to support WSANs and Ubicomp system development, including faltering first steps in the representation of context. We argue that these techniques will need to be developed further, with a particular focus on rich context types, if RE is to support WoT application development. © 2012 Springer-Verlag.
Resumo:
The goal of this roadmap paper is to summarize the state-of-the-art and to identify critical challenges for the systematic software engineering of self-adaptive systems. The paper is partitioned into four parts, one for each of the identified essential views of self-adaptation: modelling dimensions, requirements, engineering, and assurances. For each view, we present the state-of-the-art and the challenges that our community must address. This roadmap paper is a result of the Dagstuhl Seminar 08031 on "Software Engineering for Self-Adaptive Systems," which took place in January 2008. © 2009 Springer Berlin Heidelberg.
Resumo:
We argue that, for certain constrained domains, elaborate model transformation technologies-implemented from scratch in general-purpose programming languages-are unnecessary for model-driven engineering; instead, lightweight configuration of commercial off-the-shelf productivity tools suffices. In particular, in the CancerGrid project, we have been developing model-driven techniques for the generation of software tools to support clinical trials. A domain metamodel captures the community's best practice in trial design. A scientist authors a trial protocol, modelling their trial by instantiating the metamodel; customized software artifacts to support trial execution are generated automatically from the scientist's model. The metamodel is expressed as an XML Schema, in such a way that it can be instantiated by completing a form to generate a conformant XML document. The same process works at a second level for trial execution: among the artifacts generated from the protocol are models of the data to be collected, and the clinician conducting the trial instantiates such models in reporting observations-again by completing a form to create a conformant XML document, representing the data gathered during that observation. Simple standard form management tools are all that is needed. Our approach is applicable to a wide variety of information-modelling domains: not just clinical trials, but also electronic public sector computing, customer relationship management, document workflow, and so on. © 2012 Springer-Verlag.
Resumo:
* The research work reviewed in this paper has been carried out in the context of the Russian Foundation for Basic Research funded project “Adaptable Intelligent Interfaces Research and Development for Distance Learning Systems”(grant N 02-01-81019). The authors wish to acknowledge the co-operation with the Byelorussian partners of this project.