12 resultados para scientific methodology
em Greenwich Academic Literature Archive - UK
Resumo:
The Symposium, “Towards the sustainable use of Europe’s forests”, with sub-title “Forest ecosystem and landscape research: scientific challenges and opportunities” lists three fundamental substantive areas of research that are involved: Forest management and practices, Ecosystem processes and functional ecology, and Environmental economics and sociology. This paper argues that there are essential catalytic elements missing! Without these elements there is great danger that the aimed-for world leadership in the forest sciences will not materialize. What are the missing elements? All the sciences, and in particular biology, environmental sciences, sociology, economics, and forestry have evolved so that they include good scientific methodology. Good methodology is imperative in both the design and analysis of research studies, the management of research data, and in the interpretation of research finding. The methodological disciplines of Statistics, Modelling and Informatics (“SMI”) are crucial elements in a proposed Centre of European Forest Science, and the full involvement of professionals in these methodological disciplines is needed if the research of the Centre is to be world-class. Distributed Virtual Institute (DVI) for Statistics, Modelling and Informatics in Forestry and the Environment (SMIFE) is a consortium with the aim of providing world-class methodological support and collaboration to European research in the areas of Forestry and the Environment. It is suggested that DVI: SMIFE should be a formal partner in the proposed Centre for European Forest Science.
Resumo:
The shared-memory programming model can be an effective way to achieve parallelism on shared memory parallel computers. Historically however, the lack of a programming standard using directives and the limited scalability have affected its take-up. Recent advances in hardware and software technologies have resulted in improvements to both the performance of parallel programs with compiler directives and the issue of portability with the introduction of OpenMP. In this study, the Computer Aided Parallelisation Toolkit has been extended to automatically generate OpenMP-based parallel programs with nominal user assistance. We categorize the different loop types and show how efficient directives can be placed using the toolkit's in-depth interprocedural analysis. Examples are taken from the NAS parallel benchmarks and a number of real-world application codes. This demonstrates the great potential of using the toolkit to quickly parallelise serial programs as well as the good performance achievable on up to 300 processors for hybrid message passing-directive parallelisations.
Resumo:
This paper describes the use of a blackboard architecture for building a hybrid case based reasoning (CBR) system. The Smartfire fire field modelling package has been built using this architecture and includes a CBR component. It allows the integration into the system of qualitative spatial reasoning knowledge from domain experts. The system can be used for the automatic set-up of fire field models. This enables fire safety practitioners who are not expert in modelling techniques to use a fire modelling tool. The paper discusses the integrating powers of the architecture, which is based on a common knowledge representation comprising a metric diagram and place vocabulary and mechanisms for adaptation and conflict resolution built on the Blackboard.
Proposed methodology for the use of computer simulation to enhance aircraft evacuation certification
Resumo:
In this paper a methodology for the application of computer simulation to evacuation certification of aircraft is suggested. This involves the use of computer simulation, historic certification data, component testing, and full-scale certification trials. The methodology sets out a framework for how computer simulation should be undertaken in a certification environment and draws on experience from both the marine and building industries. In addition, a phased introduction of computer models to certification is suggested. This involves as a first step the use of computer simulation in conjunction with full-scale testing. The combination of full-scale trial, computer simulation (and if necessary component testing) provides better insight into aircraft evacuation performance capabilities by generating a performance probability distribution rather than a single datum. Once further confidence in the technique is established the requirement for the full-scale demonstration could be dropped. The second step in the adoption of computer simulation for certification involves the introduction of several scenarios based on, for example, exit availability, instructed by accident analysis. The final step would be the introduction of more realistic accident scenarios. This would require the continued development of aircraft evacuation modeling technology to include additional behavioral features common in real accident scenarios.
Resumo:
This paper details the computational methodology for analysis of the structural behaviour of historic composite structures. The modelling approach is based on finite element analysis and has been developed to aid the efficient and inexpensive computational mechanics of complex composite structures. The discussion is primarily focussed on the modelling methodology and analysis of structural designs that comprise of structural beam components acting as stiffeners to a wider shell part of the structure. A computational strategy for analysis of this type of composite structures that exploits their representation through smeared shell models is detailed in the paper.
Resumo:
The parallelization of real-world compute intensive Fortran application codes is generally not a trivial task. If the time to complete the parallelization is to be significantly reduced then an environment is needed that will assist the programmer in the various tasks of code parallelization. In this paper the authors present a code parallelization environment where a number of tools that address the main tasks such as code parallelization, debugging and optimization are available. The ParaWise and CAPO parallelization tools are discussed which enable the near automatic parallelization of real-world scientific application codes for shared and distributed memory-based parallel systems. As user involvement in the parallelization process can introduce errors, a relative debugging tool (P2d2) is also available and can be used to perform nearly automatic relative debugging of a program that has been parallelized using the tools. A high quality interprocedural dependence analysis as well as user-tool interaction are also highlighted and are vital to the generation of efficient parallel code and in the optimization of the backtracking and speculation process used in relative debugging. Results of benchmark and real-world application codes parallelized are presented and show the benefits of using the environment
Resumo:
Evaluating ship layout for human factors (HF) issues using simulation software such as maritimeEXODUS can be a long and complex process. The analysis requires the identification of relevant evaluation scenarios; encompassing evacuation and normal operations; the development of appropriate measures which can be used to gauge the performance of crew and vessel and finally; the interpretation of considerable simulation data. Currently, the only agreed guidelines for evaluating HFs performance of ship design relate to evacuation and so conclusions drawn concerning the overall suitability of a ship design by one naval architect can be quite different from those of another. The complexity of the task grows as the size and complexity of the vessel increases and as the number and type of evaluation scenarios considered increases. Equally, it can be extremely difficult for fleet operators to set HFs design objectives for new vessel concepts. The challenge for naval architects is to develop a procedure that allows both accurate and rapid assessment of HFs issues associated with vessel layout and crew operating procedures. In this paper we present a systematic and transparent methodology for assessing the HF performance of ship design which is both discriminating and diagnostic. The methodology is demonstrated using two variants of a hypothetical naval ship.
Resumo:
This paper presents a design methodology based on numerical modelling, integrated with optimisation techniques and statistical methods, to aid the development of new advanced technologies in the area of micro and nano systems. The design methodology is demonstrated for a micro-machining process called Focused Ion Beam (FIB). This process has been modelled to provide knowledge of how a pre-defined geometry can be achieved through this direct milling. The geometry characterisation is obtained using a Reduced Order Models (ROM), generated from the results of a mathematical model of the Focused Ion Beam, and Design of Experiment (DoE) methods. In this work, the focus is on the design flow methodology which includes an approach on how to include process parameter uncertainties into the process optimisation modelling framework. A discussion on the impact of the process parameters, and their variations, on the quality and performance of the fabricated structure is also presented. The design task is to identify the optimal process conditions, by altering the process parameters, so that certain reliability and confidence of the application is achieved and the imposed constraints are satisfied. The software tools used and developed to demonstrate the design methodology are also presented.
Resumo:
The article consists of a PowerPoint presentation on integrated reliability and prognostics prediction methodology for power electronic modules. The areas discussed include: power electronics flagship; design for reliability; IGBT module; design for manufacture; power module components; reliability prediction techniques; failure based reliability; etc.
Resumo:
A Concise Intro to Image Processing using C++ presents state-of-the-art image processing methodology, including current industrial practices for image compression, image de-noising methods based on partial differential equations, and new image compression methods such as fractal image compression and wavelet compression. It includes elementary concepts of image processing and related fundamental tools with coding examples as well as exercises. With a particular emphasis on illustrating fractal and wavelet compression algorithms, the text covers image segmentation, object recognition, and morphology. An accompanying CD-ROM contains code for all algorithms.
Resumo:
Today, the key to commercial success in manufacturing is the timely development of new products that are not only functionally fit for purpose but offer high performance and quality throughout their entire lifecycle. In principle, this demands the introduction of a fully developed and optimised product from the outset. To accomplish this, manufacturing companies must leverage existing knowledge in their current technical, manufacturing and service capabilities. This is especially true in the field of tolerance selection and application, the subject area of this research. Tolerance knowledge must be readily available and deployed as an integral part of the product development process. This paper describes a methodology and framework,currently under development in a UK manufacturer, to achieve this objective.
Resumo:
The paper reports on the investigation of the rheological behaviour new lead-free solder pastes formulations for use in flip-chip assembly applications. The study is made up of three parts; namely the evaluation of the effect of plate geometry, the effect of temperature and processing environment and the effect of torsional frequencies on the rheological measurements. Different plate geometries and rheological tests were used to evaluate new formulations in terms of wall slip characteristics, linear viscoelastic region and shear thinning behaviour. A technique which combines the use of the creep-recovery and dynamic frequency sweep tests was used to further characterise the paste structure, rheological behaviour and the processing performance of the new paste formulations. The technique demonstrated in this study has wide utility for R & D personnel involved in new paste formulation, for implementing quality control procedures used in paste manufacture and packaging and for qualifying new flip-chip assembly lines