18 resultados para Data structures (Computer science)
em Greenwich Academic Literature Archive - UK
Resumo:
This paper presents a generic framework that can be used to describe study plans using meta-data. The context of this research and associated technologies and standards is presented. The approach adopted here has been developed within the mENU project that aims to provide a model for a European Networked University. The methodology for the design of the generic Framework is discussed and the main design requirements are presented. The approach adopted was based on a set of templates containing meta-data required for the description of programs of study and consisting of generic building elements annotated appropriately. The process followed to develop the templates is presented together with a set of evaluation criteria to test the suitability of the approach. The templates structure is presented and example templates are shown. A first evaluation of the approach has shown that the proposed framework can provide a flexible and competent means for the generic description of study plans for the purposes of a networked university.
Proposed methodology for the use of computer simulation to enhance aircraft evacuation certification
Resumo:
In this paper a methodology for the application of computer simulation to evacuation certification of aircraft is suggested. This involves the use of computer simulation, historic certification data, component testing, and full-scale certification trials. The methodology sets out a framework for how computer simulation should be undertaken in a certification environment and draws on experience from both the marine and building industries. In addition, a phased introduction of computer models to certification is suggested. This involves as a first step the use of computer simulation in conjunction with full-scale testing. The combination of full-scale trial, computer simulation (and if necessary component testing) provides better insight into aircraft evacuation performance capabilities by generating a performance probability distribution rather than a single datum. Once further confidence in the technique is established the requirement for the full-scale demonstration could be dropped. The second step in the adoption of computer simulation for certification involves the introduction of several scenarios based on, for example, exit availability, instructed by accident analysis. The final step would be the introduction of more realistic accident scenarios. This would require the continued development of aircraft evacuation modeling technology to include additional behavioral features common in real accident scenarios.
Resumo:
Evacuation analysis of passenger and commercial shipping can be undertaken using computer-based simulation tools such as maritimeEXODUS. These tools emulate human shipboard behaviour during emergency scenarios; however it is largely based around the behaviour of civilian passengers and fixtures and fittings of merchant vessels. If these tools and procedures are to be applied to naval vessels there is a clear requirement to understand the behaviour of well-trained naval personnel interacting with the fixtures and fittings that are exclusive to warships. Human factor trials using Royal Navy training facilities were recently undertaken to collect data to improve our understanding of the performance of naval personnel in warship environments. The trials were designed and conducted by staff from the Fire Safety Engineering Group (FSEG) of the University of Greenwich on behalf of the Sea Technology Group (STG), Defence Procurement Agency. The trials involved a selection of RN volunteers with sea-going experience in warships, operating and traversing structural components under different angles of heel. This paper describes the trials and some of the collected data.
Resumo:
This paper investigates the use of the acoustic emission (AE) monitoring technique for use in identifying the damage mechanisms present in paper associated with its production process. The microscopic structure of paper consists of a random mesh of paper fibres connected by hydrogen bonds. This implies the existence of two damage mechanisms, the failure of a fibre-fibre bond and the failure of a fibre. This paper describes a hybrid mathematical model which couples the mechanics of the mass-spring model to the acoustic wave propagation model for use in generating the acoustic signal emitted by complex structures of paper fibres under strain. The derivation of the mass-spring model can be found in [1,2], with details of the acoustic wave equation found in [3,4]. The numerical implementation of the vibro-acoustic model is discussed in detail with particular emphasis on the damping present in the numerical model. The hybrid model uses an implicit solver which intrinsically introduces artificial damping to the solution. The artificial damping is shown to affect the frequency response of the mass-spring model, therefore certain restrictions on the simulation time step must be enforced so that the model produces physically accurate results. The hybrid mathematical model is used to simulate small fibre networks to provide information on the acoustic response of each damage mechanism. The simulated AEs are then analysed using a continuous wavelet transform (CWT), described in [5], which provides a two dimensional time-frequency representation of the signal. The AEs from the two damage mechanisms show different characteristics in the CWT so that it is possible to define a fibre-fibre bond failure by the criteria listed below. The dominant frequency components of the AE must be at approximately 250 kHz or 750 kHz. The strongest frequency component may be at either approximately 250 kHz or 750 kHz. The duration of the frequency component at approximately 250 kHz is longer than that of the frequency component at approximately 750 kHz. Similarly, the criteria for identifying a fibre failure are given below. The dominant frequency component of the AE must be greater than 800 kHz. The duration of the dominant frequency component must be less than 5.00E-06 seconds. The dominant frequency component must be present at the front of the AE. Essentially, the failure of a fibre-fibre bond produces a low frequency wave and the failure of a fibre produces a high frequency pulse. Using this theoretical criteria, it is now possible to train an intelligent classifier such as the Self-Organising Map (SOM) [6] using the experimental data. First certain features must be extracted from the CWTs of the AEs for use in training the SOM. For this work, each CWT is divided into 200 windows of 5E-06s in duration covering a 100 kHz frequency range. The power ratio for each windows is then calculated and used as a feature. Having extracted the features from the AEs, the SOM can now be trained, but care is required so that the both damage mechanisms are adequately represented in the training set. This is an issue with paper as the failure of the fibre-fibre bonds is the prevalent damage mechanism. Once a suitable training set is found, the SOM can be trained and its performance analysed. For the SOM described in this work, there is a good chance that it will correctly classify the experimental AEs.
Resumo:
Serial Analysis of Gene Expression (SAGE) is a relatively new method for monitoring gene expression levels and is expected to contribute significantly to the progress in cancer treatment by enabling a precise and early diagnosis. A promising application of SAGE gene expression data is classification of tumors. In this paper, we build three event models (the multivariate Bernoulli model, the multinomial model and the normalized multinomial model) for SAGE data classification. Both binary classification and multicategory classification are investigated. Experiments on two SAGE datasets show that the multivariate Bernoulli model performs well with small feature sizes, but the multinomial performs better at large feature sizes, while the normalized multinomial performs well with medium feature sizes. The multinomial achieves the highest overall accuracy.
Resumo:
The passenger response time distributions adopted by the International Maritime Organisation (IMO)in their assessment of the assembly time for passanger ships involves two key assumptions. The first is that the response time distribution assumes the form of a uniform random distribution and the second concerns the actual response times. These two assumptions are core to the validity of the IMO analysis but are not based on real data, being the recommendations of an IMO committee. In this paper, response time data collected from assembly trials conducted at sea on a real passanger vessel using actual passangers are presented and discussed. Unlike the IMO specified response time distributions, the data collected from these trials displays a log-normal distribution, similar to that found in land based environments. Based on this data, response time distributions for use in the IMO assesmbly for the day and night scenarios are suggested
Resumo:
The Logit-Logistic (LL), Johnson's SB, and the Beta (GBD) are flexible four-parameter probability distribution models in terms of the (skewness-kurtosis) region covered, and each has been used for modeling tree diameter distributions in forest stands. This article compares bivariate forms of these models in terms of their adequacy in representing empirical diameter-height distributions from 102 sample plots. Four bivariate models are compared: SBB, the natural, well-known, and much-used bivariate generalization of SB; the bivariate distributions with LL, SB, and Beta as marginals, constructed using Plackett's method (LL-2P, etc.). All models are fitted using maximum likelihood, and their goodness-of-fits are compared using minus log-likelihood (equivalent to Akaike's Information Criterion, the AIC). The performance ranking in this case study was SBB, LL-2P, GBD-2P, and SB-2P
Resumo:
This paper will analyse two of the likely damage mechanisms present in a paper fibre matrix when placed under controlled stress conditions: fibre/fibre bond failure and fibre failure. The failure process associated with each damage mechanism will be presented in detail focusing on the change in mechanical and acoustic properties of the surrounding fibre structure before and after failure. To present this complex process mathematically, geometrically simple fibre arrangements will be chosen based on certain assumptions regarding the structure and strength of paper, to model the damage mechanisms. The fibre structures are then formulated in terms of a hybrid vibro-acoustic model based on a coupled mass/spring system and the pressure wave equation. The model will be presented in detail in the paper. The simulation of the simple fibre structures serves two purposes; it highlights the physical and acoustic differences of each damage mechanism before and after failure, and also shows the differences in the two damage mechanisms when compared with one another. The results of the simulations are given in the form of pressure wave contours, time-frequency graphs and the Continuous Wavelet Transform (CWT) diagrams. The analysis of the results leads to criteria by which the two damage mechanisms can be identified. Using these criteria it was possible to verify the results of the simulations against experimental acoustic data. The models developed in this study are of specific practical interest in the paper-making industry, where acoustic sensors may be used to monitor continuous paper production. The same techniques may be adopted more generally to correlate acoustic signals to damage mechanisms in other fibre-based structures.
Resumo:
This paper describes the methodologies employed in the collection and storage of first-hand accounts of evacuation experiences derived from face-to-face interviews with evacuees from the World Trade Center (WTC) Twin Towers complex on 11 September 2001. In particular the paper describes the development of the High-rise Evacuation Evaluation Database (HEED). This is a flexible qualitative research tool which contains the full transcribed interview accounts and coded evacuee experiences extracted from those transcripts. The data and information captured and stored in the HEED database is not only unique, but it provides a means to address current and emerging issues relating to human factors associated with the evacuation of high-rise buildings.
Resumo:
This study investigates the use of computer modelled versus directly experimentally determined fire hazard data for assessing survivability within buildings using evacuation models incorporating Fractionally Effective Dose (FED) models. The objective is to establish a link between effluent toxicity, measured using a variety of small and large scale tests, and building evacuation. For the scenarios under consideration, fire simulation is typically used to determine the time non-survivable conditions develop within the enclosure, for example, when smoke or toxic effluent falls below a critical height which is deemed detrimental to evacuation or when the radiative fluxes reach a critical value leading to the onset of flashover. The evacuation calculation would the be used to determine whether people within the structure could evacuate before these critical conditions develop.
Resumo:
This paper describes the employment of semantic and conceptual structures in module design, specifically course modules. Additionally, it suggests other uses of these structures in aiding teaching and learning.
Resumo:
Two evacuation trials were conducted within Brazilian library facilities by FSEG staff in January 2005. These trials represent one of the first such trials conducted in Brazil. The purpose of these evacuation trials was to collect pre-evacuation time data from a population with a cultural background different to that found in western Europe. In total some 34 pre-evacuation times were collected from the experiments and these ranged from 5 to 98 seconds with a mean pre-evacuation time of 46.7 seconds
Resumo:
This paper examines the influence of exit availability on evacuation time for a narrow body aircraft under certification trial conditions using computer simulation. A narrow body aircraft which has previously passed the certification trial is used as the test configuration. While maintaining the certification requirement of 50% of the available exits, six different exit configurations are examined. These include the standard certification configuration (one exit from each exit pair) and five other exit configurations based on commonly occurring exit combinations found in accidents. These configurations are based on data derived from the AASK database and the evacuation simulations are performed using the airEXODUS evacuation simulation software. The results show that the certification practice of using half the available exits predominately down one side of the aircraft is neither statistically relevant nor challenging. For the aircraft cabin layout examined, the exit configuration used in certification trial produces the shortest egress times. Furthermore, three of the six exit combinations investigated result in predicted egress times in excess of 90 seconds, suggesting that the aircraft would not satisfy the certification requirement under these conditions.
Resumo:
This paper presents an investigation into applying Case-Based Reasoning to Multiple Heterogeneous Case Bases using agents. The adaptive CBR process and the architecture of the system are presented. A case study is presented to illustrate and evaluate the approach. The process of creating and maintaining the dynamic data structures is discussed. The similarity metrics employed by the system are used to support the process of optimisation of the collaboration between the agents which is based on the use of a blackboard architecture. The blackboard architecture is shown to support the efficient collaboration between the agents to achieve an efficient overall CBR solution, while using case-based reasoning methods to allow the overall system to adapt and “learn” new collaborative strategies for achieving the aims of the overall CBR problem solving process.