993 resultados para Castel Gala


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Evacuation analysis of passenger and commercial shipping can be undertaken using computer-based simulation tools such as maritimeEXODUS. These tools emulate human shipboard behaviour during emergency scenarios; however it is largely based around the behaviour of civilian passengers and fixtures and fittings of merchant vessels. If these tools and procedures are to be applied to naval vessels there is a clear requirement to understand the behaviour of well-trained naval personnel interacting with the fixtures and fittings that are exclusive to warships. Human factor trials using Royal Navy training facilities were recently undertaken to collect data to improve our understanding of the performance of naval personnel in warship environments. The trials were designed and conducted by staff from the Fire Safety Engineering Group (FSEG) of the University of Greenwich on behalf of the Sea Technology Group (STG), Defence Procurement Agency. The trials involved a selection of RN volunteers with sea-going experience in warships, operating and traversing structural components under different angles of heel. This paper describes the trials and some of the collected data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Signage systems are widely used in buildings to provide information for wayfinding, thereby assisting in navigation during normal circulation of pedestrians and, more importantly, exiting information during emergencies. An important consideration in determining the effectiveness of signs is establishing the region from which the sign is visible to occupants, the so-called Visibility Catchment Area (VCA). This paper attempts to factor into the determination of the VCA of signs, the observation angle of the observer using both experimental and theoretical analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The WTC evacuation of 11 September 2001 provides an unrepeatable opportunity to probe into and understand the very nature of evacuation dynamics and with this improved understanding, contribute to the design of safer, more evacuation efficient, yet highly functional, high rise buildings. Following 9/11 the Fire Safety Engineering Group (FSEG) of the University of Greenwich embarked on a study of survivor experiences from the WTC Twin Towers evacuation. The experiences were collected from published accounts appearing in the print and electronic mass media and are stored in a relational data base specifically developed for this purpose. Using these accounts and other available sources of information FSEG also undertook a series of numerical simulations of the WTC North Tower. This paper represents an overview of the results from both studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fractal video compression is a relatively new video compression method. Its attraction is due to the high compression ratio and the simple decompression algorithm. But its computational complexity is high and as a result parallel algorithms on high performance machines become one way out. In this study we partition the matching search, which occupies the majority of the work in a fractal video compression process, into small tasks and implement them in two distributed computing environments, one using DCOM and the other using .NET Remoting technology, based on a local area network consists of loosely coupled PCs. Experimental results show that the parallel algorithm is able to achieve a high speedup in these distributed environments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fractal image compression is a relatively recent image compression method, which is simple to use and often leads to a high compression ratio. These advantages make it suitable for the situation of a single encoding and many decoding, as required in video on demand, archive compression, etc. There are two fundamental fractal compression methods, namely, the cube-based and the frame-based methods, being commonly studied. However, there are advantages and disadvantages in both methods. This paper gives an extension of the fundamental compression methods based on the concept of adaptive partition. Experimental results show that the algorithms based on adaptive partition may obtain a much higher compression ratio compared to algorithms based on fixed partition while maintaining the quality of decompressed images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A parallel time-domain algorithm is described for the time-dependent nonlinear Black-Scholes equation, which may be used to build financial analysis tools to help traders making rapid and systematic evaluation of buy/sell contracts. The algorithm is particularly suitable for problems that do not require fine details at each intermediate time step, and hence the method applies well for the present problem.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a Framework for e-Learning and presents the findings of a study investigating whether the use of Blended Learning can fulfill or at least accommodate some of the human requirements presently neglected by current e-Learning systems. This study evaluates the in-house system: Teachmat, and discusses how the use of Blended Learning has become increasingly prevalent as a result of its enhancement and expansion, its relationship to the human and pedagogical issues, and both the positive and negative implications of this reality. [From the Authors]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates the use of the acoustic emission (AE) monitoring technique for use in identifying the damage mechanisms present in paper associated with its production process. The microscopic structure of paper consists of a random mesh of paper fibres connected by hydrogen bonds. This implies the existence of two damage mechanisms, the failure of a fibre-fibre bond and the failure of a fibre. This paper describes a hybrid mathematical model which couples the mechanics of the mass-spring model to the acoustic wave propagation model for use in generating the acoustic signal emitted by complex structures of paper fibres under strain. The derivation of the mass-spring model can be found in [1,2], with details of the acoustic wave equation found in [3,4]. The numerical implementation of the vibro-acoustic model is discussed in detail with particular emphasis on the damping present in the numerical model. The hybrid model uses an implicit solver which intrinsically introduces artificial damping to the solution. The artificial damping is shown to affect the frequency response of the mass-spring model, therefore certain restrictions on the simulation time step must be enforced so that the model produces physically accurate results. The hybrid mathematical model is used to simulate small fibre networks to provide information on the acoustic response of each damage mechanism. The simulated AEs are then analysed using a continuous wavelet transform (CWT), described in [5], which provides a two dimensional time-frequency representation of the signal. The AEs from the two damage mechanisms show different characteristics in the CWT so that it is possible to define a fibre-fibre bond failure by the criteria listed below. The dominant frequency components of the AE must be at approximately 250 kHz or 750 kHz. The strongest frequency component may be at either approximately 250 kHz or 750 kHz. The duration of the frequency component at approximately 250 kHz is longer than that of the frequency component at approximately 750 kHz. Similarly, the criteria for identifying a fibre failure are given below. The dominant frequency component of the AE must be greater than 800 kHz. The duration of the dominant frequency component must be less than 5.00E-06 seconds. The dominant frequency component must be present at the front of the AE. Essentially, the failure of a fibre-fibre bond produces a low frequency wave and the failure of a fibre produces a high frequency pulse. Using this theoretical criteria, it is now possible to train an intelligent classifier such as the Self-Organising Map (SOM) [6] using the experimental data. First certain features must be extracted from the CWTs of the AEs for use in training the SOM. For this work, each CWT is divided into 200 windows of 5E-06s in duration covering a 100 kHz frequency range. The power ratio for each windows is then calculated and used as a feature. Having extracted the features from the AEs, the SOM can now be trained, but care is required so that the both damage mechanisms are adequately represented in the training set. This is an issue with paper as the failure of the fibre-fibre bonds is the prevalent damage mechanism. Once a suitable training set is found, the SOM can be trained and its performance analysed. For the SOM described in this work, there is a good chance that it will correctly classify the experimental AEs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we revisit a study on e-Learning and suggestions for developing a framework for e-Learning. The original study in 2005 looked at e-Learning, specifically e-Tutoring and the issues that surround it. However, re-examining these findings led to the realization that whilst most courses were not fully "e" many were in essence using Blended Learning to varying degrees. It is concluded that the encroachment of a Blended Learning approach has been an indirect consequence of the extension and enhancement of in-house course management technologies now employed. The pros and cons of the situation are identified and discussed. In addition, we summarize the positions of participants of the workshop on Developing a Framework for e-Learning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We develop a fully polynomial-time approximation scheme (FPTAS) for minimizing the weighted total tardiness on a single machine, provided that all due dates are equal. The FPTAS is obtained by converting an especially designed pseudopolynomial dynamic programming algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper considers an on-line single machine scheduling problem where the goal is to minimize the makespan. The jobs are partitioned into families and a setup is performed every time the machine starts processing a batch of jobs of the same family. The scheduler is aware of the number of families and knows the setup time of each family, although information about a job only becomes available when that job is released. We give a lower bound on the competitive ratio of any on-line algorithm. Moreover, for the case of two families, we provide an algorithm with a competitive ratio that achieves this lower bound. As the number of families increases, the lower bound approaches 2, and we give a simple algorithm with a competitive ratio of 2.