11 resultados para response time
em Greenwich Academic Literature Archive - UK
Resumo:
The International Maritime Organisation (IMO) has adopted the use of computer simulation to assist in the assessment of the assembly time for passenger ships. A key parameter required for this analysis and specified as part of the IMO guidelines is the passenger response time distribution. It is demonstrated in this paper that the IMO specified response time distribution assumes an unrealistic mathematical form. This unrealistic mathematical form can lead to serious congestion issues being overlooked in the evacuation analysis and lead to incorrect conclusions concerning the suitability of vessel design. In light of these results, it is vital that IMO undertake research to generate passenger response time data suitable for use in evacuation analysis of passenger ships. Until this type of data becomes readily available, it is strongly recommended that rather than continuing to use the artificial and unrepresentative form of the response time distribution, IMO should adopt plausible and more realistic response time data derived from land based applications. © 2005: Royal Institution of Naval Architects.
Resumo:
The passenger response time distributions adopted by the International Maritime Organisation (IMO)in their assessment of the assembly time for passanger ships involves two key assumptions. The first is that the response time distribution assumes the form of a uniform random distribution and the second concerns the actual response times. These two assumptions are core to the validity of the IMO analysis but are not based on real data, being the recommendations of an IMO committee. In this paper, response time data collected from assembly trials conducted at sea on a real passanger vessel using actual passangers are presented and discussed. Unlike the IMO specified response time distributions, the data collected from these trials displays a log-normal distribution, similar to that found in land based environments. Based on this data, response time distributions for use in the IMO assesmbly for the day and night scenarios are suggested
Resumo:
This work explores the impact of response time distributions on high-rise building evacuation. The analysis utilises response times extracted from printed accounts and interviews of evacuees from the WTC North Tower evacuation of 11 September 2001. Evacuation simulations produced using these “real” response time distributions are compared with simulations produced using instant and engineering response time distributions. Results suggest that while typical engineering approximations to the response time distribution may produce reasonable evacuation times for up to 90% of the building population, using this approach may underestimate total evacuation times by as much as 61%. These observations are applicable to situations involving large high-rise buildings in which travel times are generally expected to be greater than response times
Resumo:
The performance of loadsharing algorithms for heterogeneous distributed systems is investigated by simulation. The systems considered are networks of workstations (nodes) which differ in processing power. Two parameters are proposed for characterising system heterogeneity, namely the variance and skew of the distribution of processing power among the network nodes. A variety of networks are investigated, with the same number of nodes and total processing power, but with the processing power distributed differently among the nodes. Two loadsharing algorithms are evaluated, at overall system loadings of 50% and 90%, using job response time as the performance metric. Comparison is made with the ideal situation of ‘perfect sharing’, where it is assumed that the communication delays are zero and that complete knowledge is available about job lengths and the loading at the different nodes, so that an arriving job can be sent to the node where it will be completed in the shortest time. The algorithms studied are based on those already in use for homogeneous networks, but were adapted to take account of system heterogeneity. Both algorithms take into account the differences in the processing powers of the nodes in their location policies, but differ in the extent to which they ‘discriminate’ against the slower nodes. It is seen that the relative performance of the two is strongly influenced by the system utilisation and the distribution of processing power among the nodes.
Resumo:
This paper presents a proactive approach to load sharing and describes the architecture of a scheme, Concert, based on this approach. A proactive approach is characterized by a shift of emphasis from reacting to load imbalance to avoiding its occurrence. In contrast, in a reactive load sharing scheme, activity is triggered when a processing node is either overloaded or underloaded. The main drawback of this approach is that a load imbalance is allowed to develop before costly corrective action is taken. Concert is a load sharing scheme for loosely-coupled distributed systems. Under this scheme, load and task behaviour information is collected and cached in advance of when it is needed. Concert uses Linux as a platform for development. Implemented partially in kernel space and partially in user space, it achieves transparency to users and applications whilst keeping the extent of kernel modifications to a minimum. Non-preemptive task transfers are used exclusively, motivated by lower complexity, lower overheads and faster transfers. The goal is to minimize the average response-time of tasks. Concert is compared with other schemes by considering the level of transparency it provides with respect to users, tasks and the underlying operating system.
Resumo:
This paper concerns a preliminary numerical simulation study of the evacuation of the World Trade Centre North Tower on 11 September 2001 using the buildingEXODUS evacuation simulation software. The analysis makes use of response time data derived from a study of survivor accounts appearing in the public domain. While exact geometric details of the building were not available for this study, the building geometry was approximated from descriptions available in the public domain. The study attempts to reproduce the events of 11 September 2001 and pursue several ‘what if’ questions concerning the evacuation. In particular, the study explores the likely outcome had a single staircase survived in tact from top to bottom.
Resumo:
This article concerns an investigation of the full scale evacuation of a building with a configuration similar to that of the World Trade Center (WTC) North Tower using computer simulation. A range of evacuation scenarios is explored in order to better understand the evacuation of the WTC on 11 September 2001. The analysis makes use of response time data derived from a study of published WTC survivor accounts. Geometric details of the building are obtained from architects' plans while the total building population used in the scenarios is based on estimates produced by the National Institute of Standards and Technology formal investigation into the evacuation. This paper attempts to approximate the events of 11 September 2001 and pursue several `what if' questions concerning the evacuation. In particular, the study explores the likely outcome had a single staircase survived intact from top to bottom. More generally, this paper explores issues associated with the practical limits of building size that can be expected to be efficiently evacuated using stairs alone.
Resumo:
Fluid structure interaction, as applied to flexible structures, has wide application in diverse areas such as flutter in aircraft, flow in elastic pipes and blood vessels and extrusion of metals through dies. However a comprehensive computational model of these multi-physics phenomena is a considerable challenge. Until recently work in this area focused on one phenomenon and represented the behaviour of the other more simply even to the extent in metal forming, for example, that the deformation of the die is totally ignored. More recently, strategies for solving the full coupling between the fluid and soild mechanics behaviour have developed. Conventionally, the computational modelling of fluid structure interaction is problematical since computational fluid dynamics (CFD) is solved using finite volume (FV) methods and computational structural mechanics (CSM) is based entirely on finite element (FE) methods. In the past the concurrent, but rather disparate, development paths for the finite element and finite volume methods have resulted in numerical software tools for CFD and CSM that are different in almost every respect. Hence, progress is frustrated in modelling the emerging multi-physics problem of fluid structure interaction in a consistent manner. Unless the fluid-structure coupling is either one way, very weak or both, transferring and filtering data from one mesh and solution procedure to another may lead to significant problems in computational convergence. Using a novel three phase technique the full interaction between the fluid and the dynamic structural response are represented. The procedure is demonstrated on some challenging applications in complex three dimensional geometries involving aircraft flutter, metal forming and blood flow in arteries.
Resumo:
The aim of integrating computational mechanics (FEA and CFD) and optimization tools is to speed up dramatically the design process in different application areas concerning reliability in electronic packaging. Design engineers in the electronics manufacturing sector may use these tools to predict key design parameters and configurations (i.e. material properties, product dimensions, design at PCB level. etc) that will guarantee the required product performance. In this paper a modeling strategy coupling computational mechanics techniques with numerical optimization is presented and demonstrated with two problems. The integrated modeling framework is obtained by coupling the multi-physics analysis tool PHYSICA - with the numerical optimization package - Visua/DOC into a fuJly automated design tool for applications in electronic packaging. Thermo-mechanical simulations of solder creep deformations are presented to predict flip-chip reliability and life-time under thermal cycling. Also a thermal management design based on multi-physics analysis with coupled thermal-flow-stress modeling is discussed. The Response Surface Modeling Approach in conjunction with Design of Experiments statistical tools is demonstrated and used subsequently by the numerical optimization techniques as a part of this modeling framework. Predictions for reliable electronic assemblies are achieved in an efficient and systematic manner.
Resumo:
The September 11th 2001 impact on the World Trade Centre (WTC) resulted in one of the most significant evacuations of a high-rise building in modern times. The UK High-rise Evacuation Evaluation Database (HEED) study aimed to capture and collate the experiences and behaviours of WTC evacuees in a database, which would facilitate and encourage future research, which in turn would influence the design construction and use of safer built environments. A data elicitation tool designed for the purpose comprised a pre-interview questionnaire followed by a one-to-one interview protocol consisting of free-flow narratives and semi-structured interviews of WTC evacuees. This paper, which is one in a series dealing with issues relating to the successful evacuations of towers 1 and 2, focuses on cue recognition and response patterns within WTC1. Results are presented by vertical floor clusters and include information regarding cues experienced, activities prior and subsequent to occupants first becoming aware that something was wrong, perceived personal risk, time taken to respond and the inter-relationships between them. The results indicate differences in occupant activities across the floor clusters and suggest that these differences can be explained in terms of the perception of risk and the nature and extent of cues received by the participants.
Resumo:
Pulse design is investigated for time-reversal (TR) imaging as applied to ultrawideband (UWB) breast cancer detection. Earlier it has been shown that a suitably-designed UWB pulse may help to improve imaging performance for a single-tumor breast phantom with predetermined lesion properties. The current work considers the following more general and practical situations: presence of multiple malignancies with unknown tumor size and dielectric properties. Four pulse selection criteria are proposed with each focusing on one of the following aspects: eliminating signal clutter generated by tissue inhomogeneities, canceling mutual interference among tumors, improving image resolution, and suppressing artifacts created by sidelobe of the target response. By applying the proposed criteria, the shape parameters of UWB waveforms with desirable characteristics are identified through search of all the possible pulses. Simulation example using a numerical breast phantom, comprised of two tumors and structured clutter distribution, demonstrates the effectiveness of the proposed approach. Specifically, a tradeoff between the image resolution and signal-to-clutter contrast (SCC) is observed in terms of selection of the excitation waveforms.