864 resultados para Castel Gala
Resumo:
This paper describes a protocol for dynamically configuring wireless sensor nodes into logical clusters. The concept is to be able to inject an overlay configuration into an ad-hoc network of sensor nodes or similar devices, and have the network configure itself organically. The devices are arbitrarily deployed and have initially have no information whatsoever concerning physical location, topology, density or neighbourhood. The Emergent Cluster Overlay (ECO) protocol is totally self-configuring and has several novel features, including nodes self-determining their mobility based on patterns of neighbour discovery, and that the target cluster size is specified externally (by the sensor network application) and is not directly coupled to radio communication range or node packing density. Cluster head nodes are automatically assigned as part of the cluster configuration process, at no additional cost. ECO is ideally suited to applications of wireless sensor networks in which localized groups of sensors act cooperatively to provide a service. This includes situations where service dilution is used (dynamically identifying redundant nodes to conserve their resources).
Resumo:
This paper presents innovative work in the development of policy-based autonomic computing. The core of the work is a powerful and flexible policy-expression language AGILE, which facilitates run-time adaptable policy configuration of autonomic systems. AGILE also serves as an integrating platform for other self-management technologies including signal processing, automated trend analysis and utility functions. Each of these technologies has specific advantages and applicability to different types of dynamic adaptation. The AGILE platform enables seamless interoperability of the different technologies to each perform various aspects of self-management within a single application. The various technologies are implemented as object components. Self-management behaviour is specified using the policy language semantics to bind the various components together as required. Since the policy semantics support run-time re-configuration, the self-management architecture is dynamically composable. Additional benefits include the standardisation of the application programmer interface, terminology and semantics, and only a single point of embedding is required.
Resumo:
The Logit-Logistic (LL), Johnson's SB, and the Beta (GBD) are flexible four-parameter probability distribution models in terms of the (skewness-kurtosis) region covered, and each has been used for modeling tree diameter distributions in forest stands. This article compares bivariate forms of these models in terms of their adequacy in representing empirical diameter-height distributions from 102 sample plots. Four bivariate models are compared: SBB, the natural, well-known, and much-used bivariate generalization of SB; the bivariate distributions with LL, SB, and Beta as marginals, constructed using Plackett's method (LL-2P, etc.). All models are fitted using maximum likelihood, and their goodness-of-fits are compared using minus log-likelihood (equivalent to Akaike's Information Criterion, the AIC). The performance ranking in this case study was SBB, LL-2P, GBD-2P, and SB-2P
Resumo:
Semi-Lagrange time integration is used with the finite difference method to provide accurate stable prices for Asian options, with or without early exercise. These are combined with coordinate transformations for computational efficiency and compared with published results
Resumo:
Solder joints are often the cause of failure in electronic devices, failing due to cyclic creep induced ductile fatigue. This paper will review the modelling methods available to predict the lifetime of SnPb and SnAgCu solder joints under thermo-mechanical cycling conditions such as power cycling, accelerated thermal cycling and isothermal testing, the methods do not apply to other damage mechanisms such as vibration or drop-testing. Analytical methods such as recommended by the IPC are covered, which are simple to use but limited in capability. Finite element modelling methods are reviewed, along with the necessary constitutive laws and fatigue laws for solder, these offer the most accurate predictions at the current time. Research on state-of-the-art damage mechanics methods is also presented, although these have not undergone enough experimental validation to be recommended at present
Resumo:
Parallel processing techniques have been used in the past to provide high performance computing resources for activities such as Computational Fluid Dynamics. This is normally achieved using specialized hardware and software, the expense of which would be difficult to justify for many fire engineering practices. In this paper, we demonstrate how typical office-based PCs attached to a local area network have the potential to offer the benefits of parallel processing with minimal costs associated with the purchase of additional hardware or software. A dynamic load balancing scheme was devised to allow the effective use of the software on heterogeneous PC networks. This scheme ensured that the impact between the parallel processing task and other computer users on the network was minimized thus allowing practical parallel processing within a conventional office environment. Copyright © 2006 John Wiley & Sons, Ltd.
Resumo:
This paper reports on the findings of a study on improving interaction design for visually impaired students, focusing upon the cognitive criteria for information visualisation.
Resumo:
This paper will analyse two of the likely damage mechanisms present in a paper fibre matrix when placed under controlled stress conditions: fibre/fibre bond failure and fibre failure. The failure process associated with each damage mechanism will be presented in detail focusing on the change in mechanical and acoustic properties of the surrounding fibre structure before and after failure. To present this complex process mathematically, geometrically simple fibre arrangements will be chosen based on certain assumptions regarding the structure and strength of paper, to model the damage mechanisms. The fibre structures are then formulated in terms of a hybrid vibro-acoustic model based on a coupled mass/spring system and the pressure wave equation. The model will be presented in detail in the paper. The simulation of the simple fibre structures serves two purposes; it highlights the physical and acoustic differences of each damage mechanism before and after failure, and also shows the differences in the two damage mechanisms when compared with one another. The results of the simulations are given in the form of pressure wave contours, time-frequency graphs and the Continuous Wavelet Transform (CWT) diagrams. The analysis of the results leads to criteria by which the two damage mechanisms can be identified. Using these criteria it was possible to verify the results of the simulations against experimental acoustic data. The models developed in this study are of specific practical interest in the paper-making industry, where acoustic sensors may be used to monitor continuous paper production. The same techniques may be adopted more generally to correlate acoustic signals to damage mechanisms in other fibre-based structures.
Resumo:
This paper describes the methodologies employed in the collection and storage of first-hand accounts of evacuation experiences derived from face-to-face interviews with evacuees from the World Trade Center (WTC) Twin Towers complex on 11 September 2001. In particular the paper describes the development of the High-rise Evacuation Evaluation Database (HEED). This is a flexible qualitative research tool which contains the full transcribed interview accounts and coded evacuee experiences extracted from those transcripts. The data and information captured and stored in the HEED database is not only unique, but it provides a means to address current and emerging issues relating to human factors associated with the evacuation of high-rise buildings.
Resumo:
Computer egress simulation has potential to be used in large scale incidents to provide live advice to incident commanders. While there are many considerations which must be taken into account when applying such models to live incidents, one of the first concerns the computational speed of simulations. No matter how important the insight provided by the simulation, numerical hindsight will not prove useful to an incident commander. Thus for this type of application to be useful, it is essential that the simulation can be run many times faster than real time. Parallel processing is a method of reducing run times for very large computational simulations by distributing the workload amongst a number of CPUs. In this paper we examine the development of a parallel version of the buildingEXODUS software. The parallel strategy implemented is based on a systematic partitioning of the problem domain onto an arbitrary number of sub-domains. Each sub-domain is computed on a separate processor and runs its own copy of the EXODUS code. The software has been designed to work on typical office based networked PCs but will also function on a Windows based cluster. Two evaluation scenarios using the parallel implementation of EXODUS are described; a large open area and a 50 story high-rise building scenario. Speed-ups of up to 3.7 are achieved using up to six computers, with high-rise building evacuation simulation achieving run times of 6.4 times faster than real time.
Resumo:
This study investigates the use of computer modelled versus directly experimentally determined fire hazard data for assessing survivability within buildings using evacuation models incorporating Fractionally Effective Dose (FED) models. The objective is to establish a link between effluent toxicity, measured using a variety of small and large scale tests, and building evacuation. For the scenarios under consideration, fire simulation is typically used to determine the time non-survivable conditions develop within the enclosure, for example, when smoke or toxic effluent falls below a critical height which is deemed detrimental to evacuation or when the radiative fluxes reach a critical value leading to the onset of flashover. The evacuation calculation would the be used to determine whether people within the structure could evacuate before these critical conditions develop.
Resumo:
A toxicity model on dividing the computational domain into two parts, a control region (CR) and a transport region (TR), for species calculation was recently developed. The model can be incorporated with either the heat source approach or the eddy dissipation model (EDM). The work described in this paper is a further application of the toxicity model with modifications of the EDM for vitiated fires. In the modified EDM, chemical reaction only occurs within the CR. This is consistent with the approach used in the species concentration calculations within the toxicity model in which yields of combustion products only change within the CR. A vitiated large room-corridor fire, in which the carbon monoxide (CM) concentrations are very high and the temperatures are relatively low at locations distant from the original fire source, is simulated using the modified EDM coupled with the toxicity model. Compared with the EDM, the modified EDM provide significant improvements in the predictions of temperatures at remote locations. Predictions of species concentrations at various locations follow the measured trends. Good agreements between the measured and predicted species concentrations are obtained at the vitiated fire stage.
Resumo:
This work explores the impact of response time distributions on high-rise building evacuation. The analysis utilises response times extracted from printed accounts and interviews of evacuees from the WTC North Tower evacuation of 11 September 2001. Evacuation simulations produced using these “real” response time distributions are compared with simulations produced using instant and engineering response time distributions. Results suggest that while typical engineering approximations to the response time distribution may produce reasonable evacuation times for up to 90% of the building population, using this approach may underestimate total evacuation times by as much as 61%. These observations are applicable to situations involving large high-rise buildings in which travel times are generally expected to be greater than response times