895 resultados para Applied psychoanalysis
Resumo:
The purpose of this paper is to demonstrate the potential of the EXODUS evacuation model in building environments. The latest PC/workstation version of EXODUS is described and is also applied to a large hypothetical supermarket/restaurant complex measuring 50 m x 40 m. A range of scenarios is presented where population characteristics (such as size, individual travel speeds, and individual response times), and enclosure configuration characteristics (such as number of exits, size of exits, and opening times of exits) are varied. The results demonstrate a wide range of occupant behavior including overtaking, queuing, redirection, and conflict avoidance. Evacuation performance is measured by a number of model predicted parameters including individual exit flow rates, overall evacuation flow rates, total evacuation time, average evacuation time per occupant, average travel distance, and average wait time. The simulations highlight the profound impact that variations in individual travel speeds and occupant response times have in determining the overall evacuation performance.
Resumo:
The purpose of this paper is to demonstrate the potential of the EXODUS evacuation model in building environments. The latest PC/workstation version of EXODUS is described and is also applied to a large hypothetical supermarket/restaurant complex measuring 50 m x 40 m. A range of scenarios is presented where population characteristics (such as size, individual travel speeds, and individual response times), and enclosure configuration characteristics (such as number of exits, size of exits, and opening times of exits) are varied. The results demonstrate a wide range of occupant behavior including overtaking, queuing, redirection, and conflict avoidance. Evacuation performance is measured by a number of model predicted parameters including individual exit flow rates, overall evacuation flow rates, total evacuation time, average evacuation time per occupant, average travel distance, and average wait time. The simulations highlight the profound impact that variations in individual travel speeds and occupant response times have in determining the overall evacuation performance. 1. Jin, T., and Yamada T., "Experimental Study of Human Behavior in Smoke Filled Corridors," Proceedings of The Second International Symposium on Fire Safety Science, 1988, pp. 511-519. 2. Galea, E.R., and Galparsoro, J.M.P., "EXODUS: An Evacuation Model for Mass Transport Vehicles," UK CAA Paper 93006 ISBN 086039 543X, CAA London, 1993. 3. Galea, E.R., and Galparsoro, J.M.P., "A Computer Based Simulation Model for the Prediction of Evacuation from Mass Transport Vehicles," Fire Safety Journal, Vol. 22, 1994, pp. 341-366. 4. Galea, E.R., Owen, M., and Lawrence, P., "Computer Modeling of Human Be havior in Aircraft Fire Accidents," to appear in the Proceedings of Combus tion Toxicology Symposium, CAMI, Oklahoma City, OK, 1995. 5. Kisko, T.M. and Francis, R.L., "EVACNET+: A Computer Program to Determine Optimal Building Evacuation Plans," Fire Safety Journal, Vol. 9, 1985, pp. 211-220. 6. Levin, B., "EXITT, A Simulation Model of Occupant Decisions and Actions in Residential Fires," Proceedings of The Second International Symposium on Fire Safety Science, 1988, pp. 561-570. 7. Fahy, R.F., "EXIT89: An Evacuation Model for High-Rise Buildings," Pro ceedings of The Third International Sym posium on Fire Safety Science, 1991, pp. 815-823. 8. Thompson, P.A., and Marchant, E.W., "A Computer Model for the Evacuation of Large Building Populations," Fire Safety Journal, Vol. 24, 1995, pp. 131-148. 9. Still, K., "New Computer System Can Predict Human Behavior Response to Building Fires," FIRE 84, 1993, pp. 40-41. 10. Ketchell, N., Cole, S.S., Webber, D.M., et.al., "The Egress Code for Human Move ment and Behavior in Emergency Evacu ations," Engineering for Crowd Safety (Smith, R.A., and Dickie, J.F., Eds.), Elsevier, 1993, pp. 361-370. 11. Takahashi, K., Tanaka, T. and Kose, S., "An Evacuation Model for Use in Fire Safety Design of Buildings," Proceedings of The Second International Symposium on Fire Safety Science, 1988, pp. 551- 560. 12. G2 Reference Manual, Version 3.0, Gensym Corporation, Cambridge, MA. 13. XVT Reference Manual, Version 3.0 XVT Software Inc., Boulder, CO. 14. Galea, E.R., "On the Field Modeling Approach to the Simulation of Enclosure Fires, Journal of Fire Protection Engineering, Vol. 1, No. 1, 1989, pp. 11-22. 15. Purser, D.A., "Toxicity Assessment of Combustion Products," SFPE Handbook of Fire Protection Engineering, National Fire Protection Association, Quincy, MA, pp. 1-200 - 1-245, 1988. 16. Hankin, B.D., and Wright, R.A., "Pas senger Flows in Subways," Operational Research Quarterly, Vol. 9, 1958, pp. 81-88. 17. HMSO, The Building Regulations 1991 - Approved Document B, section B 1 (1992 edition), HMSO publications, London, pp. 9-40. 18. Polus A., Schofer, J.L., and Ushpiz, A., "Pedestrian Flow and Level of Service," Journal of Transportation Engineering, Vol. 109, 1983, pp. 46-47. 19. Muir, H., Marrison, C., and Evans, A., "Aircraft Evacuations: the Effect of Passenger Motivation and Cabin Con figuration Adjacent to the Exit," CAA Paper 89019, ISBN 0 86039 406 9, 1989. 20. Muir, H., Private communication to appear as a CAA report, 1996.
Resumo:
This paper presents a genetic algorithm for finding a constrained minimum spanning tree. The problem is of relevance in the design of minimum cost communication networks, where there is a need to connect all the terminals at a user site to a terminal concentrator in a multipoint (tree) configuration, while ensuring that link capacity constraints are not violated. The approach used maintains a distinction between genotype and phenotype, which produces superior results to those found using a direct representation in a previous study.
Resumo:
Belief revision is a well-research topic within AI. We argue that the new model of distributed belief revision as discussed here is suitable for general modelling of judicial decision making, along with extant approach as known from jury research. The new approach to belief revision is of general interest, whenever attitudes to information are to be simulated within a multi-agent environment with agents holding local beliefs yet by interaction with, and influencing, other agents who are deliberating collectively. In the approach proposed, it's the entire group of agents, not an external supervisor, who integrate the different opinions. This is achieved through an election mechanism, The principle of "priority to the incoming information" as known from AI models of belief revision are problematic, when applied to factfinding by a jury. The present approach incorporates a computable model for local belief revision, such that a principle of recoverability is adopted. By this principle, any previously held belief must belong to the current cognitive state if consistent with it. For the purposes of jury simulation such a model calls for refinement. Yet we claim, it constitutes a valid basis for an open system where other AI functionalities (or outer stiumuli) could attempt to handle other aspects of the deliberation which are more specifi to legal narrative, to argumentation in court, and then to the debate among the jurors.
Resumo:
Belief revision is a well-researched topic within Artificial Intelligence (AI). We argue that the new model of belief revision as discussed here is suitable for general modelling of judicial decision making, along with the extant approach as known from jury research. The new approach to belief revision is of general interest, whenever attitudes to information are to be simulated within a multi-agent environment with agents holding local beliefs yet by interacting with, and influencing, other agents who are deliberating collectively. The principle of 'priority to the incoming information', as known from AI models of belief revision, is problematic when applied to factfinding by a jury. The present approach incorporates a computable model for local belief revision, such that a principle of recoverability is adopted. By this principle, any previously held belief must belong to the current cognitive state if consistent with it. For the purposes of jury simulation such a model calls for refinement. Yet, we claim, it constitutes a valid basis for an open system where other AI functionalities (or outer stimuli) could attempt to handle other aspects of the deliberation which are more specific to legal narratives, to argumentation in court, and then to the debate among the jurors.
Resumo:
This paper presents the computational modelling of welding phenomena within a versatile numerical framework. The framework embraces models from both the fields of computational fluid dynamics (CFD) and computational solid mechanics (CSM). With regard to the CFD modelling of the weld pool fluid dynamics, heat transfer and phase change, cell-centred finite volume (FV) methods are employed. Additionally, novel vertex-based FV methods are employed with regard to the elasto-plastic deformation associated with the CSM. The FV methods are included within an integrated modelling framework, PHYSICA, which can be readily applied to unstructured meshes. The modelling techniques are validated against a variety of reference solutions.
Resumo:
A vertex-based finite volume (FV) method is presented for the computational solution of quasi-static solid mechanics problems involving material non-linearity and infinitesimal strains. The problems are analysed numerically with fully unstructured meshes that consist of a variety of two- and threedimensional element types. A detailed comparison between the vertex-based FV and the standard Galerkin FE methods is provided with regard to discretization, solution accuracy and computational efficiency. For some problem classes a direct equivalence of the two methods is demonstrated, both theoretically and numerically. However, for other problems some interesting advantages and disadvantages of the FV formulation over the Galerkin FE method are highlighted.
Resumo:
In this paper, a method for the integration of several numerical analytical techniques that are used in microsystems design and failure analysis is presented. The analytical techniques are categorized into four groups in the discussion, namely the high-fidelity analytical tools, i.e. finite element (FE) method, the fast analytical tools referring to reduced order modeling (ROM); the optimization tools, and probability based analytical tools. The characteristics of these four tools are investigated. The interactions between the four tools are discussed and a methodology for the coupling of these four tools is offered. This methodology consists of three stages, namely reduced order modeling, deterministic optimization and probabilistic optimization. Using this methodology, a case study for optimization of a solder joint is conducted. It is shown that these analysis techniques have mutual relationship of interaction and complementation. Synthetic application of these techniques can fully utilize the advantages of these techniques and satisfy various design requirements. The case study shows that the coupling method of different tools provided by this paper is effective and efficient and it is highly relevant in the design and reliability analysis of microsystems
Resumo:
This paper presents two multilevel refinement algorithms for the capacitated clustering problem. Multilevel refinement is a collaborative technique capable of significantly aiding the solution process for optimisation problems. The central methodologies of the technique are filtering solutions from the search space and reducing the level of problem detail to be considered at each level of the solution process. The first multilevel algorithm uses a simple tabu search while the other executes a standard local search procedure. Both algorithms demonstrate that the multilevel technique is capable of aiding the solution process for this combinatorial optimisation problem.
Resumo:
A common problem faced by fire safety engineers in the field of evacuation analysis concerns the optimal design of an arbitrarily complex structure in order to minimise evacuation times. How does the engineer determine the best solution? In this study we introduce the concept of numerical optimisation techniques to address this problem. The study makes user of the buildingEXODUS evacuation model coupled with classical optimisation theory including Design of Experiments (DoE) and Response Surface Models (RSM). We demonstrate the technique using a relatively simple problem of determining the optimal location for a single exit in a square room.
Resumo:
There has been a significant increase of interest in parents who are considered to be outside of normative discourses; specifically the 'moral panic' relating to an increase in the demography of teenage mothers in the UK (SEU, 1999, 2003; Swann et al., 2003). Recently research has turned to the experiences of parenting from the father's perspective (Daniel and Taylor, 1999, 2001) although there remains a significant gap focusing on the experiences of young fathers. It is argued by Swann et al. (2003) that young fathers are a difficult group to access and this has limited the amount and type of studies conducted with many studies on young parents looking at the role of the father through the eyes of the mother. This contribution focuses on the use of narrative interviews with a small group of young, vulnerable, socially excluded fathers who are users of the statutory social services in the UK. The article looks specifically at the ethics and practical challenges of working with this group and offers insights into the use of the narrative method and the ethical dilemmas resulting from it.
Resumo:
Digital learning games are useful educational tools with high motivational potential. With the application of games for instruction there comes the need of acknowledging learning game experiences also in the context of educational assessment. Learning analytics provides new opportunities for supporting assessment in and of educational games. We give an overview of current learning analytics methods in this field and reflect on existing challenges. An approach of providing reusable software assets for interaction assessment and evaluation in games is presented. This is part of a broader initiative of making available advanced methodologies and tools for supporting applied game development.
Resumo:
The established (digital) leisure game industry is historically one dominated by large international hardware vendors (e.g. Sony, Microsoft and Nintendo), major publishers and supported by a complex network of development studios, distributors and retailers. New modes of digital distribution and development practice are challenging this business model and the leisure games industry landscape is one experiencing rapid change. The established (digital) leisure games industry, at least anecdotally, appears reluctant to participate actively in the applied games sector (Stewart et al., 2013). There are a number of potential explanations as to why this may indeed be the case including ; A concentration on large-scale consolidation of their (proprietary) platforms, content, entertainment brand and credibility which arguably could be weakened by association with the conflicting notion of purposefulness (in applied games) in market niches without clear business models or quantifiable returns on investment. In contrast, the applied games industry exhibits the characteristics of an emerging, immature industry namely: weak interconnectedness, limited knowledge exchange, an absence of harmonising standards, limited specialisations, limited division of labour and arguably insufficient evidence of the products efficacies (Stewart et al., 2013; Garcia Sanchez, 2013) and could, arguably, be characterised as a dysfunctional market. To test these assertions the Realising an Applied Gaming Ecosystem (RAGE) project will develop a number of self contained gaming assets to be actively employed in the creation of a number of applied games to be implemented and evaluated as regional pilots across a variety of European educational, training and vocational contexts. RAGE is a European Commission Horizon 2020 project with twenty (pan European) partners from industry, research and education with the aim of developing, transforming and enriching advanced technologies from the leisure games industry into self-contained gaming assets (i.e. solutions showing economic value potential) that could support a variety of stakeholders including teachers, students, and, significantly, game studios interested in developing applied games. RAGE will provide these assets together with a large quantity of high-quality knowledge resources through a self-sustainable Ecosystem, a social space that connects research, the gaming industries, intermediaries, education providers, policy makers and end-users in order to stimulate the development and application of applied games in educational, training and vocational contexts. The authors identify barriers (real and perceived) and opportunities facing stakeholders in engaging, exploring new emergent business models ,developing, establishing and sustaining an applied gaming eco system in Europe.
Resumo:
The EU-based industry for non-leisure games is an emerging business. As such it is still fragmented and needs to achieve critical mass to compete globally. Nevertheless its growth potential is widely recognized. To become competitive the relevant applied gaming communities and SMEs require support by fostering the generation of innovation potential. The European project Realizing an Applied Gaming Ecosystem (RAGE) is aiming at supporting this challenge. RAGE will help by making available an interoperable set of advanced technology assets, tuned to applied gaming, as well as proven practices of using asset-based applied games in various real-world contexts, and finally a centralized access to a wide range of applied gaming software modules, services and related document, media, and educational resources within an online community portal called the RAGE Ecosystem. It is based on an integrational, user-centered approach of Knowledge Management and Innovation Processes in the shape of a service-based implementation.