991 resultados para Application technology
Resumo:
In this paper we present some work concerned with the development and testing of a simple solid fuel combustion model incorporated within a Computational Fluid Dynamics (CFD) framework. The model is intended for use in engineering applications of fire field modeling and represents an extension of this technique to situations involving the combustion of solid fuels. The CFD model is coupled with a simple thermal pyrolysis model for combustible solid noncharring fuels, a six-flux radiation model and an eddy-dissipation model for gaseous combustion. The model is then used to simulate a series of small-scale room fire experiments in which the target solid fuel is polymethylmethacrylate. The numerical predictions produced by this coupled model are found to be in very good agreement with experimental data. Furthermore, numerical predictions of the relationship between the air entrained into the fire compartment and the ventilation factor produce a characteristic linear correlation with constant of proportionality 0.38 kg/sm5/12. The simulation results also suggest that the model is capable of predicting the onset of "flashover" type behavior within the fire compartment.
Resumo:
This paper presents data relating to occupant pre-evacuation times from university and hospital outpatient facilities. Although the two occupancies are entirely different, they do employ relatively similar procedures: members of staff sweep areas to encourage individuals to evacuate.However the manner in which the dependent population reacts to these procedures is quite different. In the hospital case, the patients only evacuated once a member of the nursing staff had instructed them to do so, while in the university evacuation, the students were less dependent upon the actions of the staff, with over 50% of them evacuating with no prior prompting. In addition, the student pre-evacuation time was found to be dependent on their level of engagement in various activities.
Resumo:
Computer based mathematical models describing the aircraft evacuation process have a vital role to play in the design and development of safer aircraft, the implementation of safer and more rigorous certification criteria, in cabin crew training and post-mortem accident investigation. As the risk of personal injury and the costs involved in performing full-scale certification trials are high, the development and use of these evacuation modelling tools are essential. Furthermore, evacuation models provide insight into the evacuation process that is impossible to derive from a single certification trial. The airEXODUS evacuation model has been under development since 1989 with support from the UK CAA and the aviation industry. In addition to describing the capabilities of the airEXODUS evacuation model, this paper describes the findings of a recent CAA project aimed at investigating model accuracy in predicting past certification trials. Furthermore, airEXODUS is used to examine issues related to the Blended Wing Body (BWB) and Very Large Transport Aircraft (VLTA). These radical new aircraft concepts pose considerable challenges to designers, operators and certification authorities. BWB concepts involving one or two decks with possibly four or more aisles offer even greater challenges. Can the largest exits currently available cope with passenger flow arising from four or five aisles? Do we need to consider new concepts in exit design? Should the main aisle be made wider to accommodate more passengers? In this paper we discuss various issues evacuation related issues associated VLTA and BWB aircraft and demonstrate how computer based evacuation models can be used to investigage these issues through examination of aisle/exit configurations for BWB cabin layouts.
Resumo:
This paper describes a Framework for e-Learning and presents the findings of a study investigating whether the use of Blended Learning can fulfill or at least accommodate some of the human requirements presently neglected by current e-Learning systems. This study evaluates the in-house system: Teachmat, and discusses how the use of Blended Learning has become increasingly prevalent as a result of its enhancement and expansion, its relationship to the human and pedagogical issues, and both the positive and negative implications of this reality. [From the Authors]
Resumo:
The efficient remediation of heavy metal-bearing sediment has been one of top priorities of ecosystem protection. Cement-based solidification/stabilization (s/s) is an option for reducing the mobility of heavy metals in the sediment and the subsequent hazard for human beings and animals. This work uses sodium carbonate as an internal carbon source of accelerated carbonation and gaseous CO2 as an external carbon source to overcome deleterious effects of heavy metals on strength development and improve the effectiveness of s/s of heavy metal-bearing sediment. In addition to the compressive strength and porosity measurements, leaching tests followed the Chinese solid waste extraction procedure for leaching toxicity - sulfuric acid and nitric acid method (HJ/T299-2007), German leaching procedure (DIN38414-S4) and US toxicity characteristic leaching procedures (TCLP) have been conducted. The experimental results indicated that the solidified sediment by accelerated carbonation was capable of reaching all performance criteria for the disposal at a Portland cement dosage of 10 wt.% and a solid/water ratio of 1: 1. The concentrations of mercury and other heavy metals in the leachates were below 0.10 mg/L and 5 mg/L, respectively, complying with Chinese regulatory level (GB5085-2007). Compared to the hydration, accelerated carbonation improved the compressive strength of the solidified sediment by more than 100% and reduced leaching concentrations of heavy metals significantly. It is considered that accelerated carbonation technology with a combination of Na2CO3 and CO2 may practically apply to cement-based s/s of heavy metal-bearing sediment. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
A particle swarm optimisation approach is used to determine the accuracy and experimental relevance of six disparate cure kinetics models. The cure processes of two commercially available thermosetting polymer materials utilised in microelectronics manufacturing applications have been studied using a differential scanning calorimetry system. Numerical models have been fitted to the experimental data using a particle swarm optimisation algorithm which enables the ultimate accuracy of each of the models to be determined. The particle swarm optimisation approach to model fitting proves to be relatively rapid and effective in determining the optimal coefficient set for the cure kinetics models. Results indicate that the singlestep autocatalytic model is able to represent the curing process more accurately than more complex model, with ultimate accuracy likely to be limited by inaccuracies in the processing of the experimental data.
Resumo:
This paper presents an approach for detecting local damage in large scale frame structures by utilizing regularization methods for ill-posed problems. A direct relationship between the change in stiffness caused by local damage and the measured modal data for the damaged structure is developed, based on the perturbation method for structural dynamic systems. Thus, the measured incomplete modal data can be directly adopted in damage identification without requiring model reduction techniques, and common regularization methods could be effectively employed to solve the developed equations. Damage indicators are appropriately chosen to reflect both the location and severity of local damage in individual components of frame structures such as in brace members and at beam-column joints. The Truncated Singular Value Decomposition solution incorporating the Generalized Cross Validation method is introduced to evaluate the damage indicators for the cases when realistic errors exist in modal data measurements. Results for a 16-story building model structure show that structural damage can be correctly identified at detailed level using only limited information on the measured noisy modal data for the damaged structure.
Resumo:
This is the first report from ALT’s new Annual Survey launched in December 2014. This survey was primarily for ALT members (individual or at an organisation which is an organisational member) it could however also be filled in by others, perhaps those interested in taking out membership. The report and data highlight emerging work areas that are important to the survey respondents. Analysis of the survey responses indicates a number of areas ALT should continue to support and develop. Priorities for the membership are ‘Intelligent use of learning technology’ and ‘Research and practice’, aligned to this is the value placed by respondent’s on by communication via the ALT Newsletter/News, social media and Research in Learning Technology. The survey also reveals ‘Data and Analytics’ and ‘Open Education’ are areas where the majority of respondents are finding are becoming increasingly important. As such our community may benefit from development opportunities ALT can provide. The survey is also a reminder that ALT has an essential role in enabling members to develop research and practice in areas which might be considered as minority interest. For example whilst the majority of respondents didn't indicate areas such as ‘Digital and Open Badges’, and ‘Game Based Learning’ as important there are still members who consider these areas are very significant and becoming increasingly valuable and as such ALT will continue to better support these groups within our community. Whilst ALT has conducted previous surveys of ALT membership this is the first iteration in this form. ALT has committed to surveying the sector on an annual basis, refining the core question set but trying to preserve an opportunity for longitudinal analysis.
Resumo:
This is the second report of the ALT’s Annual Survey which was open for responses between the 1st December 2015 and 17th January 2016. In total 196 responses have been analysed as part of this report. Given a number of core questions remain unchanged between this and the 2014 edition of the survey analysis includes data from the previous survey.
Resumo:
For seizing the potential of serious games, the RAGE project - funded by the Horizon-2020 Programme of the European Commission - will make available an interoperable set of advanced technology components (software assets) that support game studios at serious game development. This paper describes the overall software architecture and design conditions that are needed for the easy integration and reuse of such software assets in existing game platforms. Based on the component-based software engineering paradigm the RAGE architecture takes into account the portability of assets to different operating systems, different programming languages and different game engines. It avoids dependencies on external software frameworks and minimizes code that may hinder integration with game engine code. Furthermore it relies on a limited set of standard software patterns and well-established coding practices. The RAGE architecture has been successfully validated by implementing and testing basic software assets in four major programming languages (C#, C++, Java and Typescript/JavaScript, respectively). A demonstrator implementation of asset integration with an existing game engine was created and validated. The presented RAGE architecture paves the way for large scale development and application of cross-engine reusable software assets for enhancing the quality and diversity of serious gaming.
Resumo:
The emergence of Grid computing technology has opened up an unprecedented opportunity for biologists to share and access data, resources and tools in an integrated environment leading to a greater chance of knowledge discovery. GeneGrid is a Grid computing framework that seamlessly integrates a myriad of heterogeneous resources spanning multiple administrative domains and locations. It provides scientists an integrated environment for the streamlined access of a number of bioinformatics programs and databases through a simple and intuitive interface. It acts as a virtual bioinformatics laboratory by allowing scientists to create, execute and manage workflows that represent bioinformatics experiments. A number of cooperating Grid services interact in an orchestrated manner to provide this functionality. This paper gives insight into the details of the architecture, components and implementation of GeneGrid.
Resumo:
To investigate the possible biotechnological application of the phenomenon of low pH-inducible phosphate uptake and polyphosphate accumulation, previously reported using pure microbial cultures and under laboratory conditions, a 2000 L activated sludge pilot plant was constructed at a municipal sewage treatment works. When operated as a single-stage reactor this removed more than 60% of influent phosphate from primary settled sewage at a pH of 6.0, as opposed to approximately 30% at the typical operational pH for the works of 7.0-7.3-yet without any deleterious effect on other treatment parameters. At these pH values the phosphorus content of the sludge was, respectively, 4.2% and 2.0%. At pH 6.0 some 33.9% of sludge microbial cells were observed to contain polyphosphate inclusions; the corresponding value at pH 7.0 was 18.7%. Such a process may serve as a prototype for the development of alternative biological and chemical options for phosphate removal from wastewaters.
Resumo:
This paper examines the relation between technical possibilities, liberal logics, and the concrete reconfiguration of markets. It focuses on the enrolling of innovations in communication and information technologies into the markets traditionally dominated by stock exchanges. With the development of capacities to trade on-screen, the power of incumbent market makers has been challenged as a less stable array of competing quasi-public and private marketplaces emerges. Developing a case study of the Toronto Stock Exchange, I argue that narrative emphasis on the performative power of sociotechnical innovations, the deterritorialisation of financial relations, and the erosion of state capacities needs qualification. A case is made for the importance of developing an understanding of: the spaces of encounter between emerging social technologies and property rights, rules of exchange, and structures of governance; and the interplay of orderings of different institutional composition and spatial reach in the reconfiguration of market architectures. Only then can a better grasp be gained of the evolving dynamics between making markets, the regulatory powers of the state, and their delimitations.
Resumo:
The paper deals with use of a food grade coagulant (guar gum) as a replacement for synthetic coagulants for potable water treatment.