952 resultados para C. Computational simulation
Resumo:
Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.
Resumo:
Recent advances in hardware development coupled with the rapid adoption and broad applicability of cloud computing have introduced widespread heterogeneity in data centers, significantly complicating the management of cloud applications and data center resources. This paper presents the CACTOS approach to cloud infrastructure automation and optimization, which addresses heterogeneity through a combination of in-depth analysis of application behavior with insights from commercial cloud providers. The aim of the approach is threefold: to model applications and data center resources, to simulate applications and resources for planning and operation, and to optimize application deployment and resource use in an autonomic manner. The approach is based on case studies from the areas of business analytics, enterprise applications, and scientific computing.
Resumo:
Natural ventilation is a sustainable solution to maintaining healthy and comfortable environmental conditions in buildings. However, the effective design, construction and operation of naturally ventilated buildings require a good understanding of complex airflow patterns caused by the buoyancy and wind effects.The work presented in this article employed a 3D computational fluid dynamics (CFD) analysis in order to investigate environmental conditions and thermal comfort of the occupants of a highly-glazed naturally ventilated meeting room. This analysis was facilitated by the real-time field measurements performed in an operating building, and previously developed formal calibration methodology for reliable CFD models of indoor environments. Since, creating an accurate CFD model of an occupied space in a real-life scenario requires a high level of CFD expertise, trusted experimental data and an ability to interpret model input parameters; the calibration methodology guided towards a robust and reliable CFD model of the indoor environment. This calibrated CFD model was then used to investigate indoor environmental conditions and to evaluate thermal comfort indices for the occupants of the room. Thermal comfort expresses occupants' satisfaction with thermal environment in buildings by defining the range of indoor thermal environmental conditions acceptable to a majority of occupants. In this study, the thermal comfort analysis, supported by both field measurements and CFD simulation results, confirmed a satisfactory and optimal room operation in terms of thermal environment for the investigated real-life scenario. © 2013 Elsevier Ltd.
Resumo:
The liquid structure of pyridine-acetic acid mixtures have been investigated using neutron scattering at various mole fractions of acetic acid, χHOAc = 0.33, 0.50, and 0.67, and compared to the structures of neat pyridine and acetic acid. Data has been modelled using Empirical Potential Structure Refinement (EPSR) with a ‘free proton’ reference model, which has no prejudicial weighting towards either the existence of molecular or ionised species. Analysis of the neutron scattering results shows the existence of hydrogen-bonded acetic acid chains with pyridine inclusions, rather than the formation of an ionic liquid by proton transfer.
Resumo:
Bioresorbable polymers such as PLA have an important role to play in the development of temporary implantable medical devices with significant benefits over traditional therapies. However, development of new devices is hindered by high manufacturing costs associated with difficulties in processing the material. A major problem is the lack of insight on material degradation during processing. In this work, a method of quantifying degradation of PLA using IR spectroscopy coupled with computational chemistry and chemometric modeling is examined. It is shown that the method can predict the quantity of degradation products in solid-state samples with reasonably good accuracy, indicating the potential to adapt the method to developing an on-line sensor for monitoring PLA degradation in real-time during processing.
Resumo:
The energetics of the low-temperature adsorption and decomposition of nitrous oxide, N(2)O, on flat and stepped platinum surfaces were calculated using density-functional theory (DFT). The results show that the preferred adsorption site for N(2)O is an atop site, bound upright via the terminal nitrogen. The molecule is only weakly chemisorbed to the platinum surface. The decomposition barriers on flat (I 11) surfaces and stepped (211) surfaces are similar. While the barrier for N(2)O dissociation is relatively small, the surface rapidly becomes poisoned by adsorbed oxygen. These findings are supported by experimental results of pulsed N(2)O decomposition with 5% Pt/SiO(2) and bismuth-modified Pt/C catalysts. At low temperature, decomposition occurs but self-poisoning by O((ads)) prevents further decomposition. At higher temperatures some desorption Of O(2) is observed, allowing continued catalytic activity. The study with bismuth-modified Pt/C catalysts showed that, although the activation barriers calculated for both terraces and steps were similar, the actual rate was different for the two surfaces. Steps were found experimentally to be more active than terraces and this is attributed to differences in the preexponential term. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Simulation offers a safe opportunity for students to practice clinical procedures without exposure and risk of harm to real patients (Partin et al, 2011). Simulation is recognised to increase students’ confidence in their ability to make critical decisions (McCaughey and Traynor, 2010). Within Queen’s University Belfast, simulation for obstetric emergency training based on the ethos of ‘Practical Obstetric Multi-Professional Training[PROMPT]’ (Draycott et al, 2008) has been developed for midwifery students and is now uniquely embedded within the pre-registration curriculum. An important aspect of the PROMPT training is the use of low fidelity simulation as opposed to high tech support (Crofts et al, 2008). Studies have reflected that low fidelity simulation can be an effective tool for promoting student confidence (Tosterud, 2013; Hughes et al, 2013). Students are given the opportunity to experience obstetric emergencies within a safe environment and evaluation has indicated that students feel safe and have an increase in confidence and self-efficacy. The immediacy of the feedback offered by simulated situations encourages an exploration of beliefs and attitudes, particularly with peers, promoting a deeper sense of learning (Stoneham and Feltham, 2009).This paper will discuss why low fidelity simulation can effectively enhance the student experience and promote self-efficacy.
Resumo:
The increasing complexity and scale of cloud computing environments due to widespread data centre heterogeneity makes measurement-based evaluations highly difficult to achieve. Therefore the use of simulation tools to support decision making in cloud computing environments to cope with this problem is an increasing trend. However the data required in order to model cloud computing environments with an appropriate degree of accuracy is typically large, very difficult to collect without some form of automation, often not available in a suitable format and a time consuming process if done manually. In this research, an automated method for cloud computing topology definition, data collection and model creation activities is presented, within the context of a suite of tools that have been developed and integrated to support these activities.