68 resultados para 190202 Computer Gaming and Animation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time correlation functions yield profound information about the dynamics of a physical system and hence are frequently calculated in computer simulations. For systems whose dynamics span a wide range of time, currently used methods require significant computer time and memory. In this paper, we discuss the multiple-tau correlator method for the efficient calculation of accurate time correlation functions on the fly during computer simulations. The multiple-tau correlator is efficacious in terms of computational requirements and can be tuned to the desired level of accuracy. Further, we derive estimates for the error arising from the use of the multiple-tau correlator and extend it for use in the calculation of mean-square particle displacements and dynamic structure factors. The method described here, in hardware implementation, is routinely used in light scattering experiments but has not yet found widespread use in computer simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports on a study of computer-mediated communication within the context of a distance MA in TEFL programme which used an e-mail discussion list and then a discussion board. The study focused on the computer/Internet access and skills of the target population and their CMC needs and wants. Data were collected from 63 questionnaires and 6 in-depth interviews with students. Findings indicate that computer use and access to the Internet are widespread within the target population. In addition, most respondents indicated some competence in Internet use. No single factor emerged as an overriding inhibiting factor for lack of personal use. There was limited use of the CMC tools provided on the course for student–student interaction, mainly attributable to time constraints. However, most respondents said that they would like more CMC interaction with tutors. The main factor which would contribute to greater Internet use was training. The paper concludes with recommendations and suggestions for learner training in this area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: To determine the prevalence and nature of prescribing errors in general practice; to explore the causes, and to identify defences against error. Methods: 1) Systematic reviews; 2) Retrospective review of unique medication items prescribed over a 12 month period to a 2% sample of patients from 15 general practices in England; 3) Interviews with 34 prescribers regarding 70 potential errors; 15 root cause analyses, and six focus groups involving 46 primary health care team members Results: The study involved examination of 6,048 unique prescription items for 1,777 patients. Prescribing or monitoring errors were detected for one in eight patients, involving around one in 20 of all prescription items. The vast majority of the errors were of mild to moderate severity, with one in 550 items being associated with a severe error. The following factors were associated with increased risk of prescribing or monitoring errors: male gender, age less than 15 years or greater than 64 years, number of unique medication items prescribed, and being prescribed preparations in the following therapeutic areas: cardiovascular, infections, malignant disease and immunosuppression, musculoskeletal, eye, ENT and skin. Prescribing or monitoring errors were not associated with the grade of GP or whether prescriptions were issued as acute or repeat items. A wide range of underlying causes of error were identified relating to the prescriber, patient, the team, the working environment, the task, the computer system and the primary/secondary care interface. Many defences against error were also identified, including strategies employed by individual prescribers and primary care teams, and making best use of health information technology. Conclusion: Prescribing errors in general practices are common, although severe errors are unusual. Many factors increase the risk of error. Strategies for reducing the prevalence of error should focus on GP training, continuing professional development for GPs, clinical governance, effective use of clinical computer systems, and improving safety systems within general practices and at the interface with secondary care.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The United Nation Intergovernmental Panel on Climate Change (IPCC) makes it clear that climate change is due to human activities and it recognises buildings as a distinct sector among the seven analysed in its 2007 Fourth Assessment Report. Global concerns have escalated regarding carbon emissions and sustainability in the built environment. The built environment is a human-made setting to accommodate human activities, including building and transport, which covers an interdisciplinary field addressing design, construction, operation and management. Specifically, Sustainable Buildings are expected to achieve high performance throughout the life-cycle of siting, design, construction, operation, maintenance and demolition, in the following areas: • energy and resource efficiency; • cost effectiveness; • minimisation of emissions that negatively impact global warming, indoor air quality and acid rain; • minimisation of waste discharges; and • maximisation of fulfilling the requirements of occupants’ health and wellbeing. Professionals in the built environment sector, for example, urban planners, architects, building scientists, engineers, facilities managers, performance assessors and policy makers, will play a significant role in delivering a sustainable built environment. Delivering a sustainable built environment needs an integrated approach and so it is essential for built environment professionals to have interdisciplinary knowledge in building design and management . Building and urban designers need to have a good understanding of the planning, design and management of the buildings in terms of low carbon and energy efficiency. There are a limited number of traditional engineers who know how to design environmental systems (services engineer) in great detail. Yet there is a very large market for technologists with multi-disciplinary skills who are able to identify the need for, envision and manage the deployment of a wide range of sustainable technologies, both passive (architectural) and active (engineering system),, and select the appropriate approach. Employers seek applicants with skills in analysis, decision-making/assessment, computer simulation and project implementation. An integrated approach is expected in practice, which encourages built environment professionals to think ‘out of the box’ and learn to analyse real problems using the most relevant approach, irrespective of discipline. The Design and Management of Sustainable Built Environment book aims to produce readers able to apply fundamental scientific research to solve real-world problems in the general area of sustainability in the built environment. The book contains twenty chapters covering climate change and sustainability, urban design and assessment (planning, travel systems, urban environment), urban management (drainage and waste), buildings (indoor environment, architectural design and renewable energy), simulation techniques (energy and airflow), management (end-user behaviour, facilities and information), assessment (materials and tools), procurement, and cases studies ( BRE Science Park). Chapters one and two present general global issues of climate change and sustainability in the built environment. Chapter one illustrates that applying the concepts of sustainability to the urban environment (buildings, infrastructure, transport) raises some key issues for tackling climate change, resource depletion and energy supply. Buildings, and the way we operate them, play a vital role in tackling global greenhouse gas emissions. Holistic thinking and an integrated approach in delivering a sustainable built environment is highlighted. Chapter two demonstrates the important role that buildings (their services and appliances) and building energy policies play in this area. Substantial investment is required to implement such policies, much of which will earn a good return. Chapters three and four discuss urban planning and transport. Chapter three stresses the importance of using modelling techniques at the early stage for strategic master-planning of a new development and a retrofit programme. A general framework for sustainable urban-scale master planning is introduced. This chapter also addressed the needs for the development of a more holistic and pragmatic view of how the built environment performs, , in order to produce tools to help design for a higher level of sustainability and, in particular, how people plan, design and use it. Chapter four discusses microcirculation, which is an emerging and challenging area which relates to changing travel behaviour in the quest for urban sustainability. The chapter outlines the main drivers for travel behaviour and choices, the workings of the transport system and its interaction with urban land use. It also covers the new approach to managing urban traffic to maximise economic, social and environmental benefits. Chapters five and six present topics related to urban microclimates including thermal and acoustic issues. Chapter five discusses urban microclimates and urban heat island, as well as the interrelationship of urban design (urban forms and textures) with energy consumption and urban thermal comfort. It introduces models that can be used to analyse microclimates for a careful and considered approach for planning sustainable cities. Chapter six discusses urban acoustics, focusing on urban noise evaluation and mitigation. Various prediction and simulation methods for sound propagation in micro-scale urban areas, as well as techniques for large scale urban noise-mapping, are presented. Chapters seven and eight discuss urban drainage and waste management. The growing demand for housing and commercial developments in the 21st century, as well as the environmental pressure caused by climate change, has increased the focus on sustainable urban drainage systems (SUDS). Chapter seven discusses the SUDS concept which is an integrated approach to surface water management. It takes into consideration quality, quantity and amenity aspects to provide a more pleasant habitat for people as well as increasing the biodiversity value of the local environment. Chapter eight discusses the main issues in urban waste management. It points out that population increases, land use pressures, technical and socio-economic influences have become inextricably interwoven and how ensuring a safe means of dealing with humanity’s waste becomes more challenging. Sustainable building design needs to consider healthy indoor environments, minimising energy for heating, cooling and lighting, and maximising the utilisation of renewable energy. Chapter nine considers how people respond to the physical environment and how that is used in the design of indoor environments. It considers environmental components such as thermal, acoustic, visual, air quality and vibration and their interaction and integration. Chapter ten introduces the concept of passive building design and its relevant strategies, including passive solar heating, shading, natural ventilation, daylighting and thermal mass, in order to minimise heating and cooling load as well as energy consumption for artificial lighting. Chapter eleven discusses the growing importance of integrating Renewable Energy Technologies (RETs) into buildings, the range of technologies currently available and what to consider during technology selection processes in order to minimise carbon emissions from burning fossil fuels. The chapter draws to a close by highlighting the issues concerning system design and the need for careful integration and management of RETs once installed; and for home owners and operators to understand the characteristics of the technology in their building. Computer simulation tools play a significant role in sustainable building design because, as the modern built environment design (building and systems) becomes more complex, it requires tools to assist in the design process. Chapter twelve gives an overview of the primary benefits and users of simulation programs, the role of simulation in the construction process and examines the validity and interpretation of simulation results. Chapter thirteen particularly focuses on the Computational Fluid Dynamics (CFD) simulation method used for optimisation and performance assessment of technologies and solutions for sustainable building design and its application through a series of cases studies. People and building performance are intimately linked. A better understanding of occupants’ interaction with the indoor environment is essential to building energy and facilities management. Chapter fourteen focuses on the issue of occupant behaviour; principally, its impact, and the influence of building performance on them. Chapter fifteen explores the discipline of facilities management and the contribution that this emerging profession makes to securing sustainable building performance. The chapter highlights a much greater diversity of opportunities in sustainable building design that extends well into the operational life. Chapter sixteen reviews the concepts of modelling information flows and the use of Building Information Modelling (BIM), describing these techniques and how these aspects of information management can help drive sustainability. An explanation is offered concerning why information management is the key to ‘life-cycle’ thinking in sustainable building and construction. Measurement of building performance and sustainability is a key issue in delivering a sustainable built environment. Chapter seventeen identifies the means by which construction materials can be evaluated with respect to their sustainability. It identifies the key issues that impact the sustainability of construction materials and the methodologies commonly used to assess them. Chapter eighteen focuses on the topics of green building assessment, green building materials, sustainable construction and operation. Commonly-used assessment tools such as BRE Environmental Assessment Method (BREEAM), Leadership in Energy and Environmental Design ( LEED) and others are introduced. Chapter nineteen discusses sustainable procurement which is one of the areas to have naturally emerged from the overall sustainable development agenda. It aims to ensure that current use of resources does not compromise the ability of future generations to meet their own needs. Chapter twenty is a best-practice exemplar - the BRE Innovation Park which features a number of demonstration buildings that have been built to the UK Government’s Code for Sustainable Homes. It showcases the very latest innovative methods of construction, and cutting edge technology for sustainable buildings. In summary, Design and Management of Sustainable Built Environment book is the result of co-operation and dedication of individual chapter authors. We hope readers benefit from gaining a broad interdisciplinary knowledge of design and management in the built environment in the context of sustainability. We believe that the knowledge and insights of our academics and professional colleagues from different institutions and disciplines illuminate a way of delivering sustainable built environment through holistic integrated design and management approaches. Last, but not least, I would like to take this opportunity to thank all the chapter authors for their contribution. I would like to thank David Lim for his assistance in the editorial work and proofreading.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides a comparative study of the performance of cross-flow and counter-flow M-cycle heat exchangers for dew point cooling. It is recognised that evaporative cooling systems offer a low energy alternative to conventional air conditioning units. Recently emerged dew point cooling, as the renovated evaporative cooling configuration, is claimed to have much higher cooling output over the conventional evaporative modes owing to use of the M-cycle heat exchangers. Cross-flow and counter-flow heat exchangers, as the available structures for M-cycle dew point cooling processing, were theoretically and experimentally investigated to identify the difference in cooling effectiveness of both under the parallel structural/operational conditions, optimise the geometrical sizes of the exchangers and suggest their favourite operational conditions. Through development of a dedicated computer model and case-by-case experimental testing and validation, a parametric study of the cooling performance of the counter-flow and cross-flow heat exchangers was carried out. The results showed the counter-flow exchanger offered greater (around 20% higher) cooling capacity, as well as greater (15%–23% higher) dew-point and wet-bulb effectiveness when equal in physical size and under the same operating conditions. The cross-flow system, however, had a greater (10% higher) Energy Efficiency (COP). As the increased cooling effectiveness will lead to reduced air volume flow rate, smaller system size and lower cost, whilst the size and cost are the inherent barriers for use of dew point cooling as the alternation of the conventional cooling systems, the counter-flow system is considered to offer practical advantages over the cross-flow system that would aid the uptake of this low energy cooling alternative. In line with increased global demand for energy in cooling of building, largely by economic booming of emerging developing nations and recognised global warming, the research results will be of significant importance in terms of promoting deployment of the low energy dew point cooling system, helping reduction of energy use in cooling of buildings and cut of the associated carbon emission.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: The prediction of protein structure and the precise understanding of protein folding and unfolding processes remains one of the greatest challenges in structural biology and bioinformatics. Computer simulations based on molecular dynamics (MD) are at the forefront of the effort to gain a deeper understanding of these complex processes. Currently, these MD simulations are usually on the order of tens of nanoseconds, generate a large amount of conformational data and are computationally expensive. More and more groups run such simulations and generate a myriad of data, which raises new challenges in managing and analyzing these data. Because the vast range of proteins researchers want to study and simulate, the computational effort needed to generate data, the large data volumes involved, and the different types of analyses scientists need to perform, it is desirable to provide a public repository allowing researchers to pool and share protein unfolding data. METHODS: To adequately organize, manage, and analyze the data generated by unfolding simulation studies, we designed a data warehouse system that is embedded in a grid environment to facilitate the seamless sharing of available computer resources and thus enable many groups to share complex molecular dynamics simulations on a more regular basis. RESULTS: To gain insight into the conformational fluctuations and stability of the monomeric forms of the amyloidogenic protein transthyretin (TTR), molecular dynamics unfolding simulations of the monomer of human TTR have been conducted. Trajectory data and meta-data of the wild-type (WT) protein and the highly amyloidogenic variant L55P-TTR represent the test case for the data warehouse. CONCLUSIONS: Web and grid services, especially pre-defined data mining services that can run on or 'near' the data repository of the data warehouse, are likely to play a pivotal role in the analysis of molecular dynamics unfolding data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Point and click interactions using a mouse are an integral part of computer use for current desktop systems. Compared with younger users though, older adults experience greater difficulties performing cursor positioning tasks, and this can present limitations to using a computer easily and effectively. Target expansion is a technique for improving pointing performance, where the target dynamically grows as the cursor approaches. This has the advantage that targets conserve screen real estate in their unexpanded state, yet can still provide the benefits of a larger area to click on. This paper presents two studies of target expansion with older and younger participants, involving multidirectional point-select tasks with a computer mouse. Study 1 compares static versus expanding targets, and Study 2 compares static targets with three alternative techniques for expansion. Results show that expansion can improve times by up to 14%, and reduce error rates by up to 50%. Additionally, expanding targets are beneficial even when the expansion happens late in the movement, i.e. after the cursor has reached the expanded target area or even after it has reached the original target area. Participants’ subjective feedback on the target expansion are generally favorable, and this lends further support for the technique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modeling the vertical penetration of photosynthetically active radiation (PAR) through the ocean, and its utilization by phytoplankton, is fundamental to simulating marine primary production. The variation of attenuation and absorption of light with wavelength suggests that photosynthesis should be modeled at high spectral resolution, but this is computationally expensive. To model primary production in global 3d models, a balance between computer time and accuracy is necessary. We investigate the effects of varying the spectral resolution of the underwater light field and the photosynthetic efficiency of phytoplankton (α∗), on primary production using a 1d coupled ecosystem ocean turbulence model. The model is applied at three sites in the Atlantic Ocean (CIS (∼60°N), PAP (∼50°N) and ESTOC (∼30°N)) to include the effect of different meteorological forcing and parameter sets. We also investigate three different methods for modeling α∗ – as a fixed constant, varying with both wavelength and chlorophyll concentration [Bricaud, A., Morel, A., Babin, M., Allali, K., Claustre, H., 1998. Variations of light absorption by suspended particles with chlorophyll a concentration in oceanic (case 1) waters. Analysis and implications for bio-optical models. J. Geophys. Res. 103, 31033–31044], and using a non-spectral parameterization [Anderson, T.R., 1993. A spectrally averaged model of light penetration and photosynthesis. Limnol. Oceanogr. 38, 1403–1419]. After selecting the appropriate ecosystem parameters for each of the three sites we vary the spectral resolution of light and α∗ from 1 to 61 wavebands and study the results in conjunction with the three different α∗ estimation methods. The results show modeled estimates of ocean primary productivity are highly sensitive to the degree of spectral resolution and α∗. For accurate simulations of primary production and chlorophyll distribution we recommend a spectral resolution of at least six wavebands if α∗ is a function of wavelength and chlorophyll, and three wavebands if α∗ is a fixed value.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mathematics in Defence 2011 Abstract. We review transreal arithmetic and present transcomplex arithmetic. These arithmetics have no exceptions. This leads to incremental improvements in computer hardware and software. For example, the range of real numbers, encoded by floating-point bits, is doubled when all of the Not-a-Number(NaN) states, in IEEE 754 arithmetic, are replaced with real numbers. The task of programming such systems is simplified and made safer by discarding the unordered relational operator,leaving only the operators less-than, equal-to, and greater than. The advantages of using a transarithmetic in a computation, or transcomputation as we prefer to call it, may be had by making small changes to compilers and processor designs. However, radical change is possible by exploiting the reliability of transcomputations to make pipelined dataflow machines with a large number of cores. Our initial designs are for a machine with order one million cores. Such a machine can complete the execution of multiple in-line programs each clock tick

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Long Term Evolution based networks lack native support for Circuit Switched (CS) services. The Evolved Packet System (EPS) which includes the Evolved UMTS Terrestrial Radio Access Network (E-UTRAN) and Evolved Packet Core (EPC) is a purely all-IP packet system. This introduces the problem of how to provide voice call support when a user is within an LTE network and how to ensure voice service continuity when the user moves out of LTE coverage area. Different technologies have been proposed for the purpose of providing a voice to LTE users and to ensure the service continues outside LTE networks. The aim of this paper is to analyze and evaluate the overall performance of these technologies along with Single Radio Voice Call Continuity (SRVCC) Inter-RAT handover to Universal Terrestrial Radio Access Networks/ GSM-EDGE radio access Networks (UTRAN/GERAN). The possible solutions for providing voice call and service continuity over LTE-based networks are Circuit Switched Fall Back (CSFB), Voice over LTE via Generic Access (VoLGA), Voice over LTE (VoLTE) based on IMS/MMTel with SRVCC and Over The Top (OTT) services like Skype. This paper focuses mainly on the 3GPP standard solutions to implement voice over LTE. The paper compares various aspects of these solutions and suggests a possible roadmap that mobile operators can adopt to provide seamless voice over LTE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The 3D shape of an object and its 3D location have traditionally thought of as very separate entities, although both can be described within a single 3D coordinate frame. Here, 3D shape and location are considered as two aspects of a view-based approach to representing depth, avoiding the use of 3D coordinate frames.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The set of transreal numbers is a superset of the real numbers. It totalises real arithmetic by defining division by zero in terms of three def- inite, non-finite numbers: positive infinity, negative infinity and nullity. Elsewhere, in this proceedings, we extended continuity and limits from the real domain to the transreal domain, here we extended the real derivative to the transreal derivative. This continues to demonstrate that transreal analysis contains real analysis and operates at singularities where real analysis fails. Hence computer programs that rely on computing deriva- tives { such as those used in scientific, engineering and financial applica- tions { are extended to operate at singularities where they currently fail. This promises to make software, that computes derivatives, both more competent and more reliable. We also extended the integration of absolutely convergent functions from the real domain to the transreal domain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The IEEE 754 standard for oating-point arithmetic is widely used in computing. It is based on real arithmetic and is made total by adding both a positive and a negative infinity, a negative zero, and many Not-a-Number (NaN) states. The IEEE infinities are said to have the behaviour of limits. Transreal arithmetic is total. It also has a positive and a negative infinity but no negative zero, and it has a single, unordered number, nullity. We elucidate the transreal tangent and extend real limits to transreal limits. Arguing from this firm foundation, we maintain that there are three category errors in the IEEE 754 standard. Firstly the claim that IEEE infinities are limits of real arithmetic confuses limiting processes with arithmetic. Secondly a defence of IEEE negative zero confuses the limit of a function with the value of a function. Thirdly the definition of IEEE NaNs confuses undefined with unordered. Furthermore we prove that the tangent function, with the infinities given by geometrical con- struction, has a period of an entire rotation, not half a rotation as is commonly understood. This illustrates a category error, confusing the limit with the value of a function, in an important area of applied mathe- matics { trigonometry. We brie y consider the wider implications of this category error. Another paper proposes transreal arithmetic as a basis for floating- point arithmetic; here we take the profound step of proposing transreal arithmetic as a replacement for real arithmetic to remove the possibility of certain category errors in mathematics. Thus we propose both theo- retical and practical advantages of transmathematics. In particular we argue that implementing transreal analysis in trans- floating-point arith- metic would extend the coverage, accuracy and reliability of almost all computer programs that exploit real analysis { essentially all programs in science and engineering and many in finance, medicine and other socially beneficial applications.