877 resultados para 190202 Computer Gaming and Animation
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them also involves complicated workflows implemented as shell scripts. A new grid middleware system that is well suited to climate modelling applications is presented in this paper. Grid Remote Execution (G-Rex) allows climate models to be deployed as Web services on remote computer systems and then launched and controlled as if they were running on the user's own computer. Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model. G-Rex has a REST architectural style, featuring a Java client program that can easily be incorporated into existing scientific workflow scripts. Some technical details of G-Rex are presented, with examples of its use by climate modellers.
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.
Resumo:
These notes have been issued on a small scale in 1983 and 1987 and on request at other times. This issue follows two items of news. First, WaIter Colquitt and Luther Welsh found the 'missed' Mersenne prime M110503 and advanced the frontier of complete Mp-testing to 139,267. In so doing, they terminated Slowinski's significant string of four consecutive Mersenne primes. Secondly, a team of five established a non-Mersenne number as the largest known prime. This result terminated the 1952-89 reign of Mersenne primes. All the original Mersenne numbers with p < 258 were factorised some time ago. The Sandia Laboratories team of Davis, Holdridge & Simmons with some little assistance from a CRAY machine cracked M211 in 1983 and M251 in 1984. They contributed their results to the 'Cunningham Project', care of Sam Wagstaff. That project is now moving apace thanks to developments in technology, factorisation and primality testing. New levels of computer power and new computer architectures motivated by the open-ended promise of parallelism are now available. Once again, the suppliers may be offering free buildings with the computer. However, the Sandia '84 CRAY-l implementation of the quadratic-sieve method is now outpowered by the number-field sieve technique. This is deployed on either purpose-built hardware or large syndicates, even distributed world-wide, of collaborating standard processors. New factorisation techniques of both special and general applicability have been defined and deployed. The elliptic-curve method finds large factors with helpful properties while the number-field sieve approach is breaking down composites with over one hundred digits. The material is updated on an occasional basis to follow the latest developments in primality-testing large Mp and factorising smaller Mp; all dates derive from the published literature or referenced private communications. Minor corrections, additions and changes merely advance the issue number after the decimal point. The reader is invited to report any errors and omissions that have escaped the proof-reading, to answer the unresolved questions noted and to suggest additional material associated with this subject.
Resumo:
Landscape narrative, combining landscape and narrative, has been employed to create storytelling layouts and interpretive information in some famous botanic gardens. In order to assess the educational effectiveness of using "landscape narrative" in landscape design, the Heng-Chun Tropical Botanical Garden in Taiwan was chosen as research target for an empirical study. Based on cognitive theory and the affective responses of environmental psychology, computer simulations and video recordings were used to create five themed display areas with landscape narrative elements. Two groups of pupils watched simulated films. The pupils were then given an evaluation test and questionnaire, to determine the effectiveness of the landscape narrative. When the content was well associated and matched with the narrative landscape, the comprehension and retention of content was increased significantly. The results also indicated that visual preference of narrative landscape scenes was increased. This empirical study can be regarded as a successful model of integrating landscape narrative and interpretation practice that can be applied to the design of new theme displays in botanic gardens to improve both the effectiveness of interpretation plans and the visual preference of visitors. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
Procurement is one of major business operations in public service sector. The advance of information and communication technology (ICT) pushes this business operation to increase its efficiency and foster collaborations between the organization and its suppliers. This leads to a shift from the traditional procurement transactions to an e-procurement paradigm. Such change impacts on business process, information management and decision making. E-procurement involves various stakeholders who engage in activities based on different social and cultural practices. Therefore, a design of e-procurement system may involve complex situations analysis. This paper describes an approach of using the problem articulation method to support such analysis. This approach is applied to a case study from UAE.
Resumo:
We present three components of a virtual research environment developed for the ongoing Roman excavation at Silchester. These components — Recycle Bridge, XDB cross-database search, and Arch3D — provide additional services around the existing core of the system, run on the Integrated Archaeological Database (IADB). They provide, respectively, embedding of legacy applications into portals, cross-database searching, and 3D visualisation of stratigraphic information.
Resumo:
The work reported in this paper proposes a novel synergy between parallel computing and swarm robotics to offer a new computing paradigm, 'swarm-array computing' that can harness and apply autonomic computing for parallel computing systems. One approach among three proposed approaches in swarm-array computing based on landscapes of intelligent cores, in which the cores of a parallel computing system are abstracted to swarm agents, is investigated. A task is executed and transferred seamlessly between cores in the proposed approach thereby achieving self-ware properties that characterize autonomic computing. FPGAs are considered as an experimental platform taking into account its application in space robotics. The feasibility of the proposed approach is validated on the SeSAm multi-agent simulator.
Resumo:
Ubiquitous healthcare is an emerging area of technology that uses a large number of environmental and patient sensors and actuators to monitor and improve patients’ physical and mental condition. Tiny sensors gather data on almost any physiological characteristic that can be used to diagnose health problems. This technology faces some challenging ethical questions, ranging from the small-scale individual issues of trust and efficacy to the societal issues of health and longevity gaps related to economic status. It presents particular problems in combining developing computer/information/media ethics with established medical ethics. This article describes a practice-based ethics approach, considering in particular the areas of privacy, agency, equity and liability. It raises questions that ubiquitous healthcare will force practitioners to face as they develop ubiquitous healthcare systems. Medicine is a controlled profession whose practise is commonly restricted by government-appointed authorities, whereas computer software and hardware development is notoriously lacking in such regimes.
Resumo:
Changes in atmospheric temperature have a particular importance in climate research because climate models consistently predict a distinctive vertical profile of trends. With increasing greenhouse gas concentrations, the surface and troposphere are consistently projected to warm, with an enhancement of that warming in the tropical upper troposphere. Hence, attempts to detect this distinct ‘fingerprint’ have been a focus for observational studies. The topic acquired heightened importance following the 1990 publication of an analysis of satellite data which challenged the reality of the projected tropospheric warming. This review documents the evolution over the last four decades of understanding of tropospheric temperature trends and their likely causes. Particular focus is given to the difficulty of producing homogenized datasets, with which to derive trends, from both radiosonde and satellite observing systems, because of the many systematic changes over time. The value of multiple independent analyses is demonstrated. Paralleling developments in observational datasets, increased computer power and improved understanding of climate forcing mechanisms have led to refined estimates of temperature trends from a wide range of climate models and a better understanding of internal variability. It is concluded that there is no reasonable evidence of a fundamental disagreement between tropospheric temperature trends from models and observations when uncertainties in both are treated comprehensively
Resumo:
A unique parameterization of the perspective projections in all whole-numbered dimensions is reported. The algorithm for generating a perspective transformation from parameters and for recovering parameters from a transformation is a modification of the Givens orthogonalization algorithm. The algorithm for recovering a perspective transformation from a perspective projection is a modification of Roberts' classical algorithm. Both algorithms have been implemented in Pop-11 with call-out to the NAG Fortran libraries. Preliminary monte-carlo tests show that the transformation algorithm is highly accurate, but that the projection algorithm cannot recover magnitude and shear parameters accurately. However, there is reason to believe that the projection algorithm might improve significantly with the use of many corresponding points, or with multiple perspective views of an object. Previous parameterizations of the perspective transformations in the computer graphics and computer vision literature are discussed.