32 resultados para Computer systems organization: general-emerging technologies

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In search of better, traditional learning universities have expanded their ways to deliver knowledge and integrate cost effective e-learning systems. Universities’ use of information and communication technologies has grown tremendously over the last decade. To ensure efficient use of the e-learning system, the Arab Open University (AOU) in Bahrain was the first to use e-learning system there, aimed to evaluate the good and bad practices, detect errors and determine areas for further improvements in usage. This study critically evaluated the students’ perception of the elearning system in Bahrain and recommended changes to improve students’ e-learning usage. Results of the study indicated that, in general, students have favourable perceptions toward using the e-learning system. This study has shown that technology acceptance is the most variable, factor that contributes to students’ perception and satisfaction of the e-learning system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Organizational issues are inhibiting the implementation and strategic use of information technologies (IT) in the construction sector. This paper focuses on these issues and explores processes by which emerging technologies can be introduced into construction organizations. The paper is based on a case study, conducted in a major house building company that was implementing a virtual reality (VR) system for internal design review in the regional offices. Interviews were conducted with different members of the organization to explore the introduction process and the use of the system. The case study findings provide insight into the process of change, the constraints that inhibit IT implementation and the relationship between new technology and work patterns within construction organizations. They suggest that (1) user-developer communications are critical for the successful implementation of non-diffused innovations in the construction industry; and (2) successful uptake of IT requires both strategic decision-making by top management and decision-making by technical managers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: To determine the prevalence and nature of prescribing errors in general practice; to explore the causes, and to identify defences against error. Methods: 1) Systematic reviews; 2) Retrospective review of unique medication items prescribed over a 12 month period to a 2% sample of patients from 15 general practices in England; 3) Interviews with 34 prescribers regarding 70 potential errors; 15 root cause analyses, and six focus groups involving 46 primary health care team members Results: The study involved examination of 6,048 unique prescription items for 1,777 patients. Prescribing or monitoring errors were detected for one in eight patients, involving around one in 20 of all prescription items. The vast majority of the errors were of mild to moderate severity, with one in 550 items being associated with a severe error. The following factors were associated with increased risk of prescribing or monitoring errors: male gender, age less than 15 years or greater than 64 years, number of unique medication items prescribed, and being prescribed preparations in the following therapeutic areas: cardiovascular, infections, malignant disease and immunosuppression, musculoskeletal, eye, ENT and skin. Prescribing or monitoring errors were not associated with the grade of GP or whether prescriptions were issued as acute or repeat items. A wide range of underlying causes of error were identified relating to the prescriber, patient, the team, the working environment, the task, the computer system and the primary/secondary care interface. Many defences against error were also identified, including strategies employed by individual prescribers and primary care teams, and making best use of health information technology. Conclusion: Prescribing errors in general practices are common, although severe errors are unusual. Many factors increase the risk of error. Strategies for reducing the prevalence of error should focus on GP training, continuing professional development for GPs, clinical governance, effective use of clinical computer systems, and improving safety systems within general practices and at the interface with secondary care.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is a growing need for massive computational resources for the analysis of new astronomical datasets. To tackle this problem, we present here our first steps towards marrying two new and emerging technologies; the Virtual Observatory (e.g, AstroGrid) and the computa- tional grid (e.g. TeraGrid, COSMOS etc.). We discuss the construction of VOTechBroker, which is a modular software tool designed to abstract the tasks of submission and management of a large number of compu- tational jobs to a distributed computer system. The broker will also interact with the AstroGrid workflow and MySpace environments. We discuss our planned usages of the VOTechBroker in computing a huge number of n–point correlation functions from the SDSS data and mas- sive model-fitting of millions of CMBfast models to WMAP data. We also discuss other applications including the determination of the XMM Cluster Survey selection function and the construction of new WMAP maps.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Buildings affect people in various ways. They can help us to work more effectively; they also present a wide range of stimuli for our senses to react to. Intelligent buildings are designed to be aesthetic in sensory terms not just visually appealing but ones in which occupants experience delight, freshness, airiness, daylight, views out and social ambience. All these factors contribute to a general aesthetic which gives pleasure and affects one’s mood. If there is to be a common vision, it is essential for architects, engineers and clients to work closely together throughout the planning, design, construction and operational stages which represent the conception, birth and life of the building. There has to be an understanding of how patterns of work are best suited to a particular building form served by appropriate environmental systems. A host of technologies are emerging that help these processes, but in the end it is how we think about achieving responsive buildings that matters. Intelligent buildings should cope with social and technological changes and also be adaptable to short-term and long-term human needs. We live through our senses. They rely on stimulation from the tasks we are focused on; people around us but also the physical environment. We breathe air and its quality affects the olfactory system; temperature is felt by thermoreceptors in the skin; sound enters our ears; the visual scene is beheld by our eyes. All these stimuli are transmitted along the sensory nervous system to the brain for processing from which physiological and psychological reactions and judgments are formed depending on perception, expectancies and past experiences. It is clear that the environmental setting plays a role in this sensory process. This is the essence of sensory design. Space plays its part as well. The flow of communication is partly electronic but also largely by people meeting face to face. Our sense of space wants different things at different times. Sometimes privacy but other times social needs have to be satisfied besides the organizational requirement to have effective human communications throughout the building. In general if the senses are satisfied people feel better and work better.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

How can a bridge be built between autonomic computing approaches and parallel computing systems? How can autonomic computing approaches be extended towards building reliable systems? How can existing technologies be merged to provide a solution for self-managing systems? The work reported in this paper aims to answer these questions by proposing Swarm-Array Computing, a novel technique inspired from swarm robotics and built on the foundations of autonomic and parallel computing paradigms. Two approaches based on intelligent cores and intelligent agents are proposed to achieve autonomy in parallel computing systems. The feasibility of the proposed approaches is validated on a multi-agent simulator.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many scientific and engineering applications involve inverting large matrices or solving systems of linear algebraic equations. Solving these problems with proven algorithms for direct methods can take very long to compute, as they depend on the size of the matrix. The computational complexity of the stochastic Monte Carlo methods depends only on the number of chains and the length of those chains. The computing power needed by inherently parallel Monte Carlo methods can be satisfied very efficiently by distributed computing technologies such as Grid computing. In this paper we show how a load balanced Monte Carlo method for computing the inverse of a dense matrix can be constructed, show how the method can be implemented on the Grid, and demonstrate how efficiently the method scales on multiple processors. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The P-found protein folding and unfolding simulation repository is designed to allow scientists to perform data mining and other analyses across large, distributed simulation data sets. There are two storage components in P-found: a primary repository of simulation data that is used to populate the second component, and a data warehouse that contains important molecular properties. These properties may be used for data mining studies. Here we demonstrate how grid technologies can support multiple, distributed P-found installations. In particular, we look at two aspects: firstly, how grid data management technologies can be used to access the distributed data warehouses; and secondly, how the grid can be used to transfer analysis programs to the primary repositories — this is an important and challenging aspect of P-found, due to the large data volumes involved and the desire of scientists to maintain control of their own data. The grid technologies we are developing with the P-found system will allow new large data sets of protein folding simulations to be accessed and analysed in novel ways, with significant potential for enabling scientific discovery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The construction industry is widely being criticised as a fragmented industry. There are mounting calls for the industry to change. The espoused change calls for collaboration as well as embracing innovation in the process of design, construction and across the supply chain. Innovation and the application of emerging technologies are seen as enablers for integrating the processes ‘integrating the team’ such as building information modelling (BIM). A questionnaire survey was conducted to ascertain change in construction with regard to design management, innovation and the application of BIM as cutting edge pathways for collaboration. The respondents to the survey were from an array of designations across the construction industry such as construction managers, designers, engineers, design coordinators, design managers, architects, architectural technologists and surveyors. There was a general agreement by most respondents that the design team was responsible for design management in their organisation. There is a perception that the design manager and the client are the catalyst for advancing innovation. The current state of industry in terms of incorporating BIM technologies is posing a challenge as well as providing an opportunity for accomplishment. BIM technologies provide a new paradigm shift in the way buildings are designed, constructed and maintained. This paradigm shift calls for rethinking the curriculum for educating building professionals, collectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We outline our first steps towards marrying two new and emerging technologies; the Virtual Observatory (e.g, Astro- Grid) and the computational grid. We discuss the construction of VOTechBroker, which is a modular software tool designed to abstract the tasks of submission and management of a large number of computational jobs to a distributed computer system. The broker will also interact with the AstroGrid workflow and MySpace environments. We present our planned usage of the VOTechBroker in computing a huge number of n–point correlation functions from the SDSS, as well as fitting over a million CMBfast models to the WMAP data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Body Sensor Networks (BSNs) have been recently introduced for the remote monitoring of human activities in a broad range of application domains, such as health care, emergency management, fitness and behaviour surveillance. BSNs can be deployed in a community of people and can generate large amounts of contextual data that require a scalable approach for storage, processing and analysis. Cloud computing can provide a flexible storage and processing infrastructure to perform both online and offline analysis of data streams generated in BSNs. This paper proposes BodyCloud, a SaaS approach for community BSNs that supports the development and deployment of Cloud-assisted BSN applications. BodyCloud is a multi-tier application-level architecture that integrates a Cloud computing platform and BSN data streams middleware. BodyCloud provides programming abstractions that allow the rapid development of community BSN applications. This work describes the general architecture of the proposed approach and presents a case study for the real-time monitoring and analysis of cardiac data streams of many individuals.