32 resultados para cyber-physical system (CPS)

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A full assessment of para-­virtualization is important, because without knowledge about the various overheads, users can not understand whether using virtualization is a good idea or not. In this paper we are very interested in assessing the overheads of running various benchmarks on bare-­‐metal, as well as on para-­‐virtualization. The idea is to see what the overheads of para-­‐ virtualization are, as well as looking at the overheads of turning on monitoring and logging. The knowledge from assessing various benchmarks on these different systems will help a range of users understand the use of virtualization systems. In this paper we assess the overheads of using Xen, VMware, KVM and Citrix, see Table 1. These different virtualization systems are used extensively by cloud-­‐users. We are using various Netlib1 benchmarks, which have been developed by the University of Tennessee at Knoxville (UTK), and Oak Ridge National Laboratory (ORNL). In order to assess these virtualization systems, we run the benchmarks on bare-­‐metal, then on the para-­‐virtualization, and finally we turn on monitoring and logging. The later is important as users are interested in Service Level Agreements (SLAs) used by the Cloud providers, and the use of logging is a means of assessing the services bought and used from commercial providers. In this paper we assess the virtualization systems on three different systems. We use the Thamesblue supercomputer, the Hactar cluster and IBM JS20 blade server (see Table 2), which are all servers available at the University of Reading. A functional virtualization system is multi-­‐layered and is driven by the privileged components. Virtualization systems can host multiple guest operating systems, which run on its own domain, and the system schedules virtual CPUs and memory within each Virtual Machines (VM) to make the best use of the available resources. The guest-­‐operating system schedules each application accordingly. You can deploy virtualization as full virtualization or para-­‐virtualization. Full virtualization provides a total abstraction of the underlying physical system and creates a new virtual system, where the guest operating systems can run. No modifications are needed in the guest OS or application, e.g. the guest OS or application is not aware of the virtualized environment and runs normally. Para-­‐virualization requires user modification of the guest operating systems, which runs on the virtual machines, e.g. these guest operating systems are aware that they are running on a virtual machine, and provide near-­‐native performance. You can deploy both para-­‐virtualization and full virtualization across various virtualized systems. Para-­‐virtualization is an OS-­‐assisted virtualization; where some modifications are made in the guest operating system to enable better performance. In this kind of virtualization, the guest operating system is aware of the fact that it is running on the virtualized hardware and not on the bare hardware. In para-­‐virtualization, the device drivers in the guest operating system coordinate the device drivers of host operating system and reduce the performance overheads. The use of para-­‐virtualization [0] is intended to avoid the bottleneck associated with slow hardware interrupts that exist when full virtualization is employed. It has revealed [0] that para-­‐ virtualization does not impose significant performance overhead in high performance computing, and this in turn this has implications for the use of cloud computing for hosting HPC applications. The “apparent” improvement in virtualization has led us to formulate the hypothesis that certain classes of HPC applications should be able to execute in a cloud environment, with minimal performance degradation. In order to support this hypothesis, first it is necessary to define exactly what is meant by a “class” of application, and secondly it will be necessary to observe application performance, both within a virtual machine and when executing on bare hardware. A further potential complication is associated with the need for Cloud service providers to support Service Level Agreements (SLA), so that system utilisation can be audited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time correlation functions yield profound information about the dynamics of a physical system and hence are frequently calculated in computer simulations. For systems whose dynamics span a wide range of time, currently used methods require significant computer time and memory. In this paper, we discuss the multiple-tau correlator method for the efficient calculation of accurate time correlation functions on the fly during computer simulations. The multiple-tau correlator is efficacious in terms of computational requirements and can be tuned to the desired level of accuracy. Further, we derive estimates for the error arising from the use of the multiple-tau correlator and extend it for use in the calculation of mean-square particle displacements and dynamic structure factors. The method described here, in hardware implementation, is routinely used in light scattering experiments but has not yet found widespread use in computer simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hamiltonian dynamics describes the evolution of conservative physical systems. Originally developed as a generalization of Newtonian mechanics, describing gravitationally driven motion from the simple pendulum to celestial mechanics, it also applies to such diverse areas of physics as quantum mechanics, quantum field theory, statistical mechanics, electromagnetism, and optics – in short, to any physical system for which dissipation is negligible. Dynamical meteorology consists of the fundamental laws of physics, including Newton’s second law. For many purposes, diabatic and viscous processes can be neglected and the equations are then conservative. (For example, in idealized modeling studies, dissipation is often only present for numerical reasons and is kept as small as possible.) In such cases dynamical meteorology obeys Hamiltonian dynamics. Even when nonconservative processes are not negligible, it often turns out that separate analysis of the conservative dynamics, which fully describes the nonlinear interactions, is essential for an understanding of the complete system, and the Hamiltonian description can play a useful role in this respect. Energy budgets and momentum transfer by waves are but two examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the U.K., dental students require to perform training and practice on real human tissues at the very early stage of their courses. Currently, the human tissues, such as decayed teeth, are mounted in a human head like physical model. The problems with these models in teaching are; (1) every student operates on tooth, which are always unique; (2) the process cannot be recorded for examination purposes and (3) same training are not repeatable. The aim of the PHATOM Project is to develop a dental training system using Haptic technology. This paper documents the project background, specification, research and development of the first prototype system. It also discusses the research in the visual display, haptic devices and haptic rendering. This includes stereo vision, motion parallax, volumetric modelling, surface remapping algorithms as well as analysis design of the system. A new volumetric to surface model transformation algorithm is also introduced. This paper includes the future work on the system development and research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

North African dust is important for climate through its direct radiative effect on solar and terrestrial radiation and its role in the biogeochemical system. The Dust Outflow and Deposition to the Ocean project (DODO) aimed to characterize the physical and optical properties of airborne North African dust in two seasons and to use these observations to constrain model simulations, with the ultimate aim of being able to quantify the deposition of iron to the North Atlantic Ocean. The in situ properties of dust from airborne campaigns measured during February and August 2006, based at Dakar, Senegal, are presented here. Average values of the single scattering albedo (0.99, 0.98), mass specific extinction (0.85 m^2 g^-1 , 1.14 m^2 g^-1 ), asymmetry parameter (0.68, 0.68), and refractive index (1.53--0.0005i,1.53--0.0014i) for the accumulation mode were found to differ by varying degrees between the dry and wet season, respectively. It is hypothesized that these differences are due to different source regions and transport processes which also differ between the DODO campaigns. Elemental ratios of Ca/Al were found to differ between the dry and wet season (1.1 and 0.5, respectively). Differences in vertical profiles are found between seasons and between land and ocean locations and reflect the different dynamics of the seasons. Using measurements of the coarse mode size distribution and illustrative Mie calculations, the optical properties are found to be very sensitive to the presence and amount of coarse mode of mineral dust, and the importance of accurate measurements of the coarse mode of dust is highlighted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although climate models have been improving in accuracy and efficiency over the past few decades, it now seems that these incremental improvements may be slowing. As tera/petascale computing becomes massively parallel, our legacy codes are less suitable, and even with the increased resolution that we are now beginning to use, these models cannot represent the multiscale nature of the climate system. This paper argues that it may be time to reconsider the use of adaptive mesh refinement for weather and climate forecasting in order to achieve good scaling and representation of the wide range of spatial scales in the atmosphere and ocean. Furthermore, the challenge of introducing living organisms and human responses into climate system models is only just beginning to be tackled. We do not yet have a clear framework in which to approach the problem, but it is likely to cover such a huge number of different scales and processes that radically different methods may have to be considered. The challenges of multiscale modelling and petascale computing provide an opportunity to consider a fresh approach to numerical modelling of the climate (or Earth) system, which takes advantage of the computational fluid dynamics developments in other fields and brings new perspectives on how to incorporate Earth system processes. This paper reviews some of the current issues in climate (and, by implication, Earth) system modelling, and asks the question whether a new generation of models is needed to tackle these problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An investigation is made of the impact of a full linearized physical (moist) parameterization package on extratropical singular vectors (SVs) using the ECMWF integrated forecasting system (IFS). Comparison is made for one particular period with a dry physical package including only vertical diffusion and surface drag. The crucial extra ingredient in the full package is found to be the large-scale latent heat release. Consistent with basic theory, its inclusion results in a shift to smaller horizontal scales and enhanced growth for the SVs. Whereas, for the dry SVs, T42 resolution is sufficient, the moist SVs require T63 to resolve their structure and growth. A 24-h optimization time appears to be appropriate for the moist SVs because of the larger growth of moist SVs compared with dry SVs. Like dry SVs, moist SVs tend to occur in regions of high baroclinicity, but their location is also influenced by the availability of moisture. The most rapidly growing SVs appear to enhance or reduce large-scale rain in regions ahead of major cold fronts. The enhancement occurs in and ahead of a cyclonic perturbation and the reduction in and ahead of an anticyclonic perturbation. Most of the moist SVs for this situation are slightly modified versions of the dry SVs. However, some occur in new locations and have particularly confined structures. The most rapidly growing SV is shown to exhibit quite linear behavior in the nonlinear model as it grows from 0.5 to 12 hPa in 1 day. For 5 times this amplitude the structure is similar but the growth is about half as the perturbation damps a potential vorticity (PV) trough or produces a cutoff, depending on its sign.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A methodology is presented for the development of a combined seasonal weather and crop productivity forecasting system. The first stage of the methodology is the determination of the spatial scale(s) on which the system could operate; this determination has been made for the case of groundnut production in India. Rainfall is a dominant climatic determinant of groundnut yield in India. The relationship between yield and rainfall has been explored using data from 1966 to 1995. On the all-India scale, seasonal rainfall explains 52% of the variance in yield. On the subdivisional scale, correlations vary between variance r(2) = 0.62 (significance level p < 10(-4)) and a negative correlation with r(2) = 0.1 (p = 0.13). The spatial structure of the relationship between rainfall and groundnut yield has been explored using empirical orthogonal function (EOF) analysis. A coherent, large-scale pattern emerges for both rainfall and yield. On the subdivisional scale (similar to 300 km), the first principal component (PC) of rainfall is correlated well with the first PC of yield (r(2) = 0.53, p < 10(-4)), demonstrating that the large-scale patterns picked out by the EOFs are related. The physical significance of this result is demonstrated. Use of larger averaging areas for the EOF analysis resulted in lower and (over time) less robust correlations. Because of this loss of detail when using larger spatial scales, the subdivisional scale is suggested as an upper limit on the spatial scale for the proposed forecasting system. Further, district-level EOFs of the yield data demonstrate the validity of upscaling these data to the subdivisional scale. Similar patterns have been produced using data on both of these scales, and the first PCs are very highly correlated (r(2) = 0.96). Hence, a working spatial scale has been identified, typical of that used in seasonal weather forecasting, that can form the basis of crop modeling work for the case of groundnut production in India. Last, the change in correlation between yield and seasonal rainfall during the study period has been examined using seasonal totals and monthly EOFs. A further link between yield and subseasonal variability is demonstrated via analysis of dynamical data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents a prototype model based on a wireless sensor actuator network (WSAN) aimed at optimizing both energy consumption of environmental systems and well-being of occupants in buildings. The model is a system consisting of the following components: a wireless sensor network, `sense diaries', environmental systems such as heating, ventilation and air-conditioning systems, and a central computer. A multi-agent system (MAS) is used to derive and act on the preferences of the occupants. Each occupant is represented by a personal agent in the MAS. The sense diary is a new device designed to elicit feedback from occupants about their satisfaction with the environment. The roles of the components are: the WSAN collects data about physical parameters such as temperature and humidity from an indoor environment; the central computer processes the collected data; the sense diaries leverage trade-offs between energy consumption and well-being, in conjunction with the agent system; and the environmental systems control the indoor environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes an application of computers to a consumer-based production engineering environment. Particular consideration is given to the utilisation of low-cost computer systems for the visual inspection of components on a production line in real time. The process of installation is discussed, from identifying the need for artificial vision and justifying the cost, through to choosing a particular system and designing the physical and program structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Researchers in the rehabilitation engineering community have been designing and developing a variety of passive/active devices to help persons with limited upper extremity function to perform essential daily manipulations. Devices range from low-end tools such as head/mouth sticks to sophisticated robots using vision and speech input. While almost all of the high-end equipment developed to date relies on visual feedback alone to guide the user providing no tactile or proprioceptive cues, the “low-tech” head/mouth sticks deliver better “feel” because of the inherent force feedback through physical contact with the user's body. However, the disadvantage of a conventional head/mouth stick is that it can only function in a limited workspace and the performance is limited by the user's strength. It therefore seems reasonable to attempt to develop a system that exploits the advantages of the two approaches: the power and flexibility of robotic systems with the sensory feedback of a headstick. The system presented in this paper reflects the design philosophy stated above. This system contains a pair of master-slave robots with the master being operated by the user's head and the slave acting as a telestick. Described in this paper are the design, control strategies, implementation and performance evaluation of the head-controlled force-reflecting telestick system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use ellipsometry to investigate a transition in the morphology of a sphere-forming diblock copolymer thin-film system. At an interface the diblock morphology may differ from the bulk when the interfacial tension favours wetting of the minority domain, thereby inducing a sphere-to-lamella transition. In a small, favourable window in energetics, one may observe this transition simply by adjusting the temperature. Ellipsometry is ideally suited to the study of the transition because the additional interface created by the wetting layer affects the polarisation of light reflected from the sample. Here we study thin films of poly(butadiene-ethylene oxide) (PB-PEO), which order to form PEO minority spheres in a PB matrix. As temperature is varied, the reversible transition from a partially wetting layer of PEO spheres to a full wetting layer at the substrate is investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review the proposal of the International Committee for Weights and Measures (Comité International des Poids et Mesures, CIPM), currently being considered by the General Conference on Weights and Measures (Conférences Générales des Poids et Mesures, CGPM), to revise the International System of Units (Le Système International d’Unitès, SI). The proposal includes new definitions for four of the seven base units of the SI, and a new form of words to present the definitions of all the units. The objective of the proposed changes is to adopt definitions referenced to constants of nature, taken in the widest sense, so that the definitions may be based on what are believed to be true invariants. In particular, whereas in the current SI the kilogram, ampere, kelvin and mole are linked to exact numerical values of the mass of the international prototype of the kilogram, the magnetic constant (permeability of vacuum), the triple-point temperature of water and the molar mass of carbon-12, respectively, in the new SI these units are linked to exact numerical values of the Planck constant, the elementary charge, the Boltzmann constant and the Avogadro constant, respectively. The new wording used expresses the definitions in a simple and unambiguous manner without the need for the distinction between base and derived units. The importance of relations among the fundamental constants to the definitions, and the importance of establishing a mise en pratique for the realization of each definition, are also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Starting from the classical Saltzman two-dimensional convection equations, we derive via a severe spectral truncation a minimal 10 ODE system which includes the thermal effect of viscous dissipation. Neglecting this process leads to a dynamical system which includes a decoupled generalized Lorenz system. The consideration of this process breaks an important symmetry and couples the dynamics of fast and slow variables, with the ensuing modifications to the structural properties of the attractor and of the spectral features. When the relevant nondimensional number (Eckert number Ec) is different from zero, an additional time scale of O(Ec−1) is introduced in the system, as shown with standard multiscale analysis and made clear by several numerical evidences. Moreover, the system is ergodic and hyperbolic, the slow variables feature long-term memory with 1/f3/2 power spectra, and the fast variables feature amplitude modulation. Increasing the strength of the thermal-viscous feedback has a stabilizing effect, as both the metric entropy and the Kaplan-Yorke attractor dimension decrease monotonically with Ec. The analyzed system features very rich dynamics: it overcomes some of the limitations of the Lorenz system and might have prototypical value in relevant processes in complex systems dynamics, such as the interaction between slow and fast variables, the presence of long-term memory, and the associated extreme value statistics. This analysis shows how neglecting the coupling of slow and fast variables only on the basis of scale analysis can be catastrophic. In fact, this leads to spurious invariances that affect essential dynamical properties (ergodicity, hyperbolicity) and that cause the model losing ability in describing intrinsically multiscale processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an outlook on the climate system thermodynamics. First, we construct an equivalent Carnot engine with efficiency and frame the Lorenz energy cycle in a macroscale thermodynamic context. Then, by exploiting the second law, we prove that the lower bound to the entropy production is times the integrated absolute value of the internal entropy fluctuations. An exergetic interpretation is also proposed. Finally, the controversial maximum entropy production principle is reinterpreted as requiring the joint optimization of heat transport and mechanical work production. These results provide tools for climate change analysis and for climate models’ validation.