24 resultados para new degree program

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The measurement of public attitudes towards the criminal law has become an important area of research in recent years because of the perceived desirability of ensuring that the legal system reflects broader societal values. In particular, studies into public perceptions of crime seriousness have attempted to measure the degree of concordance that exists between law and public opinion in the organization and enforcement of criminal offences. These understandings of perceived crime seriousness are particularly important in relation to high-profile issues where public confidence in the law is central to the legal agenda, such as the enforcement of work-related fatality cases. A need to respond to public concern over this issue was cited as a primary justification for the introduction of the Corporate Manslaughter and Corporate Homicide Act 2007. This article will suggest that, although literature looking at the perceived seriousness of corporate crime and, particularly, health and safety offences is limited in volume and generalist in scope, important lessons can be gleaned from existing literature, and pressing questions are raised that demand further empirical investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The atmospheric composition of West Africa reflects the interaction of various dynamical and chemical systems (i.e. biogenic, urban, convective and long-range transport) with signatures from local to continental scales. Recent measurements performed during the African Monsoon Multidisciplinary Analyses (AMMA) observational periods in 2005 and 2006 provide new data which has allowed new insight into the processes within these systems that control the distribution of ozone and its precursors. Using these new data and recently published results, we provide an overview of these systems with a particular emphasis on ozone distributions over West Africa during the wet season.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A 2D porous material, Cu-3(tmen)(3)(tma)(2)(H2O)(2)(.)6.5H(2)O [tmen = N,N,N',N'-tetramethylethane-1,2-diamine; tmaH(3) = 1,3,5-benzenetricarboxylic acid/trimesic acid], has been synthesized and characterized by X-ray single crystal analysis, variable temperature magnetic measurements, IR spectra and XRPD pattern. The complex consists of 2D layers built by three crystallographically independent Cu(tmen) moieties bridged by tma anions. Of the three copper ions, Cu(1) and Cu(2) present distorted square pyramidal coordination geometry, while the third exhibits a severely distorted octahedral environment. The Cu(1)(tmen) and Cu(2)(tmen) building blocks bridged by tma anions give rise to chains with a zig-zag motif, which are cross-connected by Cu(3)(tmen)-tma polymers sharing metal ions Cu(2) through pendant tma carboxylates. The resulting 2D architecture extends in the crystallographic ab-plane. The adjacent sheets are embedded through the Cu(3)(tmen) tma chains, leaving H2O-filled channels. There are 6.5 lattice water molecules per formula unit, some of which are disordered. Upon heating, the lattice water molecules get eliminated without destroying the crystal morphology and the compound rehydrated reversibly on exposure to humid atmosphere. Magnetic data of the complex have been fitted considering isolated irregular Cu-3 triangles (three different J parameters) by applying the CLUMAG program. The best fit indicates three close comparable J parameters and very weak antiferromagnetic interactions are operative between the metal centers. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel series of polyaromatic ionomers with similar equivalent weights but very different sulphonic acid distributions along the ionomer backbone has been designed and prepared. By synthetically organising the sequence-distribution so that it consists of fully defined ionic segments (containing singlets, doublets or quadruplets of sulphonic acid groups) alternating strictly with equally well-defined nonionic spacer segments, a new class of polymers which may be described as microblock ionomers has been developed. These materials exhibit very different properties and morphologies from analogous randomly substituted systems. Progressively extending the nonionic spacer length in the repeat unit (maintaining a constant equivalent weight by increasing the degree of sulphonation. of the ionic segment) leads to an increasing degree of nanophase separation between hydrophilic and hydrophobic domains in these materials. Membranes cast from ionomers with the more highly phase-separated morphologies show significantly higher onset temperatures for uncontrolled swelling in water. This new type of ionomer design has enabled the fabrication of swelling-resistant hydrocarbon membranes, suitable for fuel cell operation, with very much higher ion exchange capacities (>2 meq g(-1)) than those previously reported in the literature. When tested in a fuel cell at high temperature (120 degrees C) and low relative humidity (35% RH), the best microblock membrane matched the performance of Nafion 112. Moreover, comparative low load cycle testing of membrane -electrode assemblies suggests that the durability of the new membranes under conditions of high temperature and low relative humidity is superior to that of conventional perfluorinated materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Once unit-cell dimensions have been determined from a powder diffraction data set and therefore the crystal system is known (e.g. orthorhombic), the method presented by Markvardsen, David, Johnson & Shankland [Acta Cryst. (2001), A57, 47-54] can be used to generate a table ranking the extinction symbols of the given crystal system according to probability. Markvardsen et al. tested a computer program (ExtSym) implementing the method against Pawley refinement outputs generated using the TF12LS program [David, Ibberson & Matthewman (1992). Report RAL-92-032. Rutherford Appleton Laboratory, Chilton, Didcot, Oxon, UK]. Here, it is shown that ExtSym can be used successfully with many well known powder diffraction analysis packages, namely DASH [David, Shankland, van de Streek, Pidcock, Motherwell & Cole (2006). J. Appl. Cryst. 39, 910-915], FullProf [Rodriguez-Carvajal (1993). Physica B, 192, 55-69], GSAS [Larson & Von Dreele (1994). Report LAUR 86-748. Los Alamos National Laboratory, New Mexico, USA], PRODD [Wright (2004). Z. Kristallogr. 219, 1-11] and TOPAS [Coelho (2003). Bruker AXS GmbH, Karlsruhe, Germany]. In addition, a precise description of the optimal input for ExtSym is given to enable other software packages to interface with ExtSym and to allow the improvement/modification of existing interfacing scripts. ExtSym takes as input the powder data in the form of integrated intensities and error estimates for these intensities. The output returned by ExtSym is demonstrated to be strongly dependent on the accuracy of these error estimates and the reason for this is explained. ExtSym is tested against a wide range of data sets, confirming the algorithm to be very successful at ranking the published extinction symbol as the most likely. (C) 2008 International Union of Crystallography Printed in Singapore - all rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DFT and TD-DFT calculations (ADF program) were performed in order to analyze the electronic structure of the [M-3(CO)(12)] clusters (M = Ru, Os) and interpret their electronic spectra. The highest occupied molecular orbitals are M-M bonding (sigma) involving different M-M bonds, both for Ru and Os. They participate in low-energy excitation processes and their depopulation should weaken M-M bonds in general. While the LUMO is M-NI and M-CO anti-bonding (sigma*), the next, higher-lying empty orbitals have a main contribution from CO (pi*) and either a small (Ru) or an almost negligible one (Os) from the metal atoms. The main difference between the two clusters comes from the different nature of these low-energy unoccupied orbitals that have a larger metal contribution in the case of ruthenium. The photochemical reactivity of the two clusters is reexamined and compared to earlier interpretations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Government targets for CO2 reductions are being progressively tightened, the Climate Change Act set the UK target as an 80% reduction by 2050 on 1990 figures. The residential sector accounts for about 30% of emissions. This paper discusses current modelling techniques in the residential sector: principally top-down and bottom-up. Top-down models work on a macro-economic basis and can be used to consider large scale economic changes; bottom-up models are detail rich to model technological changes. Bottom-up models demonstrate what is technically possible. However, there are differences between the technical potential and what is likely given the limited economic rationality of the typical householder. This paper recommends research to better understand individuals’ behaviour. Such research needs to include actual choices, stated preferences and opinion research to allow a detailed understanding of the individual end user. This increased understanding can then be used in an agent based model (ABM). In an ABM, agents are used to model real world actors and can be given a rule set intended to emulate the actions and behaviours of real people. This can help in understanding how new technologies diffuse. In this way a degree of micro-economic realism can be added to domestic carbon modelling. Such a model should then be of use for both forward projections of CO2 and to analyse the cost effectiveness of various policy measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper the authors exploit two equivalent formulations of the average rate of material entropy production in the climate system to propose an approximate splitting between contributions due to vertical and eminently horizontal processes. This approach is based only on 2D radiative fields at the surface and at the top of atmosphere. Using 2D fields at the top of atmosphere alone, lower bounds to the rate of material entropy production and to the intensity of the Lorenz energy cycle are derived. By introducing a measure of the efficiency of the planetary system with respect to horizontal thermodynamic processes, it is possible to gain insight into a previous intuition on the possibility of defining a baroclinic heat engine extracting work from the meridional heat flux. The approximate formula of the material entropy production is verified and used for studying the global thermodynamic properties of climate models (CMs) included in the Program for Climate Model Diagnosis and Intercomparison (PCMDI)/phase 3 of the Coupled Model Intercomparison Project (CMIP3) dataset in preindustrial climate conditions. It is found that about 90% of the material entropy production is due to vertical processes such as convection, whereas the large-scale meridional heat transport contributes to only about 10% of the total. This suggests that the traditional two-box models used for providing a minimal representation of entropy production in planetary systems are not appropriate, whereas a basic—but conceptually correct—description can be framed in terms of a four-box model. The total material entropy production is typically 55 mW m−2 K−1, with discrepancies on the order of 5%, and CMs’ baroclinic efficiencies are clustered around 0.055. The lower bounds on the intensity of the Lorenz energy cycle featured by CMs are found to be around 1.0–1.5 W m−2, which implies that the derived inequality is rather stringent. When looking at the variability and covariability of the considered thermodynamic quantities, the agreement among CMs is worse, suggesting that the description of feedbacks is more uncertain. The contributions to material entropy production from vertical and horizontal processes are positively correlated, so that no compensation mechanism seems in place. Quite consistently among CMs, the variability of the efficiency of the system is a better proxy for variability of the entropy production due to horizontal processes than that of the large-scale heat flux. The possibility of providing constraints on the 3D dynamics of the fluid envelope based only on 2D observations of radiative fluxes seems promising for the observational study of planets and for testing numerical models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Leisure is in the vanguard of a social and cultural revolution which is replacing the former East/West political bipolarity with a globalised economic system in which the new Europe has a central rôle. Within this revolution, leisure, including recreation, culture and tourism, is constructed as the epitome of successful capitalist development; the very legitimisation of the global transmogrification from a production to a consumption orientation. While acting as a direct encouragement to the political transformation in many eastern European states, it is uncertain how the issue of leisure policy is being handled, given its centrality to the new economic order. This paper therefore examines the experience of western Europe, considering in particular the degree to which the newly-created Department of National Heritage in the UK provides a potential model for leisure development and policy integration in the new Europe. Despite an official rhetoric of support and promotion of leisure activities, reflecting the growing economic significance of tourism and the positive relationship between leisure provision and regional economic development, the paper establishes that in the place of the traditional rôle of the state in promoting leisure interests, the introduction of the Department has signified a shift to the use of leisure to promote the Government's interests, particularly in regenerating citizen rights claims towards the market. While an institution such as the Department of National Heritage may have relevance to emerging states as a element in the maintenance of political hegemony, therefore, it is questionable how far it can be viewed as a promoter or protector of leisure as a signifier of a newly-won political, economic and cultural freedom throughout Europe.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Warfarin resistance was first discovered among Norway rat (Rattus norvegicus) populations in Scotland in 1958 and further reports of resistance, both in this species and in others, soon followed from other parts of Europe and the United States. Researchers quickly defined the practical impact of these resistance phenomena and developed robust methods by which to monitor their spread. These tasks were relatively simple because of the high degree of immunity to warfarin conferred by the resistance genes. Later, the second generation anticoagulants were introduced to control rodents resistant to the warfarin-like compounds, but resistance to difenacoum, bromadiolone and brodifacoum is now reported in certain localities in Europe and elsewhere. However, the adoption of test methods designed initially for use with the first generation compounds to identify resistance to compounds of the second generation has led to some practical difficulties in conducting tests and in establishing meaningful resistance baselines. In particular, the results of certain test methodologies are difficult to interpret in terms of the likely impact on practical control treatments of the resistance phenomena they seek to identify. This paper defines rodenticide resistance in the context of both first and second generation anticoagulants. It examines the advantages and disadvantages of existing laboratory and field methods used in the detection of rodent populations resistant to anticoagulants and proposes some improvements in the application of these techniques and in the interpretation of their results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Empathy is the lens through which we view others' emotion expressions, and respond to them. In this study, empathy and facial emotion recognition were investigated in adults with autism spectrum conditions (ASC; N=314), parents of a child with ASC (N=297) and IQ-matched controls (N=184). Participants completed a self-report measure of empathy (the Empathy Quotient [EQ]) and a modified version of the Karolinska Directed Emotional Faces Task (KDEF) using an online test interface. Results showed that mean scores on the EQ were significantly lower in fathers (p<0.05) but not mothers (p>0.05) of children with ASC compared to controls, whilst both males and females with ASC obtained significantly lower EQ scores (p<0.001) than controls. On the KDEF, statistical analyses revealed poorer overall performance by adults with ASC (p<0.001) compared to the control group. When the 6 distinct basic emotions were analysed separately, the ASC group showed impaired performance across five out of six expressions (happy, sad, angry, afraid and disgusted). Parents of a child with ASC were not significantly worse than controls at recognising any of the basic emotions, after controlling for age and non-verbal IQ (all p>0.05). Finally, results indicated significant differences between males and females with ASC for emotion recognition performance (p<0.05) but not for self-reported empathy (p>0.05). These findings suggest that self-reported empathy deficits in fathers of autistic probands are part of the 'broader autism phenotype'. This study also reports new findings of sex differences amongst people with ASC in emotion recognition, as well as replicating previous work demonstrating empathy difficulties in adults with ASC. The use of empathy measures as quantitative endophenotypes for ASC is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ground-based Atmospheric Radiation Measurement Program (ARM) and NASA Aerosol Robotic Net- work (AERONET) routinely monitor clouds using zenith ra- diances at visible and near-infrared wavelengths. Using the transmittance calculated from such measurements, we have developed a new retrieval method for cloud effective droplet size and conducted extensive tests for non-precipitating liquid water clouds. The underlying principle is to combine a liquid-water-absorbing wavelength (i.e., 1640 nm) with a non-water-absorbing wavelength for acquiring information on cloud droplet size and optical depth. For simulated stratocumulus clouds with liquid water path less than 300 g m−2 and horizontal resolution of 201 m, the retrieval method underestimates the mean effective radius by 0.8μm, with a root-mean-squared error of 1.7 μm and a relative deviation of 13%. For actual observations with a liquid water path less than 450 g m−2 at the ARM Oklahoma site during 2007– 2008, our 1.5-min-averaged retrievals are generally larger by around 1 μm than those from combined ground-based cloud radar and microwave radiometer at a 5-min temporal resolution. We also compared our retrievals to those from combined shortwave flux and microwave observations for relatively homogeneous clouds, showing that the bias between these two retrieval sets is negligible, but the error of 2.6 μm and the relative deviation of 22 % are larger than those found in our simulation case. Finally, the transmittance-based cloud effective droplet radii agree to better than 11 % with satellite observations and have a negative bias of 1 μm. Overall, the retrieval method provides reasonable cloud effective radius estimates, which can enhance the cloud products of both ARM and AERONET.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

new rheology that explicitly accounts for the subcontinuum anisotropy of the sea ice cover is implemented into the Los Alamos sea ice model. This is in contrast to all models of sea ice included in global circulation models that use an isotropic rheology. The model contains one new prognostic variable, the local structure tensor, which quantifies the degree of anisotropy of the sea ice, and two parameters that set the time scale of the evolution of this tensor. The anisotropic rheology provides a subcontinuum description of the mechanical behavior of sea ice and accounts for a continuum scale stress with large shear to compression ratio and tensile stress component. Results over the Arctic of a stand-alone version of the model are presented and anisotropic model sensitivity runs are compared with a reference elasto-visco-plastic simulation. Under realistic forcing sea ice quickly becomes highly anisotropic over large length scales, as is observed from satellite imagery. The influence of the new rheology on the state and dynamics of the sea ice cover is discussed. Our reference anisotropic run reveals that the new rheology leads to a substantial change of the spatial distribution of ice thickness and ice drift relative to the reference standard visco-plastic isotropic run, with ice thickness regionally increased by more than 1 m, and ice speed reduced by up to 50%.