31 resultados para post-Newtonian approximation to general relativity

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The climate belongs to the class of non-equilibrium forced and dissipative systems, for which most results of quasi-equilibrium statistical mechanics, including the fluctuation-dissipation theorem, do not apply. In this paper we show for the first time how the Ruelle linear response theory, developed for studying rigorously the impact of perturbations on general observables of non-equilibrium statistical mechanical systems, can be applied with great success to analyze the climatic response to general forcings. The crucial value of the Ruelle theory lies in the fact that it allows to compute the response of the system in terms of expectation values of explicit and computable functions of the phase space averaged over the invariant measure of the unperturbed state. We choose as test bed a classical version of the Lorenz 96 model, which, in spite of its simplicity, has a well-recognized prototypical value as it is a spatially extended one-dimensional model and presents the basic ingredients, such as dissipation, advection and the presence of an external forcing, of the actual atmosphere. We recapitulate the main aspects of the general response theory and propose some new general results. We then analyze the frequency dependence of the response of both local and global observables to perturbations having localized as well as global spatial patterns. We derive analytically several properties of the corresponding susceptibilities, such as asymptotic behavior, validity of Kramers-Kronig relations, and sum rules, whose main ingredient is the causality principle. We show that all the coefficients of the leading asymptotic expansions as well as the integral constraints can be written as linear function of parameters that describe the unperturbed properties of the system, such as its average energy. Some newly obtained empirical closure equations for such parameters allow to define such properties as an explicit function of the unperturbed forcing parameter alone for a general class of chaotic Lorenz 96 models. We then verify the theoretical predictions from the outputs of the simulations up to a high degree of precision. The theory is used to explain differences in the response of local and global observables, to define the intensive properties of the system, which do not depend on the spatial resolution of the Lorenz 96 model, and to generalize the concept of climate sensitivity to all time scales. We also show how to reconstruct the linear Green function, which maps perturbations of general time patterns into changes in the expectation value of the considered observable for finite as well as infinite time. Finally, we propose a simple yet general methodology to study general Climate Change problems on virtually any time scale by resorting to only well selected simulations, and by taking full advantage of ensemble methods. The specific case of globally averaged surface temperature response to a general pattern of change of the CO2 concentration is discussed. We believe that the proposed approach may constitute a mathematically rigorous and practically very effective way to approach the problem of climate sensitivity, climate prediction, and climate change from a radically new perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper critically reviews the application of a post-development analysis to sustainable development by employing a defined target for post-development analysis – the Environmental Kuznets Curve (EKC). Data from the 2005 Environmental Sustainability Index (ESI) for 146 countries are used to generate statistically significant EKC models, and the approach is deconstructed by employing post-development theory. While an ESI derived EKC is an easy target for post-development critique, there are foundations upon which both rest, and are not easily dismissed. Neither is the typical post-development 'alternative' of encouraging 'endogenous discourse' and grassroots movements at odds with sustainable development. As a result, this paper argues that sustainable development theory already incorporates much of the critique and alternatives raised by post-developmentalists. Indeed, what is more disconcerting is that sustainable development readily encompasses such apparently divergent ideas represented by the ESI, EKC and post-developmental critique and solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Within the context of active vision, scant attention has been paid to the execution of motion saccades—rapid re-adjustments of the direction of gaze to attend to moving objects. In this paper we first develop a methodology for, and give real-time demonstrations of, the use of motion detection and segmentation processes to initiate capture saccades towards a moving object. The saccade is driven by both position and velocity of the moving target under the assumption of constant target velocity, using prediction to overcome the delay introduced by visual processing. We next demonstrate the use of a first order approximation to the segmented motion field to compute bounds on the time-to-contact in the presence of looming motion. If the bound falls below a safe limit, a panic saccade is fired, moving the camera away from the approaching object. We then describe the use of image motion to realize smooth pursuit, tracking using velocity information alone, where the camera is moved so as to null a single constant image motion fitted within a central image region. Finally, we glue together capture saccades with smooth pursuit, thus effecting changes in both what is being attended to and how it is being attended to. To couple the different visual activities of waiting, saccading, pursuing and panicking, we use a finite state machine which provides inherent robustness outside of visual processing and provides a means of making repeated exploration. We demonstrate in repeated trials that the transition from saccadic motion to tracking is more likely to succeed using position and velocity control, than when using position alone.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Undirected graphical models are widely used in statistics, physics and machine vision. However Bayesian parameter estimation for undirected models is extremely challenging, since evaluation of the posterior typically involves the calculation of an intractable normalising constant. This problem has received much attention, but very little of this has focussed on the important practical case where the data consists of noisy or incomplete observations of the underlying hidden structure. This paper specifically addresses this problem, comparing two alternative methodologies. In the first of these approaches particle Markov chain Monte Carlo (Andrieu et al., 2010) is used to efficiently explore the parameter space, combined with the exchange algorithm (Murray et al., 2006) for avoiding the calculation of the intractable normalising constant (a proof showing that this combination targets the correct distribution in found in a supplementary appendix online). This approach is compared with approximate Bayesian computation (Pritchard et al., 1999). Applications to estimating the parameters of Ising models and exponential random graphs from noisy data are presented. Each algorithm used in the paper targets an approximation to the true posterior due to the use of MCMC to simulate from the latent graphical model, in lieu of being able to do this exactly in general. The supplementary appendix also describes the nature of the resulting approximation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An analytical model of orographic gravity wave drag due to sheared flow past elliptical mountains is developed. The model extends the domain of applicability of the well-known Phillips model to wind profiles that vary relatively slowly in the vertical, so that they may be treated using a WKB approximation. The model illustrates how linear processes associated with wind profile shear and curvature affect the drag force exerted by the airflow on mountains, and how it is crucial to extend the WKB approximation to second order in the small perturbation parameter for these effects to be taken into account. For the simplest wind profiles, the normalized drag depends only on the Richardson number, Ri, of the flow at the surface and on the aspect ratio, γ, of the mountain. For a linear wind profile, the drag decreases as Ri decreases, and this variation is faster when the wind is across the mountain than when it is along the mountain. For a wind that rotates with height maintaining its magnitude, the drag generally increases as Ri decreases, by an amount depending on γ and on the incidence angle. The results from WKB theory are compared with exact linear results and also with results from a non-hydrostatic nonlinear numerical model, showing in general encouraging agreement, down to values of Ri of order one.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The judiciousness of American felon suffrage policies has long been the subject of scholarly debate, not least due to the large number of affected Americans: an estimated 5.3 million citizens are ineligible to vote as a result of a criminal conviction. This article offers comparative law and international human rights perspectives and aims to make two main contributions to the American and global discourse. After an introduction in Part I, Part II offers comparative law perspectives on challenges to disenfranchisement legislation, juxtaposing U.S. case law against recent judgments rendered by courts in Canada, South Africa, Australia, and by the European Court of Human Rights. The article submits that owing to its unique constitutional stipulations, as well as to a general reluctance to engage foreign legal sources, U.S. jurisprudence lags behind an emerging global jurisprudential trend that increasingly views convicts’ disenfranchisement as a suspect practice and subjects it to judicial review. This transnational judicial discourse follows a democratic paradigm and adopts a “residual liberty” approach to criminal justice that considers convicts to be rights-holders. The discourse rejects regulatory justifications for convicts’ disenfranchisement, and instead sees disenfranchisement as a penal measure. In order to determine its suitability as a punishment, the adverse effects of disenfranchisement are weighed against its purported social benefits, using balancing or proportionality review. Part III analyzes the international human rights treaty regime. It assesses, in particular, Article 25 of the International Covenant on Civil and Political Rights (“ICCPR”), which proclaims that “every citizen” has a right to vote without “unreasonable restrictions.” The analysis concludes that the phrase “unreasonable restrictions” is generally interpreted in a manner which tolerates certain forms of disenfranchisement, whereas other forms (such as life disenfranchisement) may be incompatible with treaty obligations. This article submits that disenfranchisement is a normatively flawed punishment. It fails to treat convicts as politically-equal community members, degrades them, and causes them grave harms both as individuals and as members of social groups. These adverse effects outweigh the purported social benefits of disenfranchisement. Furthermore, as a core component of the right to vote, voter eligibility should cease to be subjected to balancing or proportionality review. The presumed facilitative nature of the right to vote makes suffrage less susceptible to deference-based objections regarding the judicial review of legislation, as well as to cultural relativity objections to further the international standardization of human rights obligations. In view of this, this article proposes the adoption of a new optional protocol to the ICCPR proscribing convicts’ disenfranchisement. The article draws analogies between the proposed protocol and the ICCPR’s “Optional Protocol Aiming at the Abolition of the Death Penalty.” If adopted, the proposed protocol would strengthen the current trajectory towards expanding convicts’ suffrage that emanates from the invigorated transnational judicial discourse.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Major Depressive Disorder (MDD) has been associated with biased processing and abnormal regulation of negative and positive information, which may result from compromised coordinated activity of prefrontal and subcortical brain regions involved in evaluating emotional information. We tested whether patients with MDD show distributed changes in functional connectivity with a set of independently derived brain networks that have shown high correspondence with different task demands, including stimulus salience and emotional processing. We further explored if connectivity during emotional word processing related to the tendency to engage in positive or negative emotional states. In this study, 25 medication-free MDD patients without current or past comorbidity and matched controls (n=25) performed an emotional word-evaluation task during functional MRI. Using a dual regression approach, individual spatial connectivity maps representing each subject’s connectivity with each standard network were used to evaluate between-group differences and effects of positive and negative emotionality (extraversion and neuroticism, respectively, as measured with the NEO-FFI). Results showed decreased functional connectivity of the medial prefrontal cortex, ventrolateral prefrontal cortex, and ventral striatum with the fronto-opercular salience network in MDD patients compared to controls. In patients, abnormal connectivity was related to extraversion, but not neuroticism. These results confirm the hypothesis of a relative (para)limbic-cortical decoupling that may explain dysregulated affect in MDD. As connectivity of these regions with the salience network was related to extraversion, but not to general depression severity or negative emotionality, dysfunction of this network may be responsible for the failure to sustain engagement in rewarding behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Across two studies, we examined the association between adiposity, restrictive feeding practices and cortical processing bias to food stimuli in children. We assessed P3b event-related potential (ERP) during visual oddball tasks in which the frequently presented stimulus was non-food and the infrequently presented stimulus was either a food (Study 1) or non-food (Study 2) item. Children responded to the infrequently presented stimulus and accuracy and speed responses were collected. Restrictive feeding practices, children's height and weight were also measured. In Study 1, the difference in P3b amplitude for infrequently presented food stimuli, relative to frequently presented non-food stimuli, was negatively associated with adiposity and positively associated with restrictive feeding practices after controlling for adiposity. There was no association between P3b amplitude difference and adiposity or restriction in Study 2, suggesting that the effects seen in Study 1 were not due to general attentional processes. Taken together, our results suggest that attentional salience, as indexed by the P3b amplitude, may be important for understanding the neural correlates of adiposity and restrictive feeding practices in children.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While the private sector has long been in the vanguard of shaping and managing urban environs, under the New Labour government business actors were also heralded as key agents in the delivery of sustainable places. Policy interventions, such as Business Improvement Districts (BIDs), saw business-led local partnerships positioned as key drivers in the production of economically, socially and environmentally sustainable urban communities. This research considers how one business-led body, South Bank Employer’s Group (SBEG), has inserted itself into, and influenced, local (re)development trajectories. Interview, observational and archival data are used to explore how, in a neighbourhood noted for its turbulent and conflictual development past, SBEG has led on a series of regeneration programmes that it asserts will create a “better South Bank for all”. A belief in consensual solutions underscored New Labour’s urban agenda and cast regeneration as a politically neutral process in which different stakeholders can reach mutually beneficial solutions (Southern, 2001). For authors such as Mouffe (2005), the search for consensus represents a move towards a ‘post-political’ approach to governing in which the (necessarily) antagonistic nature of the political is denied. The research utilises writings on the ‘post-political’ condition to frame an empirical exploration of regeneration at the neighbourhood level. It shows how SBEG has brokered a consensual vision of regeneration with the aim of overriding past disagreements about local development. While this may be seen as an attempt to enact what Honig (1993: 3) calls the ‘erasure of resistance from political orderings’ by assuming control of regeneration agendas (see also Baeten, 2009), the research shows that ‘resistances’ to SBEG’s activities continue to be expressed in a series of ways. These resistances suggest that, while increasingly ‘post-political’ in character, local place shaping continues to evidence what Massey (2005: 10) calls the ‘space of loose ends and missing links’ from which political activity can, at least potentially, emerge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigates the effects of a short-term pedagogic intervention on the development of L2 fluency among learners studying English for Academic purposes (EAP) at a university in the UK. It also examines the interaction between the development of fluency, and complexity and accuracy. Through a pre-test, post-test design, data were collected over a period of four weeks from learners performing monologic tasks. While the Control Group (CG) focused on developing general speaking and listening skills, the Experimental Group (EG) received awareness-raising activities and fluency strategy training in addition to general speaking and listening practice i.e following the syllabus. The data, coded in terms of a range of measures of fluency, accuracy and complexity, were subjected to repeated measures MANOVA, t-tests and correlations. The results indicate that after the intervention, while some fluency gains were achieved by the CG, the EG produced statistically more fluent language demonstrating a faster speech and articulation rate, longer runs and higher phonation time ratios. The significant correlations obtained between measures of accuracy and learners’ pauses in the CG suggest that pausing opportunities may have been linked to accuracy. The findings of the study have significant implications for L2 pedagogy, highlighting the effective impact of instruction on the development of fluency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

European air quality legislation has reduced emissions of air pollutants across Europe since the 1970s, affecting air quality, human health and regional climate. We used a coupled composition-climate model to simulate the impacts of European air quality legislation and technology measures implemented between 1970 and 2010. We contrast simulations using two emission scenarios; one with actual emissions in 2010 and the other with emissions that would have occurred in 2010 in the absence of technological improvements and end-of-pipe treatment measures in the energy, industrial and road transport sectors. European emissions of sulphur dioxide, black carbon (BC) and organic carbon in 2010 are 53%, 59% and 32% lower respectively compared to emissions that would have occurred in 2010 in the absence of legislative and technology measures. These emission reductions decreased simulated European annual mean concentrations of fine particulate matter(PM2.5) by 35%, sulphate by 44%, BC by 56% and particulate organic matter by 23%. The reduction in PM2.5 concentrations is calculated to have prevented 80 000 (37 000–116 000, at 95% confidence intervals) premature deaths annually across the European Union, resulting in a perceived financial benefit to society of US$232 billion annually (1.4% of 2010 EU GDP). The reduction in aerosol concentrations due to legislative and technology measures caused a positive change in the aerosol radiative effect at the top of atmosphere, reduced atmospheric absorption and also increased the amount of solar radiation incident at the surface over Europe. We used an energy budget approximation to estimate that these changes in the radiative balance have increased European annual mean surface temperatures and precipitation by 0.45 ± 0.11 °C and by 13 ± 0.8 mm yr−1 respectively. Our results show that the implementation of European legislation and technological improvements to reduce the emission of air pollutants has improved air quality and human health over Europe, as well as having an unintended impact on the regional radiative balance and climate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radiation schemes in general circulation models currently make a number of simplifications when accounting for clouds, one of the most important being the removal of horizontal inhomogeneity. A new scheme is presented that attempts to account for the neglected inhomogeneity by using two regions of cloud in each vertical level of the model as opposed to one. One of these regions is used to represent the optically thinner cloud in the level, and the other represents the optically thicker cloud. So, along with the clear-sky region, the scheme has three regions in each model level and is referred to as “Tripleclouds.” In addition, the scheme has the capability to represent arbitrary vertical overlap between the three regions in pairs of adjacent levels. This scheme is implemented in the Edwards–Slingo radiation code and tested on 250 h of data from 12 different days. The data are derived from cloud retrievals using radar, lidar, and a microwave radiometer at Chilbolton, southern United Kingdom. When the data are grouped into periods equivalent in size to general circulation model grid boxes, the shortwave plane-parallel albedo bias is found to be 8%, while the corresponding bias is found to be less than 1% using Tripleclouds. Similar results are found for the longwave biases. Tripleclouds is then compared to a more conventional method of accounting for inhomogeneity that multiplies optical depths by a constant scaling factor, and Tripleclouds is seen to improve on this method both in terms of top-of-atmosphere radiative flux biases and internal heating rates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[1] In many practical situations where spatial rainfall estimates are needed, rainfall occurs as a spatially intermittent phenomenon. An efficient geostatistical method for rainfall estimation in the case of intermittency has previously been published and comprises the estimation of two independent components: a binary random function for modeling the intermittency and a continuous random function that models the rainfall inside the rainy areas. The final rainfall estimates are obtained as the product of the estimates of these two random functions. However the published approach does not contain a method for estimation of uncertainties. The contribution of this paper is the presentation of the indicator maximum likelihood estimator from which the local conditional distribution of the rainfall value at any location may be derived using an ensemble approach. From the conditional distribution, representations of uncertainty such as the estimation variance and confidence intervals can be obtained. An approximation to the variance can be calculated more simply by assuming rainfall intensity is independent of location within the rainy area. The methodology has been validated using simulated and real rainfall data sets. The results of these case studies show good agreement between predicted uncertainties and measured errors obtained from the validation data.