172 resultados para Emma
Resumo:
While the private sector has long been in the vanguard of shaping and managing urban environs, under the New Labour government business actors were also heralded as key agents in the delivery of sustainable places. Policy interventions, such as Business Improvement Districts (BIDs), saw business-led local partnerships positioned as key drivers in the production of economically, socially and environmentally sustainable urban communities. This research considers how one business-led body, South Bank Employer’s Group (SBEG), has inserted itself into, and influenced, local (re)development trajectories. Interview, observational and archival data are used to explore how, in a neighbourhood noted for its turbulent and conflictual development past, SBEG has led on a series of regeneration programmes that it asserts will create a “better South Bank for all”. A belief in consensual solutions underscored New Labour’s urban agenda and cast regeneration as a politically neutral process in which different stakeholders can reach mutually beneficial solutions (Southern, 2001). For authors such as Mouffe (2005), the search for consensus represents a move towards a ‘post-political’ approach to governing in which the (necessarily) antagonistic nature of the political is denied. The research utilises writings on the ‘post-political’ condition to frame an empirical exploration of regeneration at the neighbourhood level. It shows how SBEG has brokered a consensual vision of regeneration with the aim of overriding past disagreements about local development. While this may be seen as an attempt to enact what Honig (1993: 3) calls the ‘erasure of resistance from political orderings’ by assuming control of regeneration agendas (see also Baeten, 2009), the research shows that ‘resistances’ to SBEG’s activities continue to be expressed in a series of ways. These resistances suggest that, while increasingly ‘post-political’ in character, local place shaping continues to evidence what Massey (2005: 10) calls the ‘space of loose ends and missing links’ from which political activity can, at least potentially, emerge.
Resumo:
Wall plaster sequences from the Neolithic town of Çatalhöyük have been analysed and compared to three types of natural sediment found in the vicinity of the site, using a range of analytical techniques. Block samples containing the plaster sequences were removed from the walls of several different buildings on the East Mound. Sub-samples were examined by IR spectroscopy, X-ray diffraction and X-ray fluorescence to determine the overall mineralogical and elemental composition, whilst thin sections were studied using optical polarising microscopy, IR Microscopy and Environmental Scanning Electron Microscopy with Energy Dispersive X-ray analysis. The results of this study have shown that there are two types of wall plaster found in the sequences and that the sediments used to produce these were obtained from at least two distinct sources. In particular, the presence of clay, calcite and magnesian calcite in the foundation plasters suggested that these were prepared predominantly from a marl source. On the other hand, the finishing plasters were found to contain dolomite with a small amount of clay and no calcite, revealing that softlime was used in their preparation. Whilst marl is located directly below and around Çatalhöyük, the nearest source of softlime is 6.5 km away, an indication that the latter was important to the Neolithic people, possibly due to the whiter colour (5Y 8/1) of this sediment. Furthermore, the same two plaster types were found on each wall of Building 49, the main building studied in this research, and in all five buildings investigated, suggesting that the use of these sources was an established practice for the inhabitants of several different households across the site.
Resumo:
Children with an autism spectrum disorder (ASD) may be vulnerable to social isolation and bullying. We measured the friendship, fighting/bullying and victimization experiences of 10–12-year-old children with an ASD (N = 100) using parent, teacher and child self-report. Parent and teacher reports were compared to an IQ-matched group of children with special educational needs (SEN) without ASD (N = 80) and UK population data. Parents and teachers reported a lower prevalence of friendships compared to population norms and to children with SEN without an ASD. Parents but not teachers reported higher levels of victimization than the SEN group. Half of the children with an ASD reported having friendships that involved mutuality. By teacher report children with an ASD who were less socially impaired in mainstream school experienced higher levels of victimization than more socially impaired children; whereas for more socially impaired children victimization did not vary by school placement. Strategies are required to support and improve the social interaction skills of children with an ASD, to enable them to develop and maintain meaningful peer friendships and avoid victimization.
Resumo:
This article assesses the extent to which it is ‘fair’ for the government to require owner-occupiers to draw on the equity accumulated in their home to fund their social care costs. The question is stimulated by the report of the Commission on Funding of Care and Support, Fairer Care Funding (the Dilnot Commission) and the subsequent Care Act 2014. The enquiry is located within the framework of social citizenship and the new social contract. It argues that the individualistic, contractarian approach, exemplified by the Dilnot Commission and reflected in the Act, raises questions when considered from the perspective of intergenerational fairness. We argue that our concerns with the Act could be addressed by inculcating an expectation of drawing on housing wealth to fund older age: a policy of asset-based welfare.
Resumo:
This paper outlines some of the physics opportunities available with the GSI RISING active stopper and presents preliminary results from an experiment aimed at performing beta-delayed gamma-ray spectroscopic studies in heavy-neutron-rich nuclei produced following the projectile fragmentation of a 1 GeV per nucleon 208Pb primary beam. The energy response of the silicon active stopping detector for both heavy secondary fragments and beta-particles is demonstrated and preliminary results on the decays of neutron-rich Tantalum (Ta) to Tungsten (W) isotopes are presented as examples of the potential of this technique to allow new structural studies in hitherto experimentally unreachable heavy, neutron-rich nuclei. The resulting spectral information inferred from excited states in the tungsten daughter nuclei are compared with results from axially symmetric Hartree–Fock calculations of the nuclear shape and suggest a change in ground state structure for the N = 116 isotone 190W compared to the lighter isotopes of this element.
Resumo:
While state-of-the-art models of Earth's climate system have improved tremendously over the last 20 years, nontrivial structural flaws still hinder their ability to forecast the decadal dynamics of the Earth system realistically. Contrasting the skill of these models not only with each other but also with empirical models can reveal the space and time scales on which simulation models exploit their physical basis effectively and quantify their ability to add information to operational forecasts. The skill of decadal probabilistic hindcasts for annual global-mean and regional-mean temperatures from the EU Ensemble-Based Predictions of Climate Changes and Their Impacts (ENSEMBLES) project is contrasted with several empirical models. Both the ENSEMBLES models and a “dynamic climatology” empirical model show probabilistic skill above that of a static climatology for global-mean temperature. The dynamic climatology model, however, often outperforms the ENSEMBLES models. The fact that empirical models display skill similar to that of today's state-of-the-art simulation models suggests that empirical forecasts can improve decadal forecasts for climate services, just as in weather, medium-range, and seasonal forecasting. It is suggested that the direct comparison of simulation models with empirical models becomes a regular component of large model forecast evaluations. Doing so would clarify the extent to which state-of-the-art simulation models provide information beyond that available from simpler empirical models and clarify current limitations in using simulation forecasting for decision support. Ultimately, the skill of simulation models based on physical principles is expected to surpass that of empirical models in a changing climate; their direct comparison provides information on progress toward that goal, which is not available in model–model intercomparisons.
Resumo:
Simulation models are widely employed to make probability forecasts of future conditions on seasonal to annual lead times. Added value in such forecasts is reflected in the information they add, either to purely empirical statistical models or to simpler simulation models. An evaluation of seasonal probability forecasts from the Development of a European Multimodel Ensemble system for seasonal to inTERannual prediction (DEMETER) and ENSEMBLES multi-model ensemble experiments is presented. Two particular regions are considered: Nino3.4 in the Pacific and the Main Development Region in the Atlantic; these regions were chosen before any spatial distribution of skill was examined. The ENSEMBLES models are found to have skill against the climatological distribution on seasonal time-scales. For models in ENSEMBLES that have a clearly defined predecessor model in DEMETER, the improvement from DEMETER to ENSEMBLES is discussed. Due to the long lead times of the forecasts and the evolution of observation technology, the forecast-outcome archive for seasonal forecast evaluation is small; arguably, evaluation data for seasonal forecasting will always be precious. Issues of information contamination from in-sample evaluation are discussed and impacts (both positive and negative) of variations in cross-validation protocol are demonstrated. Other difficulties due to the small forecast-outcome archive are identified. The claim that the multi-model ensemble provides a ‘better’ probability forecast than the best single model is examined and challenged. Significant forecast information beyond the climatological distribution is also demonstrated in a persistence probability forecast. The ENSEMBLES probability forecasts add significantly more information to empirical probability forecasts on seasonal time-scales than on decadal scales. Current operational forecasts might be enhanced by melding information from both simulation models and empirical models. Simulation models based on physical principles are sometimes expected, in principle, to outperform empirical models; direct comparison of their forecast skill provides information on progress toward that goal.
Resumo:
The quantification of uncertainty is an increasingly popular topic, with clear importance for climate change policy. However, uncertainty assessments are open to a range of interpretations, each of which may lead to a different policy recommendation. In the EQUIP project researchers from the UK climate modelling, statistical modelling, and impacts communities worked together on ‘end-to-end’ uncertainty assessments of climate change and its impacts. Here, we use an experiment in peer review amongst project members to assess variation in the assessment of uncertainties between EQUIP researchers. We find overall agreement on key sources of uncertainty but a large variation in the assessment of the methods used for uncertainty assessment. Results show that communication aimed at specialists makes the methods used harder to assess. There is also evidence of individual bias, which is partially attributable to disciplinary backgrounds. However, varying views on the methods used to quantify uncertainty did not preclude consensus on the consequential results produced using those methods. Based on our analysis, we make recommendations for developing and presenting statements on climate and its impacts. These include the use of a common uncertainty reporting format in order to make assumptions clear; presentation of results in terms of processes and trade-offs rather than only numerical ranges; and reporting multiple assessments of uncertainty in order to elucidate a more complete picture of impacts and their uncertainties. This in turn implies research should be done by teams of people with a range of backgrounds and time for interaction and discussion, with fewer but more comprehensive outputs in which the range of opinions is recorded.
Resumo:
In recent years several methodologies have been developed to combine and interpret ensembles of climate models with the aim of quantifying uncertainties in climate projections. Constrained climate model forecasts have been generated by combining various choices of metrics used to weight individual ensemble members, with diverse approaches to sampling the ensemble. The forecasts obtained are often significantly different, even when based on the same model output. Therefore, a climate model forecast classification system can serve two roles: to provide a way for forecast producers to self-classify their forecasts; and to provide information on the methodological assumptions underlying the forecast generation and its uncertainty when forecasts are used for impacts studies. In this review we propose a possible classification system based on choices of metrics and sampling strategies. We illustrate the impact of some of the possible choices in the uncertainty quantification of large scale projections of temperature and precipitation changes, and briefly discuss possible connections between climate forecast uncertainty quantification and decision making approaches in the climate change context.
Resumo:
Incorporating a prediction into future planning and decision making is advisable only if we have judged the prediction’s credibility. This is notoriously difficult and controversial in the case of predictions of future climate. By reviewing epistemic arguments about climate model performance, we discuss how to make and justify judgments about the credibility of climate predictions. We propose a new bounding argument that justifies basing such judgments on the past performance of possibly dissimilar prediction problems. This encourages a more explicit use of data in making quantitative judgments about the credibility of future climate predictions, and in training users of climate predictions to become better judges of credibility. We illustrate the approach using decadal predictions of annual mean, global mean surface air temperature.
Resumo:
We make use of the Skyrme effective nuclear interaction within the time-dependent Hartree-Fock framework to assess the effect of inclusion of the tensor terms of the Skyrme interaction on the fusion window of the 16O–16O reaction. We find that the lower fusion threshold, around the barrier, is quite insensitive to these details of the force, but the higher threshold, above which the nuclei pass through each other, changes by several MeV between different tensor parametrisations. The results suggest that eventually fusion properties may become part of the evaluation or fitting process for effective nuclear interactions.
Resumo:
The nuclear time-dependent Hartree-Fock model formulated in three-dimensional space, based on the full standard Skyrme energy density functional complemented with the tensor force, is presented. Full self-consistency is achieved by the model. The application to the isovector giant dipole resonance is discussed in the linear limit, ranging from spherical nuclei (16O and 120Sn) to systems displaying axial or triaxial deformation (24Mg, 28Si, 178Os, 190W and 238U). Particular attention is paid to the spin-dependent terms from the central sector of the functional, recently included together with the tensor. They turn out to be capable of producing a qualitative change on the strength distribution in this channel. The effect on the deformation properties is also discussed. The quantitative effects on the linear response are small and, overall, the giant dipole energy remains unaffected. Calculations are compared to predictions from the (quasi)-particle random-phase approximation and experimental data where available, finding good agreement
Resumo:
The role of the tensor terms in the Skyrme interaction is studied for their effect in dynamic calculations where non-zero contributions to the mean-field may arise, even when the starting nucleus, or nuclei are even-even and have no active time-odd potentials in the ground state. We study collisions in the test-bed 16O-16O system, and give a qualitative analysis of the behaviour of the time-odd tensor-kinetic density, which only appears in the mean field Hamiltonian in the presence of the tensor force. We find an axial excitation of this density is induced by a collision.