57 resultados para variance change point detection


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The misuse of Personal Protective Equipment results in health risk among smallholders in developing countries, and education is often proposed to promote safer practices. However, evidence point to limited effects of education. This paper presents a System Dynamics model which allows the identification of risk-minimizing policies for behavioural change. The model is based on the IAC framework and survey data. It represents farmers' decision-making from an agent-oriented standpoint. The most successful intervention strategy was the one which intervened in the long term, targeted key stocks in the systems and was diversified. However, the results suggest that, under these conditions, no policy is able to trigger a self sustaining behavioural change. Two implementation approaches were suggested by experts. One, based on constant social control, corresponds to a change of the current model's parameters. The other, based on participation, would lead farmers to new thinking, i.e. changes in their decision-making structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relevance of regional policy for less favoured regions (LFRs) reveals itself when policy-makers must reconcile competitiveness with social cohesion through the adaptation of competition or innovation policies. The vast literature in this area generally builds on an overarching concept of ‘social capital’ as the necessary relational infrastructure for collective action diversification and policy integration, in a context much influenced by a dynamic of industrial change and a necessary balance between the creation and diffusion of ‘knowledge’ through learning. This relational infrastructure or ‘social capital’ is centred on people’s willingness to cooperate and ‘envision’ futures as a result of “social organization, such as networks, norms and trust that facilitate action and cooperation for mutual benefit” (Putnam, 1993: 35). Advocates of this interpretation of ‘social capital’ have adopted the ‘new growth’ thinking behind ‘systems of innovation’ and ‘competence building’, arguing that networks have the potential to make both public administration and markets more effective as well as ‘learning’ trajectories more inclusive of the development of society as a whole. This essay aims to better understand the role of ‘social capital’ in the production and reproduction of uneven regional development patterns, and to critically assess the limits of a ‘systems concept’ and an institution-centred approach to comparative studies of regional innovation. These aims are discussed in light of the following two assertions: i) learning behaviour, from an economic point of view, has its determinants, and ii) the positive economic outcomes of ‘social capital’ cannot be taken as a given. It is suggested that an agent-centred approach to comparative research best addresses the ‘learning’ determinants and the consequences of social networks on regional development patterns. A brief discussion of the current debate on innovation surveys has been provided to illustrate this point.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The family of theories dubbed ‘luck egalitarianism’ represent an attempt to infuse egalitarian thinking with a concern for personal responsibility, arguing that inequalities are just when they result from, or the extent to which they result from, choice, but are unjust when they result from, or the extent to which they result from, luck. In this essay I argue that luck egalitarians should sometimes seek to limit inequalities, even when they have a fully choice-based pedigree (i.e., result only from the choices of agents). I grant that the broad approach is correct but argue that the temporal standpoint from which we judge whether the person can be held responsible, or the extent to which they can be held responsible, should be radically altered. Instead of asking, as Standard (or Static) Luck Egalitarianism seems to, whether or not, or to what extent, a person was responsible for the choice at the time of choosing, and asking the question of responsibility only once, we should ask whether, or to what extent, they are responsible for the choice at the point at which we are seeking to discover whether, or to what extent, the inequality is just, and so the question of responsibility is not settled but constantly under review. Such an approach will differ from Standard Luck Egalitarianism only if responsibility for a choice is not set in stone – if responsibility can weaken then we should not see the boundary between luck and responsibility within a particular action as static. Drawing on Derek Parfit’s illuminating discussions of personal identity, and contemporary literature on moral responsibility, I suggest there are good reasons to think that responsibility can weaken – that we are not necessarily fully responsible for a choice for ever, even if we were fully responsible at the time of choosing. I call the variant of luck egalitarianism that recognises this shift in temporal standpoint and that responsibility can weaken Dynamic Luck Egalitarianism (DLE). In conclusion I offer a preliminary discussion of what kind of policies DLE would support.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Steady state and dynamic models have been developed and applied to the River Kennet system. Annual nitrogen exports from the land surface to the river have been estimated based on land use from the 1930s and the 1990s. Long term modelled trends indicate that there has been a large increase in nitrogen transport into the river system driven by increased fertiliser application associated with increased cereal production, increased population and increased livestock levels. The dynamic model INCA Integrated Nitrogen in Catchments. has been applied to simulate the day-to-day transport of N from the terrestrial ecosystem to the riverine environment. This process-based model generates spatial and temporal data and reproduces the observed instream concentrations. Applying the model to current land use and 1930s land use indicates that there has been a major shift in the short term dynamics since the 1930s, with increased river and groundwater concentrations caused by both non-point source pollution from agriculture and point source discharges. �

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the super-Brownian motion with a single point source in dimensions 2 and 3 as constructed by Fleischmann and Mueller in 2004. Using analytic facts we derive the long time behavior of the mean in dimension 2 and 3 thereby complementing previous work of Fleischmann, Mueller and Vogt. Using spectral theory and martingale arguments we prove a version of the strong law of large numbers for the two dimensional superprocess with a single point source and finite variance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The detection of long-range dependence in time series analysis is an important task to which this paper contributes by showing that whilst the theoretical definition of a long-memory (or long-range dependent) process is based on the autocorrelation function, it is not possible for long memory to be identified using the sum of the sample autocorrelations, as usually defined. The reason for this is that the sample sum is a predetermined constant for any stationary time series; a result that is independent of the sample size. Diagnostic or estimation procedures, such as those in the frequency domain, that embed this sum are equally open to this criticism. We develop this result in the context of long memory, extending it to the implications for the spectral density function and the variance of partial sums of a stationary stochastic process. The results are further extended to higher order sample autocorrelations and the bispectral density. The corresponding result is that the sum of the third order sample (auto) bicorrelations at lags h,k≥1, is also a predetermined constant, different from that in the second order case, for any stationary time series of arbitrary length.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A favoured method of assimilating information from state-of-the-art climate models into integrated assessment models of climate impacts is to use the transient climate response (TCR) of the climate models as an input, sometimes accompanied by a pattern matching approach to provide spatial information. More recent approaches to the problem use TCR with another independent piece of climate model output: the land-sea surface warming ratio (φ). In this paper we show why the use of φ in addition to TCR has such utility. Multiple linear regressions of surface temperature change onto TCR and φ in 22 climate models from the CMIP3 multi-model database show that the inclusion of φ explains a much greater fraction of the inter-model variance than using TCR alone. The improvement is particularly pronounced in North America and Eurasia in the boreal summer season, and in the Amazon all year round. The use of φ as the second metric is beneficial for three reasons: firstly it is uncorrelated with TCR in state-of-the-art climate models and can therefore be considered as an independent metric; secondly, because of its projected time-invariance, the magnitude of φ is better constrained than TCR in the immediate future; thirdly, the use of two variables is much simpler than approaches such as pattern scaling from climate models. Finally we show how using the latest estimates of φ from climate models with a mean value of 1.6—as opposed to previously reported values of 1.4—can significantly increase the mean time-integrated discounted damage projections in a state-of-the-art integrated assessment model by about 15 %. When compared to damages calculated without the inclusion of the land-sea warming ratio, this figure rises to 65 %, equivalent to almost 200 trillion dollars over 200 years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dynamics of Northern Hemisphere major midwinter stratospheric sudden warmings (SSWs) are examined using transient climate change simulations from the Canadian Middle Atmosphere Model (CMAM). The simulated SSWs show good overall agreement with reanalysis data in terms of composite structure, statistics, and frequency. Using observed or model sea surface temperatures (SSTs) is found to make no significant difference to the SSWs, indicating that the use of model SSTs in the simulations extending into the future is not an issue. When SSWs are defined by the standard (wind based) definition, an absolute criterion, their frequency is found to increase by;60% by the end of this century, in conjunction with a;25% decrease in their temperature amplitude. However, when a relative criterion based on the northern annular mode index is used to define the SSWs, no future increase in frequency is found. The latter is consistent with the fact that the variance of 100-hPa daily heat flux anomalies is unaffected by climate change. The future increase in frequency of SSWs using the standard method is a result of the weakened climatological mean winds resulting from climate change, which make it easier for the SSW criterion to be met. A comparison of winters with and without SSWs reveals that the weakening of the climatological westerlies is not a result of SSWs. The Brewer–Dobson circulation is found to be stronger by ;10% during winters with SSWs, which is a value that does not change significantly in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Northern Hemisphere cyclone activity is assessed by applying an algorithm for the detection and tracking of synoptic scale cyclones to mean sea level pressure data. The method, originally developed for the Southern Hemisphere, is adapted for application in the Northern Hemisphere winter season. NCEP-Reanalysis data from 1958/59 to 1997/98 are used as input. The sensitivities of the results to particular parameters of the algorithm are discussed for both case studies and from a climatological point of view. Results show that the choice of settings is of major relevance especially for the tracking of smaller scale and fast moving systems. With an appropriate setting the algorithm is capable of automatically tracking different types of cyclones at the same time: Both fast moving and developing systems over the large ocean basins and smaller scale cyclones over the Mediterranean basin can be assessed. The climatology of cyclone variables, e.g., cyclone track density, cyclone counts, intensification rates, propagation speeds and areas of cyclogenesis and -lysis gives detailed information on typical cyclone life cycles for different regions. The lowering of the spatial and temporal resolution of the input data from full resolution T62/06h to T42/12h decreases the cyclone track density and cyclone counts. Reducing the temporal resolution alone contributes to a decline in the number of fast moving systems, which is relevant for the cyclone track density. Lowering spatial resolution alone mainly reduces the number of weak cyclones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission-driven rather than concentration-driven perturbed parameter ensemble of a global climate model (GCM). These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concentration-driven simulations (with 10–90th percentile ranges of 1.7 K for the aggressive mitigation scenario, up to 3.9 K for the high-end, business as usual scenario). A small minority of simulations resulting from combinations of strong atmospheric feedbacks and carbon cycle responses show temperature increases in excess of 9 K (RCP8.5) and even under aggressive mitigation (RCP2.6) temperatures in excess of 4 K. While the simulations point to much larger temperature ranges for emission-driven experiments, they do not change existing expectations (based on previous concentration-driven experiments) on the timescales over which different sources of uncertainty are important. The new simulations sample a range of future atmospheric concentrations for each emission scenario. Both in the case of SRES A1B and the Representative Concentration Pathways (RCPs), the concentration scenarios used to drive GCM ensembles, lies towards the lower end of our simulated distribution. This design decision (a legacy of previous assessments) is likely to lead concentration-driven experiments to under-sample strong feedback responses in future projections. Our ensemble of emission-driven simulations span the global temperature response of the CMIP5 emission-driven simulations, except at the low end. Combinations of low climate sensitivity and low carbon cycle feedbacks lead to a number of CMIP5 responses to lie below our ensemble range. The ensemble simulates a number of high-end responses which lie above the CMIP5 carbon cycle range. These high-end simulations can be linked to sampling a number of stronger carbon cycle feedbacks and to sampling climate sensitivities above 4.5 K. This latter aspect highlights the priority in identifying real-world climate-sensitivity constraints which, if achieved, would lead to reductions on the upper bound of projected global mean temperature change. The ensembles of simulations presented here provides a framework to explore relationships between present-day observables and future changes, while the large spread of future-projected changes highlights the ongoing need for such work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores the spaces and power relations of ethical foodscapes. Ethics can offer a commodity a valuable unique selling point in a competitive marketplace but managing the changeable and multiple motivations for stakeholder participation throughout the commodity chain in order to utilise this opportunity is a complex negotiation. Through exploring the spaces and relations within three South African– UK ethical wine networks, the discursive tactics used to sustain these are uncovered. The discourses of Fairtrade, Black Economic Empowerment and organics are highly adaptive, interacting with each other in such a way as to always be contextually appealing. This ‘tactical mutability’ is combined with ‘scales of knowing’, which, this paper argues, are essential for network durability. ‘Scales of knowing’ refers to the recognition by stakeholders of the potential for different articulations of a discourse within the network, which combines with ‘tactical mutability’ to allow for a scalar, contextual and ’knowing’ (im)mutability to ensure the discourse’s continued appeal. However, even when one discourse is the ‘lead’ it always folds within it linkages to other ethical discourses at work, suggesting that ethical practice is mutually supportive discursively. This means that at the producer end ethical interactions may offer more capacity to enact genuine transformation than the solo operations of a discourse.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe the approach to be adopted for a major new initiative to derive a homogeneous record of sea surface temperature for 1991–2007 from the observations of the series of three along-track scanning radiometers (ATSRs). This initiative is called (A)RC: (Advanced) ATSR Re-analysis for Climate. The main objectives are to reduce regional biases in retrieved sea surface temperature (SST) to less than 0.1 K for all global oceans, while creating a very homogenous record that is stable in time to within 0.05 K decade−1, with maximum independence of the record from existing analyses of SST used in climate change research. If these stringent targets are achieved, this record will enable significantly improved estimates of surface temperature trends and variability of sufficient quality to advance questions of climate change attribution, climate sensitivity and historical reconstruction of surface temperature changes. The approach includes development of new, consistent estimators for SST for each of the ATSRs, and detailed analysis of overlap periods. Novel aspects of the approach include generation of multiple versions of the record using alternative channel sets and cloud detection techniques, to assess for the first time the effect of such choices. There will be extensive effort in quality control, validation and analysis of the impact on climate SST data sets. Evidence for the plausibility of the 0.1 K target for systematic error is reviewed, as is the need for alternative cloud screening methods in this context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Future climate change projections are often derived from ensembles of simulations from multiple global circulation models using heuristic weighting schemes. This study provides a more rigorous justification for this by introducing a nested family of three simple analysis of variance frameworks. Statistical frameworks are essential in order to quantify the uncertainty associated with the estimate of the mean climate change response. The most general framework yields the “one model, one vote” weighting scheme often used in climate projection. However, a simpler additive framework is found to be preferable when the climate change response is not strongly model dependent. In such situations, the weighted multimodel mean may be interpreted as an estimate of the actual climate response, even in the presence of shared model biases. Statistical significance tests are derived to choose the most appropriate framework for specific multimodel ensemble data. The framework assumptions are explicit and can be checked using simple tests and graphical techniques. The frameworks can be used to test for evidence of nonzero climate response and to construct confidence intervals for the size of the response. The methodology is illustrated by application to North Atlantic storm track data from the Coupled Model Intercomparison Project phase 5 (CMIP5) multimodel ensemble. Despite large variations in the historical storm tracks, the cyclone frequency climate change response is not found to be model dependent over most of the region. This gives high confidence in the response estimates. Statistically significant decreases in cyclone frequency are found on the flanks of the North Atlantic storm track and in the Mediterranean basin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flooding is a particular hazard in urban areas worldwide due to the increased risks to life and property in these regions. Synthetic Aperture Radar (SAR) sensors are often used to image flooding because of their all-weather day-night capability, and now possess sufficient resolution to image urban flooding. The flood extents extracted from the images may be used for flood relief management and improved urban flood inundation modelling. A difficulty with using SAR for urban flood detection is that, due to its side-looking nature, substantial areas of urban ground surface may not be visible to the SAR due to radar layover and shadow caused by buildings and taller vegetation. This paper investigates whether urban flooding can be detected in layover regions (where flooding may not normally be apparent) using double scattering between the (possibly flooded) ground surface and the walls of adjacent buildings. The method estimates double scattering strengths using a SAR image in conjunction with a high resolution LiDAR (Light Detection and Ranging) height map of the urban area. A SAR simulator is applied to the LiDAR data to generate maps of layover and shadow, and estimate the positions of double scattering curves in the SAR image. Observations of double scattering strengths were compared to the predictions from an electromagnetic scattering model, for both the case of a single image containing flooding, and a change detection case in which the flooded image was compared to an un-flooded image of the same area acquired with the same radar parameters. The method proved successful in detecting double scattering due to flooding in the single-image case, for which flooded double scattering curves were detected with 100% classification accuracy (albeit using a small sample set) and un-flooded curves with 91% classification accuracy. The same measures of success were achieved using change detection between flooded and un-flooded images. Depending on the particular flooding situation, the method could lead to improved detection of flooding in urban areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Taste and smell detection threshold measurements are frequently time consuming especially when the method involves reversing the concentrations presented to replicate and improve accuracy of results. These multiple replications are likely to cause sensory and cognitive fatigue which may be more pronounced in elderly populations. A new rapid detection threshold methodology was developed that quickly located the likely position of each individuals sensory detection threshold then refined this by providing multiple concentrations around this point to determine their threshold. This study evaluates the reliability and validity of this method. Findings indicate that this new rapid detection threshold methodology was appropriate to identify differences in sensory detection thresholds between different populations and has positive benefits in providing a shorter assessment of detection thresholds. The results indicated that this method is appropriate at determining individual as well as group detection thresholds.