957 resultados para Cumulative Residual


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two experiments implement and evaluate a training scheme for learning to apply frequency formats to probability judgements couched in terms of percentages. Results indicate that both conditional and cumulative probability judgements can be improved in this manner, however the scheme is insufficient to promote any deeper understanding of the problem structure. In both experiments, training on one problem type only (either conditional or cumulative risk judgements) resulted in an inappropriate transfer of a learned method at test. The obstacles facing a frequency-based training programme for teaching appropriate use of probability data are discussed. Copyright (c) 2006 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The tendency to neglect base-rates in judgment under uncertainty may be "notorious," as Barbey & Sloman (B&S) suggest, but it is neither inevitable (as they document; see also Koehler 1996) nor unique. Here we would like to point out another line of evidence connecting ecological rationality to dual processes, the failure of individuals to appropriately judge cumulative probability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper summarizes the theory of simple cumulative risks—for example, the risk of food poisoning from the consumption of a series of portions of tainted food. Problems concerning such risks are extraordinarily difficult for naı¨ve individuals, and the paper explains the reasons for this difficulty. It describes how naı¨ve individuals usually attempt to estimate cumulative risks, and it outlines a computer program that models these methods. This account predicts that estimates can be improved if problems of cumulative risk are framed so that individuals can focus on the appropriate subset of cases. The paper reports two experiments that corroborated this prediction. They also showed that whether problems are stated in terms of frequencies (80 out of 100 people got food poisoning) or in terms of percentages (80% of people got food poisoning) did not reliably affect accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Certain deghosters suffer from the presence of distortion caused by the quadrature forming nature of the IF VSB filter operating on a ghosted IF signal. By analysing this a priori effect, a specific deghoster solution is found by using the phase quadrature detected IF signal to cancel the VSB induced ghost quadrature distortions from the detected inphase signal before deghosting takes place.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Synthetic pyrethroid insecticides are degraded almost entirely by ultraviolet (UV)-catalysed oxidation. A bioassay using the beetle Tribolium confusum duVal caged on bandages soaked in 0.04% a.i. cypermethrin showed large differences in residual insecticide-life under three plastic films available for cladding polytunnels. Cypermethrin exposed to a UV film that transmitted 70% of UVB and 80% of UVA killed all beetles for 8 weeks, compared to only 3 weeks for cypermethrin exposed in a clear plastic envelope. Cypermethrin under a UV-absorbing film that reduced the transmission of UVB and UVA to 14% and 50%, respectively, gave a complete kill for 17 weeks. Reducing the transmission of UVB to virtually zero, and that of UVA to only 3%, using a UV-opaque film prolonged the effective life of the cypermethrin residue to 26 weeks, and some beetles were still killed for a further 11 weeks. Even after this time, beetles exposed to cypermethrin from the UV-opaque treatment were still affected by the insecticide, and only showed near-normal mobility after 24 months of pesticide exposure to the UV-opaque film. These results have implications for the recommended intervals between cypermethrin treatment and crop harvest, and on the time of introduction of insect-based biological control agents, when UV-opaque films are used in commercial horticulture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The global temperature response to increasing atmospheric CO2 is often quantified by metrics such as equilibrium climate sensitivity and transient climate response1. These approaches, however, do not account for carbon cycle feedbacks and therefore do not fully represent the net response of the Earth system to anthropogenic CO2 emissions. Climate–carbon modelling experiments have shown that: (1) the warming per unit CO2 emitted does not depend on the background CO2 concentration2; (2) the total allowable emissions for climate stabilization do not depend on the timing of those emissions3, 4, 5; and (3) the temperature response to a pulse of CO2 is approximately constant on timescales of decades to centuries3, 6, 7, 8. Here we generalize these results and show that the carbon–climate response (CCR), defined as the ratio of temperature change to cumulative carbon emissions, is approximately independent of both the atmospheric CO2 concentration and its rate of change on these timescales. From observational constraints, we estimate CCR to be in the range 1.0–2.1 °C per trillion tonnes of carbon (Tt C) emitted (5th to 95th percentiles), consistent with twenty-first-century CCR values simulated by climate–carbon models. Uncertainty in land-use CO2 emissions and aerosol forcing, however, means that higher observationally constrained values cannot be excluded. The CCR, when evaluated from climate–carbon models under idealized conditions, represents a simple yet robust metric for comparing models, which aggregates both climate feedbacks and carbon cycle feedbacks. CCR is also likely to be a useful concept for climate change mitigation and policy; by combining the uncertainties associated with climate sensitivity, carbon sinks and climate–carbon feedbacks into a single quantity, the CCR allows CO2-induced global mean temperature change to be inferred directly from cumulative carbon emissions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global efforts to mitigate climate change are guided by projections of future temperatures1. But the eventual equilibrium global mean temperature associated with a given stabilization level of atmospheric greenhouse gas concentrations remains uncertain1, 2, 3, complicating the setting of stabilization targets to avoid potentially dangerous levels of global warming4, 5, 6, 7, 8. Similar problems apply to the carbon cycle: observations currently provide only a weak constraint on the response to future emissions9, 10, 11. Here we use ensemble simulations of simple climate-carbon-cycle models constrained by observations and projections from more comprehensive models to simulate the temperature response to a broad range of carbon dioxide emission pathways. We find that the peak warming caused by a given cumulative carbon dioxide emission is better constrained than the warming response to a stabilization scenario. Furthermore, the relationship between cumulative emissions and peak warming is remarkably insensitive to the emission pathway (timing of emissions or peak emission rate). Hence policy targets based on limiting cumulative emissions of carbon dioxide are likely to be more robust to scientific uncertainty than emission-rate or concentration targets. Total anthropogenic emissions of one trillion tonnes of carbon (3.67 trillion tonnes of CO2), about half of which has already been emitted since industrialization began, results in a most likely peak carbon-dioxide-induced warming of 2 °C above pre-industrial temperatures, with a 5–95% confidence interval of 1.3–3.9 °C.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Southern Ocean circulation consists of a complicated mixture of processes and phenomena that arise at different time and spatial scales which need to be parametrized in the state-of-the-art climate models. The temporal and spatial scales that give rise to the present-day residual mean circulation are here investigated by calculating the Meridional Overturning Circulation (MOC) in density coordinates from an eddy-permitting global model. The region sensitive to the temporal decomposition is located between 38°S and 63°S, associated with the eddy-induced transport. The ‘‘Bolus’’ component of the residual circulation corresponds to the eddy-induced transport. It is dominated by timescales between 1 month and 1 year. The temporal behavior of the transient eddies is examined in splitting the ‘‘Bolus’’ component into a ‘‘Seasonal’’, an ‘‘Eddy’’ and an ‘‘Inter-monthly’’ component, respectively representing the correlation between density and velocity fluctuations due to the average seasonal cycle, due to mesoscale eddies and due to large-scale motion on timescales longer than one month that is not due to the seasonal cycle. The ‘‘Seasonal’’ bolus cell is important at all latitudes near the surface. The ‘‘Eddy’’ bolus cell is dominant in the thermocline between 50°S and 35°S and over the whole ocean depth at the latitude of the Drake Passage. The ‘‘Inter-monthly’’ bolus cell is important in all density classes and is maximal in the Brazil–Malvinas Confluence and the Agulhas Return Current. The spatial decomposition indicates that a large part of the Eulerian mean circulation is recovered for spatial scales larger than 11.25°, implying that small-scale meanders in the Antarctic Circumpolar Current (ACC), near the Subantarctic and Polar Fronts, and near the Subtropical Front are important in the compensation of the Eulerian mean flow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper evaluates environmental externality when the structure of the externality is cumulative. The evaluation exercise is based on the assumption that the agents in question form conjectural variations. A number of environments are encompassed within this classification and have received due attention in the literature. Each of these heterogeneous environments, however, possesses considerable analytical homogeneity and permit subscription to a general model treatment. These environments include environmental externality, oligopoly and the analysis of the private provision of public goods. We highlight the general analytical approach by focusing on this latter context, in which debate centers around four issues: the existence of free-riding, the extent to which contributions are matched equally across individuals, the nature of conjectures consistent with equilibrium, and the allocative inefficiency of alternative regimes. This paper resolves each of these issues, with the following conclusions: A consistent-conjectures equilibrium exists in the private provision of public goods. It is the monopolistic-conjectures equilibrium. Agents act identically, contributing positive amounts of the public good in an efficient allocation of resources. There is complete matching of contributions among agents, no free-riding, and the allocation is independent of the number of members within the community. Thus the Olson conjecture—that inefficiency is exacerbated by community size—has no foundation in a consistent-conjectures, cumulative-externality, context (212 words).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A parameterization of mesoscale eddies in coarse-resolution ocean general circulation models (GCM) is formulated and implemented using a residual-mean formalism. In that framework, mean buoyancy is advected by the residual velocity (the sum of the Eulerian and eddy-induced velocities) and modified by a residual flux which accounts for the diabatic effects of mesoscale eddies. The residual velocity is obtained by stepping forward a residual-mean momentum equation in which eddy stresses appear as forcing terms. Study of the spatial distribution of eddy stresses, derived by using them as control parameters to ‘‘fit’’ the residual-mean model to observations, supports the idea that eddy stresses can be likened to a vertical down-gradient flux of momentum with a coefficient which is constant in the vertical. The residual eddy flux is set to zero in the ocean interior, where mesoscale eddies are assumed to be quasi-adiabatic, but is parameterized by a horizontal down-gradient diffusivity near the surface where eddies develop a diabatic component as they stir properties horizontally across steep isopycnals. The residual-mean model is implemented and tested in the MIT general circulation model. It is shown that the resulting model (1) has a climatology that is superior to that obtained using the Gent and McWilliams parameterization scheme with a spatially uniform diffusivity and (2) allows one to significantly reduce the (spurious) horizontal viscosity used in coarse resolution GCMs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wernicke’s aphasia occurs following a stroke to classical language comprehension regions in the left temporoparietal cortex. Consequently, auditory-verbal comprehension is significantly impaired in Wernicke’s aphasia but the capacity to comprehend visually presented materials (written words and pictures) is partially spared. This study used fMRI to investigate the neural basis of written word and picture semantic processing in Wernicke’s aphasia, with the wider aim of examining how the semantic system is altered following damage to the classical comprehension regions. Twelve participants with Wernicke’s aphasia and twelve control participants performed semantic animate-inanimate judgements and a visual height judgement baseline task. Whole brain and ROI analysis in Wernicke’s aphasia and control participants found that semantic judgements were underpinned by activation in the ventral and anterior temporal lobes bilaterally. The Wernicke’s aphasia group displayed an “over-activation” in comparison to control participants, indicating that anterior temporal lobe regions become increasingly influential following reduction in posterior semantic resources. Semantic processing of written words in Wernicke’s aphasia was additionally supported by recruitment of the right anterior superior temporal lobe, a region previously associated with recovery from auditory-verbal comprehension impairments. Overall, the results concord with models which indicate that the anterior temporal lobes are crucial for multimodal semantic processing and that these regions may be accessed without support from classic posterior comprehension regions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerical models of the atmosphere combine a dynamical core, which approximates solutions to the adiabatic, frictionless governing equations for fluid dynamics, with tendencies arising from the parametrization of other physical processes. Since potential vorticity (PV) is conserved following fluid flow in adiabatic, frictionless circumstances, it is possible to isolate the effects of non-conservative processes by accumulating PV changes in an air-mass relative framework. This “PV tracer technique” is used to accumulate separately the effects on PV of each of the different non-conservative processes represented in a numerical model of the atmosphere. Dynamical cores are not exactly conservative because they introduce, explicitly or implicitly, some level of dissipation and adjustment of prognostic model variables which acts to modify PV. Here, the PV tracers technique is extended to diagnose the cumulative effect of the non-conservation of PV by a dynamical core and its characteristics relative to the PV modification by parametrized physical processes. Quantification using the Met Office Unified Model reveals that the magnitude of the non-conservation of PV by the dynamical core is comparable to those from physical processes. Moreover, the residual of the PV budget, when tracing the effects of the dynamical core and physical processes, is at least an order of magnitude smaller than the PV tracers associated with the most active physical processes. The implication of this work is that the non-conservation of PV by a dynamical core can be assessed in case studies with a full suite of physics parametrizations and directly compared with the PV modification by parametrized physical processes. The nonconservation of PV by the dynamical core is shown to move the position of the extratropical tropopause while the parametrized physical processes have a lesser effect at the tropopause level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Parties to the United Nations Framework Convention on Climate Change (UNFCCC) have requested guidance on common greenhouse gas metrics in accounting for Nationally determined contributions (NDCs) to emission reductions1. Metric choice can affect the relative emphasis placed on reductions of ‘cumulative climate pollutants’ such as carbon dioxide versus ‘short-lived climate pollutants’ (SLCPs), including methane and black carbon2, 3, 4, 5, 6. Here we show that the widely used 100-year global warming potential (GWP100) effectively measures the relative impact of both cumulative pollutants and SLCPs on realized warming 20–40 years after the time of emission. If the overall goal of climate policy is to limit peak warming, GWP100 therefore overstates the importance of current SLCP emissions unless stringent and immediate reductions of all climate pollutants result in temperatures nearing their peak soon after mid-century7, 8, 9, 10, which may be necessary to limit warming to “well below 2 °C” (ref. 1). The GWP100 can be used to approximately equate a one-off pulse emission of a cumulative pollutant and an indefinitely sustained change in the rate of emission of an SLCP11, 12, 13. The climate implications of traditional CO2-equivalent targets are ambiguous unless contributions from cumulative pollutants and SLCPs are specified separately.