818 resultados para Reliability assessments
Resumo:
A necessary condition for a good probabilistic forecast is that the forecast system is shown to be reliable: forecast probabilities should equal observed probabilities verified over a large number of cases. As climate change trends are now emerging from the natural variability, we can apply this concept to climate predictions and compute the reliability of simulated local and regional temperature and precipitation trends (1950–2011) in a recent multi-model ensemble of climate model simulations prepared for the Intergovernmental Panel on Climate Change (IPCC) fifth assessment report (AR5). With only a single verification time, the verification is over the spatial dimension. The local temperature trends appear to be reliable. However, when the global mean climate response is factored out, the ensemble is overconfident: the observed trend is outside the range of modelled trends in many more regions than would be expected by the model estimate of natural variability and model spread. Precipitation trends are overconfident for all trend definitions. This implies that for near-term local climate forecasts the CMIP5 ensemble cannot simply be used as a reliable probabilistic forecast.
Resumo:
In recent years a number of chemistry-climate models have been developed with an emphasis on the stratosphere. Such models cover a wide range of time scales of integration and vary considerably in complexity. The results of specific diagnostics are here analysed to examine the differences amongst individual models and observations, to assess the consistency of model predictions, with a particular focus on polar ozone. For example, many models indicate a significant cold bias in high latitudes, the “cold pole problem”, particularly in the southern hemisphere during winter and spring. This is related to wave propagation from the troposphere which can be improved by improving model horizontal resolution and with the use of non-orographic gravity wave drag. As a result of the widely differing modelled polar temperatures, different amounts of polar stratospheric clouds are simulated which in turn result in varying ozone values in the models. The results are also compared to determine the possible future behaviour of ozone, with an emphasis on the polar regions and mid-latitudes. All models predict eventual ozone recovery, but give a range of results concerning its timing and extent. Differences in the simulation of gravity waves and planetary waves as well as model resolution are likely major sources of uncertainty for this issue. In the Antarctic, the ozone hole has probably reached almost its deepest although the vertical and horizontal extent of depletion may increase slightly further over the next few years. According to the model results, Antarctic ozone recovery could begin any year within the range 2001 to 2008. The limited number of models which have been integrated sufficiently far indicate that full recovery of ozone to 1980 levels may not occur in the Antarctic until about the year 2050. For the Arctic, most models indicate that small ozone losses may continue for a few more years and that recovery could begin any year within the range 2004 to 2019. The start of ozone recovery in the Arctic is therefore expected to appear later than in the Antarctic.
Resumo:
Background: Parental overprotection has commonly been implicated in the development and maintenance of childhood anxiety disorders. Overprotection has been assessed using questionnaire and observational methods interchangeably; however, the extent to which these methods access the same construct has received little attention. Edwards, 2008 and Edwards et al., 2010 developed a promising parent-report measure of overprotection (OP) and reported that, with parents of pre-school children, the measure correlated with observational assessments and predicted changes in child anxiety symptoms. We aimed to validate the use of the OP measure with mothers of children in middle childhood, and examine its association with child and parental anxiety. Methods: Mothers of 90 children (60 clinically anxious, 30 non-anxious) aged 7–12 years completed the measure and engaged in a series of mildly stressful tasks with their child. Results: The internal reliability of the measure was good and scores correlated significantly with observations of maternal overprotection in a challenging puzzle task. Contrary to expectations, OP was not significantly associated with child anxiety status or symptoms, but was significantly associated with maternal anxiety symptoms. Limitations: Participants were predominantly from affluent social groups and of non-minority status. Overprotection is a broad construct, the use of specific sub-dimensions of behavioural constructs may be preferable. Conclusions: The findings support the use of the OP measure to assess parental overprotection among 7–12 year-old children; however, they suggest that parental responses may be more closely related to the degree of parental rather than child anxiety.
Resumo:
Modern transaction cost economics (TCE) thinking has developed into a key intellectual foundation of international business (IB) research, but the Williamsonian version has faced substantial criticism for adopting the behavioral assumption of opportunism. In this paper we assess both the opportunism concept and existing alternatives such as trust within the context of IB research, especially work on multinational enterprise (MNE) governance. Case analyses of nine global MNEs illustrate an alternative to the opportunism assumption that captures more fully the mechanisms underlying failed commitments inside the MNE. As a substitute for the often-criticized assumption of opportunism, we propose the envelope concept of bounded reliability (BRel), an assumption that represents more accurately and more completely the reasons for failed commitments, without invalidating the other critical assumption in conventional TCE (and internalization theory) thinking, namely the widely accepted envelope concept of bounded rationality (BRat). Bounded reliability as an envelope concept includes two main components, within the context of global MNE management: opportunism as intentional deceit, and benevolent preference reversal. The implications for IB research of adopting the bounded reliability concept are far reaching, as this concept may increase the legitimacy of comparative institutional analysis in the social sciences.
Resumo:
Aerosols affect the Earth's energy budget directly by scattering and absorbing radiation and indirectly by acting as cloud condensation nuclei and, thereby, affecting cloud properties. However, large uncertainties exist in current estimates of aerosol forcing because of incomplete knowledge concerning the distribution and the physical and chemical properties of aerosols as well as aerosol-cloud interactions. In recent years, a great deal of effort has gone into improving measurements and datasets. It is thus feasible to shift the estimates of aerosol forcing from largely model-based to increasingly measurement-based. Our goal is to assess current observational capabilities and identify uncertainties in the aerosol direct forcing through comparisons of different methods with independent sources of uncertainties. Here we assess the aerosol optical depth (τ), direct radiative effect (DRE) by natural and anthropogenic aerosols, and direct climate forcing (DCF) by anthropogenic aerosols, focusing on satellite and ground-based measurements supplemented by global chemical transport model (CTM) simulations. The multi-spectral MODIS measures global distributions of aerosol optical depth (τ) on a daily scale, with a high accuracy of ±0.03±0.05τ over ocean. The annual average τ is about 0.14 over global ocean, of which about 21%±7% is contributed by human activities, as estimated by MODIS fine-mode fraction. The multi-angle MISR derives an annual average AOD of 0.23 over global land with an uncertainty of ~20% or ±0.05. These high-accuracy aerosol products and broadband flux measurements from CERES make it feasible to obtain observational constraints for the aerosol direct effect, especially over global the ocean. A number of measurement-based approaches estimate the clear-sky DRE (on solar radiation) at the top-of-atmosphere (TOA) to be about -5.5±0.2 Wm-2 (median ± standard error from various methods) over the global ocean. Accounting for thin cirrus contamination of the satellite derived aerosol field will reduce the TOA DRE to -5.0 Wm-2. Because of a lack of measurements of aerosol absorption and difficulty in characterizing land surface reflection, estimates of DRE over land and at the ocean surface are currently realized through a combination of satellite retrievals, surface measurements, and model simulations, and are less constrained. Over the oceans the surface DRE is estimated to be -8.8±0.7 Wm-2. Over land, an integration of satellite retrievals and model simulations derives a DRE of -4.9±0.7 Wm-2 and -11.8±1.9 Wm-2 at the TOA and surface, respectively. CTM simulations derive a wide range of DRE estimates that on average are smaller than the measurement-based DRE by about 30-40%, even after accounting for thin cirrus and cloud contamination. A number of issues remain. Current estimates of the aerosol direct effect over land are poorly constrained. Uncertainties of DRE estimates are also larger on regional scales than on a global scale and large discrepancies exist between different approaches. The characterization of aerosol absorption and vertical distribution remains challenging. The aerosol direct effect in the thermal infrared range and in cloudy conditions remains relatively unexplored and quite uncertain, because of a lack of global systematic aerosol vertical profile measurements. A coordinated research strategy needs to be developed for integration and assimilation of satellite measurements into models to constrain model simulations. Enhanced measurement capabilities in the next few years and high-level scientific cooperation will further advance our knowledge.
Resumo:
Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.
Resumo:
Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.
Resumo:
This paper investigates urban canopy layers (UCL) ventilation under neutral atmospheric condition with the same building area density (λp=0.25) and frontal area density (λf=0.25) but various urban sizes, building height variations, overall urban forms and wind directions. Turbulent airflows are first predicted by CFD simulations with standard k-ε model evaluated by wind tunnel data. Then air change rates per hour (ACH) and canopy purging flow rate (PFR) are numerically analyzed to quantify the rate of air exchange and the net ventilation capacity induced by mean flows and turbulence. With a parallel approaching wind (θ=0o), the velocity ratio first decreases in the adjustment region, followed by the fully-developed region where the flow reaches a balance. Although the flow quantities macroscopically keep constant, however ACH decreases and overall UCL ventilation becomes worse if urban size rises from 390m to 5km. Theoretically if urban size is infinite, ACH may reach a minimum value depending on local roof ventilation, and it rises from 1.7 to 7.5 if the standard deviation of building height variations increases (0% to 83.3%). Overall UCL ventilation capacity (PFR) with a square overall urban form (Lx=Ly=390m) is better as θ=0o than oblique winds (θ=15o, 30o, 45o), and it exceeds that of a staggered urban form under all wind directions (θ=0o to 45o), but is less than that of a rectangular urban form (Lx=570m, Ly=270m) under most wind directions (θ=30o to 90o). Further investigations are still required to quantify the net ventilation efficiency induced by mean flows and turbulence.
Resumo:
Low-power medium access control (MAC) protocols used for communication of energy constraint wireless embedded devices do not cope well with situations where transmission channels are highly erroneous. Existing MAC protocols discard corrupted messages which lead to costly retransmissions. To improve transmission performance, it is possible to include an error correction scheme and transmit/receive diversity. It is possible to add redundant information to transmitted packets in order to recover data from corrupted packets. It is also possible to make use of transmit/receive diversity via multiple antennas to improve error resiliency of transmissions. Both schemes may be used in conjunction to further improve the performance. In this study, the authors show how an error correction scheme and transmit/receive diversity can be integrated in low-power MAC protocols. Furthermore, the authors investigate the achievable performance gains of both methods. This is important as both methods have associated costs (processing requirements; additional antennas and power) and for a given communication situation it must be decided which methods should be employed. The authors’ results show that, in many practical situations, error control coding outperforms transmission diversity; however, if very high reliability is required, it is useful to employ both schemes together.
Resumo:
Embedded computer systems equipped with wireless communication transceivers are nowadays used in a vast number of application scenarios. Energy consumption is important in many of these scenarios, as systems are battery operated and long maintenance-free operation is required. To achieve this goal, embedded systems employ low-power communication transceivers and protocols. However, currently used protocols cannot operate efficiently when communication channels are highly erroneous. In this study, we show how average diversity combining (ADC) can be used in state-of-the-art low-power communication protocols. This novel approach improves transmission reliability and in consequence energy consumption and transmission latency in the presence of erroneous channels. Using a testbed, we show that highly erroneous channels are indeed a common occurrence in situations, where low-power systems are used and we demonstrate that ADC improves low-power communication dramatically.
Resumo:
Morocco constitutes an important centre of plant diversity and speciation in the Mediterranean Basin. However, numerous species are threatened by issues ranging from human activities to global climatic change. In this study, we present the conservation assessments and Red Listing of the endemic Moroccan monocotyledons according to International Union for Conservation of Nature (IUCN) criteria and categories. For each species, we include basic taxonomic information, local names and synonyms, uses, a distribution map, extent of occurrence, area of occupancy, population size and trend, a description of habitats and ecological requirements, and a discussion of the threats affecting the species and habitats. We assessed the threatened status of the endemic Moroccan monocotyledons at the species level (59 species) using the IUCN Red List criteria and categories (Version 3.1). This study shows the high extinction risk to the Moroccan monocotyledon flora, with 95% of threatened species (20% Critically Endangered, 50% Endangered, 25% Vulnerable) and only 5% not threatened (2% Near Threatened and 3% Least Concern). The flora is thus of conservation concern, which is poorly recognized, both nationally and internationally. The study presents the first part and so far the only national IUCN Red Data List for a large group of Moroccan plants, and thus provides an overview of the threatened Moroccan flora. This IUCN Red List is an important first step towards the recognition of the danger to Moroccan biodiversity hotspots, conservation of threatened species and the raising of public awareness at national and international levels.