827 resultados para Absence reliability


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scoring rules are an important tool for evaluating the performance of probabilistic forecasting schemes. A scoring rule is called strictly proper if its expectation is optimal if and only if the forecast probability represents the true distribution of the target. In the binary case, strictly proper scoring rules allow for a decomposition into terms related to the resolution and the reliability of a forecast. This fact is particularly well known for the Brier Score. In this article, this result is extended to forecasts for finite-valued targets. Both resolution and reliability are shown to have a positive effect on the score. It is demonstrated that resolution and reliability are directly related to forecast attributes that are desirable on grounds independent of the notion of scores. This finding can be considered an epistemological justification of measuring forecast quality by proper scoring rules. A link is provided to the original work of DeGroot and Fienberg, extending their concepts of sufficiency and refinement. The relation to the conjectured sharpness principle of Gneiting, et al., is elucidated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

References (20)Cited By (1)Export CitationAboutAbstract Proper scoring rules provide a useful means to evaluate probabilistic forecasts. Independent from scoring rules, it has been argued that reliability and resolution are desirable forecast attributes. The mathematical expectation value of the score allows for a decomposition into reliability and resolution related terms, demonstrating a relationship between scoring rules and reliability/resolution. A similar decomposition holds for the empirical (i.e. sample average) score over an archive of forecast–observation pairs. This empirical decomposition though provides a too optimistic estimate of the potential score (i.e. the optimum score which could be obtained through recalibration), showing that a forecast assessment based solely on the empirical resolution and reliability terms will be misleading. The differences between the theoretical and empirical decomposition are investigated, and specific recommendations are given how to obtain better estimators of reliability and resolution in the case of the Brier and Ignorance scoring rule.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A necessary condition for a good probabilistic forecast is that the forecast system is shown to be reliable: forecast probabilities should equal observed probabilities verified over a large number of cases. As climate change trends are now emerging from the natural variability, we can apply this concept to climate predictions and compute the reliability of simulated local and regional temperature and precipitation trends (1950–2011) in a recent multi-model ensemble of climate model simulations prepared for the Intergovernmental Panel on Climate Change (IPCC) fifth assessment report (AR5). With only a single verification time, the verification is over the spatial dimension. The local temperature trends appear to be reliable. However, when the global mean climate response is factored out, the ensemble is overconfident: the observed trend is outside the range of modelled trends in many more regions than would be expected by the model estimate of natural variability and model spread. Precipitation trends are overconfident for all trend definitions. This implies that for near-term local climate forecasts the CMIP5 ensemble cannot simply be used as a reliable probabilistic forecast.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chaotic traffic, prevalent in many countries, is marked by a large number of vehicles driving with different speeds without following any predefined speed lanes. Such traffic rules out using any planning algorithm for these vehicles which is based upon the maintenance of speed lanes and lane changes. The absence of speed lanes may imply more bandwidth and easier overtaking in cases where vehicles vary considerably in both their size and speed. Inspired by the performance of artificial potential fields in the planning of mobile robots, we propose here lateral potentials as measures to enable vehicles to decide about their lateral positions on the road. Each vehicle is subjected to a potential from obstacles and vehicles in front, road boundaries, obstacles and vehicles to the side and higher speed vehicles to the rear. All these potentials are lateral and only govern steering the vehicle. A speed control mechanism is also used for longitudinal control of vehicle. The proposed system is shown to perform well for obstacle avoidance, vehicle following and overtaking behaviors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Key point summary • Cerebellar ataxias are progressive debilitating diseases with no known treatment and are associated with defective motor function and, in particular, abnormalities to Purkinje cells. • Mutant mice with deficits in Ca2+ channel auxiliary α2δ-2 subunits are used as models of cerebellar ataxia. • Our data in the du2J mouse model shows an association between the ataxic phenotype exhibited by homozygous du2J/du2J mice and increased irregularity of Purkinje cell firing. • We show that both heterozygous +/du2J and homozygous du2J/du2J mice completely lack the strong presynaptic modulation of neuronal firing by cannabinoid CB1 receptors which is exhibited by litter-matched control mice. • These results show that the du2J ataxia model is associated with deficits in CB1 receptor signalling in the cerebellar cortex, putatively linked with compromised Ca2+ channel activity due to reduced α2δ-2 subunit expression. Knowledge of such deficits may help design therapeutic agents to combat ataxias. Abstract Cerebellar ataxias are a group of progressive, debilitating diseases often associated with abnormal Purkinje cell (PC) firing and/or degeneration. Many animal models of cerebellar ataxia display abnormalities in Ca2+ channel function. The ‘ducky’ du2J mouse model of ataxia and absence epilepsy represents a clean knock-out of the auxiliary Ca2+ channel subunit, α2δ-2, and has been associated with deficient Ca2+ channel function in the cerebellar cortex. Here, we investigate effects of du2J mutation on PC layer (PCL) and granule cell (GC) layer (GCL) neuronal spiking activity and, also, inhibitory neurotransmission at interneurone-Purkinje cell(IN-PC) synapses. Increased neuronal firing irregularity was seen in the PCL and, to a less marked extent, in the GCL in du2J/du2J, but not +/du2J, mice; these data suggest that the ataxic phenotype is associated with lack of precision of PC firing, that may also impinge on GC activity and requires expression of two du2J alleles to manifest fully. du2J mutation had no clear effect on spontaneous inhibitory postsynaptic current (sIPSC) frequency at IN-PC synapses, but was associated with increased sIPSC amplitudes. du2J mutation ablated cannabinoid CB1 receptor (CB1R)-mediated modulation of spontaneous neuronal spike firing and CB1Rmediated presynaptic inhibition of synaptic transmission at IN-PC synapses in both +/du2J and du2J/du2J mutants; effects that occurred in the absence of changes in CB1R expression. These results demonstrate that the du2J ataxia model is associated with deficient CB1R signalling in the cerebellar cortex, putatively linked with compromised Ca2+ channel activity and the ataxic phenotype.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Planning of autonomous vehicles in the absence of speed lanes is a less-researched problem. However, it is an important step toward extending the possibility of autonomous vehicles to countries where speed lanes are not followed. The advantages of having nonlane-oriented traffic include larger traffic bandwidth and more overtaking, which are features that are highlighted when vehicles vary in terms of speed and size. In the most general case, the road would be filled with a complex grid of static obstacles and vehicles of varying speeds. The optimal travel plan consists of a set of maneuvers that enables a vehicle to avoid obstacles and to overtake vehicles in an optimal manner and, in turn, enable other vehicles to overtake. The desired characteristics of this planning scenario include near completeness and near optimality in real time with an unstructured environment, with vehicles essentially displaying a high degree of cooperation and enabling every possible(safe) overtaking procedure to be completed as soon as possible. Challenges addressed in this paper include a (fast) method for initial path generation using an elastic strip, (re-)defining the notion of completeness specific to the problem, and inducing the notion of cooperation in the elastic strip. Using this approach, vehicular behaviors of overtaking, cooperation, vehicle following,obstacle avoidance, etc., are demonstrated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern transaction cost economics (TCE) thinking has developed into a key intellectual foundation of international business (IB) research, but the Williamsonian version has faced substantial criticism for adopting the behavioral assumption of opportunism. In this paper we assess both the opportunism concept and existing alternatives such as trust within the context of IB research, especially work on multinational enterprise (MNE) governance. Case analyses of nine global MNEs illustrate an alternative to the opportunism assumption that captures more fully the mechanisms underlying failed commitments inside the MNE. As a substitute for the often-criticized assumption of opportunism, we propose the envelope concept of bounded reliability (BRel), an assumption that represents more accurately and more completely the reasons for failed commitments, without invalidating the other critical assumption in conventional TCE (and internalization theory) thinking, namely the widely accepted envelope concept of bounded rationality (BRat). Bounded reliability as an envelope concept includes two main components, within the context of global MNE management: opportunism as intentional deceit, and benevolent preference reversal. The implications for IB research of adopting the bounded reliability concept are far reaching, as this concept may increase the legitimacy of comparative institutional analysis in the social sciences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The first genome-wide association study for BMI identified a polymorphism, rs7566605, 10 kb upstream of the insulin-induced gene 2 (INSIG2) transcription start site, as the most significantly associated variant in children and adults. Subsequent studies, however, showed inconsistent association of this polymorphism with obesity traits. This polymorphism has been hypothesized to alter INSIG2 expression leading to inhibition of fatty acid and cholesterol synthesis. Hence, we investigated the association of the INSIG2 rs7566605 polymorphism with obesity- and lipid-related traits in Danish and Estonian children (930 boys and 1,073 girls) from the European Youth Heart Study (EYHS), a school-based, cross-sectional study of pre- and early pubertal children. The association between the polymorphism and obesity traits was tested using additive and recessive models adjusted for age, age-group, gender, maturity and country. Interactions were tested by including the interaction terms in the model. Despite having sufficient power (98%) to detect the previously reported effect size for association with BMI, we did not find significant effects of rs7566605 on BMI (additive, P = 0.68; recessive, P = 0.24). Accordingly, the polymorphism was not associated with overweight (P = 0.87) or obesity (P = 0.34). We also did not find association with waist circumference (WC), sum of four skinfolds, or with total cholesterol, triglycerides, low-density lipoprotein, or high-density lipoprotein. There were no gender-specific (P = 0.55), age-group-specific (P = 0.63) or country-specific (P = 0.56) effects. There was also no evidence of interaction between genotype and physical activity (P = 0.95). Despite an adequately powered study, our findings suggest that rs7566605 is not associated with obesity-related traits and lipids in the EYHS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this study was to evaluate the association of PPARG coactivator1 alpha (PPARGC1A), peroxisome proliferator activated receptor gamma (PPARG), and uncoupling protein1 (UCP1) gene polymorphisms with the metabolic syndrome (MS) in an Asian Indian population. Nine common polymorphisms were genotyped via polymerase chain reaction restriction fragment length polymorphism and direct sequencing in 950 normal glucose-tolerant subjects and 550 type 2 diabetic subjects, chosen randomly from the Chennai Urban Rural Epidemiological Study, an ongoing population based study in Southern India. Among the 9 polymorphisms examined, only the Thr394Thr variant of the PPARGC1A gene was significantly associated with diabetes and obesity. The genotype frequency of GA of Thr394Thr variant was 16% (138/887) in the nonMS group and 22% (136/613) in the MS group, and this genotype frequency was significantly higher with MS both in males (p = 0.01) and females (p = 0.05), compared to the without-MS group. Logistic regression analysis revealed that the odds ratio for MS for the susceptible genotype GA of Thr394Thr was 1.411 [95% CI: 1.03-1.84, p = 0.012]. In the multiple logistic regression analysis, however, there was no association of this polymorphism as an independent factor with MS. Hence, the study shows that the polymorphisms in the PPARGC1A, PPARG and UCP1 genes are not associated with MS in Asian Indians.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The inhibitory effects of toxin-producing phytoplankton (TPP) on zooplankton modulate the dynamics of marine plankton. In this article, we employ simple mathematical models to compare theoretically the dynamics of phytoplankton–zooplankton interaction in situations where the TPP are present with those where TPP are absent. We consider two sets of three-component interaction models: one that does not include the effect of TPP and the other that does. The negative effects of TPP on zooplankton is described by a non-linear interaction term. Extensive theoretical analyses of the models have been performed to understand the qualitative behaviour of the model systems around every possible equilibria. The results of local-stability analysis and numerical simulations demonstrate that the two model-systems differ qualitatively with regard to oscillations and stability. The model system that does not include TPP is asymptotically stable around the coexisting equilibria, whereas, the system that includes TPP oscillates for a range of parametric values associated with toxin-inhibition rate and competition coefficients. Our analysis suggests that the qualitative dynamics of the plankton–zooplankton interactions are very likely to alter due to the presence of TPP species, and therefore the effects of TPP should be considered carefully while modelling plankton dynamics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Low-power medium access control (MAC) protocols used for communication of energy constraint wireless embedded devices do not cope well with situations where transmission channels are highly erroneous. Existing MAC protocols discard corrupted messages which lead to costly retransmissions. To improve transmission performance, it is possible to include an error correction scheme and transmit/receive diversity. It is possible to add redundant information to transmitted packets in order to recover data from corrupted packets. It is also possible to make use of transmit/receive diversity via multiple antennas to improve error resiliency of transmissions. Both schemes may be used in conjunction to further improve the performance. In this study, the authors show how an error correction scheme and transmit/receive diversity can be integrated in low-power MAC protocols. Furthermore, the authors investigate the achievable performance gains of both methods. This is important as both methods have associated costs (processing requirements; additional antennas and power) and for a given communication situation it must be decided which methods should be employed. The authors’ results show that, in many practical situations, error control coding outperforms transmission diversity; however, if very high reliability is required, it is useful to employ both schemes together.