90 resultados para Defeasible conditional
Resumo:
The Bollène-2002 Experiment was aimed at developing the use of a radar volume-scanning strategy for conducting radar rainfall estimations in the mountainous regions of France. A developmental radar processing system, called Traitements Régionalisés et Adaptatifs de Données Radar pour l’Hydrologie (Regionalized and Adaptive Radar Data Processing for Hydrological Applications), has been built and several algorithms were specifically produced as part of this project. These algorithms include 1) a clutter identification technique based on the pulse-to-pulse variability of reflectivity Z for noncoherent radar, 2) a coupled procedure for determining a rain partition between convective and widespread rainfall R and the associated normalized vertical profiles of reflectivity, and 3) a method for calculating reflectivity at ground level from reflectivities measured aloft. Several radar processing strategies, including nonadaptive, time-adaptive, and space–time-adaptive variants, have been implemented to assess the performance of these new algorithms. Reference rainfall data were derived from a careful analysis of rain gauge datasets furnished by the Cévennes–Vivarais Mediterranean Hydrometeorological Observatory. The assessment criteria for five intense and long-lasting Mediterranean rain events have proven that good quantitative precipitation estimates can be obtained from radar data alone within 100-km range by using well-sited, well-maintained radar systems and sophisticated, physically based data-processing systems. The basic requirements entail performing accurate electronic calibration and stability verification, determining the radar detection domain, achieving efficient clutter elimination, and capturing the vertical structure(s) of reflectivity for the target event. Radar performance was shown to depend on type of rainfall, with better results obtained with deep convective rain systems (Nash coefficients of roughly 0.90 for point radar–rain gauge comparisons at the event time step), as opposed to shallow convective and frontal rain systems (Nash coefficients in the 0.6–0.8 range). In comparison with time-adaptive strategies, the space–time-adaptive strategy yields a very significant reduction in the radar–rain gauge bias while the level of scatter remains basically unchanged. Because the Z–R relationships have not been optimized in this study, results are attributed to an improved processing of spatial variations in the vertical profile of reflectivity. The two main recommendations for future work consist of adapting the rain separation method for radar network operations and documenting Z–R relationships conditional on rainfall type.
Resumo:
In this paper we address two topical questions: How do the quality of governance and agricultural intensification impact on spatial expansion of agriculture? Which aspects of governance are more likely to ensure that agricultural intensification allows sparing land for nature? Using data from the Food and Agriculture Organization, the World Bank, the World Database on Protected Areas, and the Yale Center for Environmental Law and Policy, we estimate a panel data model for six South American countries and quantify the effects of major determinants of agricultural land expansion, including various dimensions of governance, over the period 1970–2006. The results indicate that the effect of agricultural intensification on agricultural expansion is conditional on the quality and type of governance. When considering conventional aspects of governance, agricultural intensification leads to an expansion of agricultural area when governance scores are high. When looking specifically at environmental aspects of governance, intensification leads to a spatial contraction of agriculture when governance scores are high, signaling a sustainable intensification process.
Resumo:
BACKGROUND: Integrin-linked kinase (ILK) and its associated complex of proteins are involved in many cellular activation processes, including cell adhesion and integrin signaling. We have previously demonstrated that mice with induced platelet ILK deficiency show reduced platelet activation and aggregation, but only a minor bleeding defect. Here, we explore this apparent disparity between the cellular and hemostatic phenotypes. METHODS: The impact of ILK inhibition on integrin αII b β3 activation and degranulation was assessed with the ILK-specific inhibitor QLT0267, and a conditional ILK-deficient mouse model was used to assess the impact of ILK deficiency on in vivo platelet aggregation and thrombus formation. RESULTS: Inhibition of ILK reduced the rate of both fibrinogen binding and α-granule secretion, but was accompanied by only a moderate reduction in the maximum extent of platelet activation or aggregation in vitro. The reduction in the rate of fibrinogen binding occurred prior to degranulation or translocation of αII b β3 to the platelet surface. The change in the rate of platelet activation in the absence of functional ILK led to a reduction in platelet aggregation in vivo, but did not change the size of thrombi formed following laser injury of the cremaster arteriole wall in ILK-deficient mice. It did, however, result in a marked decrease in the stability of thrombi formed in ILK-deficient mice. CONCLUSION: Taken together, the findings of this study indicate that, although ILK is not essential for platelet activation, it plays a critical role in facilitating rapid platelet activation, which is essential for stable thrombus formation.
Resumo:
Windstorm Kyrill affected large parts of Europe in January 2007 and caused widespread havoc and loss of life. In this study the formation of a secondary cyclone, Kyill II, along the occluded front of the mature cyclone Kyrill and the occurrence of severe wind gusts as Kyrill II passed over Germany are investigated with the help of high-resolution regional climate model simulations. Kyrill underwent an explosive cyclogenesis south of Greenland as the storm crossed polewards of an intense upper-level jet stream. Later in its life cycle secondary cyclogenesis occurred just west of the British Isles. The formation of Kyrill II along the occluded front was associated (a) with frontolytic strain and (b) with strong diabatic heating in combination with a developing upper-level shortwave trough. Sensitivity studies with reduced latent heat release feature a similar development but a weaker secondary cyclone, revealing the importance of diabatic processes during the formation of Kyrill II. Kyrill II moved further towards Europe and its development was favored by a split jet structure aloft, which maintained the cyclone’s exceptionally deep core pressure (below 965 hPa) for at least 36 hours. The occurrence of hurricane force winds related to the strong cold front over North and Central Germany is analyzed using convection-permitting simulations. The lower troposphere exhibits conditional instability, a turbulent flow and evaporative cooling. Simulation at high spatio-temporal resolution suggests that the downward mixing of high momentum (the wind speed at 875 hPa widely exceeded 45 m s-1) accounts for widespread severe surface wind gusts, which is in agreement with observed widespread losses.
Resumo:
The performance of rank dependent preference functionals under risk is comprehensively evaluated using Bayesian model averaging. Model comparisons are made at three levels of heterogeneity plus three ways of linking deterministic and stochastic models: the differences in utilities, the differences in certainty equivalents and contextualutility. Overall, the"bestmodel", which is conditional on the form of heterogeneity is a form of Rank Dependent Utility or Prospect Theory that cap tures the majority of behaviour at both the representative agent and individual level. However, the curvature of the probability weighting function for many individuals is S-shaped, or ostensibly concave or convex rather than the inverse S-shape commonly employed. Also contextual utility is broadly supported across all levels of heterogeneity. Finally, the Priority Heuristic model, previously examined within a deterministic setting, is estimated within a stochastic framework, and allowing for endogenous thresholds does improve model performance although it does not compete well with the other specications considered.
Resumo:
We investigated the plume structure of a piezo-electric sprayer system, set up to release ethanol in a wind tunnel, using a fast response mini-photoionizaton detector. We recorded the plume structure of four different piezo-sprayer configurations: the sprayer alone; with a 1.6-mm steel mesh shield; with a 3.2-mm steel mesh shield; and with a 5 cm circular upwind baffle. We measured a 12 × 12-mm core at the center of the plume, and both a horizontal and vertical cross-section of the plume, all at 100-, 200-, and 400-mm downwind of the odor source. Significant differences in plume structure were found among all configurations in terms of conditional relative mean concentration, intermittency, ratio of peak concentration to conditional mean concentration, and cross-sectional area of the plume. We then measured the flight responses of the almond moth, Cadra cautella, to odor plumes generated with the sprayer alone, and with the upwind baffle piezo-sprayer configuration, releasing a 13:1 ratio of (9Z,12E)-tetradecadienyl acetate and (Z)-9-tetradecenyl acetate diluted in ethanol at release rates of 1, 10, 100, and 1,000 pg/min. For each configuration, differences in pheromone release rate resulted in significant differences in the proportions of moths performing oriented flight and landing behaviors. Additionally, there were apparent differences in the moths’ behaviors between the two sprayer configurations, although this requires confirmation with further experiments. This study provides evidence that both pheromone concentration and plume structure affect moth orientation behavior and demonstrates that care is needed when setting up experiments that use a piezo-electric release system to ensure the optimal conditions for behavioral observations.
Resumo:
The Monte Carlo Independent Column Approximation (McICA) is a flexible method for representing subgrid-scale cloud inhomogeneity in radiative transfer schemes. It does, however, introduce conditional random errors but these have been shown to have little effect on climate simulations, where spatial and temporal scales of interest are large enough for effects of noise to be averaged out. This article considers the effect of McICA noise on a numerical weather prediction (NWP) model, where the time and spatial scales of interest are much closer to those at which the errors manifest themselves; this, as we show, means that noise is more significant. We suggest methods for efficiently reducing the magnitude of McICA noise and test these methods in a global NWP version of the UK Met Office Unified Model (MetUM). The resultant errors are put into context by comparison with errors due to the widely used assumption of maximum-random-overlap of plane-parallel homogeneous cloud. For a simple implementation of the McICA scheme, forecasts of near-surface temperature are found to be worse than those obtained using the plane-parallel, maximum-random-overlap representation of clouds. However, by applying the methods suggested in this article, we can reduce noise enough to give forecasts of near-surface temperature that are an improvement on the plane-parallel maximum-random-overlap forecasts. We conclude that the McICA scheme can be used to improve the representation of clouds in NWP models, with the provision that the associated noise is sufficiently small.
Resumo:
The disadvantage of the majority of data assimilation schemes is the assumption that the conditional probability density function of the state of the system given the observations [posterior probability density function (PDF)] is distributed either locally or globally as a Gaussian. The advantage, however, is that through various different mechanisms they ensure initial conditions that are predominantly in linear balance and therefore spurious gravity wave generation is suppressed. The equivalent-weights particle filter is a data assimilation scheme that allows for a representation of a potentially multimodal posterior PDF. It does this via proposal densities that lead to extra terms being added to the model equations and means the advantage of the traditional data assimilation schemes, in generating predominantly balanced initial conditions, is no longer guaranteed. This paper looks in detail at the impact the equivalent-weights particle filter has on dynamical balance and gravity wave generation in a primitive equation model. The primary conclusions are that (i) provided the model error covariance matrix imposes geostrophic balance, then each additional term required by the equivalent-weights particle filter is also geostrophically balanced; (ii) the relaxation term required to ensure the particles are in the locality of the observations has little effect on gravity waves and actually induces a reduction in gravity wave energy if sufficiently large; and (iii) the equivalent-weights term, which leads to the particles having equivalent significance in the posterior PDF, produces a change in gravity wave energy comparable to the stochastic model error. Thus, the scheme does not produce significant spurious gravity wave energy and so has potential for application in real high-dimensional geophysical applications.
Resumo:
Real estate securities have a number of distinct characteristics that differentiate them from stocks generally. Key amongst them is that under-pinning the firms are both real as well as investment assets. The connections between the underlying macro-economy and listed real estate firms is therefore clearly demonstrated and of heightened importance. To consider the linkages with the underlying macro-economic fundamentals we extract the ‘low-frequency’ volatility component from aggregate volatility shocks in 11 international markets over the 1990-2014 period. This is achieved using Engle and Rangel’s (2008) Spline-Generalized Autoregressive Conditional Heteroskedasticity (Spline-GARCH) model. The estimated low-frequency volatility is then examined together with low-frequency macro data in a fixed-effect pooled regression framework. The analysis reveals that the low-frequency volatility of real estate securities has strong and positive association with most of the macroeconomic risk proxies examined. These include interest rates, inflation, GDP and foreign exchange rates.
Resumo:
Seamless phase II/III clinical trials in which an experimental treatment is selected at an interim analysis have been the focus of much recent research interest. Many of the methods proposed are based on the group sequential approach. This paper considers designs of this type in which the treatment selection can be based on short-term endpoint information for more patients than have primary endpoint data available. We show that in such a case, the familywise type I error rate may be inflated if previously proposed group sequential methods are used and the treatment selection rule is not specified in advance. A method is proposed to avoid this inflation by considering the treatment selection that maximises the conditional error given the data available at the interim analysis. A simulation study is reported that illustrates the type I error rate inflation and compares the power of the new approach with two other methods: a combination testing approach and a group sequential method that does not use the short-term endpoint data, both of which also strongly control the type I error rate. The new method is also illustrated through application to a study in Alzheimer's disease. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Resumo:
Transreal numbers provide a total semantics containing classical truth values, dialetheaic, fuzzy and gap values. A paraconsistent Sheffer Stroke generalises all classical logics to a paraconsistent form. We introduce logical spaces of all possible worlds and all propositions. We operate on a proposition, in all possible worlds, at the same time. We define logical transformations, possibility and necessity relations, in proposition space, and give a criterion to determine whether a proposition is classical. We show that proofs, based on the conditional, infer gaps only from gaps and that negative and positive infinity operate as bottom and top values.
Resumo:
This article provides new insights into the dependence of firm growth on age along the entire distribution of growth rates, and conditional on survival. Using data from the European firms in a global economy survey, and adopting a quantile regression approach, we uncover evidence for a sample of French, Italian and Spanish manufacturing firms with more than ten employees in the period from 2001 to 2008. We find that: (1) young firms grow faster than old firms, especially in the highest growth quantiles; (2) young firms face the same probability of declining as their older counterparts; (3) results are robust to the inclusion of other firms’ characteristics such as labor productivity, capital intensity and the financial structure; (4) high growth is associated with younger chief executive officers and other attributes that capture the attitude of the firm toward growth and change. The effect of age on firm growth is rather similar across countries.
Resumo:
The C-type lectin receptor CLEC-2 signals through a pathway that is critically dependent on the tyrosine kinase Syk. We show that homozygous loss of either protein results in defects in brain vascular and lymphatic development, lung inflation, and perinatal lethality. Furthermore, we find that conditional deletion of Syk in the hematopoietic lineage, or conditional deletion of CLEC-2 or Syk in the megakaryocyte/platelet lineage, also causes defects in brain vascular and lymphatic development, although the mice are viable. In contrast, conditional deletion of Syk in other hematopoietic lineages had no effect on viability or brain vasculature and lymphatic development. We show that platelets, but not platelet releasate, modulate the migration and intercellular adhesion of lymphatic endothelial cells through a pathway that depends on CLEC-2 and Syk. These studies found that megakaryocyte/platelet expression of CLEC-2 and Syk is required for normal brain vasculature and lymphatic development and that platelet CLEC-2 and Syk directly modulate lymphatic endothelial cell behavior in vitro.
Resumo:
Civil wars are the most common type of large scale violent conflict. They are long, brutal and continue to harm societies long after the shooting stops. Post-conflict countries face extraordinary challenges with respect to development and security. In this paper we examine how countries can recover economically from these devastating conflicts and how international interventions can help to build lasting peace. We revisit the aid and growth debate and confirm that aid does not increase growth in general. However, we find that countries experience increased growth after the end of the war and that aid helps to make the most of this peace dividend. However, aid is only growth enhancing when the violence has stopped, in violent post-war societies aid has no growth enhancing effect. We also find that good governance is robustly correlated with growth, however we cannot confirm that aid increases growth conditional on good policies. We examine various aspects of aid and governance by disaggregating the aid and governance variables. Our analysis does not provide a clear picture of which types of aid and policy should be prioritized. We find little evidence for a growth enhancing effect of UN missions and suggest that case studies may provide better insight into the relationship between security guarantees and economic stabilization.
Resumo:
Using an international, multi-model suite of historical forecasts from the World Climate Research Programme (WCRP) Climate-system Historical Forecast Project (CHFP), we compare the seasonal prediction skill in boreal wintertime between models that resolve the stratosphere and its dynamics (“high-top”) and models that do not (“low-top”). We evaluate hindcasts that are initialized in November, and examine the model biases in the stratosphere and how they relate to boreal wintertime (Dec-Mar) seasonal forecast skill. We are unable to detect more skill in the high-top ensemble-mean than the low-top ensemble-mean in forecasting the wintertime North Atlantic Oscillation, but model performance varies widely. Increasing the ensemble size clearly increases the skill for a given model. We then examine two major processes involving stratosphere-troposphere interactions (the El Niño-Southern Oscillation/ENSO and the Quasi-biennial Oscillation/QBO) and how they relate to predictive skill on intra-seasonal to seasonal timescales, particularly over the North Atlantic and Eurasia regions. High-top models tend to have a more realistic stratospheric response to El Niño and the QBO compared to low-top models. Enhanced conditional wintertime skill over high-latitudes and the North Atlantic region during winters with El Niño conditions suggests a possible role for a stratospheric pathway.