43 resultados para Independence of irrelevant alternatives


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a new coefficient-based retrieval scheme for estimation of sea surface temperature (SST) from the Along Track Scanning Radiometer (ATSR) instruments. The new coefficients are banded by total column water vapour (TCWV), obtained from numerical weather prediction analyses. TCWV banding reduces simulated regional retrieval biases to < 0.1 K compared to biases ~ 0.2 K for global coefficients. Further, detailed treatment of the instrumental viewing geometry reduces simulated view-angle related biases from ~ 0.1 K down to < 0.005 K for dual-view retrievals using channels at 11 and 12 μm. A novel analysis of trade-offs related to the assumed noise level when defining coefficients is undertaken, and we conclude that adding a small nominal level of noise (0.01 K) is optimal for our purposes. When applied to ATSR observations, some inter-algorithm biases appear as TCWV-related differences in SSTs estimated from different channel combinations. The final step in coefficient determination is to adjust the offset coefficient in each TCWV band to match results from a reference algorithm. This reference uses the dual-view observations of 3.7 and 11 μm. The adjustment is independent of in situ measurements, preserving independence of the retrievals. The choice of reference is partly motivated by uncertainty in the calibration of the 12 μm of Advanced ATSR. Lastly, we model the sensitivities of the new retrievals to changes to TCWV and changes in true SST, confirming that dual-view SSTs are most appropriate for climatological applications

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the monetary policy independence of European nations in the years before European Economic and Monetary Union (EMU) is investigated using cointegration techniques. Daily data is used to assess pairwise relationships between individual EMU nations and ‘lead’ nation Germany, to assess the hypothesis that Germany was the dominant European nation prior to EMU. By and large our econometric investigations support this hypothesis, and lead us to conclude that the only European nation to lose monetary policy independence in the light of monetary union was Germany. Our results have important policy implications. Given that the loss of monetary policy independence is generally viewed as the main cost of monetary unification, our findings suggest a reconsideration of the costs and benefits of monetary integration. A country can only lose what it has, and in Europe the countries that joined EMU — spare Germany — apparently did not have much to lose, at least not in terms of monetary independence. Instead, they actually gained monetary policy influence by getting a seat in the ECB's governing council which is responsible for setting interest policy in the euro area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Svalgaard (2014) has recently pointed out that the calibration of the Helsinki magnetic observatory’s H component variometer was probably in error in published data for the years 1866–1874.5 and that this makes the interdiurnal variation index based on daily means, IDV(1d), (Lockwood et al., 2013a), and the interplanetary magnetic field strength derived from it (Lockwood et al., 2013b), too low around the peak of solar cycle 11. We use data from the modern Nurmijarvi station, relatively close to the site of the original Helsinki Observatory, to confirm a 30% underestimation in this interval and hence our results are fully consistent with the correction derived by Svalgaard. We show that the best method for recalibration uses the Helsinki Ak(H) and aa indices and is accurate to ±10 %. This makes it preferable to recalibration using either the sunspot number or the diurnal range of geomagnetic activity which we find to be accurate to ±20 %. In the case of Helsinki data during cycle 11, the two recalibration methods produce very similar corrections which are here confirmed using newly digitised data from the nearby St Petersburg observatory and also using declination data from Helsinki. However, we show that the IDV index is, compared to later years, too similar to sunspot number before 1872, revealing independence of the two data series has been lost; either because the geomagnetic data used to compile IDV has been corrected using sunspot numbers, or vice versa, or both. We present corrected data sequences for both the IDV(1d) index and the reconstructed IMF (interplanetary magnetic field).We also analyse the relationship between the derived near-Earth IMF and the sunspot number and point out the relevance of the prior history of solar activity, in addition to the contemporaneous value, to estimating any “floor” value of the near-Earth interplanetary field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based on a combined internet and mail survey in Germany the independence of indica-tors of trust in public authorities from indicators of attitudes toward genetically modified food is tested. Despite evidence of a link between trust indicators on the one hand and evaluation of benefits and perceived likelihoods of risks, correlation with other factors is found to be moderate on average. But the trust indicators exhibit only a moderate relation with the re-spondents’ preference for either sole public control or a cooperation of public and private bodies in the monitoring of GM food distribution. Instead, age and location in either the New or the Old Lander are found to be significantly related with such preferences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Contemporary research in generative second language (L2) acquisition has attempted to address observable target-deviant aspects of L2 grammars within a UG-continuity framework (e.g. Lardiere 2000; Schwartz 2003; Sprouse 2004; Prévost & White 1999, 2000). With the aforementioned in mind, the independence of pragmatic and syntactic development, independently observed elsewhere (e.g. Grodzinsky & Reinhart 1993; Lust et al. 1986; Pacheco & Flynn 2005; Serratrice, Sorace & Paoli 2004), becomes particularly interesting. In what follows, I examine the resetting of the Null-Subject Parameter (NSP) for English learners of L2 Spanish. I argue that insensitivity to associated discoursepragmatic constraints on the discursive distribution of overt/null subjects accounts for what appear to be particular errors as a result of syntactic deficits. It is demonstrated that despite target-deviant performance, the majority must have native-like syntactic competence given their knowledge of the Overt Pronoun Constraint (Montalbetti 1984), a principle associated with the Spanish-type setting of the NSP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This special issue is focused on the assessment of algorithms for the observation of Earth’s climate from environ- mental satellites. Climate data records derived by remote sensing are increasingly a key source of insight into the workings of and changes in Earth’s climate system. Producers of data sets must devote considerable effort and expertise to maximise the true climate signals in their products and minimise effects of data processing choices and changing sensors. A key choice is the selection of algorithm(s) for classification and/or retrieval of the climate variable. Within the European Space Agency Climate Change Initiative, science teams undertook systematic assessment of algorithms for a range of essential climate variables. The papers in the special issue report some of these exercises (for ocean colour, aerosol, ozone, greenhouse gases, clouds, soil moisture, sea surface temper- ature and glaciers). The contributions show that assessment exercises must be designed with care, considering issues such as the relative importance of different aspects of data quality (accuracy, precision, stability, sensitivity, coverage, etc.), the availability and degree of independence of validation data and the limitations of validation in characterising some important aspects of data (such as long-term stability or spatial coherence). As well as re- quiring a significant investment of expertise and effort, systematic comparisons are found to be highly valuable. They reveal the relative strengths and weaknesses of different algorithmic approaches under different observa- tional contexts, and help ensure that scientific conclusions drawn from climate data records are not influenced by observational artifacts, but are robust.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper applies an attribute-based stated choice experiment approach to estimate the value that society places on changes to the size of the badger population in England and Wales. The study was undertaken in the context of a rising incidence of bovine tuberculosis (bTB) in cattle and the government's review of current bTB control policy. This review includes consideration of culling badgers to reduce bTB in cattle, since badgers are thought to be an important wildlife reservoir for the disease. The design of the CE involved four attributes (size of badger population, cattle slaughtered due to bTB, badger management strategy and household tax) at four levels with eight choice sets of two alternatives presented to respondents. Telephone interviews were undertaken with over 400 respondents, which elicited their attitudes and preferences concerning badgers, bTB in cattle and badger management strategies. The study estimated a willingness to pay of 0.10 pound per household per year per 100,000 badgers and 1.52 pound per household per year per 10,000 cattle slaughtered due to bTB which aggregated to 22 per badger and 3298 pound per bTB slaughtered animal for all households in England and Wales. Management strategy toward badgers had a very high valuation, highlighting the emotive issue of badger culling for respondents and the importance of government policy towards badgers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most active-contour methods are based either on maximizing the image contrast under the contour or on minimizing the sum of squared distances between contour and image 'features'. The Marginalized Likelihood Ratio (MLR) contour model uses a contrast-based measure of goodness-of-fit for the contour and thus falls into the first class. The point of departure from previous models consists in marginalizing this contrast measure over unmodelled shape variations. The MLR model naturally leads to the EM Contour algorithm, in which pose optimization is carried out by iterated least-squares, as in feature-based contour methods. The difference with respect to other feature-based algorithms is that the EM Contour algorithm minimizes squared distances from Bayes least-squares (marginalized) estimates of contour locations, rather than from 'strongest features' in the neighborhood of the contour. Within the framework of the MLR model, alternatives to the EM algorithm can also be derived: one of these alternatives is the empirical-information method. Tracking experiments demonstrate the robustness of pose estimates given by the MLR model, and support the theoretical expectation that the EM Contour algorithm is more robust than either feature-based methods or the empirical-information method. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Embodied theories of cognition propose that neural substrates used in experiencing the referent of a word, for example perceiving upward motion, should be engaged in weaker form when that word, for example ‘rise’, is comprehended. Motivated by the finding that the perception of irrelevant background motion at near-threshold, but not supra-threshold, levels interferes with task execution, we assessed whether interference from near-threshold background motion was modulated by its congruence with the meaning of words (semantic content) when participants completed a lexical decision task (deciding if a string of letters is a real word or not). Reaction times for motion words, such as ‘rise’ or ‘fall’, were slower when the direction of visual motion and the ‘motion’ of the word were incongruent — but only when the visual motion was at nearthreshold levels. When motion was supra-threshold, the distribution of error rates, not reaction times, implicated low-level motion processing in the semantic processing of motion words. As the perception of near-threshold signals is not likely to be influenced by strategies, our results support a close contact between semantic information and perceptual systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assigning probabilities to alleged relationships, given DNA profiles, requires, among other things, calculation of a likelihood ratio (LR). Such calculations usually assume independence of genes: this assumption is not appropriate when the tested individuals share recent ancestry due to population substructure. Adjusted LR formulae, incorporating the coancestry coefficient F(ST), are presented here for various two-person relationships, and the issue of mutations in parentage testing is also addressed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last ten years Regulatory Impact Analysis has become the instrument providing groundwork for evidence-based regulatory decisions in most developed countries. However, to an increase in quantity, it did not correspond an increase in quality. In Italy, Regulatory Impact Analysis has been in place for ten years on paper, but in practice it has not been performed consistently. Of particular interest is the case of independent regulatory authorities, which have been required to apply Regulatory Impact Analysis since 2003. This paper explores how Regulatory Impact Analysis is carried out, by examining in depth how an individual case –on the Regulation for Quality of Service- was executed by the Autorità per l’energia elettrica e il gas. The aim is to provide a picture of the process leading to the final Regulatory Impact Analysis report, rather than just a study of its content. The case illustrates how Regulatory Impact Analysis, when properly employed, can be an important aid to the regulatory decision, not only by assessing ex ante the economic impacts of regulatory proposals in terms of costs, benefits and risks, but also opening the spectrum of policy alternatives and systematically considering stakeholder opinions as part of the decision-making process. This case highlights also several difficulties, analytical and process-related, that emerge in practical applications. Finally, it shows that the experience and expertise built by the regulatory authority over the years had a significant impact on the quality of the analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although the independence of the association and causality has not been fully established, non-fasting (postprandial) triglyceride (TG) concentrations have emerged as a clinically significant cardiovascular disease (CVD) risk factor. In the current review, findings from three insightful prospective studies in the area, namely the Women's Health Study, the Copenhagen City Heart Study and the Norwegian Counties Study, are discussed. An overview is provided as to the likely etiological basis for the association between postprandial TG and CVD, with a focus on both lipid and non-lipid (inflammation, hemostasis and vascular function) risk factors. The impact of various lifestyle and physiological determinants are considered, in particular genetic variation and meal fat composition. Furthermore, although data is limited some information is provided as to the relative and interactive impact of a number of modulators of lipemia. It is evident that relative to age, gender and body mass index (known modulators of postprandial lipemia), the contribution of identified gene variants to the heterogeneity observed in the postprandial response is likely to be relatively small. Finally, we highlight the need for the development of a standardised ‘fat tolerance test’ for use in clinical trials, to allow the integration and comparison of data from individual studies

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Certain algebraic combinations of single scattering albedo and solar radiation reflected from, or transmitted through, vegetation canopies do not vary with wavelength. These ‘‘spectrally invariant relationships’’ are the consequence of wavelength independence of the extinction coefficient and scattering phase function in veg- etation. In general, this wavelength independence does not hold in the atmosphere, but in cloud-dominated atmospheres the total extinction and total scattering phase function vary only weakly with wavelength. This paper identifies the atmospheric conditions under which the spectrally invariant approximation can accu- rately describe the extinction and scattering properties of cloudy atmospheres. The validity of the as- sumptions and the accuracy of the approximation are tested with 1D radiative transfer calculations using publicly available radiative transfer models: Discrete Ordinate Radiative Transfer (DISORT) and Santa Barbara DISORT Atmospheric Radiative Transfer (SBDART). It is shown for cloudy atmospheres with cloud optical depth above 3, and for spectral intervals that exclude strong water vapor absorption, that the spectrally invariant relationships found in vegetation canopy radiative transfer are valid to better than 5%. The physics behind this phenomenon, its mathematical basis, and possible applications to remote sensing and climate are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe the approach to be adopted for a major new initiative to derive a homogeneous record of sea surface temperature for 1991–2007 from the observations of the series of three along-track scanning radiometers (ATSRs). This initiative is called (A)RC: (Advanced) ATSR Re-analysis for Climate. The main objectives are to reduce regional biases in retrieved sea surface temperature (SST) to less than 0.1 K for all global oceans, while creating a very homogenous record that is stable in time to within 0.05 K decade−1, with maximum independence of the record from existing analyses of SST used in climate change research. If these stringent targets are achieved, this record will enable significantly improved estimates of surface temperature trends and variability of sufficient quality to advance questions of climate change attribution, climate sensitivity and historical reconstruction of surface temperature changes. The approach includes development of new, consistent estimators for SST for each of the ATSRs, and detailed analysis of overlap periods. Novel aspects of the approach include generation of multiple versions of the record using alternative channel sets and cloud detection techniques, to assess for the first time the effect of such choices. There will be extensive effort in quality control, validation and analysis of the impact on climate SST data sets. Evidence for the plausibility of the 0.1 K target for systematic error is reviewed, as is the need for alternative cloud screening methods in this context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Parasitic infections with gastrointestinal nematodes (GINs) still represent a worldwide major pathological threat associated with the outdoor production of various livestock species. Because of the widespread resistance to synthetic chemical anthelmintics, there is a strong impetus to explore novel approaches for a more integrated management of the infections. The use of nutraceuticals in the control of GINs is one of the alternatives which has been widely studied for since 20 years. The objectives of this review are: i) to define and illustrate the concept of ‘nutraceutical’ in the context of veterinary parasitology based on data obtained on the most studied GIN models in small ruminants, the tannin-containing legumes (Fabaceae); ii) to illustrate how the ‘nutraceutical concept’ could be expanded to other plants, other livestock production systems and other GI parasitic diseases, and iii) to explain how this concept is opening up new research fields for better understanding the interactions between the host, the digestive parasites and the environment.