973 resultados para Empirical Comparison


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The comparison of observed global mean surface air temperature (GMT) change to the mean change simulated by climate models has received much public and scientific attention. For a given global warming signal produced by a climate model ensemble, there exists an envelope of GMT values representing the range of possible unforced states of the climate system (the Envelope of Unforced Noise; EUN). Typically, the EUN is derived from climate models themselves, but climate models might not accurately simulate the correct characteristics of unforced GMT variability. Here, we simulate a new, empirical, EUN that is based on instrumental and reconstructed surface temperature records. We compare the forced GMT signal produced by climate models to observations while noting the range of GMT values provided by the empirical EUN. We find that the empirical EUN is wide enough so that the interdecadal variability in the rate of global warming over the 20(th) century does not necessarily require corresponding variability in the rate-of-increase of the forced signal. The empirical EUN also indicates that the reduced GMT warming over the past decade or so is still consistent with a middle emission scenario's forced signal, but is likely inconsistent with the steepest emission scenario's forced signal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cigarette smuggling reduces the price of cigarettes, thwarts youth access restrictions, reduces government revenue, and undercuts the ability of taxes to reduce consumption. The tobacco industry often opposes increases to tobacco taxes on the claim that greater taxes induce more smuggling. To date, little is known about the magnitude of smuggling in the Philippines. his information is necessary to effectively address illicit trade and to measure the impacts of tax changes and the introduction of secure tax markings on illicit trade. This study employs two gap discrepancy methods to estimate the magnitude of illicit trade in cigarettes for the Philippines between 1994 and 2009. First, domestic consumption is compared with tax-paid sales to measure the consumption of illicit cigarettes. Second, imports recorded by the Philippines are compared with exports to the Philippines by trade partners to measure smuggling. Domestic consumption fell short of tax-paid sales for all survey years. The magnitude of these differences and a comparison with a prevalence survey for 2009 suggest a high level of survey under-reporting of smoking. In the late 1990s and the mid 2000s, the Philippines experienced two sharp declines in trade discrepancies, from a high of $750 million in 1995 to a low of $133.7 million in 2008. Discrepancies composed more than one-third of the domestic market in 1995, but only 10 percent in 2009. Hong Kong, Singapore, and China together account for more than 80 percent of the cumulative discrepancies over the period and 74 percent of the discrepancy in 2009. The presence of large discrepancies supports the need to implement an effective tax marking and tobacco track and trace system to reduce illicit trade and support tax collection. The absence of a relation between tax changes and smuggling suggests that potential increases in the excise tax should not be discouraged by illicit trade. Finally, the identification of specific trade partners as primary sources for illicit trade may facilitate targeted efforts in cooperation with these governments to reduce illicit trade.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Satellite-derived remote-sensing reflectance (Rrs) can be used for mapping biogeochemically relevant variables, such as the chlorophyll concentration and the Inherent Optical Properties (IOPs) of the water, at global scale for use in climate-change studies. Prior to generating such products, suitable algorithms have to be selected that are appropriate for the purpose. Algorithm selection needs to account for both qualitative and quantitative requirements. In this paper we develop an objective methodology designed to rank the quantitative performance of a suite of bio-optical models. The objective classification is applied using the NASA bio-Optical Marine Algorithm Dataset (NOMAD). Using in situRrs as input to the models, the performance of eleven semi-analytical models, as well as five empirical chlorophyll algorithms and an empirical diffuse attenuation coefficient algorithm, is ranked for spectrally-resolved IOPs, chlorophyll concentration and the diffuse attenuation coefficient at 489 nm. The sensitivity of the objective classification and the uncertainty in the ranking are tested using a Monte-Carlo approach (bootstrapping). Results indicate that the performance of the semi-analytical models varies depending on the product and wavelength of interest. For chlorophyll retrieval, empirical algorithms perform better than semi-analytical models, in general. The performance of these empirical models reflects either their immunity to scale errors or instrument noise in Rrs data, or simply that the data used for model parameterisation were not independent of NOMAD. Nonetheless, uncertainty in the classification suggests that the performance of some semi-analytical algorithms at retrieving chlorophyll is comparable with the empirical algorithms. For phytoplankton absorption at 443 nm, some semi-analytical models also perform with similar accuracy to an empirical model. We discuss the potential biases, limitations and uncertainty in the approach, as well as additional qualitative considerations for algorithm selection for climate-change studies. Our classification has the potential to be routinely implemented, such that the performance of emerging algorithms can be compared with existing algorithms as they become available. In the long-term, such an approach will further aid algorithm development for ocean-colour studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transparency in nonprofit sector and foundations, as an element to enhance the confidence of stakeholders in the organization, is a fact shown by several studies in recent decades. Transparency can be considered in various fields and through different channels. In our study we focused on the analysis of the organizational and economic transparency of foundations, shown through the voluntary information on their Website. We review the theoretical previous studies published to put to the foundations within the framework of the social economy. This theoretical framework has focused on accountability that make foundations in relation to its social function and its management, especially since the most recent focus of information transparency across the Website.In this theoretical framework was made an index to quantify the voluntary information which is shown on its website. This index has been developed ad hoc for this study and applied to a group of large corporate foundations.With the application of these data are obtained two kind of results, to a descriptive level and to a inferential level.We analyzed the statistical correlation between economic transparency and organizational transparency offered in the Website through quantified variables by a multiple linear regression. This empirical analysis allows us to draw conclusions about the level of transparency offered by these organizations in relation to their organizational and financial information, as well as explain the relation between them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The students academic performance is a key aspect for all agents involved in a higher education quality program. However, there is no unanimity on how to measure it. Some professionals choose assessing only cognitive aspects while others lean towards assessing the acquisition of certain skills. The need to train increasingly adapted professionals in order to respond to the companies’ demands and being able to compete internationally in a global labour market requires a kind of training that goes beyond memorizing. Critical and logical thinking are amongst written language skills demanded in the field of Social Sciences. The objective of this study is to empirically demonstrate the impact of voluntary assignments on the academic performance of students. Our hypothesis is that students who complete high quality voluntary assignments are those more motivated and, therefore, those with higher grades. An experiment with students from the "Financial Accounting II" during the academic year of 2012/13 at the Business and Economics School of the UCM was carried out. A series of voluntary assessments involving the preparation of accounting essays were proposed in order to develop skills and competencies as a complement to the lessons included in the curriculum of the subject. At the end of the course, the carrying-out or not of the essay together with its critical, reflective quality and style, were compared. Our findings show a relationship between the voluntarily presented papers of quality and the final grade obtained throughout the course. These results show that the students intrinsic motivation is a key element in their academic performance. On the other hand, the teachers role focuses on being a motivating element through the learning process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Social exclusion and social capital are widely used concepts with multiple and ambiguous definitions. Their meanings and indicators partially overlap, and thus they are sometimes used interchangeably to refer to the inter-relations of economy and society. Both ideas could benefit from further specification and differentiation. The causes of social exclusion and the consequences of social capital have received the fullest elaboration, to the relative neglect of the outcomes of social exclusion and the genesis of social capital. This article identifies the similarities and differences between social exclusion and social capital. We compare the intellectual histories and theoretical orientations of each term, their empirical manifestations and their place in public policy. The article then moves on to elucidate further each set of ideas. A central argument is that the conflation of these notions partly emerges from a shared theoretical tradition, but also from insufficient theorizing of the processes in which each phenomenon is implicated. A number of suggestions are made for sharpening their explanatory focus, in particular better differentiating between cause and consequence, contextualizing social relations and social networks, and subjecting the policy 'solutions' that follow from each perspective to critical scrutiny. Placing the two in dialogue is beneficial for the further development of each.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ecological coherence is a multifaceted conservation objective that includes some potentially conflicting concepts. These concepts include the extent to which the network maximises diversity (including genetic diversity) and the extent to which protected areas interact with non-reserve locations. To examine the consequences of different selection criteria, the preferred location to complement protected sites was examined using samples taken from four locations around each of two marine protected areas: Strangford Lough and Lough Hyne, Ireland. Three different measures of genetic distance were used: FST, Dest and a measure of allelic dissimilarity, along with a direct assessment of the total number of alleles in different candidate networks. Standardized site scores were used for comparisons across methods and selection criteria. The average score for Castlehaven, a site relatively close to Lough Hyne, was highest, implying that this site would capture the most genetic diversity while ensuring highest degree of interaction between protected and unprotected sites. Patterns around Strangford Lough were more ambiguous, potentially reflecting the weaker genetic structure around this protected area in comparison to Lough Hyne. Similar patterns were found across species with different dispersal capacities, indicating that methods based on genetic distance could be used to help maximise ecological coherence in reserve networks. ⺠Ecological coherence is a key component of marine protected area network design. ⺠Coherence contains a number of competing concepts. ⺠Genetic information from field populations can help guide assessments of coherence. ⺠Average choice across different concepts of coherence was consistent among species. ⺠Measures can be combined to compare the coherence of different network designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The environmental quality of land can be assessed by calculating relevant threshold values, which differentiate between concentrations of elements resulting from geogenic and diffuse anthropogenic sources and concentrations generated by point sources of elements. A simple process allowing the calculation of these typical threshold values (TTVs) was applied across a region of highly complex geology (Northern Ireland) to six elements of interest; arsenic, chromium, copper, lead, nickel and vanadium. Three methods for identifying domains (areas where a readily identifiable factor can be shown to control the concentration of an element) were used: k-means cluster analysis, boxplots and empirical cumulative distribution functions (ECDF). The ECDF method was most efficient at determining areas of both elevated and reduced concentrations and was used to identify domains in this investigation. Two statistical methods for calculating normal background concentrations (NBCs) and upper limits of geochemical baseline variation (ULBLs), currently used in conjunction with legislative regimes in the UK and Finland respectively, were applied within each domain. The NBC methodology was constructed to run within a specific legislative framework, and its use on this soil geochemical data set was influenced by the presence of skewed distributions and outliers. In contrast, the ULBL methodology was found to calculate more appropriate TTVs that were generally more conservative than the NBCs. TTVs indicate what a "typical" concentration of an element would be within a defined geographical area and should be considered alongside the risk that each of the elements pose in these areas to determine potential risk to receptors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the financial crisis, risk based portfolio allocations have gained a great deal in popularity. This increase in popularity is primarily due to the fact that they make no assumptions as to the expected return of the assets in the portfolio. These portfolios implicitly put risk management at the heart of asset allocation and thus their recent appeal. This paper will serve as a comparison of four well-known risk based portfolio allocation methods; minimum variance, maximum diversification, inverse volatility and equally weighted risk contribution. Empirical backtests will be performed throughout rising interest rate periods from 1953 to 2015. Additionally, I will compare these portfolios to more simple allocation methods, such as equally weighted and a 60/40 asset-allocation mix. This paper will help to answer the question if these portfolios can survive in a rising interest rate environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis studies the properties and usability of operators called t-norms, t-conorms, uninorms, as well as many valued implications and equivalences. Into these operators, weights and a generalized mean are embedded for aggregation, and they are used for comparison tasks and for this reason they are referred to as comparison measures. The thesis illustrates how these operators can be weighted with a differential evolution and aggregated with a generalized mean, and the kinds of measures of comparison that can be achieved from this procedure. New operators suitable for comparison measures are suggested. These operators are combination measures based on the use of t-norms and t-conorms, the generalized 3_-uninorm and pseudo equivalence measures based on S-type implications. The empirical part of this thesis demonstrates how these new comparison measures work in the field of classification, for example, in the classification of medical data. The second application area is from the field of sports medicine and it represents an expert system for defining an athlete's aerobic and anaerobic thresholds. The core of this thesis offers definitions for comparison measures and illustrates that there is no actual difference in the results achieved in comparison tasks, by the use of comparison measures based on distance, versus comparison measures based on many valued logical structures. The approach has been highly practical in this thesis and all usage of the measures has been validated mainly by practical testing. In general, many different types of operators suitable for comparison tasks have been presented in fuzzy logic literature and there has been little or no experimental work with these operators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we provide both qualitative and quantitative measures of the cost of measuring the integrated volatility by the realized volatility when the frequency of observation is fixed. We start by characterizing for a general diffusion the difference between the realized and the integrated volatilities for a given frequency of observations. Then, we compute the mean and variance of this noise and the correlation between the noise and the integrated volatility in the Eigenfunction Stochastic Volatility model of Meddahi (2001a). This model has, as special examples, log-normal, affine, and GARCH diffusion models. Using some previous empirical works, we show that the standard deviation of the noise is not negligible with respect to the mean and the standard deviation of the integrated volatility, even if one considers returns at five minutes. We also propose a simple approach to capture the information about the integrated volatility contained in the returns through the leverage effect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The preceding two editions of CoDaWork included talks on the possible consideration of densities as infinite compositions: Egozcue and D´ıaz-Barrero (2003) extended the Euclidean structure of the simplex to a Hilbert space structure of the set of densities within a bounded interval, and van den Boogaart (2005) generalized this to the set of densities bounded by an arbitrary reference density. From the many variations of the Hilbert structures available, we work with three cases. For bounded variables, a basis derived from Legendre polynomials is used. For variables with a lower bound, we standardize them with respect to an exponential distribution and express their densities as coordinates in a basis derived from Laguerre polynomials. Finally, for unbounded variables, a normal distribution is used as reference, and coordinates are obtained with respect to a Hermite-polynomials-based basis. To get the coordinates, several approaches can be considered. A numerical accuracy problem occurs if one estimates the coordinates directly by using discretized scalar products. Thus we propose to use a weighted linear regression approach, where all k- order polynomials are used as predictand variables and weights are proportional to the reference density. Finally, for the case of 2-order Hermite polinomials (normal reference) and 1-order Laguerre polinomials (exponential), one can also derive the coordinates from their relationships to the classical mean and variance. Apart of these theoretical issues, this contribution focuses on the application of this theory to two main problems in sedimentary geology: the comparison of several grain size distributions, and the comparison among different rocks of the empirical distribution of a property measured on a batch of individual grains from the same rock or sediment, like their composition

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este documento describe el desarrollo de la educación técnica en Inglaterra, Alemania y Francia durante el siglo XIX, contrastando sus similaridades y diferencias. También analiza el papel del Estado en la provisión de educación técnica en estos países. El artículo sugiere que el alto estándar científico y técnico en la enseñanza, contribuyó significativamente para que el país se convirtiera en una potencia económica. Por ejemplo, la creciente superioridad técnica de los alemanes sobre los británicos en actividades como producción química, tinturas, hierro y acero, ha sido atribuida al hecho de que los británicos persistieron en el uso de métodos empíricos que representaban una barrera para alcanzar mejoras y adaptación, mientras que los alemanes desarrollaron un sistema de educación universitario y politécnico con fuertes lazos industriales que le permitían convertirse en la potencia industrial mas grande de Europa al inicio del Siglo XX.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports CFD and experimental results of the characteristics of wall confluent jets in a room. The results presented show the behaviour of wall confluent jets in the form of velocity profiles, the spreading rate of jets on the surface, jets decay, etc. The empirical equations derived are compared with other types of air jets. In addition, the flow in wall confluent jets is compared with the flow in displacement ventilation supply, with regards to the vertical and horizontal spreading on the floor. It is concluded that the jet momentum of wall confluent jets can be more conserved than other jets. Thus, wall confluent jets have a greater spread over the floor than displacement flow.