931 resultados para Mean-value deviance (MVD)
Resumo:
In this paper the authors construct a theory about how the expansion of higher education could be associated with several factors that indicate a decline in the quality of degrees. They assume that the expansion of tertiary education takes place through three channels, and show how these channels are likely to reduce average study time, lower academic requirements and average wages, and inflate grades. First, universities have an incentive to increase their student body through public and private funding schemes beyond a level at which they can keep their academic requirements high. Second, due to skill-biased technological change, employers have an incentive to recruit staff with a higher education degree. Third, students have an incentive to acquire a college degree due to employers’ preferences for such qualifications; the university application procedures; and through the growing social value placed on education. The authors develop a parsimonious dynamic model in which a student, a college and an employer repeatedly make decisions about requirement levels, performance and wage levels. Their model shows that if i) universities have the incentive to decrease entrance requirements, ii) employers are more likely to employ staff with a higher education degree and iii) all types of students enrol in colleges, the final grade will not necessarily induce weaker students to study more to catch up with more able students. In order to re-establish a quality-guarantee mechanism, entrance requirements should be set at a higher level.
Resumo:
BACKGROUND: Recent studies have demonstrated that exercise capacity is an independent predictor of mortality in women. Normative values of exercise capacity for age in women have not been well established. Our objectives were to construct a nomogram to permit determination of predicted exercise capacity for age in women and to assess the predictive value of the nomogram with respect to survival. METHODS: A total of 5721 asymptomatic women underwent a symptom-limited, maximal stress test. Exercise capacity was measured in metabolic equivalents (MET). Linear regression was used to estimate the mean MET achieved for age. A nomogram was established to allow the percentage of predicted exercise capacity to be estimated on the basis of age and the exercise capacity achieved. The nomogram was then used to determine the percentage of predicted exercise capacity for both the original cohort and a referral population of 4471 women with cardiovascular symptoms who underwent a symptom-limited stress test. Survival data were obtained for both cohorts, and Cox survival analysis was used to estimate the rates of death from any cause and from cardiac causes in each group. RESULTS: The linear regression equation for predicted exercise capacity (in MET) on the basis of age in the cohort of asymptomatic women was as follows: predicted MET = 14.7 - (0.13 x age). The risk of death among asymptomatic women whose exercise capacity was less than 85 percent of the predicted value for age was twice that among women whose exercise capacity was at least 85 percent of the age-predicted value (P<0.001). Results were similar in the cohort of symptomatic women. CONCLUSIONS: We have established a nomogram for predicted exercise capacity on the basis of age that is predictive of survival among both asymptomatic and symptomatic women. These findings could be incorporated into the interpretation of exercise stress tests, providing additional prognostic information for risk stratification.
Resumo:
Background Regression to the mean (RTM) is a statistical phenomenon that can make natural variation in repeated data look like real change. It happens when unusually large or small measurements tend to be followed by measurements that are closer to the mean. Methods We give some examples of the phenomenon, and discuss methods to overcome it at the design and analysis stages of a study. Results The effect of RTM in a sample becomes more noticeable with increasing measurement error and when follow-up measurements are only examined on a sub-sample selected using a baseline value. Conclusions RTM is a ubiquitous phenomenon in repeated data and should always be considered as a possible cause of an observed change. Its effect can be alleviated through better study design and use of suitable statistical methods.
Resumo:
The recent deregulation in electricity markets worldwide has heightened the importance of risk management in energy markets. Assessing Value-at-Risk (VaR) in electricity markets is arguably more difficult than in traditional financial markets because the distinctive features of the former result in a highly unusual distribution of returns-electricity returns are highly volatile, display seasonalities in both their mean and volatility, exhibit leverage effects and clustering in volatility, and feature extreme levels of skewness and kurtosis. With electricity applications in mind, this paper proposes a model that accommodates autoregression and weekly seasonals in both the conditional mean and conditional volatility of returns, as well as leverage effects via an EGARCH specification. In addition, extreme value theory (EVT) is adopted to explicitly model the tails of the return distribution. Compared to a number of other parametric models and simple historical simulation based approaches, the proposed EVT-based model performs well in forecasting out-of-sample VaR. In addition, statistical tests show that the proposed model provides appropriate interval coverage in both unconditional and, more importantly, conditional contexts. Overall, the results are encouraging in suggesting that the proposed EVT-based model is a useful technique in forecasting VaR in electricity markets. (c) 2005 International Institute of Forecasters. Published by Elsevier B.V. All rights reserved.
Resumo:
Public values are moving from a research concern to policy discourse and management practice. There are, though, different readings of what public values actually mean. Reflection suggests two distinct strands of thinking: a generative strand that sees public value emerging from processes of public debate; and an institutional interpretation that views public values as the attributes of government producers. Neither perspective seems to offer a persuasive account of how the public gains from strengthened public values. Key propositions on values are generated from comparison of influential texts. A provisional framework is presented of the values base of public institutions and the loosely coupled public propositions flowing from these values. Value propositions issue from different governing contexts, which are grouped into policy frames that then compete with other problem frames for citizens’ cognitive resources. Vital democratic commitments to pluralism require public values to be distributed in competition with other, respected, frames.
Resumo:
In recent years, the luxury market has entered a period of very modest growth, which has been dubbed the ‘new normal’, where varying tourist flows, currency fluctuations, and shifted consumer tastes dictate the terms. The modern luxury consumer is a fickle mistress. Especially millennials – people born in the 1980s and 1990s – are the embodiment of this new form of demanding luxury consumer with particular tastes and values. Modern consumers, and specifically millennials, want experiences and free time, and are interested in a brand’s societal position and environmental impact. The purpose of this thesis is to investigate what the luxury value perceptions of millennials in higher education are in Europe, seeing as many of the most prominent luxury goods companies in the world originate from Europe. Perceived luxury value is herein examined from the individual’s perspective. As values and value perceptions are complex constructs, using qualitative research methods is justifiable. The data for thesis has been gathered by means of a group interview. The interview participants all study hospitality management in a private college, and each represent a different nationality. Cultural theories and research on luxury and luxury values provide the scientific foundation for this thesis, and a multidimensional luxury value model is used as a theoretical tool in sorting and analyzing the data. The results show that millennials in Europe value much more than simply modern and hard luxury. Functional, financial, individual, and social aspects are all present in perceived luxury value, but some more in a negative sense than others. Conspicuous, status-seeking consumption is mostly frowned upon, as is the consumption of luxury goods for the sake of satisfying social requisites and peer pressure. Most of the positive value perceptions are attributed to the functional dimension, as luxury products are seen to come with a promise of high quality and reliability, which justifies any price premiums. Ecological and ethical aspects of luxury are already a contemporary trend, but perceived even more as an important characteristic of luxury in the future. Most importantly, having time is fundamental. Depending on who is asked, luxury can mean anything, just as much as it can mean nothing.
Resumo:
The wide-spread impact of exotic fishes especially Oreochromis niloticus and Lates niloticus together with over fishing in the Victoria and Kyoga lake basins during the 1950s and 1960s, caused endemic species such as the previously most important Oreochromis esculentus to become virtually extinct in the two lakes by the 1970s. Based on reports of presence of this native species in some satellite lakes within the two lake basins, a set of satellite lakes in the Victoria basin (Nabugabo lakes: Kayanja and Kayugi), were sampled between 1997-2002 with an objective of assessing their value as conservation sites for O. esculentus. Other satellite lakes (Mburo and Kachera) also in the Victoria basin, and Lemwa, Kawi and Nabisojjo, in the Kyoga basin, were sampled for comparison. Among the Nabugabo lakes, O. esculentus was more abundant in Lake Kayanja (20.1 %) ofthe total fish catch by weight compared to Lake Kayugi (1.4 %). The largest fish examined (38.7 cm TL) was caught in Lake Kayugi, (also the largest in all satellite lakes sampled), while the smallest (6.6 cm TL) was from Lake Kayanja. Fish from Lake Kayugi had a higher condition factor K (1.89± 0.02) than that from Lake Kayanja (1.53±0.0I), which was the second highest (compared with other satellite lakes) to Lake Kawi (1.92±0.2). Diatoms, especially Aulacoseira, which were previously known to be the best food for O. esculentus in Lake Victoria were mostly encountered (93.2 %) in fish stomachs from Lake Kayugi. In Lake Kayanja the dominant food item was the blue green algae (Planktolyngbya) while Microcystis was the most abundant diet item in fish from other satellite lakes. There were more male than female fish (ratio 1:0.91 and 1: 0.79 in lakes Kayugi and Kayanja respectively). This is comparable to the situation in Lake Victoria before the species got depleted. The highest mean fecundity was (771±218 eggs) recorded in Lake Kayugi compared to Lake Kayanja (399±143). Based on the results from Lake Kayugi, where diatoms dominated the diet of O. esculentus and where the largest, most fecund and healthy fish were found, this lake would be a most valuable site for the conservation of O. esculentus and the best source of fish, for restocking and for captive-propagation. This lake is therefore recommended for protection from over exploitation and misuse.
Resumo:
Statistical approaches to study extreme events require, by definition, long time series of data. In many scientific disciplines, these series are often subject to variations at different temporal scales that affect the frequency and intensity of their extremes. Therefore, the assumption of stationarity is violated and alternative methods to conventional stationary extreme value analysis (EVA) must be adopted. Using the example of environmental variables subject to climate change, in this study we introduce the transformed-stationary (TS) methodology for non-stationary EVA. This approach consists of (i) transforming a non-stationary time series into a stationary one, to which the stationary EVA theory can be applied, and (ii) reverse transforming the result into a non-stationary extreme value distribution. As a transformation, we propose and discuss a simple time-varying normalization of the signal and show that it enables a comprehensive formulation of non-stationary generalized extreme value (GEV) and generalized Pareto distribution (GPD) models with a constant shape parameter. A validation of the methodology is carried out on time series of significant wave height, residual water level, and river discharge, which show varying degrees of long-term and seasonal variability. The results from the proposed approach are comparable with the results from (a) a stationary EVA on quasi-stationary slices of non-stationary series and (b) the established method for non-stationary EVA. However, the proposed technique comes with advantages in both cases. For example, in contrast to (a), the proposed technique uses the whole time horizon of the series for the estimation of the extremes, allowing for a more accurate estimation of large return levels. Furthermore, with respect to (b), it decouples the detection of non-stationary patterns from the fitting of the extreme value distribution. As a result, the steps of the analysis are simplified and intermediate diagnostics are possible. In particular, the transformation can be carried out by means of simple statistical techniques such as low-pass filters based on the running mean and the standard deviation, and the fitting procedure is a stationary one with a few degrees of freedom and is easy to implement and control. An open-source MAT-LAB toolbox has been developed to cover this methodology, which is available at https://github.com/menta78/tsEva/(Mentaschi et al., 2016).
Resumo:
In this study we examined the impact of weather variability and tides on the transmission of Barmah Forest virus (BFV) disease and developed a weather-based forecasting model for BFV disease in the Gladstone region, Australia. We used seasonal autoregressive integrated moving-average (SARIMA) models to determine the contribution of weather variables to BFV transmission after the time-series data of response and explanatory variables were made stationary through seasonal differencing. We obtained data on the monthly counts of BFV cases, weather variables (e.g., mean minimum and maximum temperature, total rainfall, and mean relative humidity), high and low tides, and the population size in the Gladstone region between January 1992 and December 2001 from the Queensland Department of Health, Australian Bureau of Meteorology, Queensland Department of Transport, and Australian Bureau of Statistics, respectively. The SARIMA model shows that the 5-month moving average of minimum temperature (β = 0.15, p-value < 0.001) was statistically significantly and positively associated with BFV disease, whereas high tide in the current month (β = −1.03, p-value = 0.04) was statistically significantly and inversely associated with it. However, no significant association was found for other variables. These results may be applied to forecast the occurrence of BFV disease and to use public health resources in BFV control and prevention.