709 resultados para Empirical penalties
Resumo:
Liquidity is a fundamentally important facet of investments, but there is no single measure that quantifies it perfectly. Instead, a range of measures are necessary to capture different dimensions of liquidity such as the breadth and depth of markets, the costs of transacting, the speed with which transactions can occur and the resilience of prices to trading activity. This article considers how different dimensions have been measured in financial markets and for various forms of real estate investment. The purpose of this exercise is to establish the range of liquidity measures that could be used for real estate investments before considering which measures and questions have been investigated so far. Most measures reviewed here are applicable to public real estate, but not all can be applied to private real estate assets or funds. Use of a broader range of liquidity measures could help real estate researchers tackle issues such as quantification of illiquidity premiums for the real estate asset class or different types of real estate, and how liquidity differences might be incorporated into portfolio allocation models.
Resumo:
Purpose – The purpose of this paper is to seek to shed light on the practice of incomplete corporate disclosure of quantitative Greenhouse gas (GHG) emissions and investigates whether external stakeholder pressure influences the existence, and separately, the completeness of voluntary GHG emissions disclosures by 431 European companies. Design/methodology/approach – A classification of reporting completeness is developed with respect to the scope, type and reporting boundary of GHG emissions based on the guidelines of the GHG Protocol, Global Reporting Initiative and the Carbon Disclosure Project. Logistic regression analysis is applied to examine whether proxies for exposure to climate change concerns from different stakeholder groups influence the existence and/or completeness of quantitative GHG emissions disclosure. Findings – From 2005 to 2009, on average only 15 percent of companies that disclose GHG emissions report them in a manner that the authors consider complete. Results of regression analyses suggest that external stakeholder pressure is a determinant of the existence but not the completeness of emissions disclosure. Findings are consistent with stakeholder theory arguments that companies respond to external stakeholder pressure to report GHG emissions, but also with legitimacy theory claims that firms can use carbon disclosure, in this case the incomplete reporting of emissions, as a symbolic act to address legitimacy exposures. Practical implications – Bringing corporate GHG emissions disclosure in line with recommended guidelines will require either more direct stakeholder pressure or, perhaps, a mandated disclosure regime. In the meantime, users of the data will need to carefully consider the relevance of the reported data and develop the necessary competencies to detect and control for its incompleteness. A more troubling concern is that stakeholders may instead grow to accept less than complete disclosure. Originality/value – The paper represents the first large-scale empirical study into the completeness of companies’ disclosure of quantitative GHG emissions and is the first to analyze these disclosures in the context of stakeholder pressure and its relation to legitimation.
Resumo:
Small and medium sized enterprises (SMEs) play an important role in the European economy. A critical challenge faced by SME leaders, as a consequence of the continuing digital technology revolution, is how to optimally align business strategy with digital technology to fully leverage the potential offered by these technologies in pursuit of longevity and growth. There is a paucity of empirical research examining how e-leadership in SMEs drives successful alignment between business strategy and digital technology fostering longevity and growth. To address this gap, in this paper we develop an empirically derived e-leadership model. Initially we develop a theoretical model of e-leadership drawing on strategic alignment theory. This provides a theoretical foundation on how SMEs can harness digital technology in support of their business strategy enabling sustainable growth. An in-depth empirical study was undertaken interviewing 42 successful European SME leaders to validate, advance and substantiate our theoretically driven model. The outcome of the two stage process – inductive development of a theoretically driven e-leadership model and deductive advancement to develop a complete model through in-depth interviews with successful European SME leaders – is an e-leadership model with specific constructs fostering effective strategic alignment. The resulting diagnostic model enables SME decision makers to exercise effective e-leadership by creating productive alignment between business strategy and digital technology improving longevity and growth prospects.
Resumo:
In search of better, traditional learning universities have expanded their ways to deliver knowledge and integrate cost effective e-learning systems. Universities’ use of information and communication technologies has grown tremendously over the last decade. To ensure efficient use of the e-learning system, the Arab Open University (AOU) in Bahrain was the first to use e-learning system there, aimed to evaluate the good and bad practices, detect errors and determine areas for further improvements in usage. This study critically evaluated the students’ perception of the elearning system in Bahrain and recommended changes to improve students’ e-learning usage. Results of the study indicated that, in general, students have favourable perceptions toward using the e-learning system. This study has shown that technology acceptance is the most variable, factor that contributes to students’ perception and satisfaction of the e-learning system.
Resumo:
The matrix-tolerance hypothesis suggests that the most abundant species in the inter-habitat matrix would be less vulnerable to their habitat fragmentation. This model was tested with leaf-litter frogs in the Atlantic Forest where the fragmentation process is older and more severe than in the Amazon, where the model was first developed. Frog abundance data from the agricultural matrix, forest fragments and continuous forest localities were used. We found an expected negative correlation between the abundance of frogs in the matrix and their vulnerability to fragmentation, however, results varied with fragment size and species traits. Smaller fragments exhibited stronger matrix-vulnerability correlation than intermediate fragments, while no significant relation was observed for large fragments. Moreover, some species that avoid the matrix were not sensitive to a decrease in the patch size, and the opposite was also true, indicating significant differences with that expected from the model. Most of the species that use the matrix were forest species with aquatic larvae development, but those species do not necessarily respond to fragmentation or fragment size, and thus affect more intensively the strengthen of the expected relationship. Therefore, the main relationship expected by the matrix-tolerance hypothesis was observed in the Atlantic Forest; however we noted that the prediction of this hypothesis can be substantially affected by the size of the fragments, and by species traits. We propose that matrix-tolerance model should be broadened to become a more effective model, including other patch characteristics, particularly fragment size, and individual species traits (e. g., reproductive mode and habitat preference).
Resumo:
Universal properties of the Coulomb interaction energy apply to all many-electron systems. Bounds on the exchange-correlation energy, in particular, are important for the construction of improved density functionals. Here we investigate one such universal property-the Lieb-Oxford lower bound-for ionic and molecular systems. In recent work [J Chem Phys 127, 054106 (2007)], we observed that for atoms and electron liquids this bound may be substantially tightened. Calculations for a few ions and molecules suggested the same tendency, but were not conclusive due to the small number of systems considered. Here we extend that analysis to many different families of ions and molecules, and find that for these, too, the bound can be empirically tightened by a similar margin as for atoms and electron liquids. Tightening the Lieb-Oxford bound will have consequences for the performance of various approximate exchange-correlation functionals. (C) 2008 Wiley Periodicals Inc.
Exact penalties for variational inequalities with applications to nonlinear complementarity problems
Resumo:
In this paper, we present a new reformulation of the KKT system associated to a variational inequality as a semismooth equation. The reformulation is derived from the concept of differentiable exact penalties for nonlinear programming. The best theoretical results are presented for nonlinear complementarity problems, where simple, verifiable, conditions ensure that the penalty is exact. We close the paper with some preliminary computational tests on the use of a semismooth Newton method to solve the equation derived from the new reformulation. We also compare its performance with the Newton method applied to classical reformulations based on the Fischer-Burmeister function and on the minimum. The new reformulation combines the best features of the classical ones, being as easy to solve as the reformulation that uses the Fischer-Burmeister function while requiring as few Newton steps as the one that is based on the minimum.
Resumo:
Predictors of random effects are usually based on the popular mixed effects (ME) model developed under the assumption that the sample is obtained from a conceptual infinite population; such predictors are employed even when the actual population is finite. Two alternatives that incorporate the finite nature of the population are obtained from the superpopulation model proposed by Scott and Smith (1969. Estimation in multi-stage surveys. J. Amer. Statist. Assoc. 64, 830-840) or from the finite population mixed model recently proposed by Stanek and Singer (2004. Predicting random effects from finite population clustered samples with response error. J. Amer. Statist. Assoc. 99, 1119-1130). Predictors derived under the latter model with the additional assumptions that all variance components are known and that within-cluster variances are equal have smaller mean squared error (MSE) than the competitors based on either the ME or Scott and Smith`s models. As population variances are rarely known, we propose method of moment estimators to obtain empirical predictors and conduct a simulation study to evaluate their performance. The results suggest that the finite population mixed model empirical predictor is more stable than its competitors since, in terms of MSE, it is either the best or the second best and when second best, its performance lies within acceptable limits. When both cluster and unit intra-class correlation coefficients are very high (e.g., 0.95 or more), the performance of the empirical predictors derived under the three models is similar. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
Gross domestic product (GDP) is generally considered as the most important index and comprehensive measure of the size of economy. This paper investigates empirically the relationship between transport infrastructure (focus on highways) and GDP growth based on a production function approach. The physical stocks of transport infrastructure were used instead of monetary data to measure public capital together with several other variables (labor and private capital) that were hypothesized to affect economic growth. Then we explore a number of subsequent studies that use panel data covering the period between 1992 and 2004. An investigation was done to compare developed countries and developing countries. Results indicate that physical units are positively and significantly related to economic growth. Furthermore there was an interesting finding that the output elasticity with respect to physical units for developed countries is higher than developing countries.
Resumo:
This paper investigates the impact of inward FDI (Foreign Direct Investment) on international trade of China empirically on the country level by using panel data from 1984 to 2007. Two separate transformed models which are based on the gravity equation and refer to the econometric models of some previous studies, are used in this paper to estimate the effect of FDI inflows on exports and imports respectively. The estimation results confirmed the complementary relationship between FDI inflows and trade of China both on exports and imports, which has also been supported by previous empirical studies.
Resumo:
The increase in foreign students in countries such as the US, the UK and Francesuggests that the international ‘education industry’ is growing in importance. Thepurpose of this paper is to investigate the empirical determinants of internationalstudent mobility. A secondary purpose is to give tentative policy suggestions to hostcountry, source country and also to provide some recommendations to students whowant to study abroad. Using pooled cross-sectional time series data for the US overthe time period 1993-2006, we estimate an econometric model of enrolment rates offoreign students in the US. Our results suggest that tuition fees, US federal support ofeducation, and the size of the ‘young’ generation of source countries have asignificant influence on international student mobility. We also consider other factorsthat may be relevant in this context.
Resumo:
The aim of the study was to see if any relationship between government spending andunemployment could be empirically found. To test if government spending affectsunemployment, a statistical model was applied on data from Sweden. The data was quarterlydata from the year 1994 until 2012, unit-root test were conducted and the variables wheretransformed to its first-difference so ensure stationarity. This transformation changed thevariables to growth rates. This meant that the interpretation deviated a little from the originalgoal. Other studies reviewed indicate that when government spending increases and/or taxesdecreases output increases. Studies show that unemployment decreases when governmentspending/GDP ratio increases. Some studies also indicated that with an already largegovernment sector increasing the spending it could have negative effect on output. The modelwas a VAR-model with unemployment, output, interest rate, taxes and government spending.Also included in the model were a linear and three quarterly dummies. The model used 7lags. The result was not statistically significant for most lags but indicated that as governmentspending growth rate increases holding everything else constant unemployment growth rateincreases. The result for taxes was even less statistically significant and indicates norelationship with unemployment. Post-estimation test indicates that there were problems withnon-normality in the model. So the results should be interpreted with some scepticism.
Resumo:
Combinatorial optimization problems, are one of the most important types of problems in operational research. Heuristic and metaheuristics algorithms are widely applied to find a good solution. However, a common problem is that these algorithms do not guarantee that the solution will coincide with the optimum and, hence, many solutions to real world OR-problems are afflicted with an uncertainty about the quality of the solution. The main aim of this thesis is to investigate the usability of statistical bounds to evaluate the quality of heuristic solutions applied to large combinatorial problems. The contributions of this thesis are both methodological and empirical. From a methodological point of view, the usefulness of statistical bounds on p-median problems is thoroughly investigated. The statistical bounds have good performance in providing informative quality assessment under appropriate parameter settings. Also, they outperform the commonly used Lagrangian bounds. It is demonstrated that the statistical bounds are shown to be comparable with the deterministic bounds in quadratic assignment problems. As to empirical research, environment pollution has become a worldwide problem, and transportation can cause a great amount of pollution. A new method for calculating and comparing the CO2-emissions of online and brick-and-mortar retailing is proposed. It leads to the conclusion that online retailing has significantly lesser CO2-emissions. Another problem is that the Swedish regional division is under revision and the border effect to public service accessibility is concerned of both residents and politicians. After analysis, it is shown that borders hinder the optimal location of public services and consequently the highest achievable economic and social utility may not be attained.
Resumo:
Background: Constructive alignment (CA) is a pedagogical approach that emphasizes the alignment between the intended learning outcomes (ILOs), teaching and learning activities (TLAs) and assessment tasks (ATs) as well as creation of a teaching/learning environment where students will be able to actively create their knowledge. Objectives: This paper aims at investigating the extent of constructively-aligned courses in Computer Engineering and Informatics department at Dalarna University, Sweden. This study is based on empirical observations of teacher’s perceptions of implementation of CA in their courses. Methods: Ten teachers (5 from each department) were asked to fill a paper-based questionnaire, which included a number of questions related to issues of implementing CA in courses. Results: Responses to the items of the questionnaire were mixed. Teachers clearly state the ILOs in their courses and try to align the TLAs and ATs to the ILOs. Computer Engineering teachers do not explicitly communicate the ILOs to the students as compared to Informatics teachers. In addition, Computer Engineering teachers stated that their students are less active in learning activities as compared to Informatics teachers. When asked about their subjective ratings of teaching methods all teachers stated that their current teaching is teacher-centered but they try to shift the focus of activity from them to the students. Conclusions: From teachers’ perspectives, the courses are partially constructively-aligned. Their courses are “aligned”, i.e. ILOs, TLAs and ATs are aligned to each other but they are not “constructive” since, according to them, there was a low student engagement in learning activities, especially in Computer Engineering department.