966 resultados para Expected revenue
Resumo:
Innovation continues to be high on the agenda in construction. It is widely considered to be an essential prerequisite of improved performance both for the sector at large and for individual firms. Success stories dominate the parts of the academic literature that rely heavily on the recollections of key individuals. A complementary interpretation focuses on the way innovation champions in hindsight interpret, justify and legitimize the diffusion of innovations. Emphasis is put on the temporal dimension of interpretation and how this links to rhetorical strategies and impression management tactics. Rhetorical theories are drawn upon to analyse the accounts given by innovation champions in seven facilities management organizations. In particular, the three persuasive appeals in classic rhetoric are used to highlight the rhetorical justifications mobilized in the descriptions of what took place. The findings demonstrate the usefulness of rhetorical theories in complementing studies of innovation.
Resumo:
Traditional resource management has had as its main objective the optimization of throughput, based on parameters such as CPU, memory, and network bandwidth. With the appearance of Grid markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The Self-organizing ICT Resource Management (SORMA) project aims at allowing resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA's motivation is to achieve efficient resource utilization by maximizing revenue for resource providers and minimizing the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that the desired quality of service levels meet the expectations of market participants. This paper explains the proposed use of an economically enhanced resource manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximization across multiple service level agreements and provides an application scenario to demonstrate its usefulness and effectiveness. Copyright © 2008 John Wiley & Sons, Ltd.
Resumo:
Where joint forest management has been introduced into Tanzania, ‘volunteer’ patrollers take responsibility for enforcing restrictions over the harvesting of forest resources, often receiving as an incentive a share of the collected fine revenue. Using an optimal enforcement model, we explore how that share, and whether villagers have alternative sources of forest products, determines the effort patrollers put into enforcement and whether they choose to take a bribe rather than honestly reporting the illegal collection of forest resources. Without funds for paying and monitoring patrollers, policy makers face tradeoffs over illegal extraction, forest protection and revenue generation through fine collection.
Resumo:
This article applies FIMIX-PLS segmentation methodology to detect and explore unanticipated reactions to organisational strategy among stakeholder segments. For many large organisations today, the tendency to apply a “one-size-fits-all” strategy to members of a stakeholder population, commonly driven by a desire for simplicity, efficiency and fairness, may actually result in unanticipated consequences amongst specific subgroups within the target population. This study argues that it is critical for organisations to understand the varying and potentially harmful effects of strategic actions across differing, and previously unidentified, segments within a stakeholder population. The case of a European revenue service that currently focuses its strategic actions on building trust and compliant behaviour amongst taxpayers is used as the context for this study. FIMIX-PLS analysis is applied to a sample of 501 individual taxpayers, while a novel PLS-based approach for assessing measurement model invariance that can be applied to both reflective and formative measures is also introduced for the purpose of multi-group comparisons. The findings suggest that individual taxpayers can be split into two equal-sized segments with highly differentiated characteristics and reactions to organisational strategy and communications. Compliant behaviour in the first segment (n = 223), labelled “relationships centred on trust,” is mainly driven through positive service experiences and judgements of competence, while judgements of benevolence lead to the unanticipated reaction of increasing distrust among this group. Conversely, compliant behaviour in the second segment (n = 278), labelled “relationships centred on distrust,” is driven by the reduction of fear and scepticism towards the revenue service, which is achieved through signalling benevolence, reduced enforcement and the lower incidence of negative stories. In this segment, the use of enforcement has the unanticipated and counterproductive effect of ultimately reducing compliant behaviour.
Resumo:
In their comment on my 1990 article, Yeh, Suwanakul, and Mai extend my analysis-which focused attention exclusively on firm output-to allow for simultaneous endogeneity of price, aggregate output, and numbers of firms. They show that, with downward- sloping demand, industry output adjusts positively to revenue-neutral changes in the marginal rate of taxation. This result is significant for two reasons. First, we are more often interested in predictions about aggregate phenomena than we are in predictions about individual firms. Indeed, firm-level predictions are frequently irrefutable since firm data are often unavailable. Second, the authors derive their result under a set of conditions that appear to be more general than those invoked in my 1990 article. In particular, they circumvent the need to invoke specific assumptions about the nature of firms' aversions toward risk. I consider this a useful extension and I appreciate the careful scrutiny of my paper.
Resumo:
Upper air observations from radiosondes and microwave satellite instruments does not indicate any global warming during the last 19 years, contrary to surface measurements, where a warming trend is supposedly being found. This result is somewhat difficult to reconcile, since climate model experiments do indicate a reverse trend, namely, that upper tropospheric air should warm faster than the surface. To contribute toward an understanding of this difficulty, we have here undertaken some specific experiments to study the effect on climate due to the decrease in stratospheric ozone and the Mount Pinatubo eruption in 1991. The associated forcing was added to the forcing from greenhouse gases, sulfate aerosols (direct and indirect effect), and tropospheric ozone, which was investigated in a separate series of experiments. Furthermore, we have undertaken an ensemble study in order to explore the natural variability of an advanced climate model exposed to such a forcing over 19 years. The result shows that the reduction of stratospheric ozone cools not only the lower stratosphere but also the troposphere, in particular, the upper and middle part. In the upper troposphere the cooling from stratospheric ozone leads to a significant reduction of greenhouse warming. The modeled stratospheric aerosols from Mount Pinatubo generate a climate response (stratospheric warming and tropospheric cooling) in good agreement with microwave satellite measurements. Finally, analysis of a series of experiments with both stratospheric ozone and the Mount Pinatubo effect shows considerable variability in climate response, suggesting that an evolution having no warming in the period is as likely as another evolution showing modest warming. However, the observed trend of no warming in the midtroposphere and clear warming at the surface is not found in the model simulations.
Resumo:
Techniques are proposed for evaluating forecast probabilities of events. The tools are especially useful when, as in the case of the Survey of Professional Forecasters (SPF) expected probability distributions of inflation, recourse cannot be made to the method of construction in the evaluation of the forecasts. The tests of efficiency and conditional efficiency are applied to the forecast probabilities of events of interest derived from the SPF distributions, and supplement a whole-density evaluation of the SPF distributions based on the probability integral transform approach.
Resumo:
In probabilistic decision tasks, an expected value (EV) of a choice is calculated, and after the choice has been made, this can be updated based on a temporal difference (TD) prediction error between the EV and the reward magnitude (RM) obtained. The EV is measured as the probability of obtaining a reward x RM. To understand the contribution of different brain areas to these decision-making processes, functional magnetic resonance imaging activations related to EV versus RM (or outcome) were measured in a probabilistic decision task. Activations in the medial orbitofrontal cortex were correlated with both RM and with EV and confirmed in a conjunction analysis to extend toward the pregenual cingulate cortex. From these representations, TD reward prediction errors could be produced. Activations in areas that receive from the orbitofrontal cortex including the ventral striatum, midbrain, and inferior frontal gyrus were correlated with the TD error. Activations in the anterior insula were correlated negatively with EV, occurring when low reward outcomes were expected, and also with the uncertainty of the reward, implicating this region in basic and crucial decision-making parameters, low expected outcomes, and uncertainty.
Resumo:
O objetivo deste trabalho foi mostrar modelagens alternativas à tradicional maneira de se apurar o risco de mercado para ativos financeiros brasileiros. Procurou-se cobrir o máximo possível de fatores de risco existentes no Brasil; para tanto utilizamos as principais proxies para instrumentos de Renda Fixa. Em momentos de volatilidade, o gerenciamento de risco de mercado é bastante criticado por trabalhar dentro de modelagens fundamentadas na distribuição normal. Aqui reside a maior contribuição do VaR e também a maior crítica a ele. Adicionado a isso, temos um mercado caracterizado pela extrema iliquidez no mercado secundário até mesmo em certos tipos de títulos públicos federais. O primeiro passo foi fazer um levantamento da produção acadêmica sobre o tema, seja no Brasil ou no mundo. Para a nossa surpresa, pouco, no nosso país, tem se falado em distribuições estáveis aplicadas ao mercado financeiro, seja em gerenciamento de risco, precificação de opções ou administração de carteiras. Após essa etapa, passamos a seleção das variáveis a serem utilizadas buscando cobrir uma grande parte dos ativos financeiros brasileiros. Assim, deveríamos identificar a presença ou não da condição de normalidade para, aí sim, realizarmos as modelagens das medidas de risco, VaR e ES, para os ativos escolhidos, As condições teóricas e práticas estavam criadas: demanda de mercado (crítica ao método gausiano bastante difundido), ampla cobertura de ativos (apesar do eventual questionamento da liquidez), experiência acadêmica e conhecimento internacional (por meio de detalhado e criterioso estudo da produção sobre o tema nos principais meios). Analisou-se, desta forma, quatro principais abordagens para o cálculo de medidas de risco sendo elas coerentes (ES) ou não (VaR). É importante mencionar que se trata de um trabalho que poderá servir de insumo inicial para trabalhos mais grandiosos, por exemplo, aqueles que incorporarem vários ativos dentro de uma carteira de riscos lineares ou, até mesmo, para ativos que apresentem risco não-direcionais.
Resumo:
O presente estudo pretende avaliar o desempenho das Delegacias da Receita Federal através do estabelecimento de uma fronteira de eficiência paramétrica baseada nos custos, utilizando para tal um modelo estocástico que divide o ruído em dois componentes, sendo um aleatório e outro proveniente da ineficiência de cada unidade. O trabalho terá por base dados relativos aos anos de 2006 e 2008 em uma análise em corte transversal e visa avaliar a política pública de unificação dos órgãos estatais responsáveis pela arrecadação de tributos em nível Federal, a Secretaria da Receita Federal (SRF) com a Secretaria da Receita Previdenciária (SRP), ocorrida através da lei 11.457 de 16 de março de 2007. O objetivo principal da pesquisa é determinar se as unidades descentralizadas da Receita Federal, notadamente as Delegacias da Receita Federal estão operando com eficiência, na tarefa de arrecadar tributos, em função dos recursos colocados a disposição para execução de suas atividades. Na presente pesquisa o produto da unidade a ser avaliado é a arrecadação, dentre as inúmeras atividades realizadas pelo órgão, no sentido de proporcionar ao Estado recurso para implantação de Políticas Públicas. O resultado encontrado indica que as regiões onde existe um grande número de empresas optantes pelo regime de tributação do SIMPLES, bem como as que possuem em sua jurisdição empresas consideradas DIFERENCIADAS pelo seu porte, provocam um aumento nos custos das Delegacias. As unidades que se encontram nas capitais dos Estados melhoraram o seu desempenho após a unificação. Além disso, uma proporção maior de Auditores Fiscais dentro da Delegacia em relação ao total de servidores reduz a ineficiência. O trabalho espera contribuir na avaliação desse novo modelo de gestão implantado na administração tributária federal no país.
Resumo:
This paper investigates the role of consumption-wealth ratio on predicting future stock returns through a panel approach. We follow the theoretical framework proposed by Lettau and Ludvigson (2001), in which a model derived from a nonlinear consumer’s budget constraint is used to settle the link between consumption-wealth ratio and stock returns. Using G7’s quarterly aggregate and financial data ranging from the first quarter of 1981 to the first quarter of 2014, we set an unbalanced panel that we use for both estimating the parameters of the cointegrating residual from the shared trend among consumption, asset wealth and labor income, cay, and performing in and out-of-sample forecasting regressions. Due to the panel structure, we propose different methodologies of estimating cay and making forecasts from the one applied by Lettau and Ludvigson (2001). The results indicate that cay is in fact a strong and robust predictor of future stock return at intermediate and long horizons, but presents a poor performance on predicting one or two-quarter-ahead stock returns.
Resumo:
Using the theoretical framework of Lettau and Ludvigson (2001), we perform an empirical investigation on how widespread is the predictability of cay {a modi ed consumption-wealth ratio { once we consider a set of important countries from a global perspective. We chose to work with the set of G7 countries, which represent more than 64% of net global wealth and 46% of global GDP at market exchange rates. We evaluate the forecasting performance of cay using a panel-data approach, since applying cointegration and other time-series techniques is now standard practice in the panel-data literature. Hence, we generalize Lettau and Ludvigson's tests for a panel of important countries. We employ macroeconomic and nancial quarterly data for the group of G7 countries, forming an unbalanced panel. For most countries, data is available from the early 1990s until 2014Q1, but for the U.S. economy it is available from 1981Q1 through 2014Q1. Results of an exhaustive empirical investigation are overwhelmingly in favor of the predictive power of cay in forecasting future stock returns and excess returns.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)