834 resultados para use value


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Há mais de uma década, o Value-at-Risk (VaR) é utilizado por instituições financeiras e corporações não financeiras para controlar o risco de mercado de carteiras de investimentos. O fato dos métodos paramétricos assumirem a hipótese de normalidade da distribuição de retornos dos fatores de risco de mercado, leva alguns gestores de risco a utilizar métodos por simulação histórica para calcular o VaR das carteiras. A principal crítica à simulação histórica tradicional é, no entanto, dar o mesmo peso na distribuição à todos os retornos encontrados no período. Este trabalho testa o modelo de simulação histórica com atualização de volatilidade proposto por Hull e White (1998) com dados do mercado brasileiro de ações e compara seu desempenho com o modelo tradicional. Os resultados mostraram um desempenho superior do modelo de Hull e White na previsão de perdas para as carteiras e na sua velocidade de adaptação à períodos de ruptura da volatilidade do mercado.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that cointegration between the level of two variables (labeled Yt and yt in this paper) is a necessary condition to assess the empirical validity of a present-value model (PV and PVM, respectively, hereafter) linking them. The work on cointegration has been so prevalent that it is often overlooked that another necessary condition for the PVM to hold is that the forecast error entailed by the model is orthogonal to the past. The basis of this result is the use of rational expectations in forecasting future values of variables in the PVM. If this condition fails, the present-value equation will not be valid, since it will contain an additional term capturing the (non-zero) conditional expected value of future error terms. Our article has a few novel contributions, but two stand out. First, in testing for PVMs, we advise to split the restrictions implied by PV relationships into orthogonality conditions (or reduced rank restrictions) before additional tests on the value of parameters. We show that PV relationships entail a weak-form common feature relationship as in Hecq, Palm, and Urbain (2006) and in Athanasopoulos, Guillén, Issler and Vahid (2011) and also a polynomial serial-correlation common feature relationship as in Cubadda and Hecq (2001), which represent restrictions on dynamic models which allow several tests for the existence of PV relationships to be used. Because these relationships occur mostly with nancial data, we propose tests based on generalized method of moment (GMM) estimates, where it is straightforward to propose robust tests in the presence of heteroskedasticity. We also propose a robust Wald test developed to investigate the presence of reduced rank models. Their performance is evaluated in a Monte-Carlo exercise. Second, in the context of asset pricing, we propose applying a permanent-transitory (PT) decomposition based on Beveridge and Nelson (1981), which focus on extracting the long-run component of asset prices, a key concept in modern nancial theory as discussed in Alvarez and Jermann (2005), Hansen and Scheinkman (2009), and Nieuwerburgh, Lustig, Verdelhan (2010). Here again we can exploit the results developed in the common cycle literature to easily extract permament and transitory components under both long and also short-run restrictions. The techniques discussed herein are applied to long span annual data on long- and short-term interest rates and on price and dividend for the U.S. economy. In both applications we do not reject the existence of a common cyclical feature vector linking these two series. Extracting the long-run component shows the usefulness of our approach and highlights the presence of asset-pricing bubbles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The value of life methodology has been recently applied to a wide range of contexts as a means to evaluate welfare gains attributable to mortality reductions and health improvements. Yet, it suffers from an important methodological drawback: it does not incorporate into the analysis child mortality, individuals’ decisions regarding fertility, and their altruism towards offspring. Two interrelated dimensions of fertility choice are potentially essential in evaluating life expectancy and health related gains. First, child mortality rates can be very important in determining welfare in a context where individuals choose the number of children they have. Second, if altruism motivates fertility, life expectancy gains at any point in life have a twofold effect: they directly increase utility via increased survival probabilities, and they increase utility via increased welfare of the offspring. We develop a manageable way to deal with value of life valuations when fertility choices are endogenous and individuals are altruistic towards their offspring. We use the methodology developed in the paper to value the reductions in mortality rates experienced by the US between 1965 and 1995. The calculations show that, with a very conservative set of parameters, altruism and fertility can easily double the value of mortality reductions for a young adult, when compared to results obtained using the traditional value of life methodology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In June 2014 Brazil hosted the FIFA World Cup and in August 2016 Rio de Janeiro hosts the Summer Olympics. These two seminal sporting events will draw tens of thousands of air travelers through Brazil’s airports, airports that are currently in the midst of a national modernization program to address years of infrastructure neglect and insufficient capacity. Raising Brazil’s major airports up to the standards air travelers experience at major airports elsewhere in the world is more than just a case of building or remodeling facilities, processes must also be examined and reworked to enhance traveler experience and satisfaction. This research paper examines the key interface between airports and airline passengers—airport check-in procedures—according to how much value and waste there is associated with them. In particular, the paper makes use of a value stream mapping construct for services proposed by Martins, Cantanhede, and Jardim (2010). The uniqueness of this construct is that it attributes each activity with a certain percentage and magnitude of value or waste which can then be ordered and prioritized for improvement. Working against a fairly commonly expressed notion in Brazil that Brazil’s airports are inferior to the airports of economically advanced countries, the paper examines Rio’s two major airports, Galeão International and Santos Dumont in comparison to Washington D.C.’s Washington National and Dulles International airports. The paper seeks to accomplish three goals: - Determine whether there are differences in airport passenger check-in procedures between U.S. and Brazilian airports in terms of passenger value - Present options for Brazilian government or private sector authorities to consider adopting or implementing at Brazilian airports to maximize passenger value - Validate the Martins et al. construct for use in evaluating the airport check-in procedures Observations and analysis proved surprising in that all airports and service providers follow essentially the same check-in processes but execute them differently yet still result in similar overall performance in terms of value and waste. Although only a few activities are categorized as completely wasteful (and therefore removed in the revised value stream map of check-in activities), the weighting and categorization of individual activities according to their value (or waste) presents decision-makers a means to prioritize possible corrective actions. Various overall recommendations are presented based on this analysis. Most importantly, this paper demonstrates the viability of using the construct developed by Martins et al to examine airport operations, as well as its applicability to the study of other service industry processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O objetivo deste estudo é propor a implementação de um modelo estatístico para cálculo da volatilidade, não difundido na literatura brasileira, o modelo de escala local (LSM), apresentando suas vantagens e desvantagens em relação aos modelos habitualmente utilizados para mensuração de risco. Para estimação dos parâmetros serão usadas as cotações diárias do Ibovespa, no período de janeiro de 2009 a dezembro de 2014, e para a aferição da acurácia empírica dos modelos serão realizados testes fora da amostra, comparando os VaR obtidos para o período de janeiro a dezembro de 2014. Foram introduzidas variáveis explicativas na tentativa de aprimorar os modelos e optou-se pelo correspondente americano do Ibovespa, o índice Dow Jones, por ter apresentado propriedades como: alta correlação, causalidade no sentido de Granger, e razão de log-verossimilhança significativa. Uma das inovações do modelo de escala local é não utilizar diretamente a variância, mas sim a sua recíproca, chamada de “precisão” da série, que segue uma espécie de passeio aleatório multiplicativo. O LSM captou todos os fatos estilizados das séries financeiras, e os resultados foram favoráveis a sua utilização, logo, o modelo torna-se uma alternativa de especificação eficiente e parcimoniosa para estimar e prever volatilidade, na medida em que possui apenas um parâmetro a ser estimado, o que representa uma mudança de paradigma em relação aos modelos de heterocedasticidade condicional.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the value of analysts’ recommendations in Brazilian Stock Market. We studied a sample of 294 weeks of recommendations make public by the best seller newspaper in Brazil with six different investment strategies and time horizons. The main conclusion is that it is possible to beat the Brazilian market indexes Ibovespa and IBrX following the analysts’ stock recommendations. The best strategies are buying only the recommended stocks, buying the recommended stocks whose target and market prices difference is bigger than 25% and lesser or equal than 50%. The performance of the six strategies is analyzed through the use of bootstrap and Monte Carlo techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new paradigm is modeling the World: evolutionary innovations in all fronts, new information technologies, huge mobility of capital, use of risky financial tools, globalization of production, new emerging powers and the impact of consumer concerns on governmental policies. These phenomena are shaping the World and forcing the advent of a new World Order in the Multilateral Monetary, Financial, and Trading System. The effects of this new paradigm are also transforming global governance. The political and economic orders established after the World War and centered on the multilateral model of UN, IMF, World Bank, and the GATT, leaded by the developed countries, are facing significant challenges. The rise of China and emerging countries shifted the old model to a polycentric World, where the governance of these organizations are threatened by emerging countries demanding a bigger participation in the role and decision boards of these international bodies. As a consequence, multilateralism is being confronted by polycentrism. Negotiations for a more representative voting process and the pressure for new rules to cope with the new demands are paralyzing important decisions. This scenario is affecting seriously not only the Monetary and Financial Systems but also the Multilateral Trading System. International trade is facing some significant challenges: a serious deadlock to conclude the last round of the multilateral negotiation at the WTO, the fragmentation of trade rules by the multiplication of preferential and mega agreements, the arrival of a new model of global production and trade leaded by global value chains that is threatening the old trade order, and the imposition of new sets of regulations by private bodies commanded by transnationals to support global value chains and non-governmental organizations to reflect the concerns of consumers in the North based on their precautionary attitude about sustainability of products made in the World. The lack of any multilateral order in this new regulation is creating a big cacophony of rules and developing a new regulatory war of the Global North against the Global South. The objective of this paper is to explore how these challenges are affecting the Tradinge System and how it can evolve to manage these new trends.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nas culturas agrícolas as plantas daninhas devem ser controladas de modo a não afetar negativamente o rendimento e a qualidade do produto colhido. Deste modo, quantidades pequenas de plantas daninhas em um campo, na maioria dos casos, não é um problema, exceto na produção de sementes. Ressalta-se que em gramados não existe um componente de produção a se colhido. O valor do gramado é a sua qualidade inerente a estética e usabilidade. Qualidade estética é a beleza e o valor que acrescenta ao gramado em uma paisagem gerenciada. Usabilidade pode ser a durabilidade de um campo de esporte ou a redução na perda de solo pela erosão da água ou do vento. A presença de qualquer planta daninha em gramados pode diminuir a qualidade estética e usabilidade do gramado. Enquanto for possível reduzir a população de plantas daninhas utilizando práticas culturais ou mecânicas, não se poderá eliminá-las completamente. A utilização de herbicidas é a única maneira de controlar completamente as plantas daninhas em áreas de gramados. Esta revisão irá rever os principais herbicidas utilizados em gramados nos Estados Unidos com relação a seus modos de ação, a família de herbicidas e uso primário no gramado.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim. Duplex scanning has been used in the evaluation of the aorta and proximal arteries of the lower extremities, but has limitations in evaluating the arteries of the leg. The utilization of ultrasonographic contrast (USC) may be helpful in improving the quality of the image in these arteries. The objective of the present study was to verify whether the USC increases the diagnostic accuracy of patency of the leg arteries and if it diminishes the time needed to perform duplex scanning.Methods. Twenty patients with critical ischemia (20 lower extremities) were examined by standard duplex scanning, duplex scanning with contrast and digital subtraction arteriography (DSA). The 3 arteries of the leg were divided into 3 segments, for a total of 9 segments per limb. Each segment was evaluated for patency in order to compare the 3 diagnostic methods. Comparison was made between standard duplex scanning and duplex scanning with contrast in terms of quality of the color-coded Doppler signal and of the spectral curve, and also of the time to perform the exams.Results. Duplex scanning with contrast was similar to arteriography in relation to patency diagnosis (p>0.3) and even superior in some of the segments. Standard duplex scanning was inferior to arteriography and to duplex scanning with contrast (p<0.001). There were improvements of 70% in intensity of the color-coded Doppler signal and 76% in the spectral curve after the utilization of contrast. The time necessary to perform the examinations was 23.7 minutes for standard duplex scanning and 16.9 minutes for duplex scanning with contrast (p<0.001).Conclusion. The use of ultrasonographic contrast increased the accuracy of the diagnosis of patency of leg arteries and diminished the time necessary for the execution of duplex scanning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A mandioca, apesar de ser nativa do Brasil, ainda é sub-utilizada principalmente quando a questão é o aproveitamento da sua parte aérea. Com o objetivo de estudar o potencial da mandioca para alimentação animal, o presente trabalho avaliou as características da parte aérea da planta quando submetida os processos de ensilagem e fenação. Os tratamentos consistiram de: parte aérea ensilada sem emurchecimento (PAS); parte aérea ensilada após 24 horas de emurchecimento (PAE) e parte aérea fenada (PAF). As análises químicas foram realizadas a fim de avaliar os parâmetros que determinam o valor nutritivo da silagem e do feno. O emurchecimento elevou o teor de matéria seca de 25% no material in natura para 27.7%, sem alterar o teor de carboidratos solúveis (33.3 e 35.5% de MS na PAS e PAE respectivamente), bem como o poder tampão (204 mmol kg-1 MS na PAS e 195 mmol kg-1 MS na PAE). Nem o pH (3.57 na silagem in natura e 3.60 na PAE) nem os teores de NIDA (11.32% do nitrogênio total na MS na PAS e 9.99% do nitrogênio total na MS na PAE) diferiram entre as silagens, mas o NIDA foi maior na forragem fenada (15.39%). Contudo, o emurchecimento provocou aumento no nitrogênio amoniacal (de 6.5% do nitrogênio total na MS da PAS para 13.0 do nitrogênio total na MS da PAE). Os teores de ácidos graxos voláteis não sofreram alterações com o emurchecimento. O processo de ensilagem reduziu os teores de ácido cianídrico livre (HCN), sem, contudo, alterar a cianidrina.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mutilation of extremities was very frequent in patients affected by leprosy in the past; although it is now much less common, it is still seen, mainly in patients with long-term disease. In general, mutilation of the nose and ears is caused by the bacillus and mutilation of the hands and feet a consequence of chronic trauma. Leprosy must be chronically treated and any decision to interrupt therapy is based on laboratory tests and biopsy. Scintigraphy is a non-invasive procedure which could be of great value in to determining disease activity. We studied eight patients (five males and three females, aged 64-73 years) who presented with mutilation of the nose (2), ear (1), feet (3) or foot and hand (2), Conventional three-phase bone scintigraphy (750 MBq) and X-ray examinations of the affected areas were performed in all patients. Bone scintigraphy was abnormal in four patients (the presence of bacilli was confirmed by biopsy in two of them), and normal in the other four. In all patients except for the one with ear mutilation, radiography only showed the absence of bone. We conclude that bone scintigraphy is very useful to determine disease activity in cases of mutilation caused by leprosy. It seems to be superior to conventional radiography and may enable bone biopsies to be avoided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The modern approach to the development of new chemical entities against complex diseases, especially the neglected endemic diseases such as tuberculosis and malaria, is based on the use of defined molecular targets. Among the advantages, this approach allows (i) the search and identification of lead compounds with defined molecular mechanisms against a defined target (e.g. enzymes from defined pathways), (ii) the analysis of a great number of compounds with a favorable cost/benefit ratio, (iii) the development even in the initial stages of compounds with selective toxicity (the fundamental principle of chemotherapy), (iv) the evaluation of plant extracts as well as of pure substances. The current use of such technology, unfortunately, is concentrated in developed countries, especially in the big pharma. This fact contributes in a significant way to hamper the development of innovative new compounds to treat neglected diseases. The large biodiversity within the territory of Brazil puts the country in a strategic position to develop the rational and sustained exploration of new metabolites of therapeutic value. The extension of the country covers a wide range of climates, soil types, and altitudes, providing a unique set of selective pressures for the adaptation of plant life in these scenarios. Chemical diversity is also driven by these forces, in an attempt to best fit the plant communities to the particular abiotic stresses, fauna, and microbes that co-exist with them. Certain areas of vegetation (Amazonian Forest, Atlantic Forest, Araucaria Forest, Cerrado-Brazilian Savanna, and Caatinga) are rich in species and types of environments to be used to search for natural compounds active against tuberculosis, malaria, and chronic-degenerative diseases. The present review describes some strategies to search for natural compounds, whose choice can be based on ethnobotanical and chemotaxonomical studies, and screen for their ability to bind to immobilized drug targets and to inhibit their activities. Molecular cloning, gene knockout, protein expression and purification, N-terminal sequencing, and mass spectrometry are the methods of choice to provide homogeneous drug targets for immobilization by optimized chemical reactions. Plant extract preparations, fractionation of promising plant extracts, propagation protocols and definition of in planta studies to maximize product yield of plant species producing active compounds have to be performed to provide a continuing supply of bioactive materials. Chemical characterization of natural compounds, determination of mode of action by kinetics and other spectroscopic methods (MS, X-ray, NMR), as well as in vitro and in vivo biological assays, chemical derivatization, and structure-activity relationships have to be carried out to provide a thorough knowledge on which to base the search for natural compounds or their derivatives with biological activity.