837 resultados para Upkeep of assets
Resumo:
A exploração e a manipulação do desejo são algumas das principais marcas da cultura de consumo. Nas sociedades em que predomina essa cultura, o consumo aparece como critério de humanização, e o sentido da vida o núcleo ético-mítico em torno do qual a sociedade se organiza é a busca de acumulação de riqueza para se consumir cada vez mais. Alguns estudos têm demonstrado os aspectos sagrados dessa cultura, que se tornou uma verdadeira religião da vida cotidiana, com suas devoções, espiritualidades, mitos e ritos. Da mesma forma, alguns estudos vêm demonstrando como essa cultura determina os projetos pedagógicos. Esses estudos não são acidentais, pois religião e educação são elementos fundamentais na origem e na manutenção de qualquer cultura e sociedade humanas. Todavia, podem ser também elementos de transformação. Paulo Freire acena com o interesse pela criação de uma Pedagogia do Desejo, compreendendo que este tema é de fundamental importância na luta pela superação da exclusão social, o que infelizmente não teve tempo de formulá-la. A obra de René Girard reforça a tese de que a religião é um processo fundamental para as sociedades humanas, considerando sua real função na origem da cultura. Segundo Girard, a religião é a educadora da humanidade no processo de humanização e socialização. E sua característica mais notável é justamente a de educar o desejo, pois, devido a sua natureza mimética, constantemente é gerador de violência. Nas pesquisas sobre as relações entre Religião/Teologia e Educação, recentemente tem sido realizado o estudo dos pressupostos teológicos e espirituais das propostas educacionais. Há muitos pontos de convergência entre Paulo Freire e René Girard, alguns até complementares. O diálogo entre esses dois autores se mostra muito profícuo na discussão do tema do desejo em relação com a espiritualidade e a educação. Este trabalho é uma tentativa de buscar elementos que favoreçam a elaboração de uma Pedagogia do Desejo a partir das contribuições das Ciências da Religião.(AU)
Resumo:
As várias teorias acerca da estrutura de capital despertam interesse motivando diversos estudos sobre o assunto sem, no entanto, ter um consenso. Outro tema aparentemente pouco explorado refere-se ao ciclo de vida das empresas e como ele pode influenciar a estrutura de capital. Este estudo teve como objetivo verificar quais determinantes possuem maior relevância no endividamento das empresas e se estes determinantes alteram-se dependendo do ciclo de vida da empresa apoiada pelas teorias Trade Off, Pecking Order e Teoria da Agência. Para alcançar o objetivo deste trabalho foi utilizado análise em painel de efeito fixo sendo a amostra composta por empresas brasileiras de capital aberto, com dados secundários disponíveis na Economática® no período de 2005 a 2013, utilizando-se os setores da BM&FBOVESPA. Como resultado principal destaca-se o mesmo comportamento entre a amostra geral, alto e baixo crescimento pelo endividamento contábil para o determinante Lucratividade apresentando uma relação negativa, e para os determinantes Oportunidade de Crescimento e Tamanho, estes com uma relação positiva. Para os grupos de alto e baixo crescimento alguns determinantes apresentaram resultados diferentes, como a singularidade que resultou significância nestes dois grupos, sendo positiva no baixo crescimento e negativa no alto crescimento, para o valor colateral dos ativos e benefício fiscal não dívida apresentaram significância apenas no grupo de baixo crescimento. Para o endividamento a valor de mercado foi observado significância para o Benefício fiscal não dívida e Singularidade. Este resultado reforça o argumento de que o ciclo de vida influência a estrutura de capital
Resumo:
This note presents a contingent-claims approach to strategic capacity planning. We develop models for capacity choice and expansion decisions in a single firm environment where investment is irreversible and demand is uncertain. These models illustrate specifically the relevance of path-dependent options analysis to planning capacity investments when the firm adopts demand tracking or average capacity strategies. It is argued that Asian/average type real options can explain hysteresis phenomena in addition to providing superior control of assets in place.
Resumo:
This research is the leading brand for purchase of assets, and analyzing the factors based on brand asset components and the relationship between the brand and brand assets assets impact factors and purchase intent on uncovering the relationship between components and trademarks centered on South Korea and the United Kingdom, by comparing the asset management plan would generate. The study, information navigation product knowledge affects of constant (+), brand attitudes and knowledge of the brand loyalty and brand value to the constant trademark (+). Brand value and brand loyalty and purchase intent-(+) in the United Kingdom, on the other hand, of the impact that do not affect that.
Resumo:
Az EU-ban, a mai állapotok szerint, csak „transzferunióról” beszélhetünk és nem egységes piacról. Az eurós pénzfolyamatok eltorzítva közvetítik a versenyképességet is: mind az árukban és vagyontárgyakban, mind – főleg – a pénzügyi eszközökben megtestesült munkákat/teljesítményeket rosszul árazzák. Egy ilyen keretben különösen könnyen alakul ki az, amit potyautas-problémának nevezünk, vagyis ahol tényleges vagy mérhető teljesítményleadás, vagy éppen fizetés nélkül lehet fogyasztani, és túl olcsón lehet szabad forrásokhoz jutni. Az eurózóna számos közvetítő mechanizmusában is tökéletlen. A sok, szuverénadósság-présbe került tagország között van kicsi, közepes és nagy is. Ez a tény, valamint az általános növekedési és munkapiaci problémák, egyértelműen „rendszerszintű zavarokat” jeleznek, amelyeket ebben a dolgozatban teljesítmény közvetítési-átviteli problémának hívunk, és ezért egy szokatlan, ám annál beszédesebb, elektromosenergia-átviteli rendszeranalógiával segítünk értelmezni. Megmutatjuk, hogy egy jó nagyvállalat miért jobb pénzügyi tervező, mint egy azonos méretű állam. _____ Why are ill-defined transfer mechanisms diverting valuable assets and resources to the wrong destination within the EU? Why do we witness ongoing pressure in the EU banking sector and in government finances? We offer an unusual answer to these questions: we apply an analogy from physics (from an electric generation and distribution network) to show the transmission inefficiency and waste, respectively, of the EU distribution mechanisms. We demonstrate that there are inherent flaws in both the measurement and in the distribution of assets and resources amongst the key EU markets: goods, money and factor markets. In addition, we find that when international equalizer mechanism is at work (cohesion funds allocated), many of these equity functions are at risk with respect to their reliable measurement. Especially are at risk the metered load factors, likewise the loss/waste factors. The map of desired outcomes does not match the real outcome, since EUtransfers in general are put to work with low efficiency.
Resumo:
Prior research has established that idiosyncratic volatility of the securities prices exhibits a positive trend. This trend and other factors have made the merits of investment diversification and portfolio construction more compelling. ^ A new optimization technique, a greedy algorithm, is proposed to optimize the weights of assets in a portfolio. The main benefits of using this algorithm are to: (a) increase the efficiency of the portfolio optimization process, (b) implement large-scale optimizations, and (c) improve the resulting optimal weights. In addition, the technique utilizes a novel approach in the construction of a time-varying covariance matrix. This involves the application of a modified integrated dynamic conditional correlation GARCH (IDCC - GARCH) model to account for the dynamics of the conditional covariance matrices that are employed. ^ The stochastic aspects of the expected return of the securities are integrated into the technique through Monte Carlo simulations. Instead of representing the expected returns as deterministic values, they are assigned simulated values based on their historical measures. The time-series of the securities are fitted into a probability distribution that matches the time-series characteristics using the Anderson-Darling goodness-of-fit criterion. Simulated and actual data sets are used to further generalize the results. Employing the S&P500 securities as the base, 2000 simulated data sets are created using Monte Carlo simulation. In addition, the Russell 1000 securities are used to generate 50 sample data sets. ^ The results indicate an increase in risk-return performance. Choosing the Value-at-Risk (VaR) as the criterion and the Crystal Ball portfolio optimizer, a commercial product currently available on the market, as the comparison for benchmarking, the new greedy technique clearly outperforms others using a sample of the S&P500 and the Russell 1000 securities. The resulting improvements in performance are consistent among five securities selection methods (maximum, minimum, random, absolute minimum, and absolute maximum) and three covariance structures (unconditional, orthogonal GARCH, and integrated dynamic conditional GARCH). ^
Resumo:
A plethora of recent literature on asset pricing provides plenty of empirical evidence on the importance of liquidity, governance and adverse selection of equity on pricing of assets together with more traditional factors such as market beta and the Fama-French factors. However, literature has usually stressed that these factors are priced individually. In this dissertation we argue that these factors may be related to each other, hence not only individual but also joint tests of their significance is called for. ^ In the three related essays, we examine the liquidity premium in the context of the finer three-digit SIC industry classification, joint importance of liquidity and governance factors as well as governance and adverse selection. Recent studies by Core, Guay and Rusticus (2006) and Ben-Rephael, Kadan and Wohl (2010) find that governance and liquidity premiums are dwindling in the last few years. One reason could be that liquidity is very unevenly distributed across industries. This could affect the interpretation of prior liquidity studies. Thus, in the first chapter we analyze the relation of industry clustering and liquidity risk following a finer industry classification suggested by Johnson, Moorman and Sorescu (2009). In the second chapter, we examine the dwindling influence of the governance factor if taken simultaneously with liquidity. We argue that this happens since governance characteristics are potentially a proxy for information asymmetry that may be better captured by market liquidity of a company's shares. Hence, we jointly examine both the factors, namely, governance and liquidity - in a series of standard asset pricing tests. Our results reconfirm the importance of governance and liquidity in explaining stock returns thus independently corroborating the findings of Amihud (2002) and Gompers, Ishii and Metrick (2003). Moreover, governance is not subsumed by liquidity. Lastly, we analyze the relation of governance and adverse selection, and again corroborate previous findings of a priced governance factor. Furthermore, we ascertain the importance of microstructure measures in asset pricing by employing Huang and Stoll's (1997) method to extract an adverse selection variable and finding evidence for its explanatory power in four-factor regressions.^
Resumo:
In a globalized society, the relations between heritage and tourism are reflected in an ambiguous reality, shaped between the interests of preservation and the aspirations for economic benefits. On the one hand, the cities as a main generating cultural offerings needs to contemplate its heritage as a development axis, finding in the cultural tourism promotion a strategy to support the high cost of recovery and maintenance of its historical center and its expressions cultural. On the other, adds to the new requirements of demand, causing the tourism projects to turn to the cultural factor in the formation of their products, which allows municipalities to attract the growing cultural tourism segment. In this perspective, this study develops into a focused cross-cut in the analysis of Natal’s Historical City Center, in order to understand how this cultural heritage fallen has been used by the municipal administration for tourism. By understanding the heritage as a reference to identity and memory, as well as a cultural symbol of Natal society, characterized as an element surrounded by complex and strictly private situations, it identified the need for a qualitative approach to his deep understanding. The in-depth case study developed in two stages, first the realization of bibliographical and documentary research; and thereafter the interpretation of data collected through semi-structured interviews with municipal administrators and local residents. The survey results show that the official representatives of heritage are concerned about the preservation of the material dimension of the architectural heritage of the city, however, still can not reach and sensitize the local population, which seems to be part of a process that should be democratic and strengthening the sense of belonging of these people. Finally, it indicates an absence of revitalization strategies by the current municipal public administration for Natal’s Historical City Center, revealing a speech covered by a positivist interpretation of tourism, which deals with the use of assets by the scope of the marketing empiricism.
Resumo:
I study the link between capital markets and sources of macroeconomic risk. In chapter 1 I show that expected inflation risk is priced in the cross section of stock returns even after controlling for cash flow growth and volatility risks. Motivated by this evidence I study a long run risk model with a built-in inflation non-neutrality channel that allows me to decompose the real stochastic discount factor into news about current and expected cash flow growth, news about expected inflation and news about volatility. The model can successfully price a broad menu of assets and provides a setting for analyzing cross sectional variation in expected inflation risk premium. For industries like retail and durable goods inflation risk can account for nearly a third of the overall risk premium while the energy industry and a broad commodity index act like inflation hedges. Nominal bonds are exposed to expected inflation risk and have inflation premiums that increase with bond maturity. The price of expected inflation risk was very high during the 70's and 80's, but has come down a lot since being very close to zero over the past decade. On average, the expected inflation price of risk is negative, consistent with the view that periods of high inflation represent a "bad" state of the world and are associated with low economic growth and poor stock market performance. In chapter 2 I look at the way capital markets react to predetermined macroeconomic announcements. I document significantly higher excess returns on the US stock market on macro release dates as compared to days when no macroeconomic news hit the market. Almost the entire equity premium since 1997 is being realized on days when macroeconomic news are released. At high frequency, there is a pattern of returns increasing in the hours prior to the pre-determined announcement time, peaking around the time of the announcement and dropping thereafter.
Resumo:
I explore and analyze a problem of finding the socially optimal capital requirements for financial institutions considering two distinct channels of contagion: direct exposures among the institutions, as represented by a network and fire sales externalities, which reflect the negative price impact of massive liquidation of assets.These two channels amplify shocks from individual financial institutions to the financial system as a whole and thus increase the risk of joint defaults amongst the interconnected financial institutions; this is often referred to as systemic risk. In the model, there is a trade-off between reducing systemic risk and raising the capital requirements of the financial institutions. The policymaker considers this trade-off and determines the optimal capital requirements for individual financial institutions. I provide a method for finding and analyzing the optimal capital requirements that can be applied to arbitrary network structures and arbitrary distributions of investment returns.
In particular, I first consider a network model consisting only of direct exposures and show that the optimal capital requirements can be found by solving a stochastic linear programming problem. I then extend the analysis to financial networks with default costs and show the optimal capital requirements can be found by solving a stochastic mixed integer programming problem. The computational complexity of this problem poses a challenge, and I develop an iterative algorithm that can be efficiently executed. I show that the iterative algorithm leads to solutions that are nearly optimal by comparing it with lower bounds based on a dual approach. I also show that the iterative algorithm converges to the optimal solution.
Finally, I incorporate fire sales externalities into the model. In particular, I am able to extend the analysis of systemic risk and the optimal capital requirements with a single illiquid asset to a model with multiple illiquid assets. The model with multiple illiquid assets incorporates liquidation rules used by the banks. I provide an optimization formulation whose solution provides the equilibrium payments for a given liquidation rule.
I further show that the socially optimal capital problem using the ``socially optimal liquidation" and prioritized liquidation rules can be formulated as a convex and convex mixed integer problem, respectively. Finally, I illustrate the results of the methodology on numerical examples and
discuss some implications for capital regulation policy and stress testing.
Resumo:
Prior research has established that idiosyncratic volatility of the securities prices exhibits a positive trend. This trend and other factors have made the merits of investment diversification and portfolio construction more compelling. A new optimization technique, a greedy algorithm, is proposed to optimize the weights of assets in a portfolio. The main benefits of using this algorithm are to: a) increase the efficiency of the portfolio optimization process, b) implement large-scale optimizations, and c) improve the resulting optimal weights. In addition, the technique utilizes a novel approach in the construction of a time-varying covariance matrix. This involves the application of a modified integrated dynamic conditional correlation GARCH (IDCC - GARCH) model to account for the dynamics of the conditional covariance matrices that are employed. The stochastic aspects of the expected return of the securities are integrated into the technique through Monte Carlo simulations. Instead of representing the expected returns as deterministic values, they are assigned simulated values based on their historical measures. The time-series of the securities are fitted into a probability distribution that matches the time-series characteristics using the Anderson-Darling goodness-of-fit criterion. Simulated and actual data sets are used to further generalize the results. Employing the S&P500 securities as the base, 2000 simulated data sets are created using Monte Carlo simulation. In addition, the Russell 1000 securities are used to generate 50 sample data sets. The results indicate an increase in risk-return performance. Choosing the Value-at-Risk (VaR) as the criterion and the Crystal Ball portfolio optimizer, a commercial product currently available on the market, as the comparison for benchmarking, the new greedy technique clearly outperforms others using a sample of the S&P500 and the Russell 1000 securities. The resulting improvements in performance are consistent among five securities selection methods (maximum, minimum, random, absolute minimum, and absolute maximum) and three covariance structures (unconditional, orthogonal GARCH, and integrated dynamic conditional GARCH).
Resumo:
Software assets are key output of the RAGE project and they can be used by applied game developers to enhance the pedagogical and educational value of their games. These software assets cover a broad spectrum of functionalities – from player analytics including emotion detection to intelligent adaptation and social gamification. In order to facilitate integration and interoperability, all of these assets adhere to a common model, which describes their properties through a set of metadata. In this paper the RAGE asset model and asset metadata model is presented, capturing the detail of assets and their potential usage within three distinct dimensions – technological, gaming and pedagogical. The paper highlights key issues and challenges in constructing the RAGE asset and asset metadata model and details the process and design of a flexible metadata editor that facilitates both adaptation and improvement of the asset metadata model.
Resumo:
The definition of the boundaries of the firms is subject that has occupied the organizational theorists long ago, being the seminal work of Coase (1937) indicated as the trigger for one theoretical evolution, with emphasis on governance structures, which led to a modern theory of incomplete contracts. The Transaction Cost Economics (TCE) and Agency Theory arise within this evolution, being widely used in studies related to the theme. Empirically, data envelopment analysis (DEA) has established itself as a suitable tool for analysis of efficiency. Although TCE argues that specific assets must be internalized, recent studies outside the mainstream of theory show that, often, firms may decide, for various reasons, hire them on the market. Researches on transaction costs face with the unavailability of information and methodological difficulties in measuring their critical variables. There`s still the need for further methodological deepening. The theoretical framework includes classic works of TCE and Agency Theory, but also more recent works, outside the mainstream of TCE, which warn about the existence of strategies in use of specific assets that aren`t necessarily aligned with the classical ideas of TCE. The Brazilian oil industry is the focus of this thesis, that aimed to evaluate the efficiency of contracts involving high specificity service outsourced by Petrobras. In order to this, we made the categorization of outsourced services in terms of specificity, as well the description of services with higher specificity. Then, we verified the existence of relationship between the specificity of services and a number of variables, being found divergent results than those that are preached by the mainstream of TCE. Then, we designed a DEA model to analyze the efficiency in the use of onshore drilling rigs, identified among the services of highest specificity. The next step was the application of the model to evaluate the performance of drilling rigs contracts. Finally, we verified the existence of relationship between the efficiency of contracts and a number of variables, being found, again, results not consistent with the theory mainstream. Regarding to analyze of efficiency of drilling rigs contracts, the model developed is compatible with what is found in academic productions in efficiency of drilling rigs. The results on efficiency show a wide range of scores, with efficiencies ranging from 31.79% to 100%, being low the sample efficiency average. There is consonance between the model results and the practices adopted by Petrobras. The results strengthen the DEA as an important tool in studies of efficiency with possibility to use for analysis other types of contracts. In terms of theoretical findings, the results reinforce the arguments that there are situations in which the strategies of the organizations, in terms of use of assets and services of high specificity, do not necessarily follow what is recommended by the mainstream of TCE
Resumo:
The current paper aims at analyzing customer retention in Internet provider services. For this study, we sought to understand what are the client's expectations regarding the services available and compare them with management perception in relation to the use of those services. Identifying the coherence level between the two points of view, management and client, it is possible to pinpoint how service is assessed in real conditions. Then, from this point on, a new vision can be implemented on available services, and new customer service strategies aiming at best serving to their expectation and need, can be rethought. The exploratory research was utilized. It was based on case study, and quantitative and qualitative methods were used. The quantitative method was done by applying the cluster technique with six variables of control derived from the six main services, whose definition was done through qualitative survey of the internal management team. Then, an structured interview with 443 clients, from a probabilistic sample of 800 costumers. The total number of active clients of the internet provider is of 10.677. Client perception in relation to services varied, if compared with the four services that were under the managerial metric method, this comparison showed a more positive evaluation than the real use of the service. Thus, it was observed that the value of each service available for the client depends on his/her perception of it, regardless of using or not the offered service. As a result, it is possible to understand which services offered by the company under study effectively contribute to a good client-company relationship, and the upkeep of those clients
Resumo:
When a company desires to invest in a project, it must obtain resources needed to make the investment. The alternatives are using firm s internal resources or obtain external resources through contracts of debt and issuance of shares. Decisions involving the composition of internal resources, debt and shares in the total resources used to finance the activities of a company related to the choice of its capital structure. Although there are studies in the area of finance on the debt determinants of firms, the issue of capital structure is still controversial. This work sought to identify the predominant factors that determine the capital structure of Brazilian share capital, non-financial firms. This work was used a quantitative approach, with application of the statistical technique of multiple linear regression on data in panel. Estimates were made by the method of ordinary least squares with model of fixed effects. About 116 companies were selected to participate in this research. The period considered is from 2003 to 2007. The variables and hypotheses tested in this study were built based on theories of capital structure and in empirical researches. Results indicate that the variables, such as risk, size, and composition of assets and firms growth influence their indebtedness. The profitability variable was not relevant to the composition of indebtedness of the companies analyzed. However, analyzing only the long-term debt, comes to the conclusion that the relevant variables are the size of firms and, especially, the composition of its assets (tangibility).This sense, the smaller the size of the undertaking or the greater the representation of fixed assets in total assets, the greater its propensity to long-term debt. Furthermore, this research could not identify a predominant theory to explain the capital structure of Brazilian