555 resultados para leverage
Resumo:
Terrestrial planets produce crusts as they differentiate. The Earth’s bi-modal crust, with a high-standing granitic continental crust and a low-standing basaltic oceanic crust, is unique in our solar system and links the evolution of the interior and exterior of this planet. Here I present geochemical observations to constrain processes accompanying crustal formation and evolution. My approach includes geochemical analyses, quantitative modeling, and experimental studies. The Archean crustal evolution project represents my perspective on when Earth’s continental crust began forming. In this project, I utilized critical element ratios in sedimentary records to track the evolution of the MgO content in the upper continental crust as a function time. The early Archean subaerial crust had >11 wt. % MgO, whereas by the end of Archean its composition had evolved to about 4 wt. % MgO, suggesting a transition of the upper crust from a basalt-like to a more granite-like bulk composition. Driving this fundamental change of the upper crustal composition is the widespread operation of subduction processes, suggesting the onset of global plate tectonics at ~ 3 Ga (Abstract figure). Three of the chapters in this dissertation leverage the use of Eu anomalies to track the recycling of crustal materials back into the mantle, where Eu anomaly is a sensitive measure of the element’s behavior relative to neighboring lanthanoids (Sm and Gd) during crustal differentiation. My compilation of Sm-Eu-Gd data for the continental crust shows that the average crust has a net negative Eu anomaly. This result requires recycling of Eu-enriched lower continental crust to the mantle. Mass balance calculations require that about three times the mass of the modern continental crust was returned into the mantle over Earth history, possibly via density-driven recycling. High precision measurements of Eu/Eu* in selected primitive glasses of mid-ocean ridge basalt (MORB) from global MORs, combined with numerical modeling, suggests that the recycled lower crustal materials are not found within the MORB source and may have at least partially sank into the lower mantle where they can be sampled by hot spot volcanoes. The Lesser Antilles Li isotope project provides insights into the Li systematics of this young island arc, a representative section of proto-continental crust. Martinique Island lavas, to my knowledge, represent the only clear case in which crustal Li is recycled back into their mantle source, as documented by the isotopically light Li isotopes in Lesser Antilles sediments that feed into the fore arc subduction trench. By corollary, the mantle-like Li signal in global arc lavas is likely the result of broadly similar Li isotopic compositions between the upper mantle and bulk subducting sediments in most arcs. My PhD project on Li diffusion mechanism in zircon is being carried out in extensive collaboration with multiple institutes and employs analytical, experimental and modeling studies. This ongoing project, finds that REE and Y play an important role in controlling Li diffusion in natural zircons, with Li partially coupling to REE and Y to maintain charge balance. Access to state-of-art instrumentation presented critical opportunities to identify the mechanisms that cause elemental fractionation during laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) analysis. My work here elucidates the elemental fractionation associated with plasma plume condensation during laser ablation and particle-ion conversion in the ICP.
Resumo:
With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.
Resumo:
Peer-to-peer information sharing has fundamentally changed customer decision-making process. Recent developments in information technologies have enabled digital sharing platforms to influence various granular aspects of the information sharing process. Despite the growing importance of digital information sharing, little research has examined the optimal design choices for a platform seeking to maximize returns from information sharing. My dissertation seeks to fill this gap. Specifically, I study novel interventions that can be implemented by the platform at different stages of the information sharing. In collaboration with a leading for-profit platform and a non-profit platform, I conduct three large-scale field experiments to causally identify the impact of these interventions on customers’ sharing behaviors as well as the sharing outcomes. The first essay examines whether and how a firm can enhance social contagion by simply varying the message shared by customers with their friends. Using a large randomized field experiment, I find that i) adding only information about the sender’s purchase status increases the likelihood of recipients’ purchase; ii) adding only information about referral reward increases recipients’ follow-up referrals; and iii) adding information about both the sender’s purchase as well as the referral rewards increases neither the likelihood of purchase nor follow-up referrals. I then discuss the underlying mechanisms. The second essay studies whether and how a firm can design unconditional incentive to engage customers who already reveal willingness to share. I conduct a field experiment to examine the impact of incentive design on sender’s purchase as well as further referral behavior. I find evidence that incentive structure has a significant, but interestingly opposing, impact on both outcomes. The results also provide insights about senders’ motives in sharing. The third essay examines whether and how a non-profit platform can use mobile messaging to leverage recipients’ social ties to encourage blood donation. I design a large field experiment to causally identify the impact of different types of information and incentives on donor’s self-donation and group donation behavior. My results show that non-profits can stimulate group effect and increase blood donation, but only with group reward. Such group reward works by motivating a different donor population. In summary, the findings from the three studies will offer valuable insights for platforms and social enterprises on how to engineer digital platforms to create social contagion. The rich data from randomized experiments and complementary sources (archive and survey) also allows me to test the underlying mechanism at work. In this way, my dissertation provides both managerial implication and theoretical contribution to the phenomenon of peer-to-peer information sharing.
Resumo:
SQL injection is a common attack method used to leverage infor-mation out of a database or to compromise a company’s network. This paper investigates four injection attacks that can be conducted against the PL/SQL engine of Oracle databases, comparing two recent releases (10g, 11g) of Oracle. The results of the experiments showed that both releases of Oracle were vulner-able to injection but that the injection technique often differed in the packages that it could be conducted in.
Resumo:
International research shows that low-volatility stocks have beaten high-volatility stocks in terms of returns for decades on multiple markets. This abbreviation from traditional risk-return framework is known as low-volatility anomaly. This study focuses on explaining the anomaly and finding how strongly it appears in NASDAQ OMX Helsinki stock exchange. Data consists of all listed companies starting from 2001 and ending close to 2015. Methodology follows closely Baker and Haugen (2012) by sorting companies into deciles according to 3-month volatility and then calculating monthly returns for these different volatility groups. Annualized return for the lowest volatility decile is 8.85 %, while highest volatility decile destroys wealth at rate of -19.96 % per annum. Results are parallel also in quintiles that represent larger amount of companies and thus dilute outliers. Observation period captures financial crisis of 2007-2008 and European debt crisis, which embodies as low main index annual return of 1 %, but at the same time proves the success of low-volatility strategy. Low-volatility anomaly is driven by multiple reasons such as leverage constrained trading and managerial incentives which both prompt to invest in risky assets, but behavioral matters also have major weight in maintaining the anomaly.
Resumo:
This thesis studies the field of asset price bubbles. It is comprised of three independent chapters. Each of these chapters either directly or indirectly analyse the existence or implications of asset price bubbles. The type of bubbles assumed in each of these chapters is consistent with rational expectations. Thus, the kind of price bubbles investigated here are known as rational bubbles in the literature. The following describes the three chapters. Chapter 1: This chapter attempts to explain the recent US housing price bubble by developing a heterogeneous agent endowment economy asset pricing model with risky housing, endogenous collateral and defaults. Investment in housing is subject to an idiosyncratic risk and some mortgages are defaulted in equilibrium. We analytically derive the leverage or the endogenous loan to value ratio. This variable comes from a limited participation constraint in a one period mortgage contract with monitoring costs. Our results show that low values of housing investment risk produces a credit easing effect encouraging excess leverage and generates credit driven rational price bubbles in the housing good. Conversely, high values of housing investment risk produces a credit crunch characterized by tight borrowing constraints, low leverage and low house prices. Furthermore, the leverage ratio was found to be procyclical and the rate of defaults countercyclical consistent with empirical evidence. Chapter 2: It is widely believed that financial assets have considerable persistence and are susceptible to bubbles. However, identification of this persistence and potential bubbles is not straightforward. This chapter tests for price bubbles in the United States housing market accounting for long memory and structural breaks. The intuition is that the presence of long memory negates price bubbles while the presence of breaks could artificially induce bubble behaviour. Hence, we use procedures namely semi-parametric Whittle and parametric ARFIMA procedures that are consistent for a variety of residual biases to estimate the value of the long memory parameter, d, of the log rent-price ratio. We find that the semi-parametric estimation procedures robust to non-normality and heteroskedasticity errors found far more bubble regions than parametric ones. A structural break was identified in the mean and trend of all the series which when accounted for removed bubble behaviour in a number of regions. Importantly, the United States housing market showed evidence for rational bubbles at both the aggregate and regional levels. In the third and final chapter, we attempt to answer the following question: To what extend should individuals participate in the stock market and hold risky assets over their lifecycle? We answer this question by employing a lifecycle consumption-portfolio choice model with housing, labour income and time varying predictable returns where the agents are constrained in the level of their borrowing. We first analytically characterize and then numerically solve for the optimal asset allocation on the risky asset comparing the return predictability case with that of IID returns. We successfully resolve the puzzles and find equity holding and participation rates close to the data. We also find that return predictability substantially alter both the level of risky portfolio allocation and the rate of stock market participation. High factor (dividend-price ratio) realization and high persistence of factor process indicative of stock market bubbles raise the amount of wealth invested in risky assets and the level of stock market participation, respectively. Conversely, rare disasters were found to bring down these rates, the change being severe for investors in the later years of the life-cycle. Furthermore, investors following time varying returns (return predictability) hedged background risks significantly better than the IID ones.
Resumo:
Mestrado em Contabilidade e Análise Financeira
Resumo:
Organized interests do not have direct control over the fate of their policy agendas in Congress. They cannot introduce bills, vote on legislation, or serve on House committees. If organized interests want to achieve virtually any of their legislative goals they must rely on and work through members of Congress. As an interest group seeks to move its policy agenda forward in Congress, then, one of the most important challenges it faces is the recruitment of effective legislative allies. Legislative allies are members of Congress who “share the same policy objective as the group” and who use their limited time and resources to advocate for the group’s policy needs (Hall and Deardorff 2006, 76). For all the financial resources that a group can bring to bear as it competes with other interests to win policy outcomes, it will be ineffective without the help of members of Congress that are willing to expend their time and effort to advocate for its policy positions (Bauer, Pool, and Dexter 1965; Baumgartner and Leech 1998b; Hall and Wayman 1990; Hall and Deardorff 2006; Hojnacki and Kimball 1998, 1999). Given the importance of legislative allies to interest group success, are some organized interests better able to recruit legislative allies than others? This question has received little attention in the literature. This dissertation offers an original theoretical framework describing both when we should expect some types of interests to generate more legislative allies than others and how interests vary in their effectiveness at mobilizing these allies toward effective legislative advocacy. It then tests these theoretical expectations on variation in group representation during the stage in the legislative process that many scholars have argued is crucial to policy influence, interest representation on legislative committees. The dissertation uncovers pervasive evidence that interests with a presence across more congressional districts stand a better chance of having legislative allies on their key committees. It also reveals that interests with greater amounts of leverage over jobs and economic investment will be better positioned to win more allies on key committees. In addition, interests with a policy agenda that closely overlaps with the jurisdiction of just one committee in Congress are more likely to have legislative allies on their key committees than are interests that have a policy agenda divided across many committee jurisdictions. In short, how groups are distributed across districts, the leverage that interests have over local jobs and economic investment, and how committee jurisdictions align with their policy goals affects their influence in Congress.
Resumo:
Dissertação de Mestrado, Gestão de Empresas (MBA), 23 de Maio de 2016, Universidade dos Açores.
Resumo:
When a dominant undertaking holding a standard-essential patent uses its exclusive right to the IP to seek injunctions against those wishing to produce either de jure or de facto standard compliant products, it creates a conflict between the exclusive right to the use of the IP on the one hand and the possible abuse of dominance due to the exclusionary conduct on the other. The aim of the thesis is to focus on the issues concerning abuse of dominance in violation of Article 102 TFEU when the holder of the standard-essential patent seeks an injunction against a would-be licensee. The thesis is mainly based on the most recent ECJ case law in Huawei and the Commission’s recent decisions in Samsung and Motorola. The case law in Europe prior to those decisions was mainly focused on the German case law from Orange Book Standard which provided IP holders great leverage due to the almost automatic granting of injunctions against infringers. The ECJ in Huawei set out the requirements for when a de jure standard-essential patent holder would not be violating Article 102 TFEU when seeking an injunction, requiring that negotiations in good faith must take place prior to the seeking of the injunction and that all offers must comply with FRAND terms, thus limiting the scope of case law derived from Orange Book Standard in Germany. The ECJ chose not to follow all of the reasoning the Commission had laid out in Samsung and Motorola which provided a more licensee-friendly approach on the matter, but rather chose a compromise between the IP holder friendly German case law and the Commission’s decisions. However, the ECJ did not disclose how FRAND terms themselves should be interpreted, but rather left it for the national courts to decide. Furthermore, the thesis strongly argues that Huawei did not change the fact that only vertically integrated IP holders who have made a FRAND declaration are subject to the terms laid out in Huawei, thus leaving non-practicing entities such as patent trolls and entities that have not made a FRAND declaration outside its scope. The resulting conclusion from the thesis is that while the ECJ in Huawei presented new exceptional circumstances for when an IP holder could be abusing its dominant position when it seeks an injunction, it still left many more questions answered, such as the meaning of FRAND and whether deception in giving a FRAND declaration is prohibited under Article 102 TFEU or not.
Resumo:
Este relatório tem como objetivo descrever todas as atividades desenvolvidas durante o período de estágio na Miranda & Irmão, Lda, e apresentar um estudo detalhado das estratégias implementadas para a retenção de clientes, contribuindo, assim, para a criação de valor através do Marketing Relacional. O Marketing Relacional tem vindo a ganhar destaque na criação de estratégias de negócio, contribuindo para a fomentação de relações estáveis e duradouras entre as empresas e os seus clientes. Inserido nesta estratégia, a “Infinium” é um produto premium criado para alavancar a marca “Miranda” com recurso a produtos tecnologicamente transcendentes e para dar resposta a um novo nicho de mercado.
Resumo:
Financial constraints influence corporate policies of firms, including both investment decisions and external financing policies. The relevance of this phenomenon has become more pronounced during and after the recent financial crisis in 2007/2008. In addition to raising costs of external financing, the effects of financial crisis limited the availability of external financing which had implications for employment, investment, sale of assets, and tech spending. This thesis provides a comprehensive analysis of the effects of financial constraints on share issuance and repurchases decisions. Financial constraints comprise both internal constraints reflecting the demand for external financing and external financial constraints that relate to the supply of external financing. The study also examines both operating performance and stock market reactions associated with equity issuance methods. The first empirical chapter explores the simultaneous effects of financial constraints and market timing on share issuance decisions. Internal financing constraints limit firms’ ability to issue overvalued equity. On the other hand, financial crisis and low market liquidity (external financial constraints) restrict availability of equity financing and consequently increase the costs of external financing. Therefore, the study explores the extent to which internal and external financing constraints limit market timing of equity issues. This study finds that financial constraints play a significant role in whether firms time their equity issues when the shares are overvalued. The conclusion is that financially constrained firms issue overvalued equity when the external equity market or the general economic conditions are favourable. During recessionary periods, costs of external finance increase such that financially constrained firms are less likely to issue overvalued equity. Only unconstrained firms are more likely to issue overvalued equity even during crisis. Similarly, small firms that need cash flows to finance growth projects are less likely to access external equity financing during period of significant economic recessions. Moreover, constrained firms have low average stock returns compared to unconstrained firms, especially when they issue overvalued equity. The second chapter examines the operating performance and stock returns associated with equity issuance methods. Firms in the UK can issue equity through rights issues, open offers, and private placement. This study argues that alternative equity issuance methods are associated with a different level of operating performance and long-term stock returns. Firms using private placement are associated with poor operating performance. However, rights issues are found empirically to be associated with higher operating performance and less negative long-term stock returns after issuance in comparison to counterpart firms that issue private placements and open offers. Thus, rights issuing firms perform better than open offers and private placement because the favourable operating performance at the time of issuance generates subsequent positive long-run stock price response. Right issuing firms are of better quality and outperform firms that adopt open offers and private placement. In the third empirical chapter, the study explores the levered share repurchase of internally financially unconstrained firms. Unconstrained firms are expected to repurchase their shares using internal funds rather than through external borrowings. However, evidence shows that levered share repurchases are common among unconstrained firms. These firms display this repurchase behaviour when they have bond ratings or investment grade ratings that allow them to obtain cheap external debt financing. It is found that internally financially unconstrained firms borrow to finance their share repurchase when they invest more. Levered repurchase firms are associated with less positive abnormal returns than unlevered repurchase firms. For the levered repurchase sample, high investing firms are associated with more positive long-run abnormal stock returns than low investing firms. It appears the market underreact to the levered repurchase in the short-run regardless of the level of investments. These findings indicate that market reactions reflect both undervaluation and signaling hypotheses of positive information associated with share repurchase. As the firms undertake capital investments, they generate future cash flows, limit the effects of leverage on financial distress and ultimately reduce the risk of the equity capital.
Resumo:
Dissertação de Mestrado, Ciências Económicas e Empresariais, 13 de Julho de 2016, Universidade dos Açores.
Resumo:
Dissertação de Mestrado, Ciências Económicas e Empresariais, 19 de Julho de 2016, Universidade dos Açores.
Resumo:
El proceso de toma de decisiones en las bibliotecas universitarias es de suma importancia, sin embargo, se encuentra complicaciones como la gran cantidad de fuentes de datos y los grandes volúmenes de datos a analizar. Las bibliotecas universitarias están acostumbradas a producir y recopilar una gran cantidad de información sobre sus datos y servicios. Las fuentes de datos comunes son el resultado de sistemas internos, portales y catálogos en línea, evaluaciones de calidad y encuestas. Desafortunadamente estas fuentes de datos sólo se utilizan parcialmente para la toma de decisiones debido a la amplia variedad de formatos y estándares, así como la falta de métodos eficientes y herramientas de integración. Este proyecto de tesis presenta el análisis, diseño e implementación del Data Warehouse, que es un sistema integrado de toma de decisiones para el Centro de Documentación Juan Bautista Vázquez. En primer lugar se presenta los requerimientos y el análisis de los datos en base a una metodología, esta metodología incorpora elementos claves incluyendo el análisis de procesos, la calidad estimada, la información relevante y la interacción con el usuario que influyen en una decisión bibliotecaria. A continuación, se propone la arquitectura y el diseño del Data Warehouse y su respectiva implementación la misma que soporta la integración, procesamiento y el almacenamiento de datos. Finalmente los datos almacenados se analizan a través de herramientas de procesamiento analítico y la aplicación de técnicas de Bibliomining ayudando a los administradores del centro de documentación a tomar decisiones óptimas sobre sus recursos y servicios.