641 resultados para LEVERAGE


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Organized interests do not have direct control over the fate of their policy agendas in Congress. They cannot introduce bills, vote on legislation, or serve on House committees. If organized interests want to achieve virtually any of their legislative goals they must rely on and work through members of Congress. As an interest group seeks to move its policy agenda forward in Congress, then, one of the most important challenges it faces is the recruitment of effective legislative allies. Legislative allies are members of Congress who “share the same policy objective as the group” and who use their limited time and resources to advocate for the group’s policy needs (Hall and Deardorff 2006, 76). For all the financial resources that a group can bring to bear as it competes with other interests to win policy outcomes, it will be ineffective without the help of members of Congress that are willing to expend their time and effort to advocate for its policy positions (Bauer, Pool, and Dexter 1965; Baumgartner and Leech 1998b; Hall and Wayman 1990; Hall and Deardorff 2006; Hojnacki and Kimball 1998, 1999). Given the importance of legislative allies to interest group success, are some organized interests better able to recruit legislative allies than others? This question has received little attention in the literature. This dissertation offers an original theoretical framework describing both when we should expect some types of interests to generate more legislative allies than others and how interests vary in their effectiveness at mobilizing these allies toward effective legislative advocacy. It then tests these theoretical expectations on variation in group representation during the stage in the legislative process that many scholars have argued is crucial to policy influence, interest representation on legislative committees. The dissertation uncovers pervasive evidence that interests with a presence across more congressional districts stand a better chance of having legislative allies on their key committees. It also reveals that interests with greater amounts of leverage over jobs and economic investment will be better positioned to win more allies on key committees. In addition, interests with a policy agenda that closely overlaps with the jurisdiction of just one committee in Congress are more likely to have legislative allies on their key committees than are interests that have a policy agenda divided across many committee jurisdictions. In short, how groups are distributed across districts, the leverage that interests have over local jobs and economic investment, and how committee jurisdictions align with their policy goals affects their influence in Congress.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de Mestrado, Gestão de Empresas (MBA), 23 de Maio de 2016, Universidade dos Açores.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When a dominant undertaking holding a standard-essential patent uses its exclusive right to the IP to seek injunctions against those wishing to produce either de jure or de facto standard compliant products, it creates a conflict between the exclusive right to the use of the IP on the one hand and the possible abuse of dominance due to the exclusionary conduct on the other. The aim of the thesis is to focus on the issues concerning abuse of dominance in violation of Article 102 TFEU when the holder of the standard-essential patent seeks an injunction against a would-be licensee. The thesis is mainly based on the most recent ECJ case law in Huawei and the Commission’s recent decisions in Samsung and Motorola. The case law in Europe prior to those decisions was mainly focused on the German case law from Orange Book Standard which provided IP holders great leverage due to the almost automatic granting of injunctions against infringers. The ECJ in Huawei set out the requirements for when a de jure standard-essential patent holder would not be violating Article 102 TFEU when seeking an injunction, requiring that negotiations in good faith must take place prior to the seeking of the injunction and that all offers must comply with FRAND terms, thus limiting the scope of case law derived from Orange Book Standard in Germany. The ECJ chose not to follow all of the reasoning the Commission had laid out in Samsung and Motorola which provided a more licensee-friendly approach on the matter, but rather chose a compromise between the IP holder friendly German case law and the Commission’s decisions. However, the ECJ did not disclose how FRAND terms themselves should be interpreted, but rather left it for the national courts to decide. Furthermore, the thesis strongly argues that Huawei did not change the fact that only vertically integrated IP holders who have made a FRAND declaration are subject to the terms laid out in Huawei, thus leaving non-practicing entities such as patent trolls and entities that have not made a FRAND declaration outside its scope. The resulting conclusion from the thesis is that while the ECJ in Huawei presented new exceptional circumstances for when an IP holder could be abusing its dominant position when it seeks an injunction, it still left many more questions answered, such as the meaning of FRAND and whether deception in giving a FRAND declaration is prohibited under Article 102 TFEU or not.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este relatório tem como objetivo descrever todas as atividades desenvolvidas durante o período de estágio na Miranda & Irmão, Lda, e apresentar um estudo detalhado das estratégias implementadas para a retenção de clientes, contribuindo, assim, para a criação de valor através do Marketing Relacional. O Marketing Relacional tem vindo a ganhar destaque na criação de estratégias de negócio, contribuindo para a fomentação de relações estáveis e duradouras entre as empresas e os seus clientes. Inserido nesta estratégia, a “Infinium” é um produto premium criado para alavancar a marca “Miranda” com recurso a produtos tecnologicamente transcendentes e para dar resposta a um novo nicho de mercado.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Financial constraints influence corporate policies of firms, including both investment decisions and external financing policies. The relevance of this phenomenon has become more pronounced during and after the recent financial crisis in 2007/2008. In addition to raising costs of external financing, the effects of financial crisis limited the availability of external financing which had implications for employment, investment, sale of assets, and tech spending. This thesis provides a comprehensive analysis of the effects of financial constraints on share issuance and repurchases decisions. Financial constraints comprise both internal constraints reflecting the demand for external financing and external financial constraints that relate to the supply of external financing. The study also examines both operating performance and stock market reactions associated with equity issuance methods. The first empirical chapter explores the simultaneous effects of financial constraints and market timing on share issuance decisions. Internal financing constraints limit firms’ ability to issue overvalued equity. On the other hand, financial crisis and low market liquidity (external financial constraints) restrict availability of equity financing and consequently increase the costs of external financing. Therefore, the study explores the extent to which internal and external financing constraints limit market timing of equity issues. This study finds that financial constraints play a significant role in whether firms time their equity issues when the shares are overvalued. The conclusion is that financially constrained firms issue overvalued equity when the external equity market or the general economic conditions are favourable. During recessionary periods, costs of external finance increase such that financially constrained firms are less likely to issue overvalued equity. Only unconstrained firms are more likely to issue overvalued equity even during crisis. Similarly, small firms that need cash flows to finance growth projects are less likely to access external equity financing during period of significant economic recessions. Moreover, constrained firms have low average stock returns compared to unconstrained firms, especially when they issue overvalued equity. The second chapter examines the operating performance and stock returns associated with equity issuance methods. Firms in the UK can issue equity through rights issues, open offers, and private placement. This study argues that alternative equity issuance methods are associated with a different level of operating performance and long-term stock returns. Firms using private placement are associated with poor operating performance. However, rights issues are found empirically to be associated with higher operating performance and less negative long-term stock returns after issuance in comparison to counterpart firms that issue private placements and open offers. Thus, rights issuing firms perform better than open offers and private placement because the favourable operating performance at the time of issuance generates subsequent positive long-run stock price response. Right issuing firms are of better quality and outperform firms that adopt open offers and private placement. In the third empirical chapter, the study explores the levered share repurchase of internally financially unconstrained firms. Unconstrained firms are expected to repurchase their shares using internal funds rather than through external borrowings. However, evidence shows that levered share repurchases are common among unconstrained firms. These firms display this repurchase behaviour when they have bond ratings or investment grade ratings that allow them to obtain cheap external debt financing. It is found that internally financially unconstrained firms borrow to finance their share repurchase when they invest more. Levered repurchase firms are associated with less positive abnormal returns than unlevered repurchase firms. For the levered repurchase sample, high investing firms are associated with more positive long-run abnormal stock returns than low investing firms. It appears the market underreact to the levered repurchase in the short-run regardless of the level of investments. These findings indicate that market reactions reflect both undervaluation and signaling hypotheses of positive information associated with share repurchase. As the firms undertake capital investments, they generate future cash flows, limit the effects of leverage on financial distress and ultimately reduce the risk of the equity capital.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de Mestrado, Ciências Económicas e Empresariais, 13 de Julho de 2016, Universidade dos Açores.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de Mestrado, Ciências Económicas e Empresariais, 19 de Julho de 2016, Universidade dos Açores.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El proceso de toma de decisiones en las bibliotecas universitarias es de suma importancia, sin embargo, se encuentra complicaciones como la gran cantidad de fuentes de datos y los grandes volúmenes de datos a analizar. Las bibliotecas universitarias están acostumbradas a producir y recopilar una gran cantidad de información sobre sus datos y servicios. Las fuentes de datos comunes son el resultado de sistemas internos, portales y catálogos en línea, evaluaciones de calidad y encuestas. Desafortunadamente estas fuentes de datos sólo se utilizan parcialmente para la toma de decisiones debido a la amplia variedad de formatos y estándares, así como la falta de métodos eficientes y herramientas de integración. Este proyecto de tesis presenta el análisis, diseño e implementación del Data Warehouse, que es un sistema integrado de toma de decisiones para el Centro de Documentación Juan Bautista Vázquez. En primer lugar se presenta los requerimientos y el análisis de los datos en base a una metodología, esta metodología incorpora elementos claves incluyendo el análisis de procesos, la calidad estimada, la información relevante y la interacción con el usuario que influyen en una decisión bibliotecaria. A continuación, se propone la arquitectura y el diseño del Data Warehouse y su respectiva implementación la misma que soporta la integración, procesamiento y el almacenamiento de datos. Finalmente los datos almacenados se analizan a través de herramientas de procesamiento analítico y la aplicación de técnicas de Bibliomining ayudando a los administradores del centro de documentación a tomar decisiones óptimas sobre sus recursos y servicios.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The solvency rate of banks differs from the other corporations. The equity rate of a bank is lower than it is in corporations of other field of business. However, functional banking industry has huge impact on the whole society. The equity rate of a bank needs to be higher because that makes the banking industry more stable as the probability of the banks going under will decrease. If a bank goes belly up, the government will be compensating the deposits since it has granted the bank’s depositors a deposit insurance. This means that the payment comes from the tax payers in the last resort. Economic conversation has long concentrated on the costs of raising equity ratio. It has been a common belief that raising equity ratio also increases the banks’ funding costs in the same phase and these costs will be redistributed to the banks customers as higher service charges. Regardless of the common belief, the actual reaction of the funding costs to the higher equity ratio has been studied only a little in Europe and no study has been constructed in Finland. Before it can be calculated whether the higher stability of the banking industry that is caused by the raise in equity levels compensates the extra costs in funding costs, it must be calculated how much the actual increase in the funding costs is. Currently the banking industry is controlled by complex and heavy regulation. To maintain such a complex system inflicts major costs in itself. This research leans on the Modigliani and Miller theory, which shows that the finance structure of a firm is irrelevant to their funding costs. In addition, this research follows the calculations of Miller, Yang ja Marcheggianon (2012) and Vale (2011) where they calculate the funding costs after the doubling of specific banks’ equity ratios. The Finnish banks studied in this research are Nordea and Danske Bank because they are the two largest banks operating in Finland and they both also have the right company form to able the calculations. To calculate the costs of halving their leverages this study used the Capital Asset Pricing Model. The halving of the leverage of Danske Bank raised its funding costs for 16—257 basis points depending on the method of assessment. For Nordea the increase in funding costs was 11—186 basis points when its leverage was halved. On the behalf of the results found in this study it can be said that the doubling of an equity ratio does not increase the funding costs of a bank one by one. Actually the increase is quite modest. More solvent banks would increase the stability of the banking industry enormously while the increase in funding costs is low. If the costs of bank regulation exceeds the increase in funding costs after the higher equity ratio, it can be thought that this is the better way of stabilizing the banking industry rather than heavy regulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The concepts of smart city and social innovation in combination with the increasing use of ICT by citizens and public authorities could enhance the involvement of people on the decisions that directly affect their daily life. A case study approach was adopted to illustrate the potential of civic crowdfunding for increasing the participation and collaboration between citizens, firms and government. The analysis of two exemplary cases shows that civic crowdfunding platforms could be used by public administration to engage communities in the search of solutions to local problems. Likewise, it could be used to reinforce the community ties and to leverage the bonds among the stakeholders and the partners of the community ecosystem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Manufacturing companies have passed from selling uniquely tangible products to adopting a service-oriented approach to generate steady and continuous revenue streams. Nowadays, equipment and machine manufacturers possess technologies to track and analyze product-related data for obtaining relevant information from customers’ use towards the product after it is sold. The Internet of Things on Industrial environments will allow manufacturers to leverage lifecycle product traceability for innovating towards an information-driven services approach, commonly referred as “Smart Services”, for achieving improvements in support, maintenance and usage processes. The aim of this study is to conduct a literature review and empirical analysis to present a framework that describes a customer-oriented approach for developing information-driven services leveraged by the Internet of Things in manufacturing companies. The empirical study employed tools for the assessment of customer needs for analyzing the case company in terms of information requirements and digital needs. The literature review supported the empirical analysis with a deep research on product lifecycle traceability and digitalization of product-related services within manufacturing value chains. As well as the role of simulation-based technologies on supporting the “Smart Service” development process. The results obtained from the case company analysis show that the customers mainly demand information that allow them to monitor machine conditions, machine behavior on different geographical conditions, machine-implement interactions, and resource and energy consumption. Put simply, information outputs that allow them to increase machine productivity for maximizing yields, save time and optimize resources in the most sustainable way. Based on customer needs assessment, this study presents a framework to describe the initial phases of a “Smart Service” development process, considering the requirements of Smart Engineering methodologies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sequences of timestamped events are currently being generated across nearly every domain of data analytics, from e-commerce web logging to electronic health records used by doctors and medical researchers. Every day, this data type is reviewed by humans who apply statistical tests, hoping to learn everything they can about how these processes work, why they break, and how they can be improved upon. To further uncover how these processes work the way they do, researchers often compare two groups, or cohorts, of event sequences to find the differences and similarities between outcomes and processes. With temporal event sequence data, this task is complex because of the variety of ways single events and sequences of events can differ between the two cohorts of records: the structure of the event sequences (e.g., event order, co-occurring events, or frequencies of events), the attributes about the events and records (e.g., gender of a patient), or metrics about the timestamps themselves (e.g., duration of an event). Running statistical tests to cover all these cases and determining which results are significant becomes cumbersome. Current visual analytics tools for comparing groups of event sequences emphasize a purely statistical or purely visual approach for comparison. Visual analytics tools leverage humans' ability to easily see patterns and anomalies that they were not expecting, but is limited by uncertainty in findings. Statistical tools emphasize finding significant differences in the data, but often requires researchers have a concrete question and doesn't facilitate more general exploration of the data. Combining visual analytics tools with statistical methods leverages the benefits of both approaches for quicker and easier insight discovery. Integrating statistics into a visualization tool presents many challenges on the frontend (e.g., displaying the results of many different metrics concisely) and in the backend (e.g., scalability challenges with running various metrics on multi-dimensional data at once). I begin by exploring the problem of comparing cohorts of event sequences and understanding the questions that analysts commonly ask in this task. From there, I demonstrate that combining automated statistics with an interactive user interface amplifies the benefits of both types of tools, thereby enabling analysts to conduct quicker and easier data exploration, hypothesis generation, and insight discovery. The direct contributions of this dissertation are: (1) a taxonomy of metrics for comparing cohorts of temporal event sequences, (2) a statistical framework for exploratory data analysis with a method I refer to as high-volume hypothesis testing (HVHT), (3) a family of visualizations and guidelines for interaction techniques that are useful for understanding and parsing the results, and (4) a user study, five long-term case studies, and five short-term case studies which demonstrate the utility and impact of these methods in various domains: four in the medical domain, one in web log analysis, two in education, and one each in social networks, sports analytics, and security. My dissertation contributes an understanding of how cohorts of temporal event sequences are commonly compared and the difficulties associated with applying and parsing the results of these metrics. It also contributes a set of visualizations, algorithms, and design guidelines for balancing automated statistics with user-driven analysis to guide users to significant, distinguishing features between cohorts. This work opens avenues for future research in comparing two or more groups of temporal event sequences, opening traditional machine learning and data mining techniques to user interaction, and extending the principles found in this dissertation to data types beyond temporal event sequences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis intends to analyse the performance and the efficiency of companies and to identify the key factors that may explain it. A comprehensive analysis based on a set of economic and financial ratios was studied as an instrument which provides information on enterprise performance and its efficiency. It was selected a sample with 15 enterprises: 7 Portuguese and 8 Ukrainian ones, belonging to several industries. Financial and non-financial data was collected for 6 years, during the period of 2009 to 2014. Research questions that guided this work were: Are the enterprises efficient/profitable? What factors influence enterprises’ efficiency/performance? Is there any difference between Ukrainian and Portuguese enterprises’ efficiency/performance, which factors have more influence? Which industrial sector is represented by more efficient/profitable enterprises? The main results showed that in average enterprises were efficient; comparing by states Ukrainian enterprises are more efficient; industries have similar level of efficiency. Among factors that influence ATR positively are fixed and current assets turnover ratios, ROA; negatively influencing are EBITDA margin and liquidity ratio. There is no significant difference between models by country. Concerning profitability, enterprises have low performance level but in comparison of countries Ukrainian enterprises have better profitability in average. Regarding the industry sector, paper industry is the most profitable. Among factors influencing ROA are profit margin, fixed asset turnover ratio, EBITDA margin, Debt to equity ratio and the country. In case of profitability both countries have different models. For Ukrainian enterprises is suggested to pay attention on factors of Short-term debt to total debt, ROA, Interest coverage ratio in order to be more efficient; Profit margin and EBITDA margin to make their performance better. For Portuguese enterprises for improving efficiency the observation and improvement of fixed assets turnover ratio, current assets turnover ratio, Short-term financial debt to total debt, Leverage Ratio, EBITDA margin is suggested; for improving higher profitability track fixed assets turnover ratio, current assets turnover ratio, Debt to equity ratio, Profit margin and Interest coverage ratio is suggested.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aims to investigate factors that may affect return on equity (ROE). The ROE is a gauge of profit generating efficiency and a strong measure of how well the management of a firm creates value for its shareholders. Firms with higher ROE typically have competitive advantages over their competitors which translates into superior returns for investors. Therefore, seems imperative to study the drivers of ROE, particularly ratios and indicators that may have considerable impact. The analysis is done on a sample of 90 largest non-financial companies which are components of NASDAQ-100 index and also on industry sector samples. The ordinary least squares method is used to find the most impactful drivers of ROE. The extended DuPont model’s components are considered as the primary factors affecting ROE. In addition, other ratios and indicators such as price to earnings, price to book and current are also incorporated. Consequently, the study uses eight ratios that are believed to have impact on ROE. According to our findings, the most relevant ratios that determine ROE are tax burden, interest burden, operating margin, asset turnover and financial leverage (extended DuPont components) regardless of industry sectors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dinoflagellates possess large genomes in which most genes are present in many copies. This has made studies of their genomic organization and phylogenetics challenging. Recent advances in sequencing technology have made deep sequencing of dinoflagellate transcriptomes feasible. This dissertation investigates the genomic organization of dinoflagellates to better understand the challenges of assembling dinoflagellate transcriptomic and genomic data from short read sequencing methods, and develops new techniques that utilize deep sequencing data to identify orthologous genes across a diverse set of taxa. To better understand the genomic organization of dinoflagellates, a genomic cosmid clone of the tandemly repeated gene Alchohol Dehydrogenase (AHD) was sequenced and analyzed. The organization of this clone was found to be counter to prevailing hypotheses of genomic organization in dinoflagellates. Further, a new non-canonical splicing motif was described that could greatly improve the automated modeling and annotation of genomic data. A custom phylogenetic marker discovery pipeline, incorporating methods that leverage the statistical power of large data sets was written. A case study on Stramenopiles was undertaken to test the utility in resolving relationships between known groups as well as the phylogenetic affinity of seven unknown taxa. The pipeline generated a set of 373 genes useful as phylogenetic markers that successfully resolved relationships among the major groups of Stramenopiles, and placed all unknown taxa on the tree with strong bootstrap support. This pipeline was then used to discover 668 genes useful as phylogenetic markers in dinoflagellates. Phylogenetic analysis of 58 dinoflagellates, using this set of markers, produced a phylogeny with good support of all branches. The Suessiales were found to be sister to the Peridinales. The Prorocentrales formed a monophyletic group with the Dinophysiales that was sister to the Gonyaulacales. The Gymnodinales was found to be paraphyletic, forming three monophyletic groups. While this pipeline was used to find phylogenetic markers, it will likely also be useful for finding orthologs of interest for other purposes, for the discovery of horizontally transferred genes, and for the separation of sequences in metagenomic data sets.