877 resultados para Interest coverage
Resumo:
Esta tese avalia o impacto dos principais atores recorrentes durante o processo de IPO, em particular, o venture capitalist, o underwriter, e o auditor, sobre as condições de comercialização das ações da empresa, capturado pelo bid-ask spread, a fração de investidores institucionais que investem na empresa, a dispersão de capital, entre outros. Além disso, este estudo também analisa alguns benefícios que os fundos de Venture Capital (VCs) fornecem às empresas que eles investem. Ele investiga o papel dos VCs em dificultar o gerenciamento de resultados em IPOs e quantifica o papel desempenhado por eles no desempenho operacional das empresas após sua oferta inicial de ações. No primeiro capítulo, os resultados indicam que as empresas inflam seus resultados principalmente nos períodos pré-IPO e do IPO. Quando nós controlamos para os quatro períodos diferentes do IPO, observamos que IPOs de empresas investidas por VCs apresentam significativamente menos gerenciamento de resultados no IPO e em períodos seguintes à orfeta inicial das ações, exatamente quando as empresas tendem a inflar mais seus lucros. Este resultado é robusto a diferentes métodos estatísticos e diferentes metodologias usadas para avaliar o gerenciamento de resultados. Além disso, ao dividir a amostra entre IPOs de empresas investidas e não investidas por VCs, observa-se que ambos os grupos apresentam gerenciamento de resultados. Ambas as subamostras apresentam níveis de gerenciamento de resultados de forma mais intensa em diferentes fases ao redor do IPO. Finalmente, observamos também que top underwriters apresentam menores níveis de gerenciamento de resultados na subamostra das empresas investidas por VCs. No segundo capítulo, verificou-se que a escolha do auditor, dos VCs, e underwriter pode indicar escolhas de longo prazo da empresa. Nós apresentamos evidências que as características do underwriter, auditor, e VC têm um impacto sobre as características das empresas e seu desempenho no mercado. Além disso, estes efeitos são persistentes por quase uma década. As empresas que têm um top underwriter e um auditor big-N no momento do IPO têm características de mercado que permanecem ao longo dos próximos 8 anos. Essas características são representadas por um número maior de analistas seguindo a empresa, uma grande dispersão da propriedade através de investidores institucionais, e maior liquidez através um bid-ask spread menor. Elas também são menos propensas a saírem do mercado, bem como mais propensas à emissão de uma orferta secundária. Finalmente, empresas investidas por VCs são positivamente afetadas, quando consideramos todas as medidas de liquidez de mercado, desde a abertura de capital até quase uma década depois. Tais efeitos não são devido ao viés de sobrevivência. Estes resultados não dependem da bolha dot-com, ou seja, os nossos resultados são qualitativamente similares, uma vez que excluímos o período da bolha de 1999-2000. No último capítulo foi evidenciado que empresas investidas por VCs incorrem em um nível mais elevado de saldo em tesouraria do que as empresas não investidas. Este efeito é persistente por pelo menos 8 anos após o IPO. Mostramos também que empresas investidas por VCs estão associadas a um nível menor de alavancagem e cobertura de juros ao longo dos primeiros oito anos após o IPO. Finalmente, não temos evidências estatisticamente significantes entre VCs e a razão dividendo lucro. Estes resultados também são robustos a diversos métodos estatísticos e diferentes metodologias.
Resumo:
This thesis intends to analyse the performance and the efficiency of companies and to identify the key factors that may explain it. A comprehensive analysis based on a set of economic and financial ratios was studied as an instrument which provides information on enterprise performance and its efficiency. It was selected a sample with 15 enterprises: 7 Portuguese and 8 Ukrainian ones, belonging to several industries. Financial and non-financial data was collected for 6 years, during the period of 2009 to 2014. Research questions that guided this work were: Are the enterprises efficient/profitable? What factors influence enterprises’ efficiency/performance? Is there any difference between Ukrainian and Portuguese enterprises’ efficiency/performance, which factors have more influence? Which industrial sector is represented by more efficient/profitable enterprises? The main results showed that in average enterprises were efficient; comparing by states Ukrainian enterprises are more efficient; industries have similar level of efficiency. Among factors that influence ATR positively are fixed and current assets turnover ratios, ROA; negatively influencing are EBITDA margin and liquidity ratio. There is no significant difference between models by country. Concerning profitability, enterprises have low performance level but in comparison of countries Ukrainian enterprises have better profitability in average. Regarding the industry sector, paper industry is the most profitable. Among factors influencing ROA are profit margin, fixed asset turnover ratio, EBITDA margin, Debt to equity ratio and the country. In case of profitability both countries have different models. For Ukrainian enterprises is suggested to pay attention on factors of Short-term debt to total debt, ROA, Interest coverage ratio in order to be more efficient; Profit margin and EBITDA margin to make their performance better. For Portuguese enterprises for improving efficiency the observation and improvement of fixed assets turnover ratio, current assets turnover ratio, Short-term financial debt to total debt, Leverage Ratio, EBITDA margin is suggested; for improving higher profitability track fixed assets turnover ratio, current assets turnover ratio, Debt to equity ratio, Profit margin and Interest coverage ratio is suggested.
Resumo:
Due to the rapid changes that governs the Swedish financial sector such as financial deregulations and technological innovations, it is imperative to examine the extent to which the Swedish Financial institutions had performed amid these changes. For this to be accomplish, the work investigates what are the determinants of performance for Swedish Financial Monetary Institutions? Assumptions were derived from theoretical and empirical literatures to investigate the authenticity of this research question using seven explanatory variables. Two models were specified using Returns on Asset (ROA) and Return on Equity (ROE) as the main performance indicators and for the sake of reliability and validity, three different estimators such as Ordinary Least Square (OLS), Generalized Least Square (GLS) and Feasible Generalized Least Square (FGLS) were employed. The Akaike Information Criterion (AIC) was also used to verify which specification explains performance better while performing robustness check of parameter estimates was done by correcting for standard errors. Based on the findings, ROA specification proves to have the lowest Akaike Information Criterion (AIC) and Standard errors compared to ROE specification. Under ROA, two variables; the profit margins and the Interest coverage ratio proves to be statistically significant while under ROE just the interest coverage ratio (ICR) for all the estimators proves significant. The result also shows that the FGLS is the most efficient estimator, then follows the GLS and the last OLS. when corrected for SE robust, the gearing ratio which measures the capital structure becomes significant under ROA and its estimate become positive under ROE robust. Conclusions were drawn that, within the period of study three variables (ICR, profit margins and gearing) shows significant and four variables were insignificant. The overall findings show that the institutions strive to their best to maximize returns but these returns were just normal to cover their costs of operation. Much should be done as per the ASC theory to avoid liquidity and credit risks problems. Again, estimated values of ICR and profit margins shows that a considerable amount of efforts with sound financial policies are required to increase performance by one percentage point. Areas of further research could be how the individual stochastic factors such as the Dupont model, repo rates, inflation, GDP etc. can influence performance.
Resumo:
Outside lobbying is a key strategy for social movements, interest groups and political parties for mobilising public opinion through the media in order to pressure policymakers and influence the policymaking process. Relying on semi-structured interviews and newspaper content analysis in six Western European countries, this article examines the use of four outside lobbying strategies – media-related activities, informing (about) the public, mobilisation and protest – and the amount of media coverage they attract. While some strategies are systematically less pursued than others, we find variation in their relative share across institutional contexts and actor types. Given that most of these differences are not accurately mirrored in the media, we conclude that media coverage is only loosely connected to outside lobbying behaviour, and that the media respond differently to a given strategy when used by different actors. Thus, the ability of different outside lobbying strategies to generate media coverage critically depends on who makes use of them.
Resumo:
Coverage Path Planning (CPP) is the task of determining a path that passes over all points of an area or volume of interest while avoiding obstacles. This task is integral to many robotic applications, such as vacuum cleaning robots, painter robots, autonomous underwater vehicles creating image mosaics, demining robots, lawn mowers, automated harvesters, window cleaners and inspection of complex structures, just to name a few. A considerable body of research has addressed the CPP problem. However, no updated surveys on CPP reflecting recent advances in the field have been presented in the past ten years. In this paper, we present a review of the most successful CPP methods, focusing on the achievements made in the past decade. Furthermore, we discuss reported field applications of the described CPP methods. This work aims to become a starting point for researchers who are initiating their endeavors in CPP. Likewise, this work aims to present a comprehensive review of the recent breakthroughs in the field, providing links to the most interesting and successful works
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
In this work is presented a new method for sensor deployment on 3D surfaces. The method was structured on different steps. The first one aimed discretizes the relief of interest with Delaunay algorithm. The tetrahedra and relative values (spatial coordinates of each vertex and faces) were input to construction of 3D Voronoi diagram. Each circumcenter was calculated as a candidate position for a sensor node: the corresponding circular coverage area was calculated based on a radius r. The r value can be adjusted to simulate different kinds of sensors. The Dijkstra algorithm and a selection method were applied to eliminate candidate positions with overlapped coverage areas or beyond of surface of interest. Performance evaluations measures were defined using coverage area and communication as criteria. The results were relevant, once the mean coverage rate achieved on three different surfaces were among 91% and 100%.
Resumo:
Background Access to health care can be described along four dimensions: geographic accessibility, availability, financial accessibility and acceptability. Geographic accessibility measures how physically accessible resources are for the population, while availability reflects what resources are available and in what amount. Combining these two types of measure into a single index provides a measure of geographic (or spatial) coverage, which is an important measure for assessing the degree of accessibility of a health care network. Results This paper describes the latest version of AccessMod, an extension to the Geographical Information System ArcView 3.×, and provides an example of application of this tool. AccessMod 3 allows one to compute geographic coverage to health care using terrain information and population distribution. Four major types of analysis are available in AccessMod: (1) modeling the coverage of catchment areas linked to an existing health facility network based on travel time, to provide a measure of physical accessibility to health care; (2) modeling geographic coverage according to the availability of services; (3) projecting the coverage of a scaling-up of an existing network; (4) providing information for cost effectiveness analysis when little information about the existing network is available. In addition to integrating travelling time, population distribution and the population coverage capacity specific to each health facility in the network, AccessMod can incorporate the influence of landscape components (e.g. topography, river and road networks, vegetation) that impact travelling time to and from facilities. Topographical constraints can be taken into account through an anisotropic analysis that considers the direction of movement. We provide an example of the application of AccessMod in the southern part of Malawi that shows the influences of the landscape constraints and of the modes of transportation on geographic coverage. Conclusion By incorporating the demand (population) and the supply (capacities of heath care centers), AccessMod provides a unifying tool to efficiently assess the geographic coverage of a network of health care facilities. This tool should be of particular interest to developing countries that have a relatively good geographic information on population distribution, terrain, and health facility locations.
Resumo:
The EU’s Common Foreign and Security Policy (CFSP) and its accompanying Common Security and Defence Policy (CSDP) missions can be tools used to increase the international profile of the European Union. Nevertheless, CSDP missions garner little news coverage. This article argues that the very nature of the missions themselves makes them poor vehicles for EU promotion for political, institutional, and logistical reasons. By definition, they are conducted in the middle of crises, making news coverage politically sensitive. The very act of reporting could undermine the mission. Institutionally, all CSDP missions are intergovernmental, making press statements slow, overly bureaucratic, and of little interest to journalists. Logistically, the missions are often located in remote, undeveloped parts of the world, making it difficult and expensive for European and international journalists to cover. Moreover, these regions in crisis seldom have a thriving, local free press. Using the Aceh Monitoring Mission (AMM) as a case study, the author concludes that although a mission may do good, CSDP missions cannot fulfil the political function of raising the profile of the EU.
Resumo:
Les réseaux de capteurs sont formés d’un ensemble de dispositifs capables de prendre individuellement des mesures d’un environnement particulier et d’échanger de l’information afin d’obtenir une représentation de haut niveau sur les activités en cours dans la zone d’intérêt. Une telle détection distribuée, avec de nombreux appareils situés à proximité des phénomènes d’intérêt, est pertinente dans des domaines tels que la surveillance, l’agriculture, l’observation environnementale, la surveillance industrielle, etc. Nous proposons dans cette thèse plusieurs approches pour effectuer l’optimisation des opérations spatio-temporelles de ces dispositifs, en déterminant où les placer dans l’environnement et comment les contrôler au fil du temps afin de détecter les cibles mobiles d’intérêt. La première nouveauté consiste en un modèle de détection réaliste représentant la couverture d’un réseau de capteurs dans son environnement. Nous proposons pour cela un modèle 3D probabiliste de la capacité de détection d’un capteur sur ses abords. Ce modèle inègre également de l’information sur l’environnement grâce à l’évaluation de la visibilité selon le champ de vision. À partir de ce modèle de détection, l’optimisation spatiale est effectuée par la recherche du meilleur emplacement et l’orientation de chaque capteur du réseau. Pour ce faire, nous proposons un nouvel algorithme basé sur la descente du gradient qui a été favorablement comparée avec d’autres méthodes génériques d’optimisation «boites noires» sous l’aspect de la couverture du terrain, tout en étant plus efficace en terme de calculs. Une fois que les capteurs placés dans l’environnement, l’optimisation temporelle consiste à bien couvrir un groupe de cibles mobiles dans l’environnement. D’abord, on effectue la prédiction de la position future des cibles mobiles détectées par les capteurs. La prédiction se fait soit à l’aide de l’historique des autres cibles qui ont traversé le même environnement (prédiction à long terme), ou seulement en utilisant les déplacements précédents de la même cible (prédiction à court terme). Nous proposons de nouveaux algorithmes dans chaque catégorie qui performent mieux ou produits des résultats comparables par rapport aux méthodes existantes. Une fois que les futurs emplacements de cibles sont prédits, les paramètres des capteurs sont optimisés afin que les cibles soient correctement couvertes pendant un certain temps, selon les prédictions. À cet effet, nous proposons une méthode heuristique pour faire un contrôle de capteurs, qui se base sur les prévisions probabilistes de trajectoire des cibles et également sur la couverture probabiliste des capteurs des cibles. Et pour terminer, les méthodes d’optimisation spatiales et temporelles proposées ont été intégrées et appliquées avec succès, ce qui démontre une approche complète et efficace pour l’optimisation spatio-temporelle des réseaux de capteurs.
Resumo:
The structural engineering community in Brazil faces new challenges with the recent occurrence of high intensity tornados. Satellite surveillance data shows that the area covering the south-east of Brazil, Uruguay and some of Argentina is one of the world most tornado-prone areas, second only to the infamous tornado alley in central United States. The design of structures subject to tornado winds is a typical example of decision making in the presence of uncertainty. Structural design involves finding a good balance between the competing goals of safety and economy. This paper presents a methodology to find the optimum balance between these goals in the presence of uncertainty. In this paper, reliability-based risk optimization is used to find the optimal safety coefficient that minimizes the total expected cost of a steel frame communications tower, subject to extreme storm and tornado wind loads. The technique is not new, but it is applied to a practical problem of increasing interest to Brazilian structural engineers. The problem is formulated in the partial safety factor format used in current design codes, with all additional partial factor introduced to serve as optimization variable. The expected cost of failure (or risk) is defined as the product of a. limit state exceedance probability by a limit state exceedance cost. These costs include costs of repairing, rebuilding, and paying compensation for injury and loss of life. The total expected failure cost is the sum of individual expected costs over all failure modes. The steel frame communications, tower subject of this study has become very common in Brazil due to increasing mobile phone coverage. The study shows that optimum reliability is strongly dependent on the cost (or consequences) of failure. Since failure consequences depend oil actual tower location, it turn,,; out that different optimum designs should be used in different locations. Failure consequences are also different for the different parties involved in the design, construction and operation of the tower. Hence, it is important that risk is well understood by the parties involved, so that proper contracts call be made. The investigation shows that when non-structural terms dominate design costs (e.g, in residential or office buildings) it is not too costly to over-design; this observation is in agreement with the observed practice for non-optimized structural systems. In this situation, is much easier to loose money by under-design. When by under-design. When structural material cost is a significant part of design cost (e.g. concrete dam or bridge), one is likely to lose significantmoney by over-design. In this situation, a cost-risk-benefit optimization analysis is highly recommended. Finally, the study also shows that under time-varying loads like tornados, the optimum reliability is strongly dependent on the selected design life.
Resumo:
Background: Genome wide association studies (GWAS) are becoming the approach of choice to identify genetic determinants of complex phenotypes and common diseases. The astonishing amount of generated data and the use of distinct genotyping platforms with variable genomic coverage are still analytical challenges. Imputation algorithms combine directly genotyped markers information with haplotypic structure for the population of interest for the inference of a badly genotyped or missing marker and are considered a near zero cost approach to allow the comparison and combination of data generated in different studies. Several reports stated that imputed markers have an overall acceptable accuracy but no published report has performed a pair wise comparison of imputed and empiric association statistics of a complete set of GWAS markers. Results: In this report we identified a total of 73 imputed markers that yielded a nominally statistically significant association at P < 10(-5) for type 2 Diabetes Mellitus and compared them with results obtained based on empirical allelic frequencies. Interestingly, despite their overall high correlation, association statistics based on imputed frequencies were discordant in 35 of the 73 (47%) associated markers, considerably inflating the type I error rate of imputed markers. We comprehensively tested several quality thresholds, the haplotypic structure underlying imputed markers and the use of flanking markers as predictors of inaccurate association statistics derived from imputed markers. Conclusions: Our results suggest that association statistics from imputed markers showing specific MAF (Minor Allele Frequencies) range, located in weak linkage disequilibrium blocks or strongly deviating from local patterns of association are prone to have inflated false positive association signals. The present study highlights the potential of imputation procedures and proposes simple procedures for selecting the best imputed markers for follow-up genotyping studies.
Resumo:
Seven food grade commercially available lipases were immobilized by covalent binding on polysiloxane-polyvinyl alcohol (POS-PVA) hybrid composite and screened to mediate reactions of industrial interest. The synthesis of butyl butyrate and the interesterification of tripalmitin with triolein were chosen as model reactions. The highest esterification activity (240.63 mu M/g min) was achieved by Candida rugosa lipase, while the highest interesterification yield (31%, in 72 h) was achieved by lipase from Rhizopus oryzae, with the production of about 15 mM of the triglycerides C(50) and C(52). This lipase also showed a good performance in butyl butyrate synthesis, with an esterification activity of 171.14 mu M/g min. The results demonstrated the feasibility of using lipases from C. rugosa for esterification and R. oryzae lipase for both esterification and interesterification reactions.
Resumo:
Modern Integrated Circuit (IC) design is characterized by a strong trend of Intellectual Property (IP) core integration into complex system-on-chip (SOC) architectures. These cores require thorough verification of their functionality to avoid erroneous behavior in the final device. Formal verification methods are capable of detecting any design bug. However, due to state explosion, their use remains limited to small circuits. Alternatively, simulation-based verification can explore hardware descriptions of any size, although the corresponding stimulus generation, as well as functional coverage definition, must be carefully planned to guarantee its efficacy. In general, static input space optimization methodologies have shown better efficiency and results than, for instance, Coverage Directed Verification (CDV) techniques, although they act on different facets of the monitored system and are not exclusive. This work presents a constrained-random simulation-based functional verification methodology where, on the basis of the Parameter Domains (PD) formalism, irrelevant and invalid test case scenarios are removed from the input space. To this purpose, a tool to automatically generate PD-based stimuli sources was developed. Additionally, we have developed a second tool to generate functional coverage models that fit exactly to the PD-based input space. Both the input stimuli and coverage model enhancements, resulted in a notable testbench efficiency increase, if compared to testbenches with traditional stimulation and coverage scenarios: 22% simulation time reduction when generating stimuli with our PD-based stimuli sources (still with a conventional coverage model), and 56% simulation time reduction when combining our stimuli sources with their corresponding, automatically generated, coverage models.
Resumo:
Product Description: An engaging, comprehensive and colourful introduction, Social Psychology is now fully revised and updated in its 4th edition. It remains accessible, involving and clearly structured, exploring key aspects of social psychology. Through its many features and lively approach, Social Psychology will inform and challenge students everywhere and will prove invaluable to anyone with an interest in the field. Social Psychology effectively consolidates European and North American perspectives to provide coverage with a unique global flavour.