731 resultados para LEVERAGE


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sequences of timestamped events are currently being generated across nearly every domain of data analytics, from e-commerce web logging to electronic health records used by doctors and medical researchers. Every day, this data type is reviewed by humans who apply statistical tests, hoping to learn everything they can about how these processes work, why they break, and how they can be improved upon. To further uncover how these processes work the way they do, researchers often compare two groups, or cohorts, of event sequences to find the differences and similarities between outcomes and processes. With temporal event sequence data, this task is complex because of the variety of ways single events and sequences of events can differ between the two cohorts of records: the structure of the event sequences (e.g., event order, co-occurring events, or frequencies of events), the attributes about the events and records (e.g., gender of a patient), or metrics about the timestamps themselves (e.g., duration of an event). Running statistical tests to cover all these cases and determining which results are significant becomes cumbersome. Current visual analytics tools for comparing groups of event sequences emphasize a purely statistical or purely visual approach for comparison. Visual analytics tools leverage humans' ability to easily see patterns and anomalies that they were not expecting, but is limited by uncertainty in findings. Statistical tools emphasize finding significant differences in the data, but often requires researchers have a concrete question and doesn't facilitate more general exploration of the data. Combining visual analytics tools with statistical methods leverages the benefits of both approaches for quicker and easier insight discovery. Integrating statistics into a visualization tool presents many challenges on the frontend (e.g., displaying the results of many different metrics concisely) and in the backend (e.g., scalability challenges with running various metrics on multi-dimensional data at once). I begin by exploring the problem of comparing cohorts of event sequences and understanding the questions that analysts commonly ask in this task. From there, I demonstrate that combining automated statistics with an interactive user interface amplifies the benefits of both types of tools, thereby enabling analysts to conduct quicker and easier data exploration, hypothesis generation, and insight discovery. The direct contributions of this dissertation are: (1) a taxonomy of metrics for comparing cohorts of temporal event sequences, (2) a statistical framework for exploratory data analysis with a method I refer to as high-volume hypothesis testing (HVHT), (3) a family of visualizations and guidelines for interaction techniques that are useful for understanding and parsing the results, and (4) a user study, five long-term case studies, and five short-term case studies which demonstrate the utility and impact of these methods in various domains: four in the medical domain, one in web log analysis, two in education, and one each in social networks, sports analytics, and security. My dissertation contributes an understanding of how cohorts of temporal event sequences are commonly compared and the difficulties associated with applying and parsing the results of these metrics. It also contributes a set of visualizations, algorithms, and design guidelines for balancing automated statistics with user-driven analysis to guide users to significant, distinguishing features between cohorts. This work opens avenues for future research in comparing two or more groups of temporal event sequences, opening traditional machine learning and data mining techniques to user interaction, and extending the principles found in this dissertation to data types beyond temporal event sequences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis intends to analyse the performance and the efficiency of companies and to identify the key factors that may explain it. A comprehensive analysis based on a set of economic and financial ratios was studied as an instrument which provides information on enterprise performance and its efficiency. It was selected a sample with 15 enterprises: 7 Portuguese and 8 Ukrainian ones, belonging to several industries. Financial and non-financial data was collected for 6 years, during the period of 2009 to 2014. Research questions that guided this work were: Are the enterprises efficient/profitable? What factors influence enterprises’ efficiency/performance? Is there any difference between Ukrainian and Portuguese enterprises’ efficiency/performance, which factors have more influence? Which industrial sector is represented by more efficient/profitable enterprises? The main results showed that in average enterprises were efficient; comparing by states Ukrainian enterprises are more efficient; industries have similar level of efficiency. Among factors that influence ATR positively are fixed and current assets turnover ratios, ROA; negatively influencing are EBITDA margin and liquidity ratio. There is no significant difference between models by country. Concerning profitability, enterprises have low performance level but in comparison of countries Ukrainian enterprises have better profitability in average. Regarding the industry sector, paper industry is the most profitable. Among factors influencing ROA are profit margin, fixed asset turnover ratio, EBITDA margin, Debt to equity ratio and the country. In case of profitability both countries have different models. For Ukrainian enterprises is suggested to pay attention on factors of Short-term debt to total debt, ROA, Interest coverage ratio in order to be more efficient; Profit margin and EBITDA margin to make their performance better. For Portuguese enterprises for improving efficiency the observation and improvement of fixed assets turnover ratio, current assets turnover ratio, Short-term financial debt to total debt, Leverage Ratio, EBITDA margin is suggested; for improving higher profitability track fixed assets turnover ratio, current assets turnover ratio, Debt to equity ratio, Profit margin and Interest coverage ratio is suggested.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aims to investigate factors that may affect return on equity (ROE). The ROE is a gauge of profit generating efficiency and a strong measure of how well the management of a firm creates value for its shareholders. Firms with higher ROE typically have competitive advantages over their competitors which translates into superior returns for investors. Therefore, seems imperative to study the drivers of ROE, particularly ratios and indicators that may have considerable impact. The analysis is done on a sample of 90 largest non-financial companies which are components of NASDAQ-100 index and also on industry sector samples. The ordinary least squares method is used to find the most impactful drivers of ROE. The extended DuPont model’s components are considered as the primary factors affecting ROE. In addition, other ratios and indicators such as price to earnings, price to book and current are also incorporated. Consequently, the study uses eight ratios that are believed to have impact on ROE. According to our findings, the most relevant ratios that determine ROE are tax burden, interest burden, operating margin, asset turnover and financial leverage (extended DuPont components) regardless of industry sectors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dinoflagellates possess large genomes in which most genes are present in many copies. This has made studies of their genomic organization and phylogenetics challenging. Recent advances in sequencing technology have made deep sequencing of dinoflagellate transcriptomes feasible. This dissertation investigates the genomic organization of dinoflagellates to better understand the challenges of assembling dinoflagellate transcriptomic and genomic data from short read sequencing methods, and develops new techniques that utilize deep sequencing data to identify orthologous genes across a diverse set of taxa. To better understand the genomic organization of dinoflagellates, a genomic cosmid clone of the tandemly repeated gene Alchohol Dehydrogenase (AHD) was sequenced and analyzed. The organization of this clone was found to be counter to prevailing hypotheses of genomic organization in dinoflagellates. Further, a new non-canonical splicing motif was described that could greatly improve the automated modeling and annotation of genomic data. A custom phylogenetic marker discovery pipeline, incorporating methods that leverage the statistical power of large data sets was written. A case study on Stramenopiles was undertaken to test the utility in resolving relationships between known groups as well as the phylogenetic affinity of seven unknown taxa. The pipeline generated a set of 373 genes useful as phylogenetic markers that successfully resolved relationships among the major groups of Stramenopiles, and placed all unknown taxa on the tree with strong bootstrap support. This pipeline was then used to discover 668 genes useful as phylogenetic markers in dinoflagellates. Phylogenetic analysis of 58 dinoflagellates, using this set of markers, produced a phylogeny with good support of all branches. The Suessiales were found to be sister to the Peridinales. The Prorocentrales formed a monophyletic group with the Dinophysiales that was sister to the Gonyaulacales. The Gymnodinales was found to be paraphyletic, forming three monophyletic groups. While this pipeline was used to find phylogenetic markers, it will likely also be useful for finding orthologs of interest for other purposes, for the discovery of horizontally transferred genes, and for the separation of sequences in metagenomic data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The big data era has dramatically transformed our lives; however, security incidents such as data breaches can put sensitive data (e.g. photos, identities, genomes) at risk. To protect users' data privacy, there is a growing interest in building secure cloud computing systems, which keep sensitive data inputs hidden, even from computation providers. Conceptually, secure cloud computing systems leverage cryptographic techniques (e.g., secure multiparty computation) and trusted hardware (e.g. secure processors) to instantiate a “secure” abstract machine consisting of a CPU and encrypted memory, so that an adversary cannot learn information through either the computation within the CPU or the data in the memory. Unfortunately, evidence has shown that side channels (e.g. memory accesses, timing, and termination) in such a “secure” abstract machine may potentially leak highly sensitive information, including cryptographic keys that form the root of trust for the secure systems. This thesis broadly expands the investigation of a research direction called trace oblivious computation, where programming language techniques are employed to prevent side channel information leakage. We demonstrate the feasibility of trace oblivious computation, by formalizing and building several systems, including GhostRider, which is a hardware-software co-design to provide a hardware-based trace oblivious computing solution, SCVM, which is an automatic RAM-model secure computation system, and ObliVM, which is a programming framework to facilitate programmers to develop applications. All of these systems enjoy formal security guarantees while demonstrating a better performance than prior systems, by one to several orders of magnitude.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Internationalization and moving to new markets can create many opportunities for small businesses across the globe, but also presents a number of new challenges they will face, which may influence their competitive advantage in the global market -- Present paper aims to provide an internationalization guide for SMEs from Curaçao -- Also the determinants that can impact internationalization will be discussed -- In this paper, three widely researched internationalization models form the basis of the theoretical perspectives of this paper; the traditional Uppsala model, the Network model and the Linkage, Leverage and Learning model

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation explores the effect of innovative knowledge transfer across supply chain partners. My research seeks to understand the manner by which a firm is able to benefit from the innovative capabilities of its supply chain partners and utilize the external knowledge they hold to increase its own levels of innovation. Specifically, I make use of patent data as a proxy for firm-level innovation and develop both independent and dependent variables from the data contained within the patent filings. I further examine the means by which key dyadic and portfolio supply chain relationship characteristics moderate the relationship between supplier innovation and buyer innovation. I investigate factors such as the degree of transactional reciprocity between the buyer and supplier, the similarity of the firms’ knowledge bases, and specific chain characteristics (e.g., geographic propinquity) to provide greater understanding of the means by which the transfer of innovative knowledge across firms in a supply chain can be enhanced or inhibited. This dissertation spans three essays to provide insights into the role that supply chain relationships play in affecting a focal firm’s level of innovation. While innovation has been at the core of a wide body of research, very little empirical work exists that considers the role of vertical buyer-supplier relationships on a firm’s ability to develop new and novel innovations. I begin by considering the fundamental unit of analysis within a supply chain, the buyer-supplier dyad. After developing initial insights based on the interactions between singular buyers and suppliers, essay two extends the analysis to consider the full spectrum of a buyer’s supply base by aggregating the individual buyer-supplier dyad level data into firm-supply network level data. Through this broader level of analysis, I am able to examine how the relational characteristics between a buyer firm and its supply base affect its ability to leverage the full portfolio of its suppliers’ innovative knowledge. Finally, in essay three I further extend the analysis to explore the means by which a buyer firm can use its suppliers to enhance its ability to access distant knowledge held by other organizations that the buyer is only connected to indirectly through its suppliers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper applies two measures to assess spillovers across markets: the Diebold Yilmaz (2012) Spillover Index and the Hafner and Herwartz (2006) analysis of multivariate GARCH models using volatility impulse response analysis. We use two sets of data, daily realized volatility estimates taken from the Oxford Man RV library, running from the beginning of 2000 to October 2016, for the S&P500 and the FTSE, plus ten years of daily returns series for the New York Stock Exchange Index and the FTSE 100 index, from 3 January 2005 to 31 January 2015. Both data sets capture both the Global Financial Crisis (GFC) and the subsequent European Sovereign Debt Crisis (ESDC). The spillover index captures the transmission of volatility to and from markets, plus net spillovers. The key difference between the measures is that the spillover index captures an average of spillovers over a period, whilst volatility impulse responses (VIRF) have to be calibrated to conditional volatility estimated at a particular point in time. The VIRF provide information about the impact of independent shocks on volatility. In the latter analysis, we explore the impact of three different shocks, the onset of the GFC, which we date as 9 August 2007 (GFC1). It took a year for the financial crisis to come to a head, but it did so on 15 September 2008, (GFC2). The third shock is 9 May 2010. Our modelling includes leverage and asymmetric effects undertaken in the context of a multivariate GARCH model, which are then analysed using both BEKK and diagonal BEKK (DBEKK) models. A key result is that the impact of negative shocks is larger, in terms of the effects on variances and covariances, but shorter in duration, in this case a difference between three and six months.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aims to investigate the influence of the asset class and the breakdown of tangibility as determinant factors of the capital structure of companies listed on the BM & FBOVESPA in the period of 2008-2012. Two current assets classes were composed and once they were grouped by liquidity, they were also analyzed by the financial institutions for credit granting: current resources (Cash, Bank and Financial Applications) and operations with duplicates (Stocks and Receivables). The breakdown of the tangible assets was made based on its main components provided as warrantees for loans like Machinery & Equipment and Land & Buildings. For an analysis extension, three metrics for leverage (accounting, financial and market) were applied and the sample was divided into economic sectors, adopted by BM&FBOVESPA. The data model in dynamic panel estimated by a systemic GMM of two levels was used in this study due its strength to problems of endogenous relationship as well as the omitted variables bias. The found results suggest that current resources are determinants of the capital structure possibly because they re characterized as proxies for financial solvency, being its relationship with debt positive. The sectorial analysis confirmed the results for current resources. The tangibility of assets has inverse proportional relationship with the leverage. As it is disintegrated in its main components, the significant and negative influence of machinery & equipment was more marked in the Industrial Goods sector. This result shows that, on average, the most specific assets from operating activities of a company compete for a less use of third party resources. As complementary results, it was observed that the leverage has persistence, which is linked with the static trade-off theory. Specifically for financial leverage, it was observed that the persistence is relevant when it is controlled for the lagged current assets classes variables. The proxy variable for growth opportunities, measured by the Market -to -Book, has the sign of its contradictory coefficient. The company size has a positive relationship with debt, in favor of static trade-off theory. Profitability is the most consistent variable in all the performed estimations, showing strong negative and significant relationship with leverage, as the pecking order theory predicts

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is a research study, which is aimed to be published later, in the form of a practical guide for those who have an idea and plan to go one step forward on the creation of a brand and/or a business. The main questions addressed are regarding the main concerns of an entrepreneur, identifiying the main topics that an entrepreneur's practical guide must approach. This work aims to provide relevant and important insights for those who want to start a business, taking in consideration some best practices, advises from entrepreneurs that have already started their own businesses and shared their experience. It means to provide a strong contribution to the Portuguese ecosystem, more specifically, startups, small companies, projects or ideas at seed or startup stage, brands, clubs, and every initiative which is starting from nothing, or almost nothing. Apart from books and online researches, primary information and testimonials were collected through an online survey, from a target audience of entrepreneurs, leading to the main findings of this study. The conclusion of this thesis is the gross index of the future startuper's practical guide, which will be divided in 4 different stages: Preparation, Implementation, Leverage and Closure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Part 12: Collaboration Platforms

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While fault-tolerant quantum computation might still be years away, analog quantum simulators offer a way to leverage current quantum technologies to study classically intractable quantum systems. Cutting edge quantum simulators such as those utilizing ultracold atoms are beginning to study physics which surpass what is classically tractable. As the system sizes of these quantum simulators increase, there are also concurrent gains in the complexity and types of Hamiltonians which can be simulated. In this work, I describe advances toward the realization of an adaptable, tunable quantum simulator capable of surpassing classical computation. We simulate long-ranged Ising and XY spin models which can have global arbitrary transverse and longitudinal fields in addition to individual transverse fields using a linear chain of up to 24 Yb+ 171 ions confined in a linear rf Paul trap. Each qubit is encoded in the ground state hyperfine levels of an ion. Spin-spin interactions are engineered by the application of spin-dependent forces from laser fields, coupling spin to motion. Each spin can be read independently using state-dependent fluorescence. The results here add yet more tools to an ever growing quantum simulation toolbox. One of many challenges has been the coherent manipulation of individual qubits. By using a surprisingly large fourth-order Stark shifts in a clock-state qubit, we demonstrate an ability to individually manipulate spins and apply independent Hamiltonian terms, greatly increasing the range of quantum simulations which can be implemented. As quantum systems grow beyond the capability of classical numerics, a constant question is how to verify a quantum simulation. Here, I present measurements which may provide useful metrics for large system sizes and demonstrate them in a system of up to 24 ions during a classically intractable simulation. The observed values are consistent with extremely large entangled states, as much as ~95% of the system entangled. Finally, we use many of these techniques in order to generate a spin Hamiltonian which fails to thermalize during experimental time scales due to a meta-stable state which is often called prethermal. The observed prethermal state is a new form of prethermalization which arises due to long-range interactions and open boundary conditions, even in the thermodynamic limit. This prethermalization is observed in a system of up to 22 spins. We expect that system sizes can be extended up to 30 spins with only minor upgrades to the current apparatus. These results emphasize that as the technology improves, the techniques and tools developed here can potentially be used to perform simulations which will surpass the capability of even the most sophisticated classical techniques, enabling the study of a whole new regime of quantum many-body physics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este artigo tem como propósito abordar, discutir e propor, a partir de cuidadosa revisão da literatura e resultados de estudo exploratório, uma concepção que procura estabelecer conexões entre as diferentes definições sobre a aprendizagem experiencial e seu papel no desenvolvimento de competências no contexto organizacional. O estudo desenvolvido investigou os aspectos relacionados à aprendizagem e ao desenvolvimento de competências na percepção de universitários que estavam trabalhando e cursando a etapa final do curso de administração. Os resultados revelam: a importância do contexto em que os indivíduos estão inseridos na construção de significados para o processo de aprendizagem; como as situações que surgem no cotidiano podem tornarse veículo neste processo; e permitem dizer que, para instalar uma cultura de aprendizagem que possibilite o desenvolvimento de competências, é necessária compreensão clara das novas diretrizes de uma tarefa educativa voltada para aprendizagem. Na perspectiva de trazer alguma contribuição o estudo conclui propondo um quadro de referências relativamente integrado para a definição e o papel da aprendizagem na ação no desenvolvimento de competências.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Part 9: Innovation Networks

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Part 5: Service Orientation in Collaborative Networks