555 resultados para leverage


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This project resulted in the development of a proof of concept for a features inventory process to be used by field staff. The resulting concept is adaptable for different asset classes (e.g. culverts, guardrail) and able to leverage existing DOT resources such as the videolog and LRS and our current technology platforms including Oracle and our GIS web infrastructure. The concept examined the feasibility of newly available technologies, such as mobile devices, while balancing ease of use in the field. Implementation and deployment costs were also important considerations in evaluating the success of the project. These project funds allowed the pilot to address the needs of two DOT districts. A report of findings was prepared, including recommendations for a full deployment of a field data collection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pro gradu -tutkielman tavoitteena oli selvittää millaisia muutoksia uusi SAP-pohjainen järjestelmä aiheuttaa metsäteollisuusyrityksen hankinta-prosesseissa ja ostajien työssä. Tilannetta tarkasteltiin myös liiketoiminta-prosessien uudistamisprojektina. Tutkimus oli kvalitatiivinen case-tutkimus, jonka lähteinä olivat haastattelut ja prosessikuvaukset. Hankintaprosessit on pyritty standardisoimaan ja kuvaamaan tarkasti, koska järjestelmä on tarkoitus ottaa vähitellen käyttöön yrityksen kaikissa toimipisteissä. Teoriaosassa käsiteltiin globaalia hankintaa, erilaisia tilauksia, sähköistä liiketoimintaa ja uudistettujen liiketoimintaprosessien käyttöönottoa sekä siihen liittyviä haasteita. Yritys pyrkii kehittämään hankintatoimintaansa ja hyödyntämään kokonsa tuomia mittakaavaetuja, uusi järjestelmä on merkittävä osa tätä kehitystyötä. Haastattelujen perusteella uusi järjestelmä on toivottu ja siihen kohdistuu paljon odotuksia. Järjestelmän käyttöönotto tulee olemaanhaastava tehtävä, koska järjestelmän käyttäjiä on paljon ja loppukäyttäjät tekevät entistä enemmän tapahtumia järjestelmään tilausaloitteiden ja kotiinkutsujenmuodossa. Tehdasostajien roolissa tapahtuu muutoksia, rutiinitilaamisen vähentyessä he toimivat tietoa molempiin suuntiin jakavina linkkeinä keskitetyn hankinta-organisaation ja tehtaan välissä.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tutkielman tavoitteena on selvittää lineaarisen regressioanalyysin avulla paneelidataa käyttäen suomalaisten pörssiyritysten pääomarakenteisiin vaikuttavat tekijät vuosina 1999-2004. Näiden tekijöiden avulla päätellään, mitä pääomarakenneteoriaa/-teorioita nämä yritykset noudattavat. Pääomarakenneteoriat voidaan jakaa kahteen luokkaan sen mukaan, pyritäänkö niissä optimaaliseen pääomarakenteeseen vai ei. Tradeoff- ja siihen liittyvässä agenttiteoriassa pyritään optimaaliseen pääomarakenteeseen. Tradeoff-teoriassa pääomarakenne valitaan punnitsemalla vieraan pääoman hyötyjä ja haittoja. Agenttiteoria on muuten samanlainen kuin tradeoff-teoria, mutta siinä otetaan lisäksi huomioon velan agenttikustannukset. Pecking order - ja ajoitusteoriassa ei pyritä optimaaliseen pääoma-rakenteeseen. Pecking order -teoriassa rahoitus valitaan hierarkian mukaan (tulorahoitus, vieras pääoma, välirahoitus, oma pääoma). Ajoitusteoriassa valitaan se rahoitusmuoto, jota on kannattavinta hankkia vallitsevassa markkinatilanteessa. Empiiristen tulosten mukaan velkaantumisaste riippuu positiivisesti riskistä, vakuudesta ja aineettomasta omaisuudesta. Velkaantumisaste riippuu negatiivisesti likviditeetistä, osaketuotoista ja kannattavuudesta. Osingoilla ei ole vaikutusta velkaantumisasteeseen. Toimialoista teollisuustuotteiden ja -palveluiden sekä perusteollisuuden aloilla on korkeammat velkaantumisasteet kuin muilla toimialoilla. Tulokset tukevat pääosin pecking order -teoriaa ja jonkin verran ajoitusteoriaa. Muut teoriat saavat vain vähäistä tukea.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tämän tutkielman tavoitteena on selvittää, mitkä tekijät vaikuttavat yrityksen ja valtion velkakirjojen väliseen tuottoeroon. Strukturaalisten luottoriskin hinnoittelumallien mukaan luottoriskiin vaikuttavia tekijöitä ovat yrityksen velkaantumisaste, volatiliteetti ja riskitön korkokanta. Tavoitteena on erityisesti tutkia, kuinka hyvin nämä teoreettiset tekijät selittävät tuottoeroja ja onko olemassa muita tärkeitä selittäviä tekijöitä. Luottoriskinvaihtosopimusten noteerauksia käytetään tuottoerojen määrittämiseen. Selittävät tekijät koostuvat sekä yrityskohtaisista että markkinalaajuisista muuttujista. Luottoriskinvaihtosopimusten ja yrityskohtaisten muuttujien data on kerätty yhteensä 50 yritykselle Euroalueen maista. Aineisto koostuu kuukausittaisista havainnoista aikaväliltä 01.01.2003-31.12.2006. Empiiriset tulokset osoittavat, että strukturaalisten mallien mukaiset tekijät selittävät vain pienen osan tuottoeron muutoksista yli ajan. Toisaalta nämä teoreettiset tekijät selittävät huomattavasti paremmin tuottoeron vaihtelua yli poikkileikkauksen. Muut kuin teoreettiset tekijät pystyvät selittämään suuren osan tuottoeron vaihtelusta. Erityisen tärkeäksi tuottoeron selittäväksi tekijäksi osoittautui yleinen riskipreemio velkakirjamarkkinoilla. Tulokset osoittavat, että luottoriskin hinnoittelumalleja on kehitettävä edelleenniin, että ne ottaisivat huomioon yrityskohtaisten tekijöiden lisäksi myös markkinalaajuisia tekijöitä.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The three essays constituting this thesis focus on financing and cash management policy. The first essay aims to shed light on why firms issue debt so conservatively. In particular, it examines the effects of shareholder and creditor protection on capital structure choices. It starts by building a contingent claims model where financing policy results from a trade-off between tax benefits, contracting costs and agency costs. In this setup, controlling shareholders can divert part of the firms' cash ows as private benefits at the expense of minority share- holders. In addition, shareholders as a class can behave strategically at the time of default leading to deviations from the absolute priority rule. The analysis demonstrates that investor protection is a first order determinant of firms' financing choices and that conflicts of interests between firm claimholders may help explain the level and cross-sectional variation of observed leverage ratios. The second essay focuses on the practical relevance of agency conflicts. De- spite the theoretical development of the literature on agency conflicts and firm policy choices, the magnitude of manager-shareholder conflicts is still an open question. This essay proposes a methodology for quantifying these agency conflicts. To do so, it examines the impact of managerial entrenchment on corporate financing decisions. It builds a dynamic contingent claims model in which managers do not act in the best interest of shareholders, but rather pursue private benefits at the expense of shareholders. Managers have discretion over financing and dividend policies. However, shareholders can remove the manager at a cost. The analysis demonstrates that entrenched managers restructure less frequently and issue less debt than optimal for shareholders. I take the model to the data and use observed financing choices to provide firm-specific estimates of the degree of managerial entrenchment. Using structural econometrics, I find costs of control challenges of 2-7% on average (.8-5% at median). The estimates of the agency costs vary with variables that one expects to determine managerial incentives. In addition, these costs are sufficient to resolve the low- and zero-leverage puzzles and explain the time series of observed leverage ratios. Finally, the analysis shows that governance mechanisms significantly affect the value of control and firms' financing decisions. The third essay is concerned with the documented time trend in corporate cash holdings by Bates, Kahle and Stulz (BKS,2003). BKS find that firms' cash holdings double from 10% to 20% over the 1980 to 2005 period. This essay provides an explanation of this phenomenon by examining the effects of product market competition on firms' cash holdings in the presence of financial constraints. It develops a real options model in which cash holdings may be used to cover unexpected operating losses and avoid inefficient closure. The model generates new predictions relating cash holdings to firm and industry characteristics such as the intensity of competition, cash flow volatility, or financing constraints. The empirical examination of the model shows strong support of model's predictions. In addition, it shows that the time trend in cash holdings documented by BKS can be at least partly attributed to a competition effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El presente trabajo aborda el estudio de los factores determinantes del endeudamiento empresarial para contrastar empíricamente la hipótesis del Pecking Order. El endeudamiento empresarial se mide junto a su madurez y para los diferentes tamaños empresariales dada la importancia de diferenciar sus posibles efectos contrapuestos o compensados. Los modelos utilizados para el contraste de hipótesis se han estimado con una muestra de 1.320 empresas manufactureras españolas proporcionada por la Encuesta sobre Estrategias Empresariales (ESEE), para el período 1993-2001. El análisis empírico aplica un modelo multivariante de regresión logística que permite concluir que la teoría del Pecking Order es la de mejor cumplimiento, además de constatarse que las empresas de menor tamaño tienen mayores dificultades de acceso a la financiación con deuda a largo plazo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Near-infrared spectroscopy (NIRS) was used to analyse the crude protein content of dried and milled samples of wheat and to discriminate samples according to their stage of growth. A calibration set of 72 samples from three growth stages of wheat (tillering, heading and harvest) and a validation set of 28 samples was collected for this purpose. Principal components analysis (PCA) of the calibration set discriminated groups of samples according to the growth stage of the wheat. Based on these differences, a classification procedure (SIMCA) showed a very accurate classification of the validation set samples : all of them were successfully classified in each group using this procedure when both the residual and the leverage were used in the classification criteria. Looking only at the residuals all the samples were also correctly classified except one of tillering stage that was assigned to both tillering and heading stages. Finally, the determination of the crude protein content of these samples was considered in two ways: building up a global model for all the growth stages, and building up local models for each stage, separately. The best prediction results for crude protein were obtained using a global model for samples in the two first growth stages (tillering and heading), and using a local model for the harvest stage samples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nopea teknologian kehitys sekä kansainvälistymisen mukana tuoma kilpailupaine pakottavat yritykset jatkuvaan liiketoimintaprosessien kehittämiseen. Muutoksista organisaation rakenteissa sekä yrityksen prosesseissa on tullut yleisiä toimenpiteitä. Yksi näkyvimmistä toiminnallisista uudistuksesta on ollut toiminnanohjausjärjestelmän käyttöönotto. Toiminnanohjausjärjestelmän rakenne ja kehitys aiheuttaa yleensä suurimmat vaikeudet pyrittäessä rakentamaan liiketoimintaprosessien läpinäkyvyyttä esittävä tietojärjestelmäympäristö. Tässä tutkimuksessa liiketoiminnan sekä toiminnanohjausjärjestelmän prosessien yhdistäminen on tehty ns. toiminnanohjausjärjestelmä muutostyökaluilla. Kyseiset muutostyökalut on järjestetty yrityksissä tietojärjestelmä ympäristöön ja niiden avulla voidaan korjata teknisiä ongelmia sekä muuttaa itse prosesseja. Tutkimuksen empiria osuudessa on käytetty case-tutkimusmenetelmää Kone Oyj:n prosessien kehittämisosastolla. Tutkimuksen tavoitteena oli parantaa toiminnanohjausjärjestelmän muutostyökalujen prosesseja, liiketoimintaprosessien sekä toiminnanohjausjärjestelmän yhdistämiseksi ja harmonisoimiseksi. Tutkimuksen tavoitteiden täyttämiseksi, prosessijohtamisen käsitteitä käytettiin muutostyökaluprosessien parannusehdotusten löytymiseksi. Prosessijohtamisen käsitteet tarkoittavat prosessikartan, prosessin toimintojen, sekä prosessin kustannusten tutkimista ja hyväksikäyttöä. Prosessijohtamisen käsitteeseen kuuluu myös liiketoimintaprosessien jatkuvan parantamisen sekä uudelleenjärjestämisen mallien kuvaus. Toiminnanohjausjärjestelmäympäristön kuvaus teorian toisena osuutena antaa pohjaa muutostyökalujen prosessien käytölle. Tutkimuksen tuloksina voidaan todeta että tutkimusalue on hyvin monimutkainen ja vaikea. Toimintajärjestelmistä ei ole kirjoitettu teoriaa kovinkaan runsaasti, lukuunottamatta yritysten itse tekemiä tutkimuksia. Tutkimuksessa tarkasteltaville prosesseille löytyi kuitenkin parannusehdotuksia sekä ns. optimaalisen prosessimallin ominaisuuksia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Imaging in neuroscience, clinical research and pharmaceutical trials often employs the 3D magnetisation-prepared rapid gradient-echo (MPRAGE) sequence to obtain structural T1-weighted images with high spatial resolution of the human brain. Typical research and clinical routine MPRAGE protocols with ~1mm isotropic resolution require data acquisition time in the range of 5-10min and often use only moderate two-fold acceleration factor for parallel imaging. Recent advances in MRI hardware and acquisition methodology promise improved leverage of the MR signal and more benign artefact properties in particular when employing increased acceleration factors in clinical routine and research. In this study, we examined four variants of a four-fold-accelerated MPRAGE protocol (2D-GRAPPA, CAIPIRINHA, CAIPIRINHA elliptical, and segmented MPRAGE) and compared clinical readings, basic image quality metrics (SNR, CNR), and automated brain tissue segmentation for morphological assessments of brain structures. The results were benchmarked against a widely-used two-fold-accelerated 3T ADNI MPRAGE protocol that served as reference in this study. 22 healthy subjects (age=20-44yrs.) were imaged with all MPRAGE variants in a single session. An experienced reader rated all images of clinically useful image quality. CAIPIRINHA MPRAGE scans were perceived on average to be of identical value for reading as the reference ADNI-2 protocol. SNR and CNR measurements exhibited the theoretically expected performance at the four-fold acceleration. The results of this study demonstrate that the four-fold accelerated protocols introduce systematic biases in the segmentation results of some brain structures compared to the reference ADNI-2 protocol. Furthermore, results suggest that the increased noise levels in the accelerated protocols play an important role in introducing these biases, at least under the present study conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The impact of transnational private regulation on labour standards remains in dispute. While studies have provided some limited evidence of positive effects on 'outcome standards' such as wages or occupational health and safety, the literature gives little reason to believe that there has been any significant effect on 'process rights' relating primarily to collective workers' voice and social dialogue. This paper probes this assumption by bringing local contexts and worker agency more fully into the picture. It outlines an analytical framework that emphasizes workers' potential to act collectively for change in the regulatory space surrounding the employment relationship. It argues that while transnational private regulation on labour standards may marginally improve workers access to regulatory spaces and their capacity to require the inclusion of enterprises in them, it does little to increase union leverage. The findings are based on empirical research work conducted in Sub-Saharan Africa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Determining the appropriate level of integration is crucial to realizing value from acquisitions. Most prior research assumes that higher integration implies the removal of autonomy from target managers, which in turn undermines the functioning of the target firm if it entails unfamiliar elements for the acquirer. Using a survey of 86 acquisitions to obtain the richness of detail necessary to distinguish integration from autonomy, the authors argue and find that integration and autonomy are not the opposite ends of a single continuum. Certain conditions (e.g., when complementarity rather than similarity is the primary source of synergy) lead to high levels of both integration and autonomy. In addition, similarity negatively moderates the relationship between complementarity and autonomy when the target offers both synergy sources. In contrast, similarity does not moderate the link between complementarity and integration. The authors' findings advance scholarly understanding about the drivers of implementation strategy and in particular the different implementation strategies acquiring managers deploy when they attempt to leverage complementarities, similarities, or both.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this work is to present a tutorial on Multivariate Calibration, a tool which is nowadays necessary in basically most laboratories but very often misused. The basic concepts of preprocessing, principal component analysis (PCA), principal component regression (PCR) and partial least squares (PLS) are given. The two basic steps on any calibration procedure: model building and validation are fully discussed. The concepts of cross validation (to determine the number of factors to be used in the model), leverage and studentized residuals (to detect outliers) for the validation step are given. The whole calibration procedure is illustrated using spectra recorded for ternary mixtures of 2,4,6 trinitrophenolate, 2,4 dinitrophenolate and 2,5 dinitrophenolate followed by the concentration prediction of these three chemical species during a diffusion experiment through a hydrophobic liquid membrane. MATLAB software is used for numerical calculations. Most of the commands for the analysis are provided in order to allow a non-specialist to follow step by step the analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The question of why some firms perform better in managing their alliances has raised interest among scholars and managers. Whereas inter-firm factors influencing the alliance performance such as strategic fit between partners and the existence of complementarities have been studied extensively, research on firm-level antecedents is rather scarce. Therefore this study investigates the role of firm’s alliance capability in the alliance success equation. Particularly it analyses the specialized mechanisms and processes set up by firm in order to facilitate alliancerelated know-how leverage organization-wise. Evidence from a cross-industry sample of R&D intensive Finnish companies supports the fact that firms which have invested in institutionalizing alliance capabilities outperform their counterparts in alliance portfolio management. Results also suggest that firms need to adjust alliance management tools depending on the alliance portfolio size, prior experience with inter-firm partnerships and the strategic importance of alliances. Furthermore, absorptive capacity is found to be crucial for successful alliance management, its role being complementary to that of alliance capability. Finally, firms that have successful alliances also enjoy higher financial, market and innovation performance.