969 resultados para Asset Management Contracts
Resumo:
The research described in this thesis examines the characteristics, the benefits and the challenges associated with the implementation of management accounting systems in the field of Corporate Social Responsibility (CSR). Applied to the CSR context, management accounting relates to the identification, elaboration and communication of information about an organization's interactions with the society and the environment. Based on this information, firms are able to make decisions to achieve social and environmental objectives and provide evidence justifying the benefits and the costs of such actions. The study begins by focusing on green management and exploring the characteristics of Environmental Management Accounting (EMA) systems within firms. The first chapter informs the reader about the growing body of EMA research and reveals unexplored relevant aspects that need to be further investigated. The work also emphasizes the importance of developing new theoretical hypotheses and appropriate research designs to empirically tackle new aspects of EMA and gain understanding on the use of these practices. Subsequently, given the acknowledged importance of control systems in influencing the behaviour of individuals within organizations, the remaining two chapters of the dissertation focus on the functioning of CSR-linked incentives assigned to employees in the form of compensation plans. The second chapter examines the determinants influencing corporate provision of incentives for the attainment of environmental targets. Empirical analysis of a sample of international firms reveals that companies are likely to use green incentives as mechanisms to increase the efficacy in contracting with their employees as well as to respond to social influences. Finally, the third chapter investigates the effectiveness of contracting associated with the use of CSR-linked executive compensation. Empirical analysis of a sample of US-based companies shows that corporate choice to tie senior executives' pay to CSR targets promotes the firm's CSR performance. Cette thèse examine les caractéristiques, avantages et défis associés à l'utilisation des systèmes de contrôle de gestion dans le domaine de la Responsabilité Sociale des Entreprises (RSE). Dans le contexte de la RSE, les activités du contrôle de gestion impliquent l'identification, l'élaboration et la communication d'informations qui concernent les interactions des organisations avec la société et l'environnement. Avec ces informations les entreprises sont en mesure de prendre des décisions visant à atteindre les objectifs sociaux et environnementaux de l'organisation et de documenter les bénéfices et coûts de ces actions. Dès le début, la thèse se concentre sur les caractéristiques des systèmes de contrôle de gestion environnementale au sein des entreprises. Le premier chapitre passe en revue la littérature existante et révèle des aspects inexplorés. Pour ce faire, le travail suggère le développement de nouvelles théories ainsi que l'utilisation de méthodes appropriées. Ces dernières doivent permettre d'aborder empiriquement de nouveaux aspects des systèmes de contrôle environnemental et faciliter la compréhension sur l'utilisation de ces pratiques. Considérant l'importance des systèmes de contrôle pour influencer le comportement des individus au sein des organisations, la suite du travail se concentre sur le fonctionnement des contrats de rémunération des employées liées aux résultats de la RSE. Plus particulièrement, le deuxième chapitre examine les facteurs qui influencent la décision des entreprises d'assigner des objectifs environnementaux aux employées. L'analyse empirique d'un échantillon d'entreprises internationales montre que les entreprises sont susceptibles d'utiliser des mécanismes incitatifs écologiques pour augmenter l'efficacité des contrats ainsi que pour répondre aux influences sociales. Finalement, le troisième chapitre analyse l'efficacité des contrats de rémunération des dirigeants liés aux résultats de la RSE. L'analyse empirique d'un échantillon de sociétés américaines indique que le choix de l'entreprise de lier la rémunération des dirigeants à des objectifs de la RSE favorise la performance RSE de l'organisation.
Resumo:
The objective of this study has been to make a profitability analysis of service contracts for a company in Finland. The purpose has been to see how profitable the contracts are and if there possibly were some things to change or develop in the contracts. Allocation rules of cost accounting, service costs both profitability and management of services have been considered in the theory part. All the service contracts that have been valid at least three last accounting periods have been included in the study. All direct costs relating to the contracts have been collected and indirect costs have been assigned to the contracts. Profitability of the contracts has been calculated over three years. Results have been analyzed according to the key figures the company is controlling. Some suggestions for developments have been given at the end of the study. The study has shown differences between the contracts. Part of them has turned out to be like the profitability aims of the company and part less profitable. The study has shown that many factors have an effect on the profitability of the service contracts.
Resumo:
Recent theoretical developments on concession contracts for long term infrastructure projects under uncertain demand show the benefits of allowing for flexible term contracts rather than fixing a rigid term. This study presents a simulation to compare both alternatives by using real data from the oldest Spanish toll motorway. For this purpose, we analyze how well the flexible term would have performed instead of the fixed length actually established. Our results show a huge reduction of the term of concession that would have dramatically decreased the firm’s benefits and the user’s overpayment due to the internalization of an unexpected traffic increase.
Resumo:
The study of price risk management concerning high grade steel alloys and their components was conducted. This study was focused in metal commodities, of which nickel, chrome and molybdenum were in a central role. Also possible hedging instruments and strategies for referred metals were studied. In the literature part main themes are price formation of Ni, Cr and Mo, the functioning of metal exchanges and main hedging instruments for metal commodities. This section also covers how micro and macro variables may affect metal prices from the viewpoint of short as well as longer time period. The experimental part consists of three sections. In the first part, multiple regression model with seven explanatory variables was constructed to describe price behavior of nickel. Results were compared after this with information created with comparable simple regression model. Additionally, long time mean price reversion of nickel was studied. In the second part, theoretical price of CF8M alloy was studied by using nickel, ferro-chrome and ferro-molybdenum as explanatory variables. In the last section, cross hedging possibilities for illiquid FeCr -metal was studied with five LME futures. Also this section covers new information concerning possible forthcoming molybdenum future contracts as well. The results of this study confirm, that linear regression models which are based on the assumption of market rationality, are not able to reliably describe price development of metals at issue. Models fulfilling assumptions for linear regression may though include useful information of statistical significant variables which have effect on metal prices. According to the experimental part, short futures were found to incorporate the most accurate information concerning the price movements in the future. However, not even 3M futures were able to predict turning point in the market before the faced slump. Cross hedging seemed to be very doubtful risk management strategy for illiquid metals, because correlations coefficients were found to be very sensitive for the chosen time span.
Resumo:
Aim of the Thesis is to study and understand the theoretical concept of Metanational corporation and understand how the Web 2.0 technologies can be used to support the theory. Empiric part of the study compares the theory to the case company’s current situation Goal of theoretical framework is to show how the Web 2.0 technologies can be used in the three levels of the Metanational corporation. In order to do this, knowledge management and more accurately knowledge transferring is studied to understand what is needed from the Web 2.0 technologies in the different functions and operations of the Metanational corporation. Final synthesis of the theoretical framework is to present a model where the Web 2.0 technologies are placed on the levels of the Metanational corporation. Empirical part of the study is based on interviews made in the case company. Aim of the interviews is to understand the current state of the company related to the theoretical framework. Based on the interviews, the differences between the theoretical concept and the case company are presented and studied. Finally the study presents the found problem areas, and where the adoption of the Web 2.0 tools is seen as beneficiary, based on the interviews and theoretical framework.
Resumo:
The ability of the supplier firm to generate and utilise customer-specific knowledge has attracted increasing attention in the academic literature during the last decade. It has been argued the customer knowledge should treated as a strategic asset the same as any other intangible assets. Yet, at the same time it has been shown that the management of customer-specific knowledge is challenging in practice, and that many firms are better at acquiring customer knowledge than at making use of it. This study examines customer knowledge processing in the context of key account management in large industrial firms. This focus was chosen because key accounts are demanding and complex. It is not unusual for a single key account relationship to constitute a complex web of relationships between the supplier and the key account – thus easily leading to the dispersion of customer-specific knowledge in the supplier firm. Although the importance of customer-specific knowledge generation has been widely acknowledged in the literature, surprisingly little attention has been paid to the processes through which firms generate, disseminate and use such knowledge internally for enhancing the relationships with their major, strategically important key account customers. This thesis consists of two parts. The first part comprises a theoretical overview and draws together the main findings of the study, whereas the second part consists of five complementary empirical research papers based on survey data gathered from large industrial firms in Finland. The findings suggest that the management of customer knowledge generated about and form key accounts is a three-dimensional process consisting of acquisition, dissemination and utilization. It could be concluded from the results that customer-specific knowledge is a strategic asset because the supplier’s customer knowledge processing activities have a positive effect on supplier’s key account performance. Moreover, in examining the determinants of each phase separately the study identifies a number of intra-organisational factors that facilitate the process in supplier firms. The main contribution of the thesis lies in linking the concept of customer knowledge processing to the previous literature on key account management. Moreover, given than this literature is mainly conceptual or case-based, a further contribution is to examine its consequences and determinants based on quantitative empirical data.
Resumo:
Intellectual assets have attained continuous attention in the academic field, as they are vital sources of competitive advantage and organizational performance in the contemporary knowledge intensive business environment. Intellectual capital measurement is quite thoroughly addressed in the accounting literature. However, the purpose of the measurement is to support the management of intellectual assets, but the reciprocal relationship between measurement and management has not been comprehensively considered in the literature. The theoretical motivation for this study rose from this paradox, as in order to maximise the effectiveness of knowledge management the two initiatives need to be closely integrated. The research approach of this interventionist case study is constructive. The objective is to develop the case organization’s knowledge management and intellectual capital measurement in a way that they would be closely integrated and the measurement would support the management of intellectual assets. The case analysis provides valuable practical considerations about the integration and related issues as the case company is a knowledge intensive organization in which the know-how of the employees is the central competitive asset and therefore, the management and measurement of knowledge are essential for its future success. The results suggest that the case organization is confronting challenges in managing knowledge. In order to appropriately manage knowledge processes and control the related risks, support from intellectual capital measurement is required. However, challenges in measuring intellectual capital, especially knowledge, could be recognized in the organization. By reflecting the knowledge management situation and the constructed strategy map, a new intellectual measurement system was developed for the case organization. The construction of the system as well as its indicators can be perceived to contribute to the literature, emphasizing of the importance of properly considering the organization’s knowledge situation in developing an intellectual capital measurement system.
Resumo:
Data is the most important asset of a company in the information age. Other assets, such as technology, facilities or products can be copied or reverse-engineered, employees can be brought over, but data remains unique to every company. As data management topics are slowly moving from unknown unknowns to known unknowns, tools to evaluate and manage data properly are developed and refined. Many projects are in progress today to develop various maturity models for evaluating information and data management practices. These maturity models come in many shapes and sizes: from short and concise ones meant for a quick assessment, to complex ones that call for an expert assessment by experienced consultants. In this paper several of them, made not only by external inter-organizational groups and authors, but also developed internally at a Major Energy Provider Company (MEPC) are juxtaposed and thoroughly analyzed. Apart from analyzing the available maturity models related to Data Management, this paper also selects the one with the most merit and describes and analyzes using it to perform a maturity assessment in MEPC. The utility of maturity models is two-fold: descriptive and prescriptive. Besides recording the current state of Data Management practices maturity by performing the assessments, this maturity model is also used to chart the way forward. Thus, after the current situation is presented, analysis and recommendations on how to improve it based on the definitions of higher levels of maturity are given. Generally, the main trend observed was the widening of the Data Management field to include more business and “soft” areas (as opposed to technical ones) and the change of focus towards business value of data, while assuming that the underlying IT systems for managing data are “ideal”, that is, left to the purely technical disciplines to design and maintain. This trend is not only present in Data Management but in other technological areas as well, where more and more attention is given to innovative use of technology, while acknowledging that the strategic importance of IT as such is diminishing.
Resumo:
Few people see both opportunities and threats coming from IT legacy in current world. On one hand, effective legacy management can bring substantial hard savings and smooth transition to the desired future state. On the other hand, its mismanagement contributes to serious operational business risks, as old systems are not as reliable as it is required by the business users. This thesis offers one perspective of dealing with IT legacy – through effective contract management, as a component towards achieving Procurement Excellence in IT, thus bridging IT delivery departments, IT procurement, business units, and suppliers. It developed a model for assessing the impact of improvements on contract management process and set of tools and advices with regards to analysis and improvement actions. The thesis conducted case study to present and justify the implementation of Lean Six Sigma in IT legacy contract management environment. Lean Six Sigma proved to be successful and this thesis presents and discusses all the steps necessary, and pitfalls to avoid, to achieve breakthrough improvement in IT contract management process performance. For the IT legacy contract management process two improvements require special attention and can be easily copied to any organization. First is the issue of diluted contract ownership that stops all the improvements, as people do not know who is responsible for performing those actions. Second is the contract management performance evaluation tool, which can be used for monitoring, identifying outlying contracts and opportunities for improvements in the process. The study resulted in a valuable insight on the benefits of applying Lean Six Sigma to improve IT legacy contract management, as well as on how Lean Six Sigma can be applied in IT environment. Managerial implications are discussed. It is concluded that the use of data-driven Lean Six Sigma methodology for improving the existing IT contract management processes is a significant addition to the existing best practices in contract management.
Resumo:
Cette thèse envisage un ensemble de méthodes permettant aux algorithmes d'apprentissage statistique de mieux traiter la nature séquentielle des problèmes de gestion de portefeuilles financiers. Nous débutons par une considération du problème général de la composition d'algorithmes d'apprentissage devant gérer des tâches séquentielles, en particulier celui de la mise-à-jour efficace des ensembles d'apprentissage dans un cadre de validation séquentielle. Nous énumérons les desiderata que des primitives de composition doivent satisfaire, et faisons ressortir la difficulté de les atteindre de façon rigoureuse et efficace. Nous poursuivons en présentant un ensemble d'algorithmes qui atteignent ces objectifs et présentons une étude de cas d'un système complexe de prise de décision financière utilisant ces techniques. Nous décrivons ensuite une méthode générale permettant de transformer un problème de décision séquentielle non-Markovien en un problème d'apprentissage supervisé en employant un algorithme de recherche basé sur les K meilleurs chemins. Nous traitons d'une application en gestion de portefeuille où nous entraînons un algorithme d'apprentissage à optimiser directement un ratio de Sharpe (ou autre critère non-additif incorporant une aversion au risque). Nous illustrons l'approche par une étude expérimentale approfondie, proposant une architecture de réseaux de neurones spécialisée à la gestion de portefeuille et la comparant à plusieurs alternatives. Finalement, nous introduisons une représentation fonctionnelle de séries chronologiques permettant à des prévisions d'être effectuées sur un horizon variable, tout en utilisant un ensemble informationnel révélé de manière progressive. L'approche est basée sur l'utilisation des processus Gaussiens, lesquels fournissent une matrice de covariance complète entre tous les points pour lesquels une prévision est demandée. Cette information est utilisée à bon escient par un algorithme qui transige activement des écarts de cours (price spreads) entre des contrats à terme sur commodités. L'approche proposée produit, hors échantillon, un rendement ajusté pour le risque significatif, après frais de transactions, sur un portefeuille de 30 actifs.
Resumo:
La thèse propose d’introduire une perspective globale dans le traitement juridique du transport intermodal international qui prendrait racine dans la stratégie logistique des entreprises. La conception juridique se heurte, en effet, aux évolutions opérationnelles et organisationnelles des transports et aboutit à une incertitude juridique. Les transporteurs ont dû s’adapter aux exigences d’optimisation des flux des chargeurs dont les modes de production et de distribution reposent sur le supply chain management (SCM). Ce concept est le fruit de la mondialisation et des technologies de l’information. La concurrence induite par la mondialisation et le pilotage optimal des flux ont impulsé de nouvelles stratégies de la part des entreprises qui tentent d’avoir un avantage concurrentiel sur le marché. Ces stratégies reposent sur l’intégration interfonctionnelle et interoganisationnelle. Dans cette chaîne logistique globale (ou SCM) l’intermodal est crucial. Il lie et coordonne les réseaux de production et de distribution spatialement désagrégés des entreprises et, répond aux exigences de maîtrise de l’espace et du temps, à moindre coût. Ainsi, le transporteur doit d’une part, intégrer les opérations de transport en optimisant les déplacements et, d’autre part, s’intégrer à la chaîne logistique du client en proposant des services de valeur ajoutée pour renforcer la compétitivité de la chaîne de valeur. Il en découle une unité technique et économique de la chaîne intermodale qui est pourtant, juridiquement fragmentée. Les Conventions internationales en vigueur ont été élaborées pour chaque mode de transport en faisant fi de l’interaction entre les modes et entre les opérateurs. L’intermodal est considéré comme une juxtaposition des modes et des régimes juridiques. Ce dépeçage juridique contraste avec la gestion de la chaîne intermodale dont les composantes individuelles s’effacent au profit de l’objectif global à atteindre. L’on expose d’abord l’ampleur de l’incertitude juridique due aux difficultés de circonscrire le champ d’opérations couvert par les Conventions en vigueur. Une attention est portée aux divergences d’interprétations qui débouchent sur la « désunification » du droit du transport. On s’intéresse ensuite aux interactions entre le transport et la chaîne logistique des chargeurs. Pour cela, on retrace l’évolution des modes de production et de distribution de ces derniers. C’est effectivement de la stratégie logistique que découle la conception de la chaîne intermodale. Partant de ce système, on identifie les caractéristiques fondamentales de l’intermodal. La thèse aboutit à dissiper les confusions liées à la qualification de l’intermodal et qui sont à la base des divergences d’interprétations et de l’incertitude juridique. De plus, elle met en exergue l’unité économique du contrat de transport intermodal qui devrait guider la fixation d’un régime de responsabilité dédié à ce système intégré de transport. Enfin, elle initie une approche ignorée des débats juridiques.
Resumo:
In the absence of entry barrier or regulatory restrictions, Non Banking Financial Companies frantically grew and accessed the public deposit without any regulatory control. The deposit of NBFCs grew from Rs. 41.9 crore in 1971 to 53116.0 crore in 1997. This growth was the result of a combined effect of increase in the number of NBFCs and increase in the amount of deposits. The deposits amazed as above was invested in various assets especially that in motor vehicles by these asset financing NBFCs. Various tactics were adopted by these NBFCs and their agents for recovering the receivable outstanding from such assets. Both central government and RBI were concerned about the protection of depositors‘ interest and various committees were set up to frame a comprehensive regulation for the functioning of these NBFCs.
Resumo:
Zur Senkung von Kosten werden in vielen Unternehmen Dienstleistungen, die nicht zur Kernkompetenz gehören, an externe Dienstleister ausgelagert. Dieser Prozess wird auch als Outsourcing bezeichnet. Die dadurch entstehenden Abhängigkeiten zu den externen Dienstleistern werden mit Hilfe von Service Level Agreements (SLAs) vertraglich geregelt. Die Aufgabe des Service Level Managements (SLM) ist es, die Einhaltung der vertraglich fixierten Dienstgüteparameter zu überwachen bzw. sicherzustellen. Für eine automatische Bearbeitung ist daher eine formale Spezifikation von SLAs notwendig. Da der Markt eine Vielzahl von unterschiedlichen SLM-Werkzeugen hervorgebracht hat, entstehen in der Praxis Probleme durch proprietäre SLA-Formate und fehlende Spezifikationsmethoden. Daraus resultiert eine Werkzeugabhängigkeit und eine limitierte Wiederverwendbarkeit bereits spezifizierter SLAs. In der vorliegenden Arbeit wird ein Ansatz für ein plattformunabhängiges Service Level Management entwickelt. Ziel ist eine Vereinheitlichung der Modellierung, so dass unterschiedliche Managementansätze integriert und eine Trennung zwischen Problem- und Technologiedomäne erreicht wird. Zudem wird durch die Plattformunabhängigkeit eine hohe zeitliche Stabilität erstellter Modelle erreicht. Weiteres Ziel der Arbeit ist, die Wiederverwendbarkeit modellierter SLAs zu gewährleisten und eine prozessorientierte Modellierungsmethodik bereitzustellen. Eine automatisierte Etablierung modellierter SLAs ist für eine praktische Nutzung von entscheidender Relevanz. Zur Erreichung dieser Ziele werden die Prinzipien der Model Driven Architecture (MDA) auf die Problemdomäne des Service Level Managements angewandt. Zentrale Idee der Arbeit ist die Definition von SLA-Mustern, die konfigurationsunabhängige Abstraktionen von Service Level Agreements darstellen. Diese SLA-Muster entsprechen dem Plattformunabhängigen Modell (PIM) der MDA. Durch eine geeignete Modelltransformation wird aus einem SLA-Muster eine SLA-Instanz generiert, die alle notwendigen Konfigurationsinformationen beinhaltet und bereits im Format der Zielplattform vorliegt. Eine SLA-Instanz entspricht damit dem Plattformspezifischen Modell (PSM) der MDA. Die Etablierung der SLA-Instanzen und die daraus resultierende Konfiguration des Managementsystems entspricht dem Plattformspezifischen Code (PSC) der MDA. Nach diesem Schritt ist das Managementsystem in der Lage, die im SLA vereinbarten Dienstgüteparameter eigenständig zu überwachen. Im Rahmen der Arbeit wurde eine UML-Erweiterung definiert, die eine Modellierung von SLA-Mustern mit Hilfe eines UML-Werkzeugs ermöglicht. Hierbei kann die Modellierung rein graphisch als auch unter Einbeziehung der Object Constraint Language (OCL) erfolgen. Für die praktische Realisierung des Ansatzes wurde eine Managementarchitektur entwickelt, die im Rahmen eines Prototypen realisiert wurde. Der Gesamtansatz wurde anhand einer Fallstudie evaluiert.
Resumo:
Esta disertación busca estudiar los mecanismos de transmisión que vinculan el comportamiento de agentes y firmas con las asimetrías presentes en los ciclos económicos. Para lograr esto, se construyeron tres modelos DSGE. El en primer capítulo, el supuesto de función cuadrática simétrica de ajuste de la inversión fue removido, y el modelo canónico RBC fue reformulado suponiendo que des-invertir es más costoso que invertir una unidad de capital físico. En el segundo capítulo, la contribución más importante de esta disertación es presentada: la construcción de una función de utilidad general que anida aversión a la pérdida, aversión al riesgo y formación de hábitos, por medio de una función de transición suave. La razón para hacerlo así es el hecho de que los individuos son aversos a la pérdidad en recesiones, y son aversos al riesgo en auges. En el tercer capítulo, las asimetrías en los ciclos económicos son analizadas junto con ajuste asimétrico en precios y salarios en un contexto neokeynesiano, con el fin de encontrar una explicación teórica de la bien documentada asimetría presente en la Curva de Phillips.