868 resultados para ostoprosessi, ajankäyttö, time-based management
Resumo:
This Master´s thesis illustrates how growing a business ties up the company´s working capital and what the cost of committed capital. In order to manage a company´s working capital in rapid business growth phase, the thesis suggests that by monitoring and managing the operating and cash conversion cycles of customers´ projects, a company can find ways to secure the required amount of capital. The research method of this thesis was based on literature reviews and case study research. The theoretical review presents the concepts of working capital and provides the background for understanding how to improve working capital management. The company in subject is a global small and medium-sized enterprise that manufactures pumps and valves for demanding process conditions. The company is expanding, which creates lots of challenges. This thesis concentrates to the company´s working capital management and its efficiency through the supply chain and value chain perspective. The main elements of working capital management are inventory management, accounts receivable management and accounts payable management. Prepayments also play a significant role, particularly in project-based businesses. Developing companies´ working capital management requires knowledge from different kind of key operations´ in the company, like purchasing, production, sales, logistics and financing. The perspective to develop and describe working capital management is an operational. After literature reviews the thesis present pilot projects that formed the basis of a model to monitor working capital in the case company. Based on analysis and pilot projects, the thesis introduces a rough model for monitoring capital commitments in short time period. With the model the company can more efficiently monitor and manage their customer projects.
Resumo:
This thesis examines innovation development needs of firms in a remote rural region. The perspective of the study is in strategic innovation management and three dimensions of innovation development: innovation environment, value delivery and innovation capability. The framework is studied with a theoretical and methodological approach in the context of the development of a regional innovation system and the defining of innovation development needs. The thesis is based on existing innovation management literature, expanding it by examining the features of the three dimensions. The empirical data of the study comprise 50 purposefully selected firms within the region of Pielinen Karelia located in Eastern Finland. Most of the firms (70%) included in the study represent manufacturing firms, and over 90% are small and medium-sized enterprises. The research data consist of two questionnaires and an interview, which were done during 2011 in the connection of a regional development project. The point of view of the research is in regional development and harnessing the innovation capability of the firms within the region. The principal research approach applies soft systems methodology. The study explores the means to foster the innovativeness of firms from the viewpoints of innovation environment, innovation capability and value delivery. In closer detail, the study examines relations between the innovation capability factors, differences in innovation development needs within the value delivery system, between sectors and between firm size categories. The thesis offers three major contributions. First, the study extends earlier research on strategic innovation management by connecting the frameworks of innovation capability, innovation environment and value delivery process to the defining of innovation development needs at the regional level. The results deepen knowledge especially concerning practice-based innovation, peripheral regions and smaller firms. Second, the empirical work, based on a case study, confirms the existence of a structural connection integrating five factors of innovation capability. Statistical evidence is provided especially for the positive impacts of the improvement of absorption capability, marketing capability and networking capability, which are the main weaknesses of firms according to the study. Third, the research provides a methodological contribution by applying the innovation matrix in the defining of the innovation development needs of firms. The study demonstrates how the matrix improves possibility to target policy instruments and innovation services more efficiently through indicating significant differences between the innovation support needs regarding various time horizons and phases of innovation process.
Resumo:
The aim of this study is to test the accrual-based model suggested by Dechow et al. (1995) in order to detect and compare earnings management practices in Finnish and French companies. Also the impact of financial crisis of 2008 on earnings management behavior in these countries is tested by dividing the whole time period of 2003-2012 into two sub-periods: pre-crisis (2003-2008) and post-crisis (2009-2012). Results support the idea that companies in both countries have significant earnings management practices. During the post-crisis period companies in Finland show income inflating practices, while in France the opposite tendency is noticed (income deflating) during the same period. Results of the assumption that managers in highly concentrated companies are engaged in income enhancing practices vary in two countries. While in Finland managers are trying to show better performance for bonuses or other contractual compensation motivations, in France they avoid paying dividends or high taxes.
Resumo:
Pour répondre aux exigences du gouvernement fédéral quant aux temps d’attente pour les chirurgies de remplacement du genou et de la hanche, les établissements canadiens ont adopté des stratégies de gestion des listes d’attentes avec des niveaux de succès variables. Notre question de recherche visait à comprendre Quels facteurs ont permis de maintenir dans le temps un temps d’attente répondant aux exigences du gouvernement fédéral pendant au moins 6-12 mois? Nous avons développé un modèle possédant quatre facteurs, inspiré du modèle de Parsons (1977), afin d’analyser les facteurs comprenant la gouvernance, la culture, les ressources, et les outils. Trois études de cas ont été menées. En somme, le 1er cas a été capable d’obtenir les exigences pendant six mois mais incapable de les maintenir, le 2e cas a été capable de maintenir les exigences > 18 mois et le 3e cas a été incapable d’atteindre les objectifs. Des documents furent recueillis et des entrevues furent réalisées auprès des personnes impliquées dans la stratégie. Les résultats indiquent que l’hôpital qui a été en mesure de maintenir le temps d’attente possède certaines caractéristiques: réalisation exclusive de chirurgie de remplacement de la hanche et du genou, présence d’un personnel motivé, non distrait par d’autres préoccupations et un esprit d’équipe fort. Les deux autres cas ont eu à faire face à une culture médicale moins homogène et moins axés sur l’atteinte des cibles; des ressources dispersées et une politique intra-établissement imprécise. Le modèle d’hôpital factory est intéressant dans le cadre d’une chirurgie surspécialisée. Toutefois, les patients sont sélectionnés pour des chirurgies simples et dont le risque de complication est faible. Il ne peut donc pas être retenu comme le modèle durable par excellence.
Resumo:
This study is concerned with Autoregressive Moving Average (ARMA) models of time series. ARMA models form a subclass of the class of general linear models which represents stationary time series, a phenomenon encountered most often in practice by engineers, scientists and economists. It is always desirable to employ models which use parameters parsimoniously. Parsimony will be achieved by ARMA models because it has only finite number of parameters. Even though the discussion is primarily concerned with stationary time series, later we will take up the case of homogeneous non stationary time series which can be transformed to stationary time series. Time series models, obtained with the help of the present and past data is used for forecasting future values. Physical science as well as social science take benefits of forecasting models. The role of forecasting cuts across all fields of management-—finance, marketing, production, business economics, as also in signal process, communication engineering, chemical processes, electronics etc. This high applicability of time series is the motivation to this study.
Resumo:
Soil fertility constraints to crop production have been recognized widely as a major obstacle to food security and agro-ecosystem sustainability in sub-Saharan West Africa. As such, they have led to a multitude of research projects and policy debates on how best they should be overcome. Conclusions, based on long-term multi-site experiments, are lacking with respect to a regional assessment of phosphorus and nitrogen fertilizer effects, surface mulched crop residues, and legume rotations on total dry matter of cereals in this region. A mixed model time-trend analysis was used to investigate the effects of four nitrogen and phosphorus rates, annually applied crop residue dry matter at 500 and 2000 kg ha^-1, and cereal-legume rotation versus continuous cereal cropping on the total dry matter of cereals and legumes. The multi-factorial experiment was conducted over four years at eight locations, with annual rainfall ranging from 510 to 1300 mm, in Niger, Burkina Faso, and Togo. With the exception of phosphorus, treatment effects on legume growth were marginal. At most locations, except for typical Sudanian sites with very low base saturation and high rainfall, phosphorus effects on cereal total dry matter were much lower with rock phosphate than with soluble phosphorus, unless the rock phosphate was combined with an annual seed-placement of 4 kg ha^-1 phosphorus. Across all other treatments, nitrogen effects were negligible at 500 mm annual rainfall but at 900 mm, the highest nitrogen rate led to total dry matter increases of up to 77% and, at 1300 mm, to 183%. Mulch-induced increases in cereal total dry matter were larger with lower base saturation, reaching 45% on typical acid sandy Sahelian soils. Legume rotation effects tended to increase over time but were strongly species-dependent.
Resumo:
Die zunehmende Vernetzung der Informations- und Kommunikationssysteme führt zu einer weiteren Erhöhung der Komplexität und damit auch zu einer weiteren Zunahme von Sicherheitslücken. Klassische Schutzmechanismen wie Firewall-Systeme und Anti-Malware-Lösungen bieten schon lange keinen Schutz mehr vor Eindringversuchen in IT-Infrastrukturen. Als ein sehr wirkungsvolles Instrument zum Schutz gegenüber Cyber-Attacken haben sich hierbei die Intrusion Detection Systeme (IDS) etabliert. Solche Systeme sammeln und analysieren Informationen von Netzwerkkomponenten und Rechnern, um ungewöhnliches Verhalten und Sicherheitsverletzungen automatisiert festzustellen. Während signatur-basierte Ansätze nur bereits bekannte Angriffsmuster detektieren können, sind anomalie-basierte IDS auch in der Lage, neue bisher unbekannte Angriffe (Zero-Day-Attacks) frühzeitig zu erkennen. Das Kernproblem von Intrusion Detection Systeme besteht jedoch in der optimalen Verarbeitung der gewaltigen Netzdaten und der Entwicklung eines in Echtzeit arbeitenden adaptiven Erkennungsmodells. Um diese Herausforderungen lösen zu können, stellt diese Dissertation ein Framework bereit, das aus zwei Hauptteilen besteht. Der erste Teil, OptiFilter genannt, verwendet ein dynamisches "Queuing Concept", um die zahlreich anfallenden Netzdaten weiter zu verarbeiten, baut fortlaufend Netzverbindungen auf, und exportiert strukturierte Input-Daten für das IDS. Den zweiten Teil stellt ein adaptiver Klassifikator dar, der ein Klassifikator-Modell basierend auf "Enhanced Growing Hierarchical Self Organizing Map" (EGHSOM), ein Modell für Netzwerk Normalzustand (NNB) und ein "Update Model" umfasst. In dem OptiFilter werden Tcpdump und SNMP traps benutzt, um die Netzwerkpakete und Hostereignisse fortlaufend zu aggregieren. Diese aggregierten Netzwerkpackete und Hostereignisse werden weiter analysiert und in Verbindungsvektoren umgewandelt. Zur Verbesserung der Erkennungsrate des adaptiven Klassifikators wird das künstliche neuronale Netz GHSOM intensiv untersucht und wesentlich weiterentwickelt. In dieser Dissertation werden unterschiedliche Ansätze vorgeschlagen und diskutiert. So wird eine classification-confidence margin threshold definiert, um die unbekannten bösartigen Verbindungen aufzudecken, die Stabilität der Wachstumstopologie durch neuartige Ansätze für die Initialisierung der Gewichtvektoren und durch die Stärkung der Winner Neuronen erhöht, und ein selbst-adaptives Verfahren eingeführt, um das Modell ständig aktualisieren zu können. Darüber hinaus besteht die Hauptaufgabe des NNB-Modells in der weiteren Untersuchung der erkannten unbekannten Verbindungen von der EGHSOM und der Überprüfung, ob sie normal sind. Jedoch, ändern sich die Netzverkehrsdaten wegen des Concept drif Phänomens ständig, was in Echtzeit zur Erzeugung nicht stationärer Netzdaten führt. Dieses Phänomen wird von dem Update-Modell besser kontrolliert. Das EGHSOM-Modell kann die neuen Anomalien effektiv erkennen und das NNB-Model passt die Änderungen in Netzdaten optimal an. Bei den experimentellen Untersuchungen hat das Framework erfolgversprechende Ergebnisse gezeigt. Im ersten Experiment wurde das Framework in Offline-Betriebsmodus evaluiert. Der OptiFilter wurde mit offline-, synthetischen- und realistischen Daten ausgewertet. Der adaptive Klassifikator wurde mit dem 10-Fold Cross Validation Verfahren evaluiert, um dessen Genauigkeit abzuschätzen. Im zweiten Experiment wurde das Framework auf einer 1 bis 10 GB Netzwerkstrecke installiert und im Online-Betriebsmodus in Echtzeit ausgewertet. Der OptiFilter hat erfolgreich die gewaltige Menge von Netzdaten in die strukturierten Verbindungsvektoren umgewandelt und der adaptive Klassifikator hat sie präzise klassifiziert. Die Vergleichsstudie zwischen dem entwickelten Framework und anderen bekannten IDS-Ansätzen zeigt, dass der vorgeschlagene IDSFramework alle anderen Ansätze übertrifft. Dies lässt sich auf folgende Kernpunkte zurückführen: Bearbeitung der gesammelten Netzdaten, Erreichung der besten Performanz (wie die Gesamtgenauigkeit), Detektieren unbekannter Verbindungen und Entwicklung des in Echtzeit arbeitenden Erkennungsmodells von Eindringversuchen.
Resumo:
The proposal presented in this thesis is to provide designers of knowledge based supervisory systems of dynamic systems with a framework to facilitate their tasks avoiding interface problems among tools, data flow and management. The approach is thought to be useful to both control and process engineers in assisting their tasks. The use of AI technologies to diagnose and perform control loops and, of course, assist process supervisory tasks such as fault detection and diagnose, are in the scope of this work. Special effort has been put in integration of tools for assisting expert supervisory systems design. With this aim the experience of Computer Aided Control Systems Design (CACSD) frameworks have been analysed and used to design a Computer Aided Supervisory Systems (CASSD) framework. In this sense, some basic facilities are required to be available in this proposed framework: ·
Resumo:
Planning a project with proper considerations of all necessary factors and managing a project to ensure its successful implementation will face a lot of challenges. Initial stage in planning a project for bidding a project is costly, time consuming and usually with poor accuracy on cost and effort predictions. On the other hand, detailed information for previous projects may be buried in piles of archived documents which can be increasingly difficult to learn from the previous experiences. Project portfolio has been brought into this field aiming to improve the information sharing and management among different projects. However, the amount of information that could be shared is still limited to generic information. This paper, we report a recently developed software system COBRA to automatically generate a project plan with effort estimation of time and cost based on data collected from previous completed projects. To maximise the data sharing and management among different projects, we proposed a method of using product based planning from PRINCE2 methodology. (Automated Project Information Sharing and Management System -�COBRA) Keywords: project management, product based planning, best practice, PRINCE2
Resumo:
The term commercial management has been used for some time, similarly the job title commercial manager. However, as of yet, little emphasis has been placed on defining. This paper presents the findings from a two-year research initiative that has compared and contrasted the role of commercial managers from a range of organisations and across industry sectors, as a first step in developing a body of knowledge for commercial. It is argued that there are compelling arguments for considering commercial management, not solely as atask undertaken by commercial managers, but as a discipline in itself: a discipline that, arguably, bridges traditional project management and organisational theories. While the study has established differences in approach and application both between and within industry sectors, it has established sufficient similarity and synergy in practice to identify a specific role of commercial management in project-based organisations. These similarities encompass contract management and dispute resolution; the divergences include a greater involvement in financial and value management in construction and in bid management in defence/aerospace.
Resumo:
Background: Medication errors are an important cause of morbidity and mortality in primary care. The aims of this study are to determine the effectiveness, cost effectiveness and acceptability of a pharmacist-led information-technology-based complex intervention compared with simple feedback in reducing proportions of patients at risk from potentially hazardous prescribing and medicines management in general (family) practice. Methods: Research subject group: "At-risk" patients registered with computerised general practices in two geographical regions in England. Design: Parallel group pragmatic cluster randomised trial. Interventions: Practices will be randomised to either: (i) Computer-generated feedback; or (ii) Pharmacist-led intervention comprising of computer-generated feedback, educational outreach and dedicated support. Primary outcome measures: The proportion of patients in each practice at six and 12 months post intervention: - with a computer-recorded history of peptic ulcer being prescribed non-selective non-steroidal anti-inflammatory drugs - with a computer-recorded diagnosis of asthma being prescribed beta-blockers - aged 75 years and older receiving long-term prescriptions for angiotensin converting enzyme inhibitors or loop diuretics without a recorded assessment of renal function and electrolytes in the preceding 15 months. Secondary outcome measures; These relate to a number of other examples of potentially hazardous prescribing and medicines management. Economic analysis: An economic evaluation will be done of the cost per error avoided, from the perspective of the UK National Health Service (NHS), comparing the pharmacist-led intervention with simple feedback. Qualitative analysis: A qualitative study will be conducted to explore the views and experiences of health care professionals and NHS managers concerning the interventions, and investigate possible reasons why the interventions prove effective, or conversely prove ineffective. Sample size: 34 practices in each of the two treatment arms would provide at least 80% power (two-tailed alpha of 0.05) to demonstrate a 50% reduction in error rates for each of the three primary outcome measures in the pharmacist-led intervention arm compared with a 11% reduction in the simple feedback arm. Discussion: At the time of submission of this article, 72 general practices have been recruited (36 in each arm of the trial) and the interventions have been delivered. Analysis has not yet been undertaken.
Resumo:
Construction planning plays a fundamental role in construction project management that requires team working among planners from a diverse range of disciplines and in geographically dispersed working situations. Model-based four-dimensional (4D) computer-aided design (CAD) groupware, though considered a possible approach to supporting collaborative planning, is still short of effective collaborative mechanisms for teamwork due to methodological, technological and social challenges. Targeting this problem, this paper proposes a model-based groupware solution to enable a group of multidisciplinary planners to perform real-time collaborative 4D planning across the Internet. In the light of the interactive definition method, and its computer-supported collaborative work (CSCW) design analysis, the paper discusses the realization of interactive collaborative mechanisms from software architecture, application mode, and data exchange protocol. These mechanisms have been integrated into a groupware solution, which was validated by a planning team in a truly geographically dispersed condition. Analysis of the validation results revealed that the proposed solution is feasible for real-time collaborative 4D planning to gain a robust construction plan through collaborative teamwork. The realization of this solution triggers further considerations about its enhancement for wider groupware applications.
Resumo:
A novel approach is presented for combining spatial and temporal detail from newly available TRMM-based data sets to derive hourly rainfall intensities at 1-km spatial resolution for hydrological modelling applications. Time series of rainfall intensities derived from 3-hourly 0.25° TRMM 3B42 data are merged with a 1-km gridded rainfall climatology based on TRMM 2B31 data to account for the sub-grid spatial distribution of rainfall intensities within coarse-scale 0.25° grid cells. The method is implemented for two dryland catchments in Tunisia and Senegal, and validated against gauge data. The outcomes of the validation show that the spatially disaggregated and intensity corrected TRMM time series more closely approximate ground-based measurements than non-corrected data. The method introduced here enables the generation of rainfall intensity time series with realistic temporal and spatial detail for dynamic modelling of runoff and infiltration processes that are especially important to water resource management in arid regions.
Resumo:
REDD (reduced emissions from deforestation and degradation) aims to slow carbon releases caused by forest disturbance by making payments conditional on forest quality over time. Like earlier policies to slow deforestation, REDD must change the behaviour of forest degrading actors. Broadly, it can be implemented with payments to forest users in exchange for improved forest management, thus creating incentives; through payments for enforcement, thus creating disincentives; or through addressing external drivers such as urban charcoal demand. In Tanzania, community-based forest management (CBFM), a form of participatory forest management, was chosen by the Tanzania Forest Conservation Group, a local NGO, as a model for implementing REDD pilot programmes. Payments are made to villages that have the rights to forest carbon. In exchange, the villages must demonstrably reduce deforestation at the village level. In this paper, using this pilot programme as a case study, combined with a review of the literature, we provide insights for REDD implementation in sub-Saharan Africa. We pay particular attention to leakage, monitoring and enforcement. We suggest that implementing REDD through CBFM-type structures can create appropriate incentives and behaviour change when the recipients of the REDD funds are also the key drivers of forest change. When external forces drive forest change, however, REDD through CBFM-type structures becomes an enforcement programme with local communities rather than government agencies being responsible for the enforcement. That structure imposes costs on local communities, whose local authority limits the ability to address leakage outside the particular REDD village.
Resumo:
In Mediterranean areas, conventional tillage increases soil organic matter losses, reduces soil quality, and contributes to climate change due to increased CO2 emissions. CO2 sequestration rates in soil may be enhanced by appropriate agricultural soil management and increasing soil organic matter content. This study analyzes the stratification ratio (SR) index of soil organic carbon (SOC), nitrogen (N) and C:N ratio under different management practices in an olive grove (OG) in Mediterranean areas (Andalusia, southern Spain). Management practices considered in this study are conventional tillage (CT) and no tillage (NT). In the first case, CT treatments included addition of alperujo (A) and olive leaves (L). A control plot with no addition of olive mill waste was considered (CP). In the second case, NT treatments included addition of chipped pruned branches (NT1) and chipped pruned branches and weeds (NT2). The SRs of SOC increased with depth for all treatments. The SR of SOC was always higher in NT compared to CT treatments, with the highest SR of SOC observed under NT2. The SR of N increased with depth in all cases, ranging between 0.89 (L-SR1) and 39.11 (L-SR3 and L-SR4).The SR of C:N ratio was characterized by low values, ranging from 0.08 (L-SR3) to 1.58 (NT1-SR2) and generally showing higher values in SR1 and SR2 compared to those obtained in SR3 and SR4. This study has evaluated several limitations to the SR index such as the fact that it is descriptive but does not analyze the behavior of the variable over time. In addition, basing the assessment of soil quality on a single variable could lead to an oversimplification of the assessment. Some of these limitations were experienced in the assessment of L, where SR1 of SOC was the lowest of the studied soils. In this case, the higher content in the second depth interval compared to the first was caused by the intrinsic characteristics of this soil's formation process rather than by degradation. Despite the limitations obtained SRs demonstrate that NT with the addition of organic material improves soil quality.