803 resultados para production planning information systems
Resumo:
Tämän tutkielman tavoitteena on perehtyä globaalin yrityksen laskentatoimen tietojärjestelmien yhtenäistämiseen ja käyttöönottoon esimerkkinä UPM-Kymmene konsernin projekti. Tutkielmassa sovelletaan hermeneuttista tutkimusotetta. Teoreettisesti aihetta tarkastellaan globalisoitumisen ja laskentatoimen tietojärjestelmille asetettavien vaatimusten pohjalta, sekä järjestelmän muutosprosessin eri vaiheissa huomioon otettavien muuttujien perusteella. Yhtenäisen laskentatoimen tietojärjestelmän tuomat edut globaalille yritykselle ovat ilmeisiä. Ennen yhtenäisen projektin kehittelyä on olennaista tutkia lähtökohdat projektin onnistumiselle ja suunnitella projektin eri vaiheet huolella. Tutkielmassa havaitaan myös, että globaalissa yrityksessä tulee huomioida eri yrityskulttuurit sekä tulosyksiköiden erilaiset toimintatavat. Johtopäätöksenä todetaan, että sekä yritysjohdon että tulosyksiköiden sitoutuneisuus projektiin ja yhtenäiset tavoitteet ovat oleellisia projektin onnistumisen kannalta.
Resumo:
The objective of this project was to introduce a new software product to pulp industry, a new market for case company. An optimization based scheduling tool has been developed to allow pulp operations to better control their production processes and improve both production efficiency and stability. Both the work here and earlier research indicates that there is a potential for savings around 1-5%. All the supporting data is available today coming from distributed control systems, data historians and other existing sources. The pulp mill model together with the scheduler, allows what-if analyses of the impacts and timely feasibility of various external actions such as planned maintenance of any particular mill operation. The visibility gained from the model proves also to be a real benefit. The aim is to satisfy demand and gain extra profit, while achieving the required customer service level. Research effort has been put both in understanding the minimum features needed to satisfy the scheduling requirements in the industry and the overall existence of the market. A qualitative study was constructed to both identify competitive situation and the requirements vs. gaps on the market. It becomes clear that there is no such system on the marketplace today and also that there is room to improve target market overall process efficiency through such planning tool. This thesis also provides better overall understanding of the different processes in this particular industry for the case company.
Resumo:
Two fundamental processes usually arise in the production planning of many industries. The first one consists of deciding how many final products of each type have to be produced in each period of a planning horizon, the well-known lot sizing problem. The other process consists of cutting raw materials in stock in order to produce smaller parts used in the assembly of final products, the well-studied cutting stock problem. In this paper the decision variables of these two problems are dependent of each other in order to obtain a global optimum solution. Setups that are typically present in lot sizing problems are relaxed together with integer frequencies of cutting patterns in the cutting problem. Therefore, a large scale linear optimizations problem arises, which is exactly solved by a column generated technique. It is worth noting that this new combined problem still takes the trade-off between storage costs (for final products and the parts) and trim losses (in the cutting process). We present some sets of computational tests, analyzed over three different scenarios. These results show that, by combining the problems and using an exact method, it is possible to obtain significant gains when compared to the usual industrial practice, which solve them in sequence. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.
Resumo:
Develop software is still a risky business. After 60 years of experience, this community is still not able to consistently build Information Systems (IS) for organizations with predictable quality, within previously agreed budget and time constraints. Although software is changeable we are still unable to cope with the amount and complexity of change that organizations demand for their IS. To improve results, developers followed two alternatives: Frameworks that increase productivity but constrain the flexibility of possible solutions; Agile ways of developing software that keep flexibility with less upfront commitments. With strict frameworks, specific hacks have to be put in place to get around the framework construction options. In time this leads to inconsistent architectures that are harder to maintain due to incomplete documentation and human resources turnover. The main goals of this work is to create a new way to develop flexible IS for organizations, using web technologies, in a faster, better and cheaper way that is more suited to handle organizational change. To do so we propose an adaptive object model that uses a new ontology for data and action with strict normalizing rules. These rules should bound the effects of changes that can be better tested and therefore corrected. Interfaces are built with templates of resources that can be reused and extended in a flexible way. The “state of the world” for each IS is determined by all production and coordination acts that agents performed over time, even those performed by external systems. When bugs are found during maintenance, their past cascading effects can be checked through simulation, re-running the log of transaction acts over time and checking results with previous records. This work implements a prototype with part of the proposed system in order to have a preliminary assessment its feasibility and limitations.
Resumo:
Abstract Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.
Resumo:
In den letzten drei Jahrzehnten sind Fernerkundung und GIS in den Geowissenschaften zunehmend wichtiger geworden, um die konventionellen Methoden von Datensammlung und zur Herstellung von Landkarten zu verbessern. Die vorliegende Arbeit befasst sich mit der Anwendung von Fernerkundung und geographischen Informationssystemen (GIS) für geomorphologische Untersuchungen. Durch die Kombination beider Techniken ist es vor allem möglich geworden, geomorphologische Formen im Überblick und dennoch detailliert zu erfassen. Als Grundlagen werden in dieser Arbeit topographische und geologische Karten, Satellitenbilder und Klimadaten benutzt. Die Arbeit besteht aus 6 Kapiteln. Das erste Kapitel gibt einen allgemeinen Überblick über den Untersuchungsraum. Dieser umfasst folgende morphologische Einheiten, klimatischen Verhältnisse, insbesondere die Ariditätsindizes der Küsten- und Gebirgslandschaft sowie das Siedlungsmuster beschrieben. Kapitel 2 befasst sich mit der regionalen Geologie und Stratigraphie des Untersuchungsraumes. Es wird versucht, die Hauptformationen mit Hilfe von ETM-Satellitenbildern zu identifizieren. Angewandt werden hierzu folgende Methoden: Colour Band Composite, Image Rationing und die sog. überwachte Klassifikation. Kapitel 3 enthält eine Beschreibung der strukturell bedingten Oberflächenformen, um die Wechselwirkung zwischen Tektonik und geomorphologischen Prozessen aufzuklären. Es geht es um die vielfältigen Methoden, zum Beispiel das sog. Image Processing, um die im Gebirgskörper vorhandenen Lineamente einwandfrei zu deuten. Spezielle Filtermethoden werden angewandt, um die wichtigsten Lineamente zu kartieren. Kapitel 4 stellt den Versuch dar, mit Hilfe von aufbereiteten SRTM-Satellitenbildern eine automatisierte Erfassung des Gewässernetzes. Es wird ausführlich diskutiert, inwieweit bei diesen Arbeitsschritten die Qualität kleinmaßstäbiger SRTM-Satellitenbilder mit großmaßstäbigen topographischen Karten vergleichbar ist. Weiterhin werden hydrologische Parameter über eine qualitative und quantitative Analyse des Abflussregimes einzelner Wadis erfasst. Der Ursprung von Entwässerungssystemen wird auf der Basis geomorphologischer und geologischer Befunde interpretiert. Kapitel 5 befasst sich mit der Abschätzung der Gefahr episodischer Wadifluten. Die Wahrscheinlichkeit ihres jährlichen Auftretens bzw. des Auftretens starker Fluten im Abstand mehrerer Jahre wird in einer historischen Betrachtung bis 1921 zurückverfolgt. Die Bedeutung von Regentiefs, die sich über dem Roten Meer entwickeln, und die für eine Abflussbildung in Frage kommen, wird mit Hilfe der IDW-Methode (Inverse Distance Weighted) untersucht. Betrachtet werden außerdem weitere, regenbringende Wetterlagen mit Hilfe von Meteosat Infrarotbildern. Genauer betrachtet wird die Periode 1990-1997, in der kräftige, Wadifluten auslösende Regenfälle auftraten. Flutereignisse und Fluthöhe werden anhand von hydrographischen Daten (Pegelmessungen) ermittelt. Auch die Landnutzung und Siedlungsstruktur im Einzugsgebiet eines Wadis wird berücksichtigt. In Kapitel 6 geht es um die unterschiedlichen Küstenformen auf der Westseite des Roten Meeres zum Beispiel die Erosionsformen, Aufbauformen, untergetauchte Formen. Im abschließenden Teil geht es um die Stratigraphie und zeitliche Zuordnung von submarinen Terrassen auf Korallenriffen sowie den Vergleich mit anderen solcher Terrassen an der ägyptischen Rotmeerküste westlich und östlich der Sinai-Halbinsel.
Resumo:
The need to effectively manage the documentation covering the entire production process, from the concept phase right through to market realise, constitutes a key issue in the creation of a successful and highly competitive product. For almost forty years the most commonly used strategies to achieve this have followed Product Lifecycle Management (PLM) guidelines. Translated into information management systems at the end of the '90s, this methodology is now widely used by companies operating all over the world in many different sectors. PLM systems and editor programs are the two principal types of software applications used by companies for their process aotomation. Editor programs allow to store in documents the information related to the production chain, while the PLM system stores and shares this information so that it can be used within the company and made it available to partners. Different software tools, which capture and store documents and information automatically in the PLM system, have been developed in recent years. One of them is the ''DirectPLM'' application, which has been developed by the Italian company ''Focus PLM''. It is designed to ensure interoperability between many editors and the Aras Innovator PLM system. In this dissertation we present ''DirectPLM2'', a new version of the previous software application DirectPLM. It has been designed and developed as prototype during the internship by Focus PLM. Its new implementation separates the abstract logic of business from the real commands implementation, previously strongly dependent on Aras Innovator. Thanks to its new design, Focus PLM can easily develop different versions of DirectPLM2, each one devised for a specific PLM system. In fact, the company can focus the development effort only on a specific set of software components which provides specialized functions interacting with that particular PLM system. This allows shorter Time-To-Market and gives the company a significant competitive advantage.
Resumo:
Most commercial project management software packages include planning methods to devise schedules for resource-constrained projects. As it is proprietary information of the software vendors which planning methods are implemented, the question arises how the software packages differ in quality with respect to their resource-allocation capabilities. We experimentally evaluate the resource-allocation capabilities of eight recent software packages by using 1,560 instances with 30, 60, and 120 activities of the well-known PSPLIB library. In some of the analyzed packages, the user may influence the resource allocation by means of multi-level priority rules, whereas in other packages, only few options can be chosen. We study the impact of various complexity parameters and priority rules on the project duration obtained by the software packages. The results indicate that the resource-allocation capabilities of these packages differ significantly. In general, the relative gap between the packages gets larger with increasing resource scarcity and with increasing number of activities. Moreover, the selection of the priority rule has a considerable impact on the project duration. Surprisingly, when selecting a priority rule in the packages where it is possible, both the mean and the variance of the project duration are in general worse than for the packages which do not offer the selection of a priority rule.
Resumo:
In recent years, Geographic Information Systems (GIS) have increasingly been used in a wide array of application contexts for development cooperation in lowlands and mountain areas. When used for planning, implementation, and monitoring, GIS is a versatile and highly efficient tool, particularly in mountain areas characterized by great spatial diversity and inaccessibility. However, the establishment and application of GIS in mountain regions generally presents considerable technical challenges. Moreover, it is necessary to address specific institutional and organizational issues regarding implementation.
Resumo:
This work highlights two critical taboos in organizations: 1)taking for granted the quality of certain capabilities and attitudes of the end-user representatives (EUR) in information systems development projects (ISDP), and 2) the EUR´s inherent accountability for losses in IS investments. These issues are neither addressed by theory nor research when assessing success/ failure. A triangulation approach was applied to combine quantitative and qualitative methods, having convergent results and showing that in problematic cases, paradoxically, the origin of IS rejection by end users (EU) points towards the EUR themselves. It has been evaluated to what extent some EUR factors impacted a macro ISDP involving an enterprise resource planning (ERP) package, ranking the ‘knowledge of the EUR’ as the main latent variable. The results validate some issues found throughout decades of praxis, confirming that when not properly managed the EUR role by itself has a direct relationship with IS rejection and significant losses in IS investments.
Resumo:
This article has been extracted from the results of a thesis entitled “Potential bioelectricity production of the Madrid Community Agricultural Regions based on rye and triticale biomass.” The aim was, first, to quantify the potential of rye (Secale Cereale L.) and triticale ( Triticosecale Aestivum L.) biomass in each of the Madrid Community agricultural regions, and second, to locate the most suitable areas for the installation of power plants using biomass. At least 17,339.9 t d.m. of rye and triticale would be required to satisfy the biomass needs of a 2.2 MW power plant, (considering an efficiency of 21.5%, 8,000 expected operating hours/year and a biomass LCP of 4,060 kcal/kg for both crops), and 2,577 ha would be used (which represent 2.79% of the Madrid Community fallow dry land surface). Biomass yields that could be achieved in Madrid Community using 50% of the fallow dry land surface (46,150 ha representing 5.75% of the Community area), based on rye and triticale crops, are estimated at 84,855, 74,906, 70,109, 50,791, 13,481, and 943 t annually for the Campiña, Vegas, Sur Occidental, Área Metropolitana, Lozoya-Somosierra, and Guadarrama regions. The latter represents a bioelectricity potential of 10.77, 9.5, 8.9, 6.44, 1.71, and 0.12 MW, respectively.
Resumo:
Cover title.
Resumo:
Analysing investments in ISs in order to maximise benefits has become a prime concern, especially for private corporations. No formula of equilibrium exists that could link the injected amounts and accrued returns. The relationship is simply not straightforward. This thesis is based upon empirical work which involved sketching organisational ethnographies (four organographies and a sectography) into the role and value of information systems in Jordanian financial organisations. Besides deciphering the map of impacts, it explains the attributions of the variations in the impacts of ISs which were found to be related to the internal organisational processes: culturally and politically specific considerations, economically or technically rooted factors and environmental factors. The research serves as an empirical attempt to test out the applicability of adopting the interpretive paradigm to researching organisations in a developing country. The fieldwork comprised an exploratory stage, a detailed investigation of four case studies and a survey stage encompassing 16 organisations. Primary and secondary data were collected from multiple sources using a range of instruments. The evidence highlights the fact that little long term strategic planning was pursued; the emphasis was more focused on short term planning. There was no noticeable adoption of any strategic fit principle linking IS strategy to the corporate strategy. In addition, the benefits obtained were mostly intangible. Although ISs were central to the work of the organisations surveyed as the core technology, they were considered as tools or work enablers rather than weapons for competitive rivalry. The cultural specificity of IS impacts was evident and the cultural and political considerations were key factors in explaining the attributions of the variations in the impacts of ISs in JFOs. The thesis confirms that measuring the benefits of ISs is the problematic. However, in order to gain more insight, the phenomenon of "the use of ISs" has to be studied within its context.
Resumo:
Information systems are corporate resources, therefore information systems development must be aligned with corporate strategy. This thesis proposes that effective strategic alignment of information systems requires information systems development, information systems planning and strategic management to be united. Literature in these areas is examined, breaching the academic boundaries which separate these areas, to contribute a synthesised approach to the strategic alignment of information systems development. Previous work in information systems planning has extended information systems development techniques, such as data modelling, into strategic planning activities, neglecting techniques of strategic management. Examination of strategic management in this thesis, identifies parallel trends in strategic management and information systems development; the premises of the learning school of strategic management are similar to those of soft systems approaches to information systems development. It is therefore proposed that strategic management can be supported by a soft systems approach. Strategic management tools and techniques frame individual views of a strategic situation; soft systems approaches can integrate these diverse views to explore the internal and external environments of an organisation. The information derived from strategic analysis justifies the need for an information system and provides a starting point for information systems development. This is demonstrated by a composite framework which enables each information system to be justified according to its direct contribution to corporate strategy. The proposed framework was developed through action research conducted in a number of organisations of varying types. This suggests that the framework can be widely used to support the strategic alignment of information systems development, thereby contributing to organisational success.