942 resultados para product data management system
Resumo:
Palvelukeskeistä arkkitehtuuria (SOA) sovelletaan nykyään varsinkin suurten yritysten tietojärjestelmien suunnittelussa ja toteutuksessa. Siinä toiminnot suunnitellaan palveluina, mikä lisää erityisesti palveluiden uudelleen-käytettävyyttä ja mahdollisuutta hyödyntää jo tehtyjä järjestelmäinvestointeja. Tuotteen elinkaarenaikaisen tiedonhallinnan (PLM) pyrkimyksenä on saada elinkaarelle hajaantunut tieto käyttöön oikeassa paikassa oikeaan aikaan sekä parantaa tuotetiedon luotettavuutta ja ajantasaisuutta. Tämä on yksi tärkeimmistä tekijöistä tavoiteltaessa kilpailuetuja verkottuneessa liiketoiminnassa. Tämän diplomityön tavoitteena oli selvittää, kuinka tuotteen elinkaarenaikainen tiedonhallinta voidaan toteuttaa palvelukeskeisen arkkitehtuurin avulla sekä, mitä haasteita ja hyötyjä tästä seuraa organisaatiolle. Tutkimus tehtiin kirjallisuustutkimuksena. Työ tarjoaa tietoa PLM:n ja SOA:n integroinnista sekä integroinnin haasteista ja hyödyistä. Tutkimuksen tulokset osoittavat palvelukeskeisen PLM:n tuovan oikein suunniteltuna ja toteutettuna merkittäviä hyötyjä verkostoituneessa ympäristössä toimiville yrityksille. Lisäksi työ antaa käsityksen siitä, kuinka laaja projekti palvelukeskeisen PLM:n implementointi on.
Resumo:
This Master´s thesis explores how the a global industrial corporation’s after sales service department should arrange its installed base management practices in order to maintain and utilize the installed base information effectively. Case company has product-related records, such as product’s lifecycle information, service history information and information about product’s performance. Information is collected and organized often case by case, therefore the systematic and effective use of installed base information is difficult also the overview of installed base is missing. The goal of the thesis study was to find out how the case company can improve the installed base maintenance and management practices and improve the installed base information availability and reliability. Installed base information management practices were first examined through the literature. The empirical research was conducted by the interviews and questionnaire survey, targeted to the case company’s service department. The research purpose was to find out the challenges related to case company´s service department’s information management practices. The study also identified the installed base information needs and improvement potential in the availability of information. Based on the empirical research findings, recommendations for improve installed base management practices and information availability were created. Grounding of the recommendations, the case company is suggested the following proposals for action: Service report development, improving the change management process, ensuring the quality of the product documentation in early stages of product life cycle and decision to improve installed base management practices.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Workshop at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
As the rapid development of the society as well as the lifestyle, the generation of commercial waste is getting more complicated to control. The situation of packaging waste and food waste – the main fractions of commercial waste in different countries in Europe and Asia is analyzed in order to evaluate and suggest necessary improvements for the existing waste management system in the city of Hanoi, Vietnam. From all waste generation sources of the city, a total amount of approximately 4000 tons of mixed waste is transported to the composting facility and the disposal site, which emits a huge amount of 1,6Mt of GHG emission to the environment. Recycling activity is taking place spontaneously by the informal pickers, leads to the difficulty in managing the whole system and uncertainty of the overall data. With a relative calculation, resulting in only approximately 0,17Mt CO2 equivalent emission, incinerator is suggested to be the solution of the problem with overloaded landfill and raising energy demand within the inhabitants.
Resumo:
Diplomityö toteutettiin Yritys X:n toimeksiantona. Työ sai alkunsa Yritys X:n tarpeesta kehittää yrityksen ja sen asiakasomistajien yhteistä tuotetietoprosessia. Tavoitteena oli saada kuvaus tuotetietoprosessin nykytilasta sekä sen pohjalta luoda kehitysehdotukset, joilla prosessia voitaisiin tehostaa. Työssä käytettiin kvalitatiivista eli laadullista tutkimusmenetelmää. Tietoa tuotetietoprosessin nykytilasta etsittiin pääsääntöisesti lukuisten Yritys X:n ja sen asiakasomistajien henkilöstön haastattelujen avulla. Lisäksi työtä tehdessä tutustuttiin laajasti toimitusketjuyhteistyöstä, tiedon läpinäkyvyyden merkityksestä, tiedonhallinnasta ja tuotetietoprosessista kertovaan kirjallisuuteen. Työn tuloksena saatiin kuvaus siitä, miten Yritys X ja sen asiakasomistajat käsittelevät tuotetietoja sekä mitkä ovat tuotetietoprosessin suurimmat haasteet. Tämän pohjalta luotiin kehitysehdotukset, joilla prosessia voitaisiin tehostaa. Tuotetietoprosessin suurimpana haasteena nähtiin olevan se, että koko ryhmittymän yhteistä etua ei ole huomioitu prosessia luodessa sekä se, että prosessissa tehdään päällekkäistä ja manuaalista työtä, joka olisi mahdollista välttää tiedon jakamisen ja sähköisen tiedonsiirron avulla. Tärkeimmiksi kehitysehdotuksiksi muodostui tuotetietoprosessin tarkastelu ja kehittäminen jatkossa kokonaisuutena, yhteistyön ja tiedon jakamisen lisääminen, Yritys X:ltä saatavien tuotetietojen laajempi hyödyntäminen asiakasomistajilla, sähköisen tiedonsiirron merkittävä lisääminen sekä yhtenäisen tuotetietojen laadunhallintajärjestelmän luominen ja sitä myötä tuotetietoprosessin ja sen tehokkuuden mittaaminen.
Resumo:
In the market where companies similar in size and resources are competing, it is challenging to have any advantage over others. In order to stay afloat company needs to have capability to perform with fewer resources and yet provide better service. Hence development of efficient processes which can cut costs and improve performance is crucial. As business expands, processes become complicated and large amount of data needs to be managed and available on request. Different tools are used in companies to store and manage data, which facilitates better production and transactions. In the modern business world the most utilized tool for that purpose is ERP - Enterprise Resource Planning system. The focus of this research is to study how competitive advantage can be achieved by implementing proprietary ERP system in the company; ERP system that is in-house created, tailor made to match and align business needs and processes. Market is full of ERP software, but choosing the right one is a big challenge. Identifying the key features that need improvement in processes and data management, choosing the right ERP, implementing it and the follow-up is a long and expensive journey companies undergo. Some companies prefer to invest in a ready-made package bought from vendor and adjust it according to own business needs, while others focus on creating own system with in-house IT capabilities. In this research a case company is used and author tries to identify and analyze why organization in question decided to pursue the development of proprietary ERP system, how it has been implemented and whether it has been successful. Main conclusion and recommendation of this research is for companies to know core capabilities and constraints before choosing and implementing ERP system. Knowledge of factors that affect system change outcome is important, to make the right decisions on strategic level and implement on operational level. Duration of the project in the case company has lasted longer than anticipated. It has been reported that in cases of buying ready product from vendor, projects are delayed and completed over budget as well. In general, in case company implementation of proprietary ERP has been successful both from business performance figures and usability of system by employees. In terms of future research, conducting a study to calculate statistically ROI of both approaches; of buying ready product and creating own ERP will be beneficial.
Resumo:
This thesis introduces heat demand forecasting models which are generated by using data mining algorithms. The forecast spans one full day and this forecast can be used in regulating heat consumption of buildings. For training the data mining models, two years of heat consumption data from a case building and weather measurement data from Finnish Meteorological Institute are used. The thesis utilizes Microsoft SQL Server Analysis Services data mining tools in generating the data mining models and CRISP-DM process framework to implement the research. Results show that the built models can predict heat demand at best with mean average percentage errors of 3.8% for 24-h profile and 5.9% for full day. A deployment model for integrating the generated data mining models into an existing building energy management system is also discussed.
Resumo:
The successful performance of company in the market relates to the quality management of human capital aiming to improve the company's internal performance and external implementation of the core business strategy. Companies with matrix structure focusing on realization and development of innovation and technologies for the uncertain market need to select thoroughly the approach to HR management system. Human resource management has a significant impact on the organization and use a variety of instruments such as corporate information systems to fulfill their functions and objectives. There are three approaches to strategic control management depending on major impact on the major interference in employee decision-making, development of skills and his integration into the business strategy. The mainstream research has focus only on the framework of strategic planning of HR and general productivity of firm, but not on features of organizational structure and corporate software capabilities for human capital. This study tackles the before mentioned challenges, typical for matrix organization, by using the HR control management tools and corporate information system. The detailed analysis of industry producing and selling electromotor and heating equipment in this master thesis provides the opportunity to improve system for HR control and displays its application in the ERP software. The results emphasize the sustainable role of matrix HR input control for creating of independent project teams for matrix structure who are able to respond to various uncertainties of the market and use their skills for improving performance. Corporate information systems can be integrated into input control system by means of output monitoring to regulate and evaluate the processes of teams, using key performance indicators and reporting systems.
Resumo:
Traffic Management system (TMS) comprises four major sub systems: The Network Database Management system for information to the passengers, Transit Facility Management System for service, planning, and scheduling vehicle and crews, Congestion Management System for traffic forecasting and planning, Safety Management System concerned with safety aspects of passengers and Environment. This work has opened a rather wide frame work of model structures for application on traffic. The facets of these theories are so wide that it seems impossible to present all necessary models in this work. However it could be deduced from the study that the best Traffic Management System is that whichis realistic in all aspects is easy to understand is easy to apply As it is practically difficult to device an ideal fool—proof model, the attempt here has been to make some progress-in that direction.
Resumo:
In this paper, we discuss Conceptual Knowledge Discovery in Databases (CKDD) in its connection with Data Analysis. Our approach is based on Formal Concept Analysis, a mathematical theory which has been developed and proven useful during the last 20 years. Formal Concept Analysis has led to a theory of conceptual information systems which has been applied by using the management system TOSCANA in a wide range of domains. In this paper, we use such an application in database marketing to demonstrate how methods and procedures of CKDD can be applied in Data Analysis. In particular, we show the interplay and integration of data mining and data analysis techniques based on Formal Concept Analysis. The main concern of this paper is to explain how the transition from data to knowledge can be supported by a TOSCANA system. To clarify the transition steps we discuss their correspondence to the five levels of knowledge representation established by R. Brachman and to the steps of empirically grounded theory building proposed by A. Strauss and J. Corbin.
Resumo:
In recent years, progress in the area of mobile telecommunications has changed our way of life, in the private as well as the business domain. Mobile and wireless networks have ever increasing bit rates, mobile network operators provide more and more services, and at the same time costs for the usage of mobile services and bit rates are decreasing. However, mobile services today still lack functions that seamlessly integrate into users’ everyday life. That is, service attributes such as context-awareness and personalisation are often either proprietary, limited or not available at all. In order to overcome this deficiency, telecommunications companies are heavily engaged in the research and development of service platforms for networks beyond 3G for the provisioning of innovative mobile services. These service platforms are to support such service attributes. Service platforms are to provide basic service-independent functions such as billing, identity management, context management, user profile management, etc. Instead of developing own solutions, developers of end-user services such as innovative messaging services or location-based services can utilise the platform-side functions for their own purposes. In doing so, the platform-side support for such functions takes away complexity, development time and development costs from service developers. Context-awareness and personalisation are two of the most important aspects of service platforms in telecommunications environments. The combination of context-awareness and personalisation features can also be described as situation-dependent personalisation of services. The support for this feature requires several processing steps. The focus of this doctoral thesis is on the processing step, in which the user’s current context is matched against situation-dependent user preferences to find the matching user preferences for the current user’s situation. However, to achieve this, a user profile management system and corresponding functionality is required. These parts are also covered by this thesis. Altogether, this thesis provides the following contributions: The first part of the contribution is mainly architecture-oriented. First and foremost, we provide a user profile management system that addresses the specific requirements of service platforms in telecommunications environments. In particular, the user profile management system has to deal with situation-specific user preferences and with user information for various services. In order to structure the user information, we also propose a user profile structure and the corresponding user profile ontology as part of an ontology infrastructure in a service platform. The second part of the contribution is the selection mechanism for finding matching situation-dependent user preferences for the personalisation of services. This functionality is provided as a sub-module of the user profile management system. Contrary to existing solutions, our selection mechanism is based on ontology reasoning. This mechanism is evaluated in terms of runtime performance and in terms of supported functionality compared to other approaches. The results of the evaluation show the benefits and the drawbacks of ontology modelling and ontology reasoning in practical applications.
Resumo:
Vor dem Hintergund der Integration des wissensbasierten Managementsystems Precision Farming in den Ökologischen Landbau wurde die Umsetzung bestehender sowie neu zu entwickelnder Strategien evaluiert und diskutiert. Mit Blick auf eine im Precision Farming maßgebende kosteneffiziente Ertragserfassung der im Ökologischen Landbau flächenrelevanten Leguminosen-Grasgemenge wurden in zwei weiteren Beiträgen die Schätzgüten von Ultraschall- und Spektralsensorik in singulärer und kombinierter Anwendung analysiert. Das Ziel des Precision Farming, ein angepasstes Management bezogen auf die flächeninterne Variabilität der Standorte umzusetzen, und damit einer Reduzierung von Betriebsmitteln, Energie, Arbeit und Umwelteffekten bei gleichzeitiger Effektivitätssteigerung und einer ökonomischen Optimierung zu erreichen, deckt sich mit wesentlichen Bestrebungen im Ökogischen Landbau. Es sind vorrangig Maßnahmen zur Erfassung der Variabilität von Standortfaktoren wie Geländerelief, Bodenbeprobung und scheinbare elektrische Leitfähigkeit sowie der Ertragserfassung über Mähdrescher, die direkt im Ökologischen Landbau Anwendung finden können. Dagegen sind dynamisch angepasste Applikationen zur Düngung, im Pflanzenschutz und zur Beseitigung von Unkräutern aufgrund komplexer Interaktionen und eines eher passiven Charakters dieser Maßnahmen im Ökologischen Landbau nur bei Veränderung der Applikationsmodelle und unter Einbindung weiterer dynamischer Daten umsetzbar. Beispiele hiefür sind einzubeziehende Mineralisierungsprozesse im Boden und organischem Dünger bei der Düngemengenberechnung, schwer ortsspezifisch zuzuordnende präventive Maßnamen im Pflanzenschutz sowie Einflüsse auf bodenmikrobiologische Prozesse bei Hack- oder Striegelgängen. Die indirekten Regulationsmechanismen des Ökologischen Landbaus begrenzen daher die bisher eher auf eine direkte Wirkung ausgelegten dynamisch angepassten Applikationen des konventionellen Precision Farming. Ergänzend sind innovative neue Strategien denkbar, von denen die qualitätsbezogene Ernte, der Einsatz hochsensibler Sensoren zur Früherkennung von Pflanzenkrankheiten oder die gezielte teilflächen- und naturschutzorientierte Bewirtschaftung exemplarisch in der Arbeit vorgestellt werden. Für die häufig große Flächenanteile umfassenden Leguminosen-Grasgemenge wurden für eine kostengünstige und flexibel einsetzbare Ertragserfassung die Ultraschalldistanzmessung zur Charakterisierung der Bestandeshöhe sowie verschiedene spektrale Vegetationsindices als Schätzindikatoren analysiert. Die Vegetationsindices wurden aus hyperspektralen Daten nach publizierten Gleichungen errechnet sowie als „Normalized Difference Spectral Index“ (NDSI) stufenweise aus allen möglichen Wellenlängenkombinationen ermittelt. Die Analyse erfolgte für Ultraschall und Vegetationsindices in alleiniger und in kombinierter Anwendung, um mögliche kompensatorische Effekte zu nutzen. In alleiniger Anwendung erreichte die Ultraschallbestandeshöhe durchweg bessere Schätzgüten, als alle einzelnen Vegetationsindices. Bei den letztgenannten erreichten insbesondere auf Wasserabsorptionsbanden basierende Vegetationsindices eine höhere Schätzgenauigkeit als traditionelle Rot/Infrarot-Indices. Die Kombination beider Sensorda-ten ließ eine weitere Steigerung der Schätzgüte erkennen, insbesondere bei bestandesspezifischer Kalibration. Hierbei kompensieren die Vegetationsindices Fehlschätzungen der Höhenmessung bei diskontinuierlichen Bestandesdichtenänderungen entlang des Höhengradienten, wie sie beim Ährenschieben oder durch einzelne hochwachsende Arten verursacht werden. Die Kombination der Ultraschallbestandeshöhe mit Vegetationsindices weist das Potential zur Entwicklung kostengünstiger Ertragssensoren für Leguminosen-Grasgemenge auf. Weitere Untersuchungen mit hyperspektralen Vegetationsindices anderer Berechnungstrukturen sowie die Einbindung von mehr als zwei Wellenlängen sind hinsichtlich der Entwicklung höherer Schätzgüten notwendig. Ebenso gilt es, Kalibrierungen und Validationen der Sensorkombination im artenreichen Grasland durchzuführen. Die Ertragserfassung in den Leguminosen-Grasgemengen stellt einen wichtigen Beitrag zur Erstellung einer Ertragshistorie in den vielfältigen Fruchtfolgen des Ökologischen Landbaus dar und ermöglicht eine verbesserte Einschätzung von Produktionspotenzialen und Defizitarealen für ein standortangepasstes Management.