19 resultados para soil data requirements
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
After sales business is an effective way to create profit and increase customer satisfaction in manufacturing companies. Despite this, some special business characteristics that are linked to these functions, make it exceptionally challenging in its own way. This Master’s Thesis examines the current situation of the data and inventory management in the case company regarding possibilities and challenges related to the consolidation of current business operations. The research examines process steps, procedures, data requirements, data mining practices and data storage management of spare part sales process, whereas the part focusing on inventory management is reviewing the current stock value and examining current practices and operational principles. There are two global after sales units which supply spare parts and issues reviewed in this study are examined from both units’ perspective. The analysis is focused on the operations of that unit where functions would be centralized by default, if change decisions are carried out. It was discovered that both data and inventory management include clear shortcomings, which result from lack of internal instructions and established processes as well as lack of cooperation with other stakeholders related to product’s lifecycle. The main product of data management was a guideline for consolidating the functions, tailored for the company’s needs. Additionally, potentially scrapped spare part were listed and a proposal of inventory management instructions was drafted. If the suggested spare part materials will be scrapped, stock value will decrease 46 percent. A guideline which was reviewed and commented in this thesis was chosen as the basis of the inventory management instructions.
Resumo:
Selostus: Viljelymaiden savespitoisuuden alueellistaminen geostatistiikan ja pistemäisen tiedon avulla
Resumo:
This study was done for ABB Ltd. Motors and Generators business unit in Helsinki. In this study, global data movement in large businesses is examined from a product data management (PDM) and enterprise resource planning (ERP) point-of-view. The purpose of this study was to understand and map out how a large global business handles its data in a multiple site structure and how it can be applied in practice. This was done by doing an empirical interview study on five different global businesses with design locations in multiple countries. Their master data management (MDM) solutions were inspected and analyzed to understand which solution would best benefit a large global architecture with many design locations. One working solution is a transactional hub which negates the effects of multisite transfers and reduces lead times. Also, the requirements and limitations of the current MDM architecture were analyzed and possible reform ideas given.
Resumo:
Ohjelmistokehitys on monimutkainen prosessi. Yksi keskeisistä tekijöistä siinä on ohjelmistolle asetettavat vaatimukset. Näitä vaatimuksia on hyvin monenlaisia, ja eri tasoisia; toivotusta toiminnallisuudesta hyvinkin yksityiskohtaisiin vaatimuksiin. Näiden vaatimusten hallinta on myöskin hyvin monitahoista, vaikkakin se on kirjallisuudessa esitetty selkeänä prosessissa, joka on sarja toisistaan erottuviavaiheita. Työn painopiste oli näiden vaatimusten muutoksen ja valmiiseen ohjelmistoon kohdistuvan palautteen hallinnassa, ja kuinka vaatimustenhallintaohjelmisto voisi olla avuksi näissä prosesseissa. Vaatimustenhallintatyökalun käyttö ei sinällään ratkaise mitään ongelmia, mutta se suo puitteet parantaa vaatimusten hallitsemista. Työkalun käytöstä on muun muassa seuraavia etuja: vaatimusten keskitetty varastointi, käyttäjäoikeuksien määrittely koskien eri käyttäjiä ja heidän pääsyään näkemään tai muuttamaan tietoa, muutoksenhallintaprosessin hallinta, muutosten vaikutuksen analysointi ja jäljitettävyys ja pääsy tietoihin web-selaimella.
Resumo:
Ohjelmistoteollisuudessa pitkiä ja vaikeita kehityssyklejä voidaan helpottaa käyttämällä hyväksi ohjelmistokehyksiä (frameworks). Ohjelmistokehykset edustavat kokoelmaa luokkia, jotka tarjoavat yleisiä ratkaisuja tietyn ongelmakentän tarpeisiin vapauttaen ohjelmistokehittäjät keskittymään sovelluskohtaisiin vaatimuksiin. Hyvin suunniteltujen ohjelmistokehyksien käyttö lisää suunnitteluratkaisujen sekä lähdekoodin uudelleenkäytettävyyttä enemmän kuin mikään muu suunnittelulähestymistapa. Tietyn kohdealueen tietämys voidaan tallentaa ohjelmistokehyksiin, joista puolestaan voidaan erikoistaa viimeisteltyjä ohjelmistotuotteita. Tässä diplomityössä kuvataan ohjelmistoagentteihin (software agents) perustuvaa ohjelmistokehyksen suunnittelua toteutusta. Pääpaino työssä on vaatimusmäärittelyä vastaavan suunnitelman sekä toteutuksen kuvaaminen ohjelmistokehykselle, josta voidaan erikoistaa erilaiseen tiedonkeruuseen kykeneviä ohjelmistoja Internet ympäristöön. Työn kokeellisessa osuudessa esitellään myös esimerkkisovellus, joka perustuu työssä kehitettyyn ohjelmistokehykseen.
Resumo:
In a networked business environment the visibility requirements towards the supply operations and customer interface has become tighter. In order to meet those requirements the master data of case company is seen as an enabler. However the current state of master data and its quality are not seen good enough to meet those requirements. In this thesis the target of research was to develop a process for managing master data quality as a continuous process and find solutions to cleanse the current customer and supplier data to meet the quality requirements defined in that process. Based on the theory of Master Data Management and data cleansing, small amount of master data was analyzed and cleansed using one commercial data cleansing solution available on the market. This was conducted in cooperation with the vendor as a proof of concept. In the proof of concept the cleansing solution’s applicability to improve the quality of current master data was proved. Based on those findings and the theory of data management the recommendations and proposals for improving the quality of data were given. In the results was also discovered that the biggest reasons for poor data quality is the lack of data governance in the company, and the current master data solutions and its restrictions.
Resumo:
Especially in global enterprises, key data is fragmented in multiple Enterprise Resource Planning (ERP) systems. Thus the data is inconsistent, fragmented and redundant across the various systems. Master Data Management (MDM) is a concept, which creates cross-references between customers, suppliers and business units, and enables corporate hierarchies and structures. The overall goal for MDM is the ability to create an enterprise-wide consistent data model, which enables analyzing and reporting customer and supplier data. The goal of the study was defining the properties and success factors of a master data system. The theoretical background was based on literature and the case consisted of enterprise specific needs and demands. The theoretical part presents the concept, background, and principles of MDM and then the phases of system planning and implementation project. Case consists of background, definition of as is situation, definition of project, evaluation criterions and concludes the key results of the thesis. In the end chapter Conclusions combines common principles with the results of the case. The case part ended up dividing important factors of the system in success factors, technical requirements and business benefits. To clarify the project and find funding for the project, business benefits have to be defined and the realization has to be monitored. The thesis found out six success factors for the MDM system: Well defined business case, data management and monitoring, data models and structures defined and maintained, customer and supplier data governance, delivery and quality, commitment, and continuous communication with business. Technical requirements emerged several times during the thesis and therefore those can’t be ignored in the project. Conclusions chapter goes through these factors on a general level. The success factors and technical requirements are related to the essentials of MDM: Governance, Action and Quality. This chapter could be used as guidance in a master data management project.
Resumo:
This thesis consists of three main theoretical themes: quality of data, success of information systems, and metadata in data warehousing. Loosely defined, metadata is descriptive data about data, and, in this thesis, master data means reference data about customers, products etc. The objective of the thesis is to contribute to an implementation of a metadata management solution for an industrial enterprise. The metadata system incorporates a repository, integration, delivery and access tools, as well as semantic rules and procedures for master data maintenance. It targets to improve maintenance processes and quality of hierarchical master data in the case company’s informational systems. That should bring benefits to whole organization in improved information quality, especially in cross-system data consistency, and in more efficient and effective data management processes. As the result of this thesis, the requirements for the metadata management solution in case were compiled, and the success of the new information system and the implementation project was evaluated.
Resumo:
The overall purpose of this thesis was to increase the knowledge on the biogeochemistry of rural acid sulphate (AS) soil environments and urban forest ecosystems near small towns in Western Finland. In addition, the potential causal relationship between the distribution of AS soils and geographical occurence of multiple sclerosis (MS) disease was assessed based on a review of existing literature and data. Acid sulphate soils, which occupy an area of approximately 17–24 million hectare worldwide, are regarded as the nastiest soils in the world. Independent of the geographical locality of these soils, they pose a great threat to their surrounding environment if disturbed. The abundant metal-rich acid drainage from Finnish AS soils, which is a result of sulphide oxidation due to artificial farmland drainage, has significant but spatially and temporally variable ecotoxicological impacts on biodiversity and community structure of fish, benthic invertebrates and macrophytes. This has resulted in mass fish kills and even eradication of sensitive fish species in affected waters. Moreover, previous investigations demonstrated significantly enriched concentrations of Co, Ni, Mn and Al, metals which are abundantly mobilised in AS soils, in agricultural crops (timothy grass and oats) and approximately 50 times higher concentrations of Al in cow milk originating from AS soils in Western Finland. Nevertheless, the results presented here demonstrate, in general, relatively moderate metal concentrations in oats and cabbage grown on AS soils in Western Finland, although some of the studied fields showed anomalous values of metals (e.g. Co and Ni) in both the soil and target plants (especially oats), similar to that of the previous investigations. The results indicated that the concentrations of Co, Ni, Mn and Zn in oats and Co and Zn in cabbage were governed by soil geochemistry as these metals were correlated with corresponding concentrations extracted from the soil by NH4Ac-EDTA and NH4Ac, respectively. The concentrations of Cu and Fe in oats and cabbage were uncorrelated to that of the easily soluble concentrations in the soils, suggesting that biological processes (e.g. plant-root processes) overshadow geochemical variation. The concentrations of K and Mg in cabbage, which showed a low spread and were strongly correlated to the NH4Ac extractable contents in the soil, were governed by both the bioavailable fractions in the topsoil and plant-uptake mechanisms. The plant´s ability to regulate its uptake of Ca and P (e.g. through root exudates) seemed to be more important than the influence of soil geochemistry. The distribution of P, K, Ca, Mg, Mn and S within humus, moss and needles in and around small towns was to a high degree controlled by biological cycling, which was indicated by the low correlation coefficients for P, K, Ca, Mg and S between humus and moss, and the low spread of these nutrients in moss and needles. The concentration variations of elements in till are mainly due to natural processes (e.g. intrusions, weathering, mineralogical variations in the bedrock). There was a strong spatial pattern for B in humus, moss and needles, which was suggested to be associated with anthropogenic emissions from nearby town centres. Geogenic dust affected the spatial distribution of Fe and Cr in moss, while natural processes governed the Fe anomaly found in the needles. The spatial accumulation patterns of Zn, Cd, Cu, Ni and Pb in humus and moss were strong and diverse, and related to current industry, the former steel industry, coal combustion, and natural geochemical processes. An intriguing Cu anomaly was found in moss. Since it was located close to a main railway line and because the railway line´s electric cables are made of Cu, it was suggested that the reason for the Cu anomaly is corrosion of these cables. In Western Finland, where AS soils are particularly abundant and enrich the metal concentrations of stream waters, cow milk and to some extent crops, an environmental risk assessment would be motivated to elucidate if the metal dispersion affect human health. Within this context, a topic of concern is the distribution of multiple sclerosis as high MS prevalence rates are found in the main area of AS soils. Regionally, the AS soil type in the Seinäjoki area has been demonstrated to be very severe in terms of metal leaching, this area also shows one of the highest MS rates reported worldwide. On a local scale, these severe AS soil types coincide well with the corresponding MS clustering along the Kyrönjoki River in Seinäjoki. There are reasons to suspect that these spatial correlations are causal, as multiple sclerosis has been suggested to result from a combination of genetic and environmental factors.
Resumo:
One of the most crucial tasks for a company offering a software product is to decide what new features should be implemented in the product’s forthcoming versions. Yet, existing studies show that this is also a task with which many companies are struggling. This problem has been claimed to be ambiguous and changing. There are better or worse solutions to the problem, but no optimal one. Furthermore, the criteria determining the success of the solution keeps changing due to continuously changing competition, technologies and market needs. This thesis seeks to gain a deeper understanding of the challenges that companies have reportedly faced in determining the requirements for their forthcoming product versions. To this end, product management related activities are explored in seven companies. Following grounded theory approach, the thesis conducts four iterations of data analysis, where each of the iterations goes beyond the previous one. The thesis results in a theory proposal intended to 1) describe the essential characteristics of organizations’ product management challenges, 2) explain the origins of the perceived challenges and 3) suggest strategies to alleviate the perceived challenges. The thesis concludes that current product management approaches are becoming inadequate to deal with challenges that have multiple and conflicting interpretations, different value orientations, unclear goals, contradictions and paradoxes. This inadequacy continues to increase until current beliefs and assumptions about the product management challenges are questioned and a new paradigm for dealing with the challenges is adopted.
Resumo:
Communications play a key role in modern smart grids. New functionalities that make the grids ‘smart’ require the communication network to function properly. Data transmission between intelligent electric devices (IEDs) in the rectifier and the customer-end inverters (CEIs) used for power conversion is also required in the smart grid concept of the low-voltage direct current (LVDC) distribution network. Smart grid applications, such as smart metering, demand side management (DSM), and grid protection applied with communications are all installed in the LVDC system. Thus, besides remote connection to the databases of the grid operators, a local communication network in the LVDC network is needed. One solution applied to implement the communication medium in power distribution grids is power line communication (PLC). There are power cables in the distribution grids, and hence, they may be applied as a communication channel for the distribution-level data. This doctoral thesis proposes an IP-based high-frequency (HF) band PLC data transmission concept for the LVDC network. A general method to implement the Ethernet-based PLC concept between the public distribution rectifier and the customerend inverters in the LVDC grid is introduced. Low-voltage cables are studied as the communication channel in the frequency band of 100 kHz–30 MHz. The communication channel characteristics and the noise in the channel are described. All individual components in the channel are presented in detail, and a channel model, comprising models for each channel component is developed and verified by measurements. The channel noise is also studied by measurements. Theoretical signalto- noise ratio (SNR) and channel capacity analyses and practical data transmission tests are carried out to evaluate the applicability of the PLC concept against the requirements set by the smart grid applications in the LVDC system. The main results concerning the applicability of the PLC concept and its limitations are presented, and suggestion for future research proposed.
Resumo:
Because of the increased availability of different kind of business intelligence technologies and tools it can be easy to fall in illusion that new technologies will automatically solve the problems of data management and reporting of the company. The management is not only about management of technology but also the management of processes and people. This thesis is focusing more into traditional data management and performance management of production processes which both can be seen as a requirement for long lasting development. Also some of the operative BI solutions are considered in the ideal state of reporting system. The objectives of this study are to examine what requirements effective performance management of production processes have for data management and reporting of the company and to see how they are effecting on the efficiency of it. The research is executed as a theoretical literary research about the subjects and as a qualitative case study about reporting development project of Finnsugar Ltd. The case study is examined through theoretical frameworks and by the active participant observation. To get a better picture about the ideal state of reporting system simple investment calculations are performed. According to the results of the research, requirements for effective performance management of production processes are automation in the collection of data, integration of operative databases, usage of efficient data management technologies like ETL (Extract, Transform, Load) processes, data warehouse (DW) and Online Analytical Processing (OLAP) and efficient management of processes, data and roles.
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014