130 resultados para Election Counting and Reporting Software,


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn tavoitteena oli selvittää OMG Kokkola Chemicals Oy:n aine- ja tarvike varastojen materiaalin ohjauksen sekä raportoinnin nykytila, analysoida saatuja tuloksia ja pohtia keinoja hankintatoimintaan ja raportointiin liittyvien menetelmien parantamiseksi. Pääpaino materiaalin ohjauksen kehittämistoimenpiteiden osalta oli varastoihin sitoutuneiden pääomien vapauttaminen ja palvelutason parantaminen. Raportoinnin osalta työssä keskityttiin varaston valvonnan kehittämiseen. Varastojärjestelmä Matekista saatuja ABC-analyysejä käytettiin apuna painopisteiden määrittämisessä nimikkeistön nykytilan tarkastelussa. Vuosi käytöltään merkittävimpiä nimikkeitä tarkasteltiin yksityiskohtaisesti. Matekin vakituisen käytön ulkopuolisia raportointiominaisuuksia tarkasteltiin tekemällä koeajoja sekä lisäksi tutkittiin Cognos Impromptu-ohjelman tuomia lisä mahdollisuuksia raportointiin. Työssä esitettiin keinot tilauspistejärjestelmän parametrien määrittämiseksi. Valituille nimikkeille laskettiin varmuusvarastotasot, hälytysrajat ja täydennys eräkoot sekä esitettiin laskelmat saavutettavista pääomakustannussäästöistä vanhan järjestelmän parametreihin verrattuna. Tuotannon ja oston välisen yhteistyön avulla saavutettavia säästöjä varastokustannustenosalta tarkasteltiin materiaali tarvelaskennan avulla. Ostostrategiantarkastelun ja kehittämisen työkaluna käytettiin ostosalkkuanalyysiä. Säännöllisten raporttien lisäksi esitettiin käyttöön otettaviksi uusia raportteja, joiden avulla voidaan seurata nimikkeistön hinnanmuutoksia ja epäkurantin osuutta varastossa. Raporttien laatimisen lähtökohdat Cognos Impromptulla käytiin läpi ja esitettiin menetelmän avulla saavutettavat hyödyt ja esiintyneet ongelmat.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis studies evaluation of software development practices through an error analysis. The work presents software development process, software testing, software errors, error classification and software process improvement methods. The practical part of the work presents results from the error analysis of one software process. It also gives improvement ideas for the project. It was noticed that the classification of the error data was inadequate in the project. Because of this it was impossible to use the error data effectively. With the error analysis we were able to show that there were deficiencies in design and analyzing phases, implementation phase and in testing phase. The work gives ideas for improving error classification and for software development practices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

After the restructuring process of the power supply industry, which for instance in Finland took place in the mid-1990s, free competition was introduced for the production and sale of electricity. Nevertheless, natural monopolies are found to be the most efficient form of production in the transmission and distribution of electricity, and therefore such companies remained franchised monopolies. To prevent the misuse of the monopoly position and to guarantee the rights of the customers, regulation of these monopoly companies is required. One of the main objectives of the restructuring process has been to increase the cost efficiency of the industry. Simultaneously, demands for the service quality are increasing. Therefore, many regulatory frameworks are being, or have been, reshaped so that companies are provided with stronger incentives for efficiency and quality improvements. Performance benchmarking has in many cases a central role in the practical implementation of such incentive schemes. Economic regulation with performance benchmarking attached to it provides companies with directing signals that tend to affect their investment and maintenance strategies. Since the asset lifetimes in the electricity distribution are typically many decades, investment decisions have far-reaching technical and economic effects. This doctoral thesis addresses the directing signals of incentive regulation and performance benchmarking in the field of electricity distribution. The theory of efficiency measurement and the most common regulation models are presented. The chief contributions of this work are (1) a new kind of analysis of the regulatory framework, so that the actual directing signals of the regulation and benchmarking for the electricity distribution companies are evaluated, (2) developing the methodology and a software tool for analysing the directing signals of the regulation and benchmarking in the electricity distribution sector, and (3) analysing the real-life regulatory frameworks by the developed methodology and further develop regulation model from the viewpoint of the directing signals. The results of this study have played a key role in the development of the Finnish regulatory model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently there is a vogue for Agile Software Development methods and many software development organizations have already implemented or they are planning to implement agile methods. Objective of this thesis is to define how agile software development methods are implemented in a small organization. Agile methods covered in this thesis are Scrum and XP. From both methods the key practices are analysed and compared to waterfall method. This thesis also defines implementation strategy and actions how agile methods are implemented in a small organization. In practice organization must prepare well and all needed meters are defined before the implementation starts. In this work three different sample projects are introduced where agile methods were implemented. Experiences from these projects were encouraging although sample set of projects were too small to get trustworthy results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mertaniemen voimalaitoksien prosessitietokone (PTK) on uusittu keväällä 2005. Tämän työn tarkoituksena on ollut auttaa PTK:n virheiden korjaamisessa ja puut-teiden kartoittamisessa. Työssä on keskitytty etenkin prosessiraportoinnin tekemiseen. Työn alussa on kerrottu Mertaniemen voimalaitoksen tekniset tiedot ja PTK:n hankinnan taustatietoja. Uudesta PTK-järjestelmästä on kuvattu laitteisto, sovellus ja perusohjelmistot. PTK:n ja muiden järjestelmien välinen tiedonsiirto on myös kuvattu. PTK muuttujien nimeäminen on esitelty, jotta olisi helpompi hahmottaa työssä käytettyjen positioiden merkityksiä. Prosessiraportoinnin kehittämisessä kuvataan raporttien tarvetta ja niiden sisältöä sekä sitä kuinka raportit on tehty. Päästöraportointi on esitetty omana osa-alueenaan, koska voimalaitosten päästöjen seurantaa edellytetään tehtävän viran¬omaismääräysten ja EU-direktiivien vaatimusten mukaisesti. Raporttien lisäksi prosessiarvojen seuraamista helpottamaan on tehty yhteisiä trendi- ja työtilanäyttöjä. PTK:n ongelmakohtina on käsitelty muuttujien tunnuksissa ja nimissä olevat virheet sekä PTK laskennan tarkastaminen. Muuttujien nimien ja laskennan tarkas¬tusta tehtiin prosessiraportoinnin tekemisen yhteydessä sekä yhteistyössä PTK-järjestelmän toimittaneen Metso Automation Oy:n kanssa. Päästölaskennan korjaaminen oli erityisen tärkeää.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Keeping track of software assets and managing software installations in IT environments can be a hard endeavor, especially when the size and diversity of the environment grows. How to install and uninstall software efficiently and cost effectively? Are there too few or too many software licenses purchased? If installed, is the software actually in use? Software Asset Management (SAM) is a process that involves managing and optimizing the purchase, deployment, maintenance, utilization, and disposal of software applications within an organization. This master’s thesis describes a special Software Lifecycle Management Framework to provide solutions to the multitude of challenges within SAM. The main objectives when designing the framework was to provide a set of tools to control the software assets during their entire lifecycle while trying to minimize the costs related to owning and managing them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Especially in global enterprises, key data is fragmented in multiple Enterprise Resource Planning (ERP) systems. Thus the data is inconsistent, fragmented and redundant across the various systems. Master Data Management (MDM) is a concept, which creates cross-references between customers, suppliers and business units, and enables corporate hierarchies and structures. The overall goal for MDM is the ability to create an enterprise-wide consistent data model, which enables analyzing and reporting customer and supplier data. The goal of the study was defining the properties and success factors of a master data system. The theoretical background was based on literature and the case consisted of enterprise specific needs and demands. The theoretical part presents the concept, background, and principles of MDM and then the phases of system planning and implementation project. Case consists of background, definition of as is situation, definition of project, evaluation criterions and concludes the key results of the thesis. In the end chapter Conclusions combines common principles with the results of the case. The case part ended up dividing important factors of the system in success factors, technical requirements and business benefits. To clarify the project and find funding for the project, business benefits have to be defined and the realization has to be monitored. The thesis found out six success factors for the MDM system: Well defined business case, data management and monitoring, data models and structures defined and maintained, customer and supplier data governance, delivery and quality, commitment, and continuous communication with business. Technical requirements emerged several times during the thesis and therefore those can’t be ignored in the project. Conclusions chapter goes through these factors on a general level. The success factors and technical requirements are related to the essentials of MDM: Governance, Action and Quality. This chapter could be used as guidance in a master data management project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dirt counting and dirt particle characterisation of pulp samples is an important part of quality control in pulp and paper production. The need for an automatic image analysis system to consider dirt particle characterisation in various pulp samples is also very critical. However, existent image analysis systems utilise a single threshold to segment the dirt particles in different pulp samples. This limits their precision. Based on evidence, designing an automatic image analysis system that could overcome this deficiency is very useful. In this study, the developed Niblack thresholding method is proposed. The method defines the threshold based on the number of segmented particles. In addition, the Kittler thresholding is utilised. Both of these thresholding methods can determine the dirt count of the different pulp samples accurately as compared to visual inspection and the Digital Optical Measuring and Analysis System (DOMAS). In addition, the minimum resolution needed for acquiring a scanner image is defined. By considering the variation in dirt particle features, the curl shows acceptable difference to discriminate the bark and the fibre bundles in different pulp samples. Three classifiers, called k-Nearest Neighbour, Linear Discriminant Analysis and Multi-layer Perceptron are utilised to categorize the dirt particles. Linear Discriminant Analysis and Multi-layer Perceptron are the most accurate in classifying the segmented dirt particles by the Kittler thresholding with morphological processing. The result shows that the dirt particles are successfully categorized for bark and for fibre bundles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud computing enables on-demand network access to shared resources (e.g., computation, networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort. Cloud computing refers to both the applications delivered as services over the Internet and the hardware and system software in the data centers. Software as a service (SaaS) is part of cloud computing. It is one of the cloud service models. SaaS is software deployed as a hosted service and accessed over the Internet. In SaaS, the consumer uses the provider‘s applications running in the cloud. SaaS separates the possession and ownership of software from its use. The applications can be accessed from any device through a thin client interface. A typical SaaS application is used with a web browser based on monthly pricing. In this thesis, the characteristics of cloud computing and SaaS are presented. Also, a few implementation platforms for SaaS are discussed. Then, four different SaaS implementation cases and one transformation case are deliberated. The pros and cons of SaaS are studied. This is done based on literature references and analysis of the SaaS implementations and the transformation case. The analysis is done both from the customer‘s and service provider‘s point of view. In addition, the pros and cons of on-premises software are listed. The purpose of this thesis is to find when SaaS should be utilized and when it is better to choose a traditional on-premises software. The qualities of SaaS bring many benefits both for the customer as well as the provider. A customer should utilize SaaS when it provides cost savings, ease, and scalability over on-premises software. SaaS is reasonable when the customer does not need tailoring, but he only needs a simple, general-purpose service, and the application supports customer‘s core business. A provider should utilize SaaS when it offers cost savings, scalability, faster development, and wider customer base over on-premises software. It is wise to choose SaaS when the application is cheap, aimed at mass market, needs frequent updating, needs high performance computing, needs storing large amounts of data, or there is some other direct value from the cloud infrastructure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Formal software development processes and well-defined development methodologies are nowadays seen as the definite way to produce high-quality software within time-limits and budgets. The variety of such high-level methodologies is huge ranging from rigorous process frameworks like CMMI and RUP to more lightweight agile methodologies. The need for managing this variety and the fact that practically every software development organization has its own unique set of development processes and methods have created a profession of software process engineers. Different kinds of informal and formal software process modeling languages are essential tools for process engineers. These are used to define processes in a way which allows easy management of processes, for example process dissemination, process tailoring and process enactment. The process modeling languages are usually used as a tool for process engineering where the main focus is on the processes themselves. This dissertation has a different emphasis. The dissertation analyses modern software development process modeling from the software developers’ point of view. The goal of the dissertation is to investigate whether the software process modeling and the software process models aid software developers in their day-to-day work and what are the main mechanisms for this. The focus of the work is on the Software Process Engineering Metamodel (SPEM) framework which is currently one of the most influential process modeling notations in software engineering. The research theme is elaborated through six scientific articles which represent the dissertation research done with process modeling during an approximately five year period. The research follows the classical engineering research discipline where the current situation is analyzed, a potentially better solution is developed and finally its implications are analyzed. The research applies a variety of different research techniques ranging from literature surveys to qualitative studies done amongst software practitioners. The key finding of the dissertation is that software process modeling notations and techniques are usually developed in process engineering terms. As a consequence the connection between the process models and actual development work is loose. In addition, the modeling standards like SPEM are partially incomplete when it comes to pragmatic process modeling needs, like light-weight modeling and combining pre-defined process components. This leads to a situation, where the full potential of process modeling techniques for aiding the daily development activities can not be achieved. Despite these difficulties the dissertation shows that it is possible to use modeling standards like SPEM to aid software developers in their work. The dissertation presents a light-weight modeling technique, which software development teams can use to quickly analyze their work practices in a more objective manner. The dissertation also shows how process modeling can be used to more easily compare different software development situations and to analyze their differences in a systematic way. Models also help to share this knowledge with others. A qualitative study done amongst Finnish software practitioners verifies the conclusions of other studies in the dissertation. Although processes and development methodologies are seen as an essential part of software development, the process modeling techniques are rarely used during the daily development work. However, the potential of these techniques intrigues the practitioners. As a conclusion the dissertation shows that process modeling techniques, most commonly used as tools for process engineers, can also be used as tools for organizing the daily software development work. This work presents theoretical solutions for bringing the process modeling closer to the ground-level software development activities. These theories are proven feasible by presenting several case studies where the modeling techniques are used e.g. to find differences in the work methods of the members of a software team and to share the process knowledge to a wider audience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developing software is a difficult and error-prone activity. Furthermore, the complexity of modern computer applications is significant. Hence,an organised approach to software construction is crucial. Stepwise Feature Introduction – created by R.-J. Back – is a development paradigm, in which software is constructed by adding functionality in small increments. The resulting code has an organised, layered structure and can be easily reused. Moreover, the interaction with the users of the software and the correctness concerns are essential elements of the development process, contributing to high quality and functionality of the final product. The paradigm of Stepwise Feature Introduction has been successfully applied in an academic environment, to a number of small-scale developments. The thesis examines the paradigm and its suitability to construction of large and complex software systems by focusing on the development of two software systems of significant complexity. Throughout the thesis we propose a number of improvements and modifications that should be applied to the paradigm when developing or reengineering large and complex software systems. The discussion in the thesis covers various aspects of software development that relate to Stepwise Feature Introduction. More specifically, we evaluate the paradigm based on the common practices of object-oriented programming and design and agile development methodologies. We also outline the strategy to testing systems built with the paradigm of Stepwise Feature Introduction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Because of the increased availability of different kind of business intelligence technologies and tools it can be easy to fall in illusion that new technologies will automatically solve the problems of data management and reporting of the company. The management is not only about management of technology but also the management of processes and people. This thesis is focusing more into traditional data management and performance management of production processes which both can be seen as a requirement for long lasting development. Also some of the operative BI solutions are considered in the ideal state of reporting system. The objectives of this study are to examine what requirements effective performance management of production processes have for data management and reporting of the company and to see how they are effecting on the efficiency of it. The research is executed as a theoretical literary research about the subjects and as a qualitative case study about reporting development project of Finnsugar Ltd. The case study is examined through theoretical frameworks and by the active participant observation. To get a better picture about the ideal state of reporting system simple investment calculations are performed. According to the results of the research, requirements for effective performance management of production processes are automation in the collection of data, integration of operative databases, usage of efficient data management technologies like ETL (Extract, Transform, Load) processes, data warehouse (DW) and Online Analytical Processing (OLAP) and efficient management of processes, data and roles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An empirical study was conducted in the area of software engineering to study relationships between development, testing and intended software quality. International standards served as a starting point of the study. For analysis a round of interviews was kept and transcribed. It was found that interaction between humans is critical, especially in transferring knowledge and standards’ processes. The standards are communicated through interaction and learning processes are involved before compliance. One of the results was that testing is the key to sufficient quality. The outcome was that successful interaction, sufficient testing and compliance with the standards combined with good motivation may provide most repeatable intended quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pro gradu –tutkielman ensisijaisena tavoitteena on tutkia sähköisen ta-loushallinnon kehitystä ja miten se on näkynyt alan suomalaisessa ja kansainvälisessä ammattilehtikirjoittelussa vuosina 1997 – 2013. Tarkoi-tuksena on löytää kirjoittelun perusteella sähköisen taloushallinnon kehi-tyksen edistäviä ja hidastavia tekijöitä sekä minkälaisia tulevaisuuden näkymiä artikkelit luovat sähköiselle taloushallinnolle. Lisäksi tavoitteena on löytää yhtäläisyyksiä ja eroja kansallisen ja kansainvälisen kirjoittelun välillä. Tutkielma on laadullinen tutkimus ja tutkimusmenetelminä käytetään si-sällönanalyysia, sisällönerittelyä, teemoittelua ja vertailevaa tutkimusta. Tutkielman empiirinen aineisto koostuu Tilisanomien, Balanssi–lehden ja Accountancy–lehden sähköistä taloushallintoa koskevista artikkeleista aikavälillä 1997 – 2013. Tutkimustulosten perusteella sähköisen taloushallinnon kehityskulku näkyy myös ammattilehtikirjoittelussa. Tietojärjestelmät ja niiden käytön mahdollistava lainsäädäntö ovat sähköisen taloushallinnon perusedelly-tykset. Suurimpia kehityksen edistäjiä ovat julkinen valta, verkkolaskut ja standardointi. Kehityksen hidastajina nähdään yhtenäisten standardien puute ja asenteet. Sähköisen taloushallinnon tulevaisuuden näkymiä ovat standardoidut prosessit ja käsitteet koko taloushallinnon alueella sekä viranomaisraportointi XBRL-kielen avulla. Suurimmat erot kansalli-sessa ja kansainvälisessä kehityksessä on havaittavissa verkkolaskuissa ja XBRL-raportoinnissa. Johtopäätöksenä todetaan että kirjoittelun perusteella voidaan löytää sähköisen taloushallinnon kehitystä hidastavia ja edistäviä tekijöitä ja se luo viitteitä tulevaisuuden kehitykselle. Sähköinen taloushallinto kehittyy kunkin maan valtiovallan tahtotilan ja lainsäädännön mukaan. Jatkotut-kimuskohteena tutkielmaa voisi laajentaa kansainvälisemmäksi ottamalla mukaan tarkasteluun useampia kansainvälisiä lehtiä ja kansainvälistä lainsäädäntöä.