915 resultados para Building Information Model
Resumo:
To describe the change of purchasing moving from administrative to strategic function academics have put forward maturity models which help practitioners to compare their purchasing activities to industry top performers and best practices. However, none of the models aim to distinguish the purchasing maturity from the after-sales point of view, even though after-sales activities are acknowledged as a relevant source of revenue, profit and competitive advantage in most manufacturing firms. The maturity of purchasing and supply management practices have a large impact to the overall performance of the spare parts supply chain and ultimately to the value creation and relationship building for the end customer. The research was done as a case study for a European after-sales organization which is part of a globally operating industrial firm specialized in heavy machinery. The study mapped the current state of the purchasing practices in the case organization and also distinguished the relevant areas for future development. The study was based on the purchasing maturity model developed by Schiele (2007) and investigated also how applicable is the maturity model in the spare parts supply chain context. Data for the assessment was gathered using five expert interviews inside the case organization and other parties involved in the company’s spare parts supply chain. Inventory management dimension was added to the original maturity model in order to better capture the important areas in a spare parts supply chain. The added five questions were deduced from the spare parts management literature and verified as relevant areas by the case organization’s personnel. Results indicate that largest need for development in the case organization are: better collaboration between sourcing and operative procurement functions, use of installed base information in the spare parts management, training plan development for new buyers, assessment of aligned KPI’s between the supply chain parties and better defining the role of after-sales sourcing. The purchasing maturity model used in this research worked well in H&R Leading, Controlling and Inventory Management dimensions. The assessment was more difficult to conduct in the Supplier related processes, Process integration and Organizational structure –dimensions, mainly because the assessment in these sections would for some parts require more company-wide assessment. Results indicate also that the purchasing maturity model developed by Schiele (2007) captures the relevant areas in the spare parts supply as well.
Resumo:
Diplomityö tehtiin julkisomisteiseen osakeyhtiöön, joka tuottaa julkiseen terveydenhuoltoon ja kuntien toimintaan liittyviä tieto- ja viestintäteknologian sekä lääketieteellisen tekniikan (ICMT) palveluja. Työn tavoitteena oli rakentaa yritykseen perinteisen kustannuslaskennan ja toimintolaskennan mallit, joita vertaamalla pyrittiin löytämään soveliain kustannuslaskentaratkaisu ICT-alalle. Tätä tavoitetta tuettiin haastattelututkimuksella suomalaisiin ICT-toimialan yrityksiin, heidän käytössään olevista kustannuslaskennan ja hinnoittelun menetelmistä. Teoriaosuudessa esitetään kustannustenlaskenta ja hinnoittelumenetelmiä ja käydään läpi niiden periaatteet sekä soveltuminen ICT-sektorille. Empiirisessä osuudessa kuvataan perinteisen kustannuslaskentamallin rakentaminen sekä toimintolaskentaprojektin eteneminen kohdeyrityksessä. ICT-toimialalle suoritettavan haastattelututkimuksen avulla kartoitetaan suomalaisten kärki ICT-yritysten kustannustenlaskentamenetelmiä ja toimitapoja, sekä käytössä olevia hinnoitteluperiaatteita. Perinteinen kustannuslaskennan menetelmä osoittautui tutkimuksessa soveltuvimmaksi kohdeyrityksen käyttöön. Haastattelututkimuksen tulokset tukivat tätä perinteisen kustannuslaskennan käyttöä. Pääosalla haastatteluun osallistuneista yrityksistä oli käytössään kustannusperustainen laskentajärjestelmä. Yleisin laskentamenetelmä suomalaisissa ICT-alan yhtiöissä oli katetuottolaskenta. Hinnoittelu perustui niin ikään ensisijaisesti tuotekustannusten laskentaan.
Resumo:
Vantaan Energia rakentaa ympäristövaatimukset täyttävän jätevoimalan Itä-Vantaan Långmossebergeniin. Jätevoimalassa tullaan käyttämään polttoaineena kierrätykseen kelpaamatonta syntypaikkalajiteltua yhdyskuntajätettä sekä maakaasua. Helsingin seudun ympäristöpalvelut -kuntayhtymä HSY tulee toimittamaan noin 80 % vuosittaisesta jätepolttoaineesta. Tässä työssä on esitetty toimintamalli HSY:n jätteenpolton materiaalivirtojen hallitsemiseksi. Toimintamallin tarkoituksena on antaa ohjeistus jätteiden materiaalivirtojen käsittelymenetelmistä ennen jätteenpolttolaitosta. Lisäksi toimintamallin tarkoituksena on saada vähennettyä pohjakuonan määrää. Toimintamalli sisältää ohjeistuksen kotitalouksien sekajätteen, pienjäteasemien sekajätteen, sekalaisen rakennus- ja purkujätteen sekä kaupan- ja teollisuuden jätteiden käsittelytavoista. Jätevirtojen koostumusta on selvitetty kirjallisuudesta löytyvien tietojen perusteella ja tietoja on täydennetty kesällä 2013 suoritetun lajittelututkimuksen tiedoilla. Tutkimuksen tuloksista selvisi, että pienjäteasemien sekajätteiden lajittelua tehostamalla HSY:llä pystytään tekemään merkittäviä taloudellisia säästöjä. Tutkimuksessa selvisi, että kipsilevy olisi kannattavinta kerätä omalle lavalleen pienjäteasemilla. Sekalaisen rakennus- ja purkujätteen osalta todettiin, että sitä ei kannata ohjata suoraan jätevoimalalle poltettavaksi, eikä sitä voida sijoittaa käsittelemättömänä kaatopaikalle vuoden 2020 jälkeen. Tästä syystä työssä on ehdotettu, että sekalainen rakennus- ja purkujäte ohjattaisiin lajittelulaitokselle käsiteltäväksi ennen sen loppusijoittamista. Tutkimuksen tulosten perusteella voidaan myös todeta, että toimintamallia noudattamalla, pohjakuonan määrää on mahdollista vähentää lähes puolella alkuperäisestä arviosta.
Resumo:
The capabilities and thus, design complexity of VLSI-based embedded systems have increased tremendously in recent years, riding the wave of Moore’s law. The time-to-market requirements are also shrinking, imposing challenges to the designers, which in turn, seek to adopt new design methods to increase their productivity. As an answer to these new pressures, modern day systems have moved towards on-chip multiprocessing technologies. New architectures have emerged in on-chip multiprocessing in order to utilize the tremendous advances of fabrication technology. Platform-based design is a possible solution in addressing these challenges. The principle behind the approach is to separate the functionality of an application from the organization and communication architecture of hardware platform at several levels of abstraction. The existing design methodologies pertaining to platform-based design approach don’t provide full automation at every level of the design processes, and sometimes, the co-design of platform-based systems lead to sub-optimal systems. In addition, the design productivity gap in multiprocessor systems remain a key challenge due to existing design methodologies. This thesis addresses the aforementioned challenges and discusses the creation of a development framework for a platform-based system design, in the context of the SegBus platform - a distributed communication architecture. This research aims to provide automated procedures for platform design and application mapping. Structural verification support is also featured thus ensuring correct-by-design platforms. The solution is based on a model-based process. Both the platform and the application are modeled using the Unified Modeling Language. This thesis develops a Domain Specific Language to support platform modeling based on a corresponding UML profile. Object Constraint Language constraints are used to support structurally correct platform construction. An emulator is thus introduced to allow as much as possible accurate performance estimation of the solution, at high abstraction levels. VHDL code is automatically generated, in the form of “snippets” to be employed in the arbiter modules of the platform, as required by the application. The resulting framework is applied in building an actual design solution for an MP3 stereo audio decoder application.
Resumo:
This study is a qualitative action research by its nature with elements of personal design in the form of a tangible model implementation framework construction. Utilized empirical data has been gathered via two questionnaires in relation to the arranged four workshop events with twelve individual participants. Five of them represented maintenance customers, three maintenance service providers and four equipment providers respectively. Further, there are two main research objectives in proportion to the two complementary focusing areas of this thesis. Firstly, the value-based life-cycle model, which first version has already been developed prior to this thesis, requires updating in order to increase its real-life applicability as an inter-firm decision-making tool in industrial maintenance. This first research objective is fulfilled by improving appearance, intelligibility and usability of the above-mentioned model. In addition, certain new features are also added. The workshop participants from the collaborating companies were reasonably pleased with made changes, although further attention will be required in future on the model’s intelligibility in particular as main results, charts and values were all reckoned as slightly hard to understand. Moreover, upgraded model’s appearance and added new features satisfied them the most. Secondly and more importantly, the premises of the model’s possible inter-firm implementation process need to be considered. This second research objective is delivered in two consecutive steps. At first, a bipartite open-books supported implementation framework is created and its different characteristics discussed in theory. Afterwards, the prerequisites and the pitfalls of increasing inter-organizational information transparency are studied in empirical context. One of the main findings was that the organizations are not yet prepared for network-wide information disclosure as dyadic collaboration was favored instead. However, they would be willing to share information bilaterally at least. Another major result was that the present state of companies’ cost accounting systems will definitely need implementation-wise enhancing in future since accurate and sufficiently detailed maintenance data is not available. Further, it will also be crucial to create supporting and mutually agreed network infrastructure. There are hardly any collaborative models, methods or tools currently in usage. Lastly, the essential questions about mutual trust and predominant purchasing strategies are cooperation-wise important. If inter-organizational activities are expanded, a more relational approach should be favored in this regard. Mutual trust was also recognized as a significant cooperation factor, but it is hard to measure in reality.
Resumo:
The goal of this thesis is to build a viral marketing management framework for a Finnish medium sized gaming company. This is achieved by first finding and building a theoretical five step management process framework based on literature, analyzing current model and giving recommendations for the case company to develop its own management process. In addition, viral marketing research is still in early stage resulting this study to propose its own take on the definition in the theory part. Empirical part is based on qualitative interviews, campaign material and secondary sources and is aimed to find out and analyze the case company’s current viral marketing state and to give recommendations to it. The final outcome of the study is a general, theoretical management framework for viral marketing campaigns and specified recommendations for the case company.
Resumo:
This pro gradu –thesis discusses generating competitive advantage through competitor information systems. The structure of this thesis follows the structure of the WCA model by Alter (1996). In the WCA model, business process is influenced by three separate but connected elements: information, technology, and process participants. The main research question is how competitor information can be incorporated into or made into a tool creating competitive advantage. Research subquestions are: How does competitor information act as a part of the business process creating competitive advantage? How is a good competitor information system situated and structured in an organisation? How can management help information generate competitive advantage in the business process with participants, information, and technology? This thesis discusses each of the elements separate, but the elements are connected to each other and to competitive advantage. Information is discussed by delving into competitor information and competitor analysis. Competitive intelligence and competitor analysis requires commitment throughout the organisation, including top management, the desire to perform competitive intelligence and the desire to use the end products of that competitive intelligence. In order to be successful, systematic competitive intelligence and competitor analysis require vision, willingness to strive for the goals set, and clear strategies to proceed. Technology is discussed by taking a look into the function of the competitor information systems play and the place they occupy within an organization. In addition, there is discussion about the basic infrastructure of competitor information systems, and the problems competitor information systems can have plaguing them. In order for competitor information systems to be useful and worthy of the resources it takes to develop and maintain them, competitor information systems require on-going resource allocation and high quality information. In order for competitor information systems justify their existence business process participants need to maintain and utilize competitor information systems on all levels. Business process participants are discussed through management practices. This thesis discusses way to manage information, technology, and process participants, when the goal is to generate competitive advantage through competitor information systems. This is possible when information is treated as a resource with value, technology requires strategy in order to be successful within an organization, and process participants are an important resource. Generating competitive advantage through competitor information systems is possible when the elements of information, technology, and business process participants all align advantageously.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.
Resumo:
Even though e-commerce systems are expected to have many advantages compared to the traditional ways of doing business, it is not always the reality. Lack of trust is still said to be one of the most important barriers to online shopping. In traditional stores, trust has usually been established in a direct contact between the customer and the company or its personnel. In online stores, there is no direct interaction. The purpose of this thesis is to identify the key antecedents to online trust and to distinguish between effective and ineffective practices. A model on how consumers establish initial trust towards an unknown online vendor was proposed based on previous theories. The model was tested empirically by targeting an online survey at higher degree students in Finland and in Germany. The data confirmed the proposed view that trusting intentions are affected by individual characteristics, characteristics of the company as well as characteristics of the website. Additionally national differences were found between Finnish and German respondents. The data suggested that online vendors can convey a message of trustworthiness by improving information quality and overall usefulness of the website. Perceived risk of online shopping was found to depend especially on general trust in the Internet, service quality and ease of use. A trustworthy online store should include several payment methods as well as means to access and modify given data. The vendors should also make sure that inquiries are addressed quickly, transactions are confirmed automatically and that customers have a possibility to track their order. A model that includes three different sources of trust should contribute to the theoretical understanding of trust formation in online stores. The resulting list of trust antecedents can also be used as a checklist when e-commerce practitioners wish to optimize the trust building.
Resumo:
BCM (business continuity Management) is a holistic management process aiming at ensuring business continuity and building organizational resilience. Maturity models offer organizations a tool for evaluating their current maturity in a certain process. In the recent years BCM has been subject to international ISO standardization, while the interest of organizations to bechmark their state of BCM agains standards and the use of maturity models for these asessments has increased. However, although new standards have been introduced, very little attention has been paid to reviewing the existing BCM maturity models in research - especially in the light of the new ISO 22301 standard for BCM. In this thesis the existing BCM maturily models are carefully evaluated to determine whetherthey could be improved. In order to accomplish this, the compliance of the existing models to the ISO 22301 standard is measured and a framework for assessing a maturitymodel´s quality is defined. After carefully evaluating the existing frameworks for maturity model development and evaluation, an approach suggested by Becker et al. (2009) was chosen as the basis for the research. An additionto the procedural model a set of seven research guidelines proposed by the same authors was applied, drawing on the design-science research guidelines as suggested by Hevner et al. (2004). Furthermore, the existing models´ form and function was evaluated to address their usability. Based on the evaluation of the existing BCM maturity models, the existing models were found to have shortcomings in each dimension of the evaluation. Utilizing the best of the existing models, a draft version for an enhanced model was developed. This draft model was then iteratively developed by conducting six semi-structured interviews with BCM professionals in finland with the aim of validating and improving it. As a Result, a final version of the enhanced BCM maturity model was developed, conforming to the seven key clauses in the ISO 22301 standard and the maturity model development guidelines suggested by Becker et al. (2009).
Resumo:
The purpose of this exploratory research is to study the role of emotional branding in building brand personality. The research is conducted from the perspective of the consumer, more specifically the Finnish Generation Y females. The aim of the thesis is to gain insights and understanding on the key concepts and contribute to the Generation Y literature. In addition, the research examines the effect of certain cultural implications on the process of building brand personality. The research was conducted as an embedded single-case study, in which qualitative data was collected through semi-structured interviews with a sample of six consumers and personal observation within one of the concept stores of the case company. In order to triangulate the data, secondary sources were utilized to gain more information about the case company. The results indicated a connection between emotional branding and the formulation of brand personality, which can be manipulated according to the brand personality drivers. Congruence with consumer self-conceptualization and set of values were discovered to strengthen the emotional bonding. As the end result, the research was able to clarify the process-thinking behind emotional branding.
Resumo:
This thesis was carried out as a case study of a company YIT in order to clarify the sev-erest risks for the company and to build a method for project portfolio evaluation. The target organization creates new living environment by constructing residential buildings, business premises, infrastructure and entire areas worth for EUR 1.9 billion in the year 2013. Company has noted project portfolio management needs more information about the structure of project portfolio and possible influences of market shock situation. With interviews have been evaluated risks with biggest influence and most appropriate metrics to examine. The major risks for the company were evaluated by interviewing the executive staff. At the same time, the most appropriate risk metrics were considered. At the moment sales risk was estimated to have biggest impact on company‟s business. Therefore project port-folio evaluation model was created and three different scenarios for company‟s future were created in order to identify the scale of possible market shock situation. The created model is tested with public and descriptive figures of YIT in a one-year-long market shock and the impact on different metrics was evaluated. Study was conducted using con-structive research methodology. Results indicate that company has notable sales risk in certain sections of business portfolio.