6 resultados para conventional model

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research report illustrates and examines new operation models for decreasing fixed costs and transforming them into variable costs in the field of paper industry. The report illustrates two cases - a new operation model for material logistics in maintenance and an examination of forklift truck fleet outsourcing solutions. Conventional material logistics in maintenance operation is illustrated and some problems related to conventional operation are identified. A new operation model that solves some of these problems is presented including descriptions of procurement and service contracts and sources of added value. Forklift truck fleet outsourcing solutions are examined by illustrating the responsibilities of a host company and a service provider both before and after outsourcing. The customer buys outsourcingservices in order to improve its investment productivity. The mechanism of how these services affect the customer company's investment productivity is illustrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis concentrates on developing a practical local approach methodology based on micro mechanical models for the analysis of ductile fracture of welded joints. Two major problems involved in the local approach, namely the dilational constitutive relation reflecting the softening behaviour of material, and the failure criterion associated with the constitutive equation, have been studied in detail. Firstly, considerable efforts were made on the numerical integration and computer implementation for the non trivial dilational Gurson Tvergaard model. Considering the weaknesses of the widely used Euler forward integration algorithms, a family of generalized mid point algorithms is proposed for the Gurson Tvergaard model. Correspondingly, based on the decomposition of stresses into hydrostatic and deviatoric parts, an explicit seven parameter expression for the consistent tangent moduli of the algorithms is presented. This explicit formula avoids any matrix inversion during numerical iteration and thus greatly facilitates the computer implementation of the algorithms and increase the efficiency of the code. The accuracy of the proposed algorithms and other conventional algorithms has been assessed in a systematic manner in order to highlight the best algorithm for this study. The accurate and efficient performance of present finite element implementation of the proposed algorithms has been demonstrated by various numerical examples. It has been found that the true mid point algorithm (a = 0.5) is the most accurate one when the deviatoric strain increment is radial to the yield surface and it is very important to use the consistent tangent moduli in the Newton iteration procedure. Secondly, an assessment of the consistency of current local failure criteria for ductile fracture, the critical void growth criterion, the constant critical void volume fraction criterion and Thomason's plastic limit load failure criterion, has been made. Significant differences in the predictions of ductility by the three criteria were found. By assuming the void grows spherically and using the void volume fraction from the Gurson Tvergaard model to calculate the current void matrix geometry, Thomason's failure criterion has been modified and a new failure criterion for the Gurson Tvergaard model is presented. Comparison with Koplik and Needleman's finite element results shows that the new failure criterion is fairly accurate indeed. A novel feature of the new failure criterion is that a mechanism for void coalescence is incorporated into the constitutive model. Hence the material failure is a natural result of the development of macroscopic plastic flow and the microscopic internal necking mechanism. By the new failure criterion, the critical void volume fraction is not a material constant and the initial void volume fraction and/or void nucleation parameters essentially control the material failure. This feature is very desirable and makes the numerical calibration of void nucleation parameters(s) possible and physically sound. Thirdly, a local approach methodology based on the above two major contributions has been built up in ABAQUS via the user material subroutine UMAT and applied to welded T joints. By using the void nucleation parameters calibrated from simple smooth and notched specimens, it was found that the fracture behaviour of the welded T joints can be well predicted using present methodology. This application has shown how the damage parameters of both base material and heat affected zone (HAZ) material can be obtained in a step by step manner and how useful and capable the local approach methodology is in the analysis of fracture behaviour and crack development as well as structural integrity assessment of practical problems where non homogeneous materials are involved. Finally, a procedure for the possible engineering application of the present methodology is suggested and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research report illustrates and examines new operation models for decreasing fixed costs and transforming them into variable costs in the field of paper industry. The report illustrates two cases – a new operation model for material logistics in maintenance and an examination of forklift truck fleet outsourcing solutions. Conventional material logistics in maintenance operation is illustrated and some problems related to conventional operation are identified. A new operation model that solves some of these problems is presented including descriptions of procurement and service contracts and sources of added value. Forklift truck fleet outsourcing solutions are examined by illustrating the responsibilities of a host company and a service provider both before and after outsourcing. The customer buys outsourcing services in order to improve its investment productivity. The mechanism of how these services affect the customer company’s investment productivity is illustrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technological innovations and the advent of digitalization have led retail business into one of its biggest transformations of all time. Consumer behaviour has changed rapidly and the customers are ever more powerful, demanding, tech-savvy and moving on various plat-forms. These attributes will continue to drive the development and robustly restructure the architecture of value creation in the retail business. The largest retail category, grocery yet awaits for a real disruption, but the signals for major change are already on the horizon. The first wave of online grocery retail was introduced in the mid 1990’s and it throve until millennium. Many overreactions, heavy investments and the burst IT-bubble almost stag-nated the whole industry for a long period of time. The second wave started with a venge-ance around 2010. Some research was carried out during the first wave from a single-viewpoint of online grocery retail, but without a comprehensive approach to online-offline business model integration. Now the accelerating growth of e-business has initiated an increased interest to examine the transformation from traditional business models towards e-business models and their integration on the companies’ traditional business models. This research strove to examine how can we recognize and analyze how digitalization and online channels are affecting the business models of grocery retail, by using business mod-el canvas as an analysis tool. Furthermore business model innovation and omnichannel retail were presented and suggested as potential solutions for these changes. 21 experts in online grocery industry were being interviewed. The thoughts of the informants were being qualitatively analysed by using an analysis tool called the business model canvas. The aim of this research was to portray a holistic view on the Omnichannel grocery retail business model, and the value chain, in which the case company Arina along with its partners are operating. The key conclusions exhibited that online grocery retail business model is not an alterna-tive model nor a substitute for the traditional grocery retail business model, though all of the business model elements are to some extent affected by it, but rather a complementary business model that should be integrated into the prevailing, conventional grocery retail business model. A set of business model elements, such as value proposition and distribu-tion channels were recognized as the most important ones and sources of innovation within these components were being illustrated. Segments for online grocery retail were empiri-cally established as polarized niche markets in contrast of the segmented mass-market of the conventional grocery retail. Business model innovation was proven to be a considera-ble method and a conceptual framework, by which to come across with new value proposi-tions that create competitive advantage for the company in the contemporary, changing business environment. Arina as a retailer can be considered as a industry model innovator, since it has initiated an entire industry in its market area, where other players have later on embarked on, and in which the contributors of the value chain, such as Posti depend on it to a great extent. Consumer behaviour clearly affects and appears everywhere in the digi-talized grocery trade and it drives customers to multiple platforms where retailers need to be present. Omnichannel retail business model was suggested to be the solution, in which the new technologies are being utilized, contemporary consumer behaviour is embedded in decision-making and all of the segments and their value propositions are being served seamlessly across the channels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Phenomena in cyber domain, especially threats to security and privacy, have proven an increasingly heated topic addressed by different writers and scholars at an increasing pace – both nationally and internationally. However little public research has been done on the subject of cyber intelligence. The main research question of the thesis was: To what extent is the applicability of cyber intelligence acquisition methods circumstantial? The study was conducted in sequential a manner, starting with defining the concept of intelligence in cyber domain and identifying its key attributes, followed by identifying the range of intelligence methods in cyber domain, criteria influencing their applicability, and types of operatives utilizing cyber intelligence. The methods and criteria were refined into a hierarchical model. The existing conceptions of cyber intelligence were mapped through an extensive literature study on a wide variety of sources. The established understanding was further developed through 15 semi-structured interviews with experts of different backgrounds, whose wide range of points of view proved to substantially enhance the perspective on the subject. Four of the interviewed experts participated in a relatively extensive survey based on the constructed hierarchical model on cyber intelligence that was formulated in to an AHP hierarchy and executed in the Expert Choice Comparion online application. It was concluded that Intelligence in cyber domain is an endorsing, cross-cutting intelligence discipline that adds value to all aspects of conventional intelligence and furthermore that it bears a substantial amount of characteristic traits – both advantageous and disadvantageous – and furthermore that the applicability of cyber intelligence methods is partly circumstantially limited.