1000 resultados para 1456


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tutkielma vanhoista sääntöjärjestelmistä uudessa digitaalisessa ympäristössä. Teos laajentaa oikeudellisia kysymyksiä kultuurisiin, taloudellisiin ja yhteisöllisiin taustoihin selventäen sekä menneisyyttä että tulevaisuuden haasteita ja monikansallista sääntörjäestelmää. Teos on osoitettu etenkin opiskelijoille mutta se osallistuu lisäksi meneillään olevaan keskusteluun informaation sääntelynkeinoista.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation studies the technical quality assessment of a printing paper. The theoretical framework basis on the customer value hierarchy concept and the product integrity concept. The experimental part of the research was divided into two phases: Interviews of the publishers and the printers and the testing of the technical quality of selected consumer magazines and a set of selected white paper samples. The revenue coming from the advertising and the revenue coming from the copy sales arethe most important factors of efficiency of the publisher. They form the highest level in the customer value hierarchy of the publisher. A printed product is profiled according to the target group and product segment. There is no absolute level of good printed quality. It can be studied only in context of the requirements set on the printed product. Publishers quality expectations are basic elements of external product integrity. The most important elements of efficiency of the printer can be summarised to reaching high production efficiency in order toattain good profitability and competitive delivery times. Printers' factors of efficiency base on the customers expectations on the consequences in use situation in the customer value hierarchy. They form the basis of internal productintegrity. The use of purely technical testing to classify printed products according to the customers' expectations proved to be only indicative at its best. The information gathered from the interviews was documented and sorted withthe help of the QFD-technique. The technical quality of two different coated paper grades were assessed based on the customer expectations and based on the best achievable quality. When customer requirements are used for the basis of assessing the technical quality of printing papers the order from best to worst is different than when comparing the papers just based on the best achievable quality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The active magnetic bearings have recently been intensively developed because of noncontact support having several advantages compared to conventional bearings. Due to improved materials, strategies of control, and electrical components, the performance and reliability of the active magnetic bearings are improving. However, additional bearings, retainer bearings, still have a vital role in the applications of the active magnetic bearings. The most crucial moment when the retainer bearings are needed is when the rotor drops from the active magnetic bearings on the retainer bearings due to component or power failure. Without appropriate knowledge of the retainer bearings, there is a chance that an active magnetic bearing supported rotor system will be fatal in a drop-down situation. This study introduces a detailed simulation model of a rotor system in order to describe a rotor drop-down situation on the retainer bearings. The introduced simulation model couples a finite element model with component mode synthesis and detailed bearing models. In this study, electrical components and electromechanical forces are not in the focus. The research looks at the theoretical background of the finite element method with component mode synthesis that can be used in the dynamic analysis of flexible rotors. The retainer bearings are described by using two ball bearing models, which include damping and stiffness properties, oil film, inertia of rolling elements and friction between races and rolling elements. Thefirst bearing model assumes that the cage of the bearing is ideal and that the cage holds the balls in their predefined positions precisely. The second bearing model is an extension of the first model and describes the behavior of the cageless bearing. In the bearing model, each ball is described by using two degrees of freedom. The models introduced in this study are verified with a corresponding actual structure. By using verified bearing models, the effects of the parameters of the rotor system onits dynamics during emergency stops are examined. As shown in this study, the misalignment of the retainer bearings has a significant influence on the behavior of the rotor system in a drop-down situation. In this study, a stability map of the rotor system as a function of rotational speed of the rotor and the misalignment of the retainer bearings is presented. In addition, the effects of parameters of the simulation procedure and the rotor system on the dynamics of system are studied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tässä väitöstutkimuksessa tutkittiin fysikaaliskemiallisten olosuhteiden ja toimintaparametrien vaikutusta juustoheran fraktiointiin. Kirjallisuusosassa on käsitelty heran ympäristövaikutusta, heran hyödyntämistä ja heran käsittelyä kalvotekniikalla. Kokeellinen osa on jaettu kahteen osaan, joista ensimmäinen käsittelee ultrasuodatusta ja toinen nanosuodatusta juustoheran fraktioinnissa. Ultrasuodatuskalvon valinta tehtiin perustuen kalvon cut-off lukuun, joka oli määritetty polyetyleeniglykoliliuoksilla olosuhteissa, joissa konsentraatiopolariosaatioei häiritse mittausta. Kriittisen vuon konseptia käytettiin sopivan proteiinikonsentraation löytämiseksi ultrasuodatuskokeisiin, koska heraproteiinit ovat tunnetusti kalvoa likaavia aineita. Ultrasuodatuskokeissa tutkittiin heran eri komponenttien suodattumista kalvon läpi ja siihen vaikuttavia ominaisuuksia. Herapermeaattien peptidifraktiot analysoitiin kokoekskluusiokromatografialla ja MALDI-TOF massaspektrometrillä. Kokeissa käytettävien nanosuodatuskalvojen keskimääräinen huokoskoko analysoitiin neutraaleilla liukoisilla aineilla ja zeta-potentiaalit virtauspotentiaalimittauksilla. Aminohappoja käytettiin malliaineina tutkittaessa huokoskoon ja varauksen merkitystä erotuksessa. Aminohappojen retentioon vaikuttivat pH ja liuoksen ionivahvuus sekä molekyylien väliset vuorovaikutukset. Heran ultrasuodatuksessa tuotettu permeaatti, joka sisälsi pieniä peptidejä, laktoosia ja suoloja, nanosuodatettiin happamassa ja emäksisessä pH:ssa. Emäksisissä oloissa tehdyssä nanosuodatuksessa foulaantumista tapahtui vähemmän ja permeaattivuo oli parempi. Emäksisissä oloissa myös selektiivisyys laktoosin erotuksessa peptideistä oli parempi verrattuna selektiivisyyteen happamissa oloissa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite the rapid change in today's business environment there are relatively few studies about corporate renewal. This study aims for its part at filling that research gap by studying the concepts of strategy, corporate renewal, innovation and corporate venturing. Its purpose is to enhance our understanding of how established companies operating in dynamic and global environment can benefit from their corporate venturing activities. The theoretical part approaches the research problem in corporate and venture levels. Firstly, it focuses on mapping the determinants of strategy and suggests using industry, location, resources, knowledge, structure and culture, market, technology and business model to assess the environment and using these determinants to optimize speed and magnitude of change.Secondly, it concludes that the choice of innovation strategy is dependent on the type and dimensions of innovation and suggests assessing market, technology, business model as well as novelty and complexity related to each of them for choosing an optimal context for developing innovations further. Thirdly, it directsattention on processes through which corporate renewal takes place. On corporate level these processes are identified as strategy formulation, strategy formation and strategy implementation. On the venture level the renewal processes are identified as learning, leveraging and nesting. The theoretical contribution of this study, the framework of strategic corporate venturing, joins corporate and venture level management issues together and concludes that strategy processes and linking processes are the mechanism through which continuous corporate renewaltakes place. The framework of strategic corporate venturing proposed by this study is a new way to illustrate the role of corporate venturing as a purposefullybuilt, different view of a company's business environment. The empirical part extended the framework by enhancing our understanding of the link between corporate renewal and corporate venturing in its real life environment in three Finnish companies: Metso, Nokia and TeliaSonera. Characterizing companies' environmentwith the determinants of strategy identified in this study provided a structured way to analyze their competitive position and renewal challenges that they arefacing. More importantly the case studies confirmed that a link between corporate renewal and corporate venturing exists and found out that the link is not as straight forward as indicated by the theory. Furthermore, the case studies enhanced the framework by indicating a sequence according to which the processes work. Firstly, the induced strategy processes strategy formulation and strategy implementation set the scene for corporate venturing context and management processes and leave strategy formation for the venture. Only after that can strategies formed by ventures come back to the corporate level - and if found viable in the corporate level be formalized through formulation and implementation. With the help of the framework of strategic corporate venturing the link between corporaterenewal and corporate venturing can be found and managed. The suggested response to the continuous need for change is continuous renewal i.e. institutionalizing corporate renewal in the strategy processes of the company. As far as benefiting from venturing is concerned the answer lies in deliberately managing venturing in a context different to the mainstream businesses and establishing efficientlinking processes to exploit the renewal potential of individual ventures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis, the sorption and elastic properties of the cation-exchange resins were studied to explain the liquid chromatographic separation of carbohydrates. Na+, Ca2+ and La3+ form strong poly(styrene-co-divinylbenzene) (SCE) as well as Na+ and Ca2+ form weak acrylic (WCE) cation-exchange resins at different cross-link densities were treated within this work. The focus was on the effects of water-alcohol mixtures, mostly aqueous ethanol, and that of the carbohydrates. The carbohydrates examined were rhamnose, xylose, glucose, fructose, arabinose, sucrose, xylitol and sorbitol. In addition to linear chromatographic conditions, non-linear conditions more typical for industrial applications were studied. Both experimental and modeling aspectswere covered. The aqueous alcohol sorption on the cation-exchangers were experimentally determined and theoretically calculated. The sorption model includes elastic parameters, which were obtained from sorption data combined with elasticity measurements. As hydrophilic materials cation-exchangers are water selective and shrink when an organic solvent is added. At a certain deswelling degree the elastic resins go through glass transition and become as glass-like material. Theincreasing cross-link level and the valence of the counterion decrease the sorption of solvent components in the water-rich solutions. The cross-linkage or thecounterions have less effect on the water selectivity than the resin type or the used alcohol. The amount of water sorbed is higher in the WCE resin and, moreover, the WCE resin is more water selective than the corresponding SCE resin. Theincreased aliphatic part of lower alcohols tend to increase the water selectivity, i.e. the resins are more water selective in 2-propanol than in ethanol solutions. Both the sorption behavior of carbohydrates and the sorption differences between carbohydrates are considerably affected by the eluent composition and theresin characteristics. The carbohydrate sorption was experimentally examined and modeled. In all cases, sorption and moreover the separation of carbohydrates are dominated by three phenomena: partition, ligand exchange and size exclusion. The sorption of hydrophilic carbohydrates increases when alcohol is added into the eluent or when carbohydrate is able to form coordination complexes with the counterions, especially with multivalent counterions. Decreasing polarity of the eluent enhances the complex stability. Size exclusion effect is more prominent when the resin becomes tighter or carbohydrate size increases. On the other hand,the elution volumes between different sized carbohydrates decreases with the decreasing polarity of the eluent. The chromatographic separation of carbohydrateswas modeled, using rhamnose and xylose as target molecules. The thermodynamic sorption model was successfully implemented in the rate-based column model. The experimental chromatographic data were fitted by using only one adjustable parameter. In addition to the fitted data also simulated data were generated and utilized in explaining the effect of the eluent composition and of the resin characteristics on the carbohydrate separation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Woven monofilament, multifilament, and spun yarn filter media have long been the standard media in liquid filtration equipment. While the energy for a solid-liquid separation process is determined by the engineering work, it is the interface between the slurry and the equipment - the filter media - that greatly affects the performance characteristics of the unit operation. Those skilled in the art are well aware that a poorly designed filter medium may endanger the whole operation, whereas well-performing filter media can make the operation smooth and economical. As the mineral and pulp producers seek to produce ever finer and more refined fractions of their products, it is becoming increasingly important to be able to dewater slurries with average particle sizes around 1 ¿m using conventional, high-capacity filtration equipment. Furthermore, the surface properties of the media must not allow sticky and adhesive particles to adhere to the media. The aim of this thesis was to test how the dirt-repellency, electrical resistance and highpressure filtration performance of selected woven filter media can be improved by modifying the fabric or yarn with coating, chemical treatment and calendering. The results achieved by chemical surface treatments clearly show that the woven media surface properties can be modified to achieve lower electrical resistance and improved dirt-repellency. The main challenge with the chemical treatments is the abrasion resistance and, while the experimental results indicate that the treatment is sufficiently permanent to resist standard weathering conditions, they may still prove to be inadequately strong in terms of actual use.From the pressure filtration studies in this work, it seems obvious that the conventional woven multifilament fabrics still perform surprisingly well against the coated media in terms of filtrate clarity and cake build-up. Especially in cases where the feed slurry concentration was low and the pressures moderate, the conventional media seemed to outperform the coated media. In the cases where thefeed slurry concentration was high, the tightly woven media performed well against the monofilament reference fabrics, but seemed to do worse than some of the coated media. This result is somewhat surprising in that the high initial specific resistance of the coated media would suggest that the media will blind more easily than the plain woven media. The results indicate, however, that it is actually the woven media that gradually clogs during the coarse of filtration. In conclusion, it seems obvious that there is a pressure limit above which the woven media looses its capacity to keep the solid particles from penetrating the structure. This finding suggests that for extreme pressures the only foreseeable solution is the coated fabrics supported by a strong enough woven fabric to hold thestructure together. Having said that, the high pressure filtration process seems to follow somewhat different laws than the more conventional processes. Based on the results, it may well be that the role of the cloth is most of all to support the cake, and the main performance-determining factor is a long life time. Measuring the pore size distribution with a commercially available porometer gives a fairly accurate picture of the pore size distribution of a fabric, but failsto give insight into which of the pore sizes is the most important in determining the flow through the fabric. Historically air, and sometimes water, permeability measures have been the standard in evaluating media filtration performance including particle retention. Permeability, however, is a function of a multitudeof variables and does not directly allow the estimation of the effective pore size. In this study a new method for estimating the effective pore size and open pore area in a densely woven multifilament fabric was developed. The method combines a simplified equation of the electrical resistance of fabric with the Hagen-Poiseuille flow equation to estimate the effective pore size of a fabric and the total open area of pores. The results are validated by comparison to the measured values of the largest pore size (Bubble point) and the average pore size. The results show good correlation with measured values. However, the measured and estimated values tend to diverge in high weft density fabrics. This phenomenon is thought to be a result of a more tortuous flow path of denser fabrics, and could most probably be cured by using another value for the tortuosity factor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents a topological approach to studying fuzzy setsby means of modifier operators. Modifier operators are mathematical models, e.g., for hedges, and we present briefly different approaches to studying modifier operators. We are interested in compositional modifier operators, modifiers for short, and these modifiers depend on binary relations. We show that if a modifier depends on a reflexive and transitive binary relation on U, then there exists a unique topology on U such that this modifier is the closure operator in that topology. Also, if U is finite then there exists a lattice isomorphism between the class of all reflexive and transitive relations and the class of all topologies on U. We define topological similarity relation "≈" between L-fuzzy sets in an universe U, and show that the class LU/ ≈ is isomorphic with the class of all topologies on U, if U is finite and L is suitable. We consider finite bitopological spaces as approximation spaces, and we show that lower and upper approximations can be computed by means of α-level sets also in the case of equivalence relations. This means that approximations in the sense of Rough Set Theory can be computed by means of α-level sets. Finally, we present and application to data analysis: we study an approach to detecting dependencies of attributes in data base-like systems, called information systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The building industry has a particular interest in using clinching as a joining method for frame constructions of light-frame housing. Normally many clinch joints are required in joining of frames.In order to maximise the strength of the complete assembly, each clinch joint must be as sound as possible. Experimental testing is the main means of optimising a particular clinch joint. This includes shear strength testing and visual observation of joint cross-sections. The manufacturers of clinching equipment normally perform such experimental trials. Finite element analysis can also be used to optimise the tool geometry and the process parameter, X, which represents the thickness of the base of the joint. However, such procedures require dedicated software, a skilled operator, and test specimens in order to verify the finite element model. In addition, when using current technology several hours' computing time may be necessary. The objective of the study was to develop a simple calculation procedure for rapidly establishing an optimum value for the parameter X for a given tool combination. It should be possible to use the procedure on a daily basis, without stringent demands on the skill of the operator or the equipment. It is also desirable that the procedure would significantly decrease thenumber of shear strength tests required for verification. The experimental workinvolved tests in order to obtain an understanding of the behaviour of the sheets during clinching. The most notable observation concerned the stage of the process in which the upper sheet was initially bent, after which the deformation mechanism changed to shearing and elongation. The amount of deformation was measured relative to the original location of the upper sheet, and characterised as the C-measure. By understanding in detail the behaviour of the upper sheet, it waspossible to estimate a bending line function for the surface of the upper sheet. A procedure was developed, which makes it possible to estimate the process parameter X for each tool combination with a fixed die. The procedure is based on equating the volume of material on the punch side with the volume of the die. Detailed information concerning the behaviour of material on the punch side is required, assuming that the volume of die does not change during the process. The procedure was applied to shear strength testing of a sample material. The sample material was continuously hot-dip zinc-coated high-strength constructional steel,with a nominal thickness of 1.0 mm. The minimum Rp0.2 proof stress was 637 N/mm2. Such material has not yet been used extensively in light-frame housing, and little has been published on clinching of the material. The performance of the material is therefore of particular interest. Companies that use clinching on a daily basis stand to gain the greatest benefit from the procedure. By understanding the behaviour of sheets in different cases, it is possible to use data at an early stage for adjusting and optimising the process. In particular, the functionality of common tools can be increased since it is possible to characterise the complete range of existing tools. The study increases and broadens the amount ofbasic information concerning the clinching process. New approaches and points of view are presented and used for generating new knowledge.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The changing business environment demands that chemical industrial processes be designed such that they enable the attainment of multi-objective requirements and the enhancement of innovativedesign activities. The requirements and key issues for conceptual process synthesis have changed and are no longer those of conventional process design; there is an increased emphasis on innovative research to develop new concepts, novel techniques and processes. A central issue, how to enhance the creativity of the design process, requires further research into methodologies. The thesis presentsa conflict-based methodology for conceptual process synthesis. The motivation of the work is to support decision-making in design and synthesis and to enhance the creativity of design activities. It deals with the multi-objective requirements and combinatorially complex nature of process synthesis. The work is carriedout based on a new concept and design paradigm adapted from Theory of InventiveProblem Solving methodology (TRIZ). TRIZ is claimed to be a `systematic creativity' framework thanks to its knowledge based and evolutionary-directed nature. The conflict concept, when applied to process synthesis, throws new lights on design problems and activities. The conflict model is proposed as a way of describing design problems and handling design information. The design tasks are represented as groups of conflicts and conflict table is built as the design tool. The general design paradigm is formulated to handle conflicts in both the early and detailed design stages. The methodology developed reflects the conflict nature of process design and synthesis. The method is implemented and verified through case studies of distillation system design, reactor/separator network design and waste minimization. Handling the various levels of conflicts evolve possible design alternatives in a systematic procedure which consists of establishing an efficient and compact solution space for the detailed design stage. The approach also provides the information to bridge the gap between the application of qualitative knowledge in the early stage and quantitative techniques in the detailed design stage. Enhancement of creativity is realized through the better understanding of the design problems gained from the conflict concept and in the improvement in engineering design practice via the systematic nature of the approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The past few decades have seen a considerable increase in the number of parallel and distributed systems. With the development of more complex applications, the need for more powerful systems has emerged and various parallel and distributed environments have been designed and implemented. Each of the environments, including hardware and software, has unique strengths and weaknesses. There is no single parallel environment that can be identified as the best environment for all applications with respect to hardware and software properties. The main goal of this thesis is to provide a novel way of performing data-parallel computation in parallel and distributed environments by utilizing the best characteristics of difference aspects of parallel computing. For the purpose of this thesis, three aspects of parallel computing were identified and studied. First, three parallel environments (shared memory, distributed memory, and a network of workstations) are evaluated to quantify theirsuitability for different parallel applications. Due to the parallel and distributed nature of the environments, networks connecting the processors in these environments were investigated with respect to their performance characteristics. Second, scheduling algorithms are studied in order to make them more efficient and effective. A concept of application-specific information scheduling is introduced. The application- specific information is data about the workload extractedfrom an application, which is provided to a scheduling algorithm. Three scheduling algorithms are enhanced to utilize the application-specific information to further refine their scheduling properties. A more accurate description of the workload is especially important in cases where the workunits are heterogeneous and the parallel environment is heterogeneous and/or non-dedicated. The results obtained show that the additional information regarding the workload has a positive impact on the performance of applications. Third, a programming paradigm for networks of symmetric multiprocessor (SMP) workstations is introduced. The MPIT programming paradigm incorporates the Message Passing Interface (MPI) with threads to provide a methodology to write parallel applications that efficiently utilize the available resources and minimize the overhead. The MPIT allows for communication and computation to overlap by deploying a dedicated thread for communication. Furthermore, the programming paradigm implements an application-specific scheduling algorithm. The scheduling algorithm is executed by the communication thread. Thus, the scheduling does not affect the execution of the parallel application. Performance results achieved from the MPIT show that considerable improvements over conventional MPI applications are achieved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this thesis was to produce information for the estimation of the flow balance of wood resin in mechanical pulping and to demonstrate the possibilities for improving the efficiency of deresination in practice. It was observed that chemical changes in wood resin take place only during peroxide bleaching, a significant amount of water dispersed wood resin is retained in the pulp mat during dewatering and the amount of wood resin in the solid phase of the process filtrates is very small. On the basis of this information there exist three parameters related to behaviour of wood resin that determine the flow balance in the process: 1. The liberation of wood resin to the pulp water phase 2. Theretention of water dispersed wood resin in dewatering 3. The proportion of wood resin degraded in the peroxide bleaching The effect of different factors on these parameters was evaluated with the help of laboratory studies and a literature survey. Also, information related to the values of these parameters in existing processes was obtained in mill measurements. With the help of this information, it was possible to evaluate the deresination efficiency and the effect of different factors on this efficiency in a pulping plant that produced low-freeness mechanical pulp. This evaluation showed that the wood resin content of mechanical pulp can be significantly decreased if there exists, in the process, a peroxide bleaching and subsequent washing stage. In the case of an optimal process configuration, as high as a 85 percent deresination efficiency seems to be possible with a water usage level of 8 m3/o.d.t.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numerical weather prediction and climate simulation have been among the computationally most demanding applications of high performance computing eversince they were started in the 1950's. Since the 1980's, the most powerful computers have featured an ever larger number of processors. By the early 2000's, this number is often several thousand. An operational weather model must use all these processors in a highly coordinated fashion. The critical resource in running such models is not computation, but the amount of necessary communication between the processors. The communication capacity of parallel computers often fallsfar short of their computational power. The articles in this thesis cover fourteen years of research into how to harness thousands of processors on a single weather forecast or climate simulation, so that the application can benefit as much as possible from the power of parallel high performance computers. The resultsattained in these articles have already been widely applied, so that currently most of the organizations that carry out global weather forecasting or climate simulation anywhere in the world use methods introduced in them. Some further studies extend parallelization opportunities into other parts of the weather forecasting environment, in particular to data assimilation of satellite observations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main objective of this thesis was togenerate better filtration technologies for effective production of pure starchproducts, and thereby the optimisation of filtration sequences using created models, as well as the synthesis of the theories of different filtration stages, which were suitable for starches. At first, the structure and the characteristics of the different starch grades are introduced and each starch grade is shown to have special characteristics. These are taken as the basis of the understanding of the differences in the behaviour of the different native starch grades and their modifications in pressure filtration. Next, the pressure filtration process is divided into stages, which are filtration, cake washing, compression dewatering and displacement dewatering. Each stage is considered individually in their own chapters. The order of the different suitable combinations of the process stages are studied, as well as the proper durations and pressures of the stages. The principles of the theory of each stageare reviewed, the methods for monitoring the progress of each stage are presented, and finally, the modelling of them is introduced. The experimental results obtained from the different stages of starch filtration tests are given and the suitability of the theories and models to the starch filtration are shown. Finally, the theories and the models are gathered together and shown, that the analysis of the whole starch pressure filtration process can be performed with the software developed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the past few decades, turbulent change has characterized the situation in the media industry. It has been noted that digitalization and new media are strongly influencing the industry: it is changing the existing market dynamics and requires new strategies. Prior research on the impact of digitalization and the Internet has emphasized news-focused media such as newspaper publishing and broadcasting, yet magazine publishing is very seldom the focus of the research. This study examines how the Internetimpacts magazine publishing. The work presents a multi-level analysis on the role and impact of the Internet on magazine products, companies and industry. The study is founded on strategic management, technology management and media economics literature. This study consists of two parts. The first part introduces the research topic and discusses the overall results of the study. The second part comprises five research publications. Qualitative research methods are used throughout. The results of the study indicate that the Internet has not had a disruptive effect on magazine publishing, and that its strategic implications could rather be considered complementary to the print magazine and the business as a whole. It seems that the co-specialized assets, together with market-related competencies and unchanged core competence have protected established firms from the disruptive effect of the new technology in magazine publishing. In addition, it seems that the Internet offers a valuable possibility to build and nourish customer relationships. The study contributes tomedia management and economics research by moving from product- or industry-level investigations towards a strategic-management perspective.