60 resultados para scaffold architectures


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The political environment of security and defence has changed radically in the Western industrialised world since the Cold War. As a response to these changes, since the beginning of the twenty-first century, most Western countries have adopted a ‘capabilities-based approach’ to developing and operating their armed forces. More responsive and versatile military capabilities must be developed to meet the contemporary challenges. The systems approach is seen as a beneficial means of overcoming traps in resolving complex real -world issues by conventional thinking. The main objectives of this dissertation are to explore and assess the means to enhance the development of military capabilities both in concept development and experimentation (CD&E) and in national defence materiel collaboration issues. This research provides a unique perspective, a systems approach, to the development areas of concern in resolving complex real-world issues. This dissertation seeks to increase the understanding of the military capability concept both as a whole and with in its life cycle. The dissertation follows the generic functionalist systems methodology by Jackson. The methodology applies a comprehensive set of constitutive rules to examine the research objectives. This dissertation makes contribution to current studies about military capability. It presents two interdepen dent conceptual capability models: the comprehensive capability meta-model (CCMM) and the holistic capability life cycle model (HCLCM). These models holistically and systematically complement the existing, but still evolving, understanding of military capability and its life cycle. In addition, this dissertation contributes to the scientific discussion of defence procurement in its broad meaning by introducing the holistic model about the national defence materiel collaboration between the defence forces, defence industry and academia. The model connects the key collaborative mechanisms, which currently work in isolation from each other, and take into consideration the unique needs of each partner. This dissertation contributes empirical evidence regarding the benefits of enterprise architectures (EA) to CD&E. The EA approach may add value to traditional concept development by increasing the clarity, consistency and completeness of the concept. The most important use considered for EA in CD&E is that it enables further utilisation of the concept created in the case project.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This doctoral dissertation investigates the adult education policy of the European Union (EU) in the framework of the Lisbon agenda 2000–2010, with a particular focus on the changes of policy orientation that occurred during this reference decade. The year 2006 can be considered, in fact, a turning point for the EU policy-making in the adult learning sector: a radical shift from a wide--ranging and comprehensive conception of educating adults towards a vocationally oriented understanding of this field and policy area has been observed, in particular in the second half of the so--called ‘Lisbon decade’. In this light, one of the principal objectives of the mainstream policy set by the Lisbon Strategy, that of fostering all forms of participation of adults in lifelong learning paths, appears to have muted its political background and vision in a very short period of time, reflecting an underlying polarisation and progressive transformation of European policy orientations. Hence, by means of content analysis and process tracing, it is shown that the new target of the EU adult education policy, in this framework, has shifted from citizens to workers, and the competence development model, borrowed from the corporate sector, has been established as the reference for the new policy road maps. This study draws on the theory of governance architectures and applies a post-ontological perspective to discuss whether the above trends are intrinsically due to the nature of the Lisbon Strategy, which encompasses education policies, and to what extent supranational actors and phenomena such as globalisation influence the European governance and decision--making. Moreover, it is shown that the way in which the EU is shaping the upgrading of skills and competences of adult learners is modeled around the needs of the ‘knowledge economy’, thus according a great deal of importance to the ‘new skills for new jobs’ and perhaps not enough to life skills in its broader sense which include, for example, social and civic competences: these are actually often promoted but rarely implemented in depth in the EU policy documents. In this framework, it is conveyed how different EU policy areas are intertwined and interrelated with global phenomena, and it is emphasised how far the building of the EU education systems should play a crucial role in the formation of critical thinking, civic competences and skills for a sustainable democratic citizenship, from which a truly cohesive and inclusive society fundamentally depend, and a model of environmental and cosmopolitan adult education is proposed in order to address the challenges of the new millennium. In conclusion, an appraisal of the EU’s public policy, along with some personal thoughts on how progress might be pursued and actualised, is outlined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The necessity of EC (Electronic Commerce) and enterprise systems integration is perceived from the integrated nature of enterprise systems. The proven benefits of EC to provide competitive advantages to the organizations force enterprises to adopt and integrate EC with their enterprise systems. Integration is a complex task to facilitate seamless flow of information and data between different systems within and across enterprises. Different systems have different platforms, thus to integrate systems with different platforms and infrastructures, integration technologies, such as middleware, SOA (Service-Oriented Architecture), ESB (Enterprise Service Bus), JCA (J2EE Connector Architecture), and B2B (Business-to-Business) integration standards are required. Huge software vendors, such as Oracle, IBM, Microsoft, and SAP suggest various solutions to address EC and enterprise systems integration problems. There are limited numbers of literature about the integration of EC and enterprise systems in detail. Most of the studies in this area have focused on the factors which influence the adoption of EC by enterprise or other studies provide limited information about a specific platform or integration methodology in general. Therefore, this thesis is conducted to cover the technical details of EC and enterprise systems integration and covers both the adoption factors and integration solutions. In this study, many literature was reviewed and different solutions were investigated. Different enterprise integration approaches as well as most popular integration technologies were investigated. Moreover, various methodologies of integrating EC and enterprise systems were studied in detail and different solutions were examined. In this study, the influential factors to adopt EC in enterprises were studied based on previous literature and categorized to technical, social, managerial, financial, and human resource factors. Moreover, integration technologies were categorized based on three levels of integration, which are data, application, and process. In addition, different integration approaches were identified and categorized based on their communication and platform. Also, different EC integration solutions were investigated and categorized based on the identified integration approaches. By considering different aspects of integration, this study is a great asset to the architectures, developers, and system integrators in order to integrate and adopt EC with enterprise systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The capabilities and thus, design complexity of VLSI-based embedded systems have increased tremendously in recent years, riding the wave of Moore’s law. The time-to-market requirements are also shrinking, imposing challenges to the designers, which in turn, seek to adopt new design methods to increase their productivity. As an answer to these new pressures, modern day systems have moved towards on-chip multiprocessing technologies. New architectures have emerged in on-chip multiprocessing in order to utilize the tremendous advances of fabrication technology. Platform-based design is a possible solution in addressing these challenges. The principle behind the approach is to separate the functionality of an application from the organization and communication architecture of hardware platform at several levels of abstraction. The existing design methodologies pertaining to platform-based design approach don’t provide full automation at every level of the design processes, and sometimes, the co-design of platform-based systems lead to sub-optimal systems. In addition, the design productivity gap in multiprocessor systems remain a key challenge due to existing design methodologies. This thesis addresses the aforementioned challenges and discusses the creation of a development framework for a platform-based system design, in the context of the SegBus platform - a distributed communication architecture. This research aims to provide automated procedures for platform design and application mapping. Structural verification support is also featured thus ensuring correct-by-design platforms. The solution is based on a model-based process. Both the platform and the application are modeled using the Unified Modeling Language. This thesis develops a Domain Specific Language to support platform modeling based on a corresponding UML profile. Object Constraint Language constraints are used to support structurally correct platform construction. An emulator is thus introduced to allow as much as possible accurate performance estimation of the solution, at high abstraction levels. VHDL code is automatically generated, in the form of “snippets” to be employed in the arbiter modules of the platform, as required by the application. The resulting framework is applied in building an actual design solution for an MP3 stereo audio decoder application.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Advancements in IC processing technology has led to the innovation and growth happening in the consumer electronics sector and the evolution of the IT infrastructure supporting this exponential growth. One of the most difficult obstacles to this growth is the removal of large amount of heatgenerated by the processing and communicating nodes on the system. The scaling down of technology and the increase in power density is posing a direct and consequential effect on the rise in temperature. This has resulted in the increase in cooling budgets, and affects both the life-time reliability and performance of the system. Hence, reducing on-chip temperatures has become a major design concern for modern microprocessors. This dissertation addresses the thermal challenges at different levels for both 2D planer and 3D stacked systems. It proposes a self-timed thermal monitoring strategy based on the liberal use of on-chip thermal sensors. This makes use of noise variation tolerant and leakage current based thermal sensing for monitoring purposes. In order to study thermal management issues from early design stages, accurate thermal modeling and analysis at design time is essential. In this regard, spatial temperature profile of the global Cu nanowire for on-chip interconnects has been analyzed. It presents a 3D thermal model of a multicore system in order to investigate the effects of hotspots and the placement of silicon die layers, on the thermal performance of a modern ip-chip package. For a 3D stacked system, the primary design goal is to maximise the performance within the given power and thermal envelopes. Hence, a thermally efficient routing strategy for 3D NoC-Bus hybrid architectures has been proposed to mitigate on-chip temperatures by herding most of the switching activity to the die which is closer to heat sink. Finally, an exploration of various thermal-aware placement approaches for both the 2D and 3D stacked systems has been presented. Various thermal models have been developed and thermal control metrics have been extracted. An efficient thermal-aware application mapping algorithm for a 2D NoC has been presented. It has been shown that the proposed mapping algorithm reduces the effective area reeling under high temperatures when compared to the state of the art.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data management consists of collecting, storing, and processing the data into the format which provides value-adding information for decision-making process. The development of data management has enabled of designing increasingly effective database management systems to support business needs. Therefore as well as advanced systems are designed for reporting purposes, also operational systems allow reporting and data analyzing. The used research method in the theory part is qualitative research and the research type in the empirical part is case study. Objective of this paper is to examine database management system requirements from reporting managements and data managements perspectives. In the theory part these requirements are identified and the appropriateness of the relational data model is evaluated. In addition key performance indicators applied to the operational monitoring of production are studied. The study has revealed that the appropriate operational key performance indicators of production takes into account time, quality, flexibility and cost aspects. Especially manufacturing efficiency has been highlighted. In this paper, reporting management is defined as a continuous monitoring of given performance measures. According to the literature review, the data management tool should cover performance, usability, reliability, scalability, and data privacy aspects in order to fulfill reporting managements demands. A framework is created for the system development phase based on requirements, and is used in the empirical part of the thesis where such a system is designed and created for reporting management purposes for a company which operates in the manufacturing industry. Relational data modeling and database architectures are utilized when the system is built for relational database platform.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis focuses on the development of sustainable industrial architectures for bioenergy based on the metaphors of industrial symbiosis and industrial ecosystems, which imply exchange of material and energy side-flows of various industries in order to improve sustainability of those industries on a system level. The studies on industrial symbiosis have been criticised for staying at the level of incremental changes by striving for cycling waste and by-flows of the industries ‘as is’ and leaving the underlying industry structures intact. Moreover, there has been articulated the need for interdisciplinary research on industrial ecosystems as well as the need to extend the management and business perspectives on industrial ecology. This thesis addresses this call by applying a business ecosystem and business model perspective on industrial symbiosis in order to produce knowledge on how industrial ecosystems can be developed that are sustainable environmentally and economically. A case of biogas business is explored and described in four research papers and an extended summary that form this thesis. Since the aim of the research was to produce a normative model for developing sustainable industrial ecosystems, the methodology applied in this research can be characterised as constructive and collaborative. A constructive research mode was required in order to expand the historical knowledge on industrial symbiosis development and business ecosystem development into the knowledge of what should be done, which is crucial for sustainability and the social change it requires. A collaborative research mode was employed through participating in a series of projects devoted to the development of a biogas-for-traffic industrial ecosystem. The results of the study showed that the development of material flow interconnections within industrial symbiosis is inseparable from larger business ecosystem restructuring. This included a shift in the logic of the biogas and traffic fuel industry and a subsequent development of a business ecosystem that would entail the principles of industrial symbiosis and localised energy production and consumption. Since a company perspective has been taken in this thesis, the role of an ecosystem integrator appeared as a crucial means to achieve the required industry restructuring. This, in turn, required the development of a modular and boundary-spanning business model that had a strong focus on establishing collaboration among ecosystem stakeholders and development of multiple local industrial ecosystems as part of business growth. As a result, the designed business model of the ecosystem integrator acquired the necessary flexibility in order to adjust to local conditions, which is crucial for establishing industrial symbiosis. This thesis presents a normative model for the development of a business model required for creating sustainable industrial ecosystems, which contributes to approaches at the policy-makers’ level, proposed earlier. Therefore, this study addresses the call for more research on the business level of industrial ecosystem formation and the implications for the business models of the involved actors. Moreover, the thesis increases the understanding of system innovation and innovation in business ecosystems by explicating how business model innovation can be the trigger for achieving more sustainable industry structures, such as those relying on industrial symbiosis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The molecular functions of the non-cell cycle-related Cyclin-dependent kinase 5 (Cdk5) have been of primary interest within the neuroscience field, but novel undertakings are constantly emerging for the kinase in tissue homeostasis, as well as in diseases such as diabetes and cancer. Although Cdk5 activation is predominantly regulated by specific non-cyclin activator protein binding, additional mechanisms have proved to orchestrate Cdk5 signaling in cells. For example, the interaction between the intermediate filament protein nestin and Cdk5 has been proposed to determine cellular fate during neuronal apoptosis through nestin-dependent adjustment of the sensitive balance and turnover of Cdk5 activators. While nestin constitutes a crucial regulatory scaffold for appropriate Cdk5 activation in apoptosis, Cdk5 itself phosphorylates nestin with the consequence of filament reorganization in both neuronal progenitors and differentiating muscle cells. Interestingly, the two proteins are often found coexpressed in various tissues and cell types, proposing that nestin-mediated scaffolding of Cdk5 and its activators may be applicable to other tissue systems as well. In the literature, the molecular functions of nestin have remained in the shade, as it is mostly exploited as a marker protein for progenitor cells. In light of these studies, the aim of this thesis was to assess the importance of the nestin scaffold in regulation of Cdk5 actions in cell fate decisions. This thesis can be subdivided into two major projects: one that studied the nature of the Cdk5-nestin interplay in muscle, and one that assessed their role in prostate cancer. During differentiation of a myoblast cell line, the filament formation properties of nestin was found to be crucial in directing Cdk5 activity, with direct consequences on the process of differentiation. Also the genetic knockout of nestin was found to influence Cdk5 activity, although differentiation per se was not affected. Instead, the genetic ablation of nestin had broad consequences on muscle homeostasis and regeneration. While the nestin-mediated regulation of Cdk5 in muscle was found to act in multiple ways, the connection remained more elusive in cancer models. Cdk5 was, however, established as a significant determinant of prostate cancer proliferation; a behavior uncharacteristic for this differentiation-associated kinase. Through complex and simultaneous regulation of two major prostate cancer pathways, Cdk5 was placed upstream of both Akt kinase and the androgen receptor. Its action on proliferation was nonetheless mainly exerted through the Akt signaling pathway in various cancer models. In summary, this thesis contributed to the knowledge of Cdk5 regulation and functions in two atypical settings; proliferation (in a cancer framework) and muscle differentiation, which is a poorly understood model system in the Cdk5 field. This balance between proliferation and differentiation implemented by Cdk5 is ultimately regulated (where present) by the dynamics of the cytoskeletal nestin scaffold.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Poly-L-lactide (PLLA) is a widely used sustainable and biodegradable alternative to replace synthetic non-degradable plastic materials in the packaging industry. Conversely, its processing properties are not always optimal, e.g. insufficient melt strength at higher temperatures (necessary in extrusion coating processes). This thesis reports on research to improve properties of commercial PLLA grade (3051D from NatureWorks), to satisfy and extend end-use applications, such as food packaging by blending with modified PLLA. Adjustment of the processability by chain branching of commercial poly-L-lactide initiated by peroxide was evaluated. Several well-defined branched structures with four arms (sPLLA) were synthesized using pentaerythritol as a tetra-functional initiator. Finally, several block copolymers consisting of polyethylene glycol and PLLA (i.e. PEGLA) were produced to obtain a well extruded material with improved heat sealing properties. Reactive extrusion of poly-L-lactide was carried out in the presence of 0.1, 0.3 and 0.5 wt% of various peroxides [tert-butyl-peroxybenzoate (TBPB), 2,5-dimethyl-2,5-(tert-butylperoxy)-hexane (Lupersol 101; LOL1) and benzoyl peroxide (BPO)] at 190C. The peroxide-treated PLLAs showed increased complex viscosity and storage modulus at lower frequencies, indicating the formation of branched/cross linked architectures. The material property changes were dependent on the peroxide, and the used peroxide concentration. Gel fraction analysis showed that the peroxides, afforded different gel contents, and especially 0.5 wt% peroxide, produced both an extremely high molar mass, and a cross linked structure, not perhaps well suited for e.g. further use in a blending step. The thermal behavior was somewhat unexpected as the materials prepared with 0.5 wt% peroxide showed the highest ability for crystallization and cold crystallization, despite substantial cross linking. The peroxide-modified PLLA, i.e. PLLA melt extruded with 0.3 wt% of TBPB and LOL1 and 0.5 wt% BPO was added to linear PLLA in ratios of 5, 15 and 30 wt%. All blends showed increased zero shear viscosity, elastic nature (storage modulus) and shear sensitivity. All blends remained amorphous, though the ability of annealing was improved slightly. Extrusion coating on paperboard was conducted with PLLA, and peroxide-modified PLLA blends (90:10). All blends were processable, but only PLLA with 0.3 wt% of LOL1 afforded a smooth high quality surface with improved line speed. Adhesion levels between fiber and plastic, as well as heat seal performance were marginally reduced compared with pure 3051D. The water vapor transmission measurements (WVTR) of the blends containing LOL1 showed acceptable levels, only slightly lower than for comparable PLLA 3051D. A series of four-arm star-shaped poly-L-lactide (sPLLA) with different branch length was synthesized by ring opening polymerization (ROP) of L-lactide using pentaerythritol as initiator and stannous octoate as catalyst. The star-shaped polymers were further blended with its linear resin and studied for their melt flow and thermal properties. Blends containing 30 wt% of sPLLA with low molecular weight (30 wt%; Mwtotal: 2500 g mol-1 and 15000 g mol-1) showed lower zero shear viscosity and significantly increased shear thinning, while at the same time slightly increased crystallization of the blend. However, the amount of crystallization increased significantly with the higher molecular weight sPLLA, therefore the star-shaped structure may play a role as nucleating agent. PLLA-polyethylene glycol–PLLA triblock copolymers (PEGLA) with different PLLA block length were synthesized and their applicability as blends with linear PLLA (3051D NatureWorks) was investigated with the intention of improving heat-seal and adhesion properties of extrusion-coated paperboard. PLLA-PEG-PLLA was obtained by ring opening polymerization (ROP) of L-lactide using PEG (molecular weight 6000 g mol-1) as an initiator, and stannous octoate as catalyst. The structures of the PEGLAs were characterized by proton nuclear magnetic resonance spectroscopy (1H-NMR). The melt flow and thermal properties of all PEGLAs and their blends were evaluated using dynamic rheology, and differential scanning calorimeter (DSC). All blends containing 30 wt% of PEGLAs showed slightly higher zero shear viscosity, higher shear thinning and increased melt elasticity (based on tan delta). Nevertheless, no significant changes in thermal properties were distinguished. High molecular weight PEGLAs were used in extrusion coating line with 3051D without problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Corporations practice company acquisitions in order to create shareholder’s value. During the last few decades, the companies in emerging markets have become active in the acquisition business. During the last decade, large and significant acquisitions have occurred especially in automotive industry. While domestic markets have become too competitive and companies are lacking required capabilities, they seek possibilities to expand into Western markets by attaining valuable assets through acquisitions of developed country corporations. This study discusses the issues and characteristics of these acquisitions through case studies. The purpose of this study was to identify the acquisition motives and strategies for post-transaction brand and product integration as well as analyze the effect of the motives to the integration strategy. The cases chosen for the research were Chinese Geely acquiring Swedish Volvo in 2010 and Indian Tata Motors buying British Jaguar Land Rover in 2008. The main topics were chosen according to their significance for companies in automotive industry as well as those are most visible parts for consumers. The study is based on qualitative case study methods, analyzing secondary data from academic papers and news articles as well as companies’ own announcements e.g. stock exchange and press releases. The study finds that the companies in the cases mainly possessed asset-seeking and market-seeking motives. In addition, the findings refer to rather minimal post-acquisition brand and product integration strategies. Mainly the parent companies left the target company autonomous to make their own business strategies and decisions. The most noticeable integrations were in the product development and production processes. Through restructuring the product architectures, the companies were able to share components and technology between product families and brands, which results in cutting down costs and in increase of profitability and efficiency. In the Geely- Volvo case, the strategy focused more on component sharing and product development know-how, whereas in Tata Motors-Jaguar Land Rover case, the main actions were to cut down costs through component sharing and combine production and distribution networks especially in Asian markets. However, it was evident that in both cases the integration and technology sharing were executed cautiously to prevent on harming the valuable image of the luxury brand. This study has concluded that the asset-seeking motives have significant influence on the posttransaction brand and model line-up integration strategies. By taking a cautious approach in acquiring assets, such as luxury brand, the companies in the cases have implemented a successful post-acquisition strategy and managed to create value for the shareholders at least in short-term. Yritykset harjoittavat yritysostoja luodakseen osakkeenomistajille lisäarvoa. Viimeisten muutamien vuosikymmenien aikana yritykset kehittyvissä maissa ovat myös aktivoituneet yritysostoissa. Viimeisen vuosikymmenen aikana erityisesti autoteollisuudessa on esiintynyt suuria ja merkittäviä yritysostoja. Koska kilpailu kotimaan markkinoilla on kiristynyt ja yritykset ovat vailla vaadittavia valmiuksia, ne etsivät mahdollisuuksiaan laajentaa länsimaisiin markkinoihin hankkimalla arvokkaita etuja kehittyneiden maiden yrityksistä yritysostojen avulla. Tämä tutkimus pohtii näiden yritysostojen olennaisia kysymyksiä ja ominaisuuksia casetutkimuksien kautta. Tutkimuksen tarkoitus oli tunnistaa sekä yritysostojen motiiveja ja brändi- ja mallisto-integraation strategioita että analysoida kyseisten motiivien vaikutusta integraatiostrategiaan. Tapaus-tutkimuksiksi valittiin kiinalaisen Geelyn yritysosto ruotsalaisesta Volvosta vuonna 2010 ja intialaisen Tata Motorsin yritysosto englantilaisesta Jaguar Land Roverista vuonna 2008. Tutkimus on kvalitatiivinen case-tutkimus ja siinä analysoidaan toissijaista tietoa sekä akateemisten ja uutisartikkeleiden että yritysten omien ilmoitusten, kuten pörssi- ja lehdistötiedotteiden, kautta. Tutkimuksen tulokset osoittavat, että tutkittujen yritysten toiminnat perustuivat motiiveihin, joita ajoivat etujen and uusien markkinoiden tarve. Sen lisäksi tutkimustulokset osoittivat, että yritysoston jälkeinen brändi- ja mallisto-integraatio pidettiin minimaalisena. Pääasiallisesti kohdeyrityksille jätettiin autonomia tehdä omat liikkeenjohdolliset päätökset yritysstrategioihin liittyen. Huomattavimmat integraatiot koskivat tuotekehityksellisiä ja tuotannollisia prosesseja. Kehittämällä uudelleen tuotearkkitehtuureja, yritykset pystyivät jakamaan komponentteja ja teknologiaa tuoteperheiden ja brändien välillä. Tämä mahdollisti kustannusleikkauksia sekä kannattavuuden ja tehokkuuden parantamista. Geely-Volvo –tapauksessa integraatiostrategia keskittyi komponenttien jakamiseen yhteisten tuotearkkitehtuurien avulla ja tuotekehityksen ammattitaitoon, kun taas Tata Motors-JLR –tapauksessa päätoiminnat olivat kustannuksien leikkaus sekä tuotannon ja jakeluverkoston yhdistäminen erityisesti Aasian maissa. Yhteistä yrityskaupoissa oli, että brändi- ja mallisto-integraatio sekä teknologian jakaminen suoritettiin varoen ehkäistäkseen arvokkaiden luksus-brändien tuotekuvan vahingoittamista. Tutkimuksen lopputulokset osoittavat, että yrityskaupan motiiveilla on huomattava vaikutus brändija mallisto-integraation strategiaan. Toteuttamalla varovaista lähestymistapaa luksus-brändin hankinnassa ja integraatiossa, yritykset ovat onnistuneet luomaan lisäarvoa osakkeenomistajille vähintään lyhyellä aikavälillä.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Case company utilizes multi-branding strategy (or house of brands strategy) in its product portfolio. In practice the company has multiple brands – one main brand and four acquired brands – which all utilize one single product platform. The objective of this research is to analyze case company’s multi-branding strategy and its benefits and challenges. Moreover, the purpose is to clarify that how could a company in B2B markets utilize multi-branding strategy more efficiently and profitably. The theoretical part of this thesis consists of aspects of branding strategies; different brand name architectures, benefits and challenges of different strategies and different ways of utilize branding strategies in mergers and acquisitions. The empirical part, on the other hand, includes the description of the case company’s branding strategy and the employees’ perspective on the benefits and challenges of multi-branding strategy, and how to utilize it more efficiently and profitably. This study shows, that the major benefits of utilizing multi-branding are lower production costs, ability to reach wider market coverage, possibility to utilize common sales tools, synergies in R&D and shared resources. On the other hand, the major challenges are lack of product differentiation, internal competition, branding issues in production and deliveries, pricing issues and conflicts, and compromises in product compatibility and suitability. Based on the results, several ways to utilize multi-branding strategy more efficiently and profitably were found; by putting more effort on brand image and product differentiation, by having more co-operation among the brands and by focusing on more precise customer and market segmentation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Business-to-business terminology is relatively new as a business concept, so is the enterprise resource planning system in information technology. Research, implementation and integration of these two concept has been observed for last two decades in this paper. One of the major success point for growth in business-to-business environment is the availability of internal and partner data. Enterprise resource planning system facilitates storing, analysis of such data and enables different business process automation, forecasting and numerous value creating activity. In order to achieve such functionality for B2B customers, integrating them within ERP is very useful. This paper aims at understanding and suggesting such integration through investigating related documentation of similar integration scenarios, infrastructure, models and architectures. The investigation of the topic of this paper has been made using systematic mapping study of related papers and listing and suggesting necessary ingredients that enables such integration. Furthermore, this paper also suggests possibilities to overcome challenges integration experts might face during the integration phase and opens doors to future research scope in the related fields.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Convolutional Neural Networks (CNN) have become the state-of-the-art methods on many large scale visual recognition tasks. For a lot of practical applications, CNN architectures have a restrictive requirement: A huge amount of labeled data are needed for training. The idea of generative pretraining is to obtain initial weights of the network by training the network in a completely unsupervised way and then fine-tune the weights for the task at hand using supervised learning. In this thesis, a general introduction to Deep Neural Networks and algorithms are given and these methods are applied to classification tasks of handwritten digits and natural images for developing unsupervised feature learning. The goal of this thesis is to find out if the effect of pretraining is damped by recent practical advances in optimization and regularization of CNN. The experimental results show that pretraining is still a substantial regularizer, however, not a necessary step in training Convolutional Neural Networks with rectified activations. On handwritten digits, the proposed pretraining model achieved a classification accuracy comparable to the state-of-the-art methods.