18 resultados para modularisation
Resumo:
Problématique: L’hypertension artérielle essentielle, facteur de risque majeur dans le développement des maladies cardiovasculaires, est un trait multigénique complexe dont les connaissances sur le déterminisme génétique nécessitent d’être approfondies. De nombreux loci à trait quantitatif (QTLs); soit des gènes responsables de faire varier la pression artérielle (PA), ont été identifiés chez l’humain et le modèle animal. Cependant, le mystère plane encore sur la façon dont ces gènes fonctionnent ensemble pour réguler la PA. Hypothèse et objectif: Plutôt qu’une addition de QTLs ayant chacun une action infinitésimale sur la PA, une interaction épistatique entre les gènes serait responsable du phénotype hypertendu. Ainsi, l’étude de cette épistasie entre les gènes impliqués, directement ou indirectement, dans l’homéostasie de la PA nous permettrait d’explorer de nouvelles voies de régulation moléculaire en cause dans cette maladie. Méthodes: Via la réalisation de souches congéniques de rats, où un segment chromosomique provenant d’une souche receveuse hypertendue (Dahl Salt Sensitive, SS/Jr) est remplacé par son homologue provenant d’une souche donneuse normotendue (Lewis, LEW), des QTLs peuvent être mis en évidence. Dans ce contexte, la combinaison de QTLs via la création de doubles ou multiples congéniques constitue la première démonstration fonctionnelle des interactions intergéniques. Résultats: Vingt-sept combinaisons au total nous ont menés à l’appréciation d’une modularisation des QTLs. Ces derniers ont été catégorisés selon deux principaux modules épistatiques (EMs) où les QTLs appartenant à un même EM sont épistatiques entre eux et participent à une même voie régulatrice. Les EMs/cascades agissent alors en parallèle pour réguler la PA. Grâce à l’existence de QTLs ayant des effets opposés sur la PA, nous avons pu établir l’ordre hiérarchique entre trois paires de QTLs. Cependant, lorsque cette suite régulatrice ne peut être déterminée, d’autres approches sont nécessaires. Nos travaux nous ont mené à l’identification d’un QTL situé sur le chromosome 16 du rat (C16QTL), appartenant au EM1 et qui révélerait une nouvelle voie de l’homéostasie de la PA. Le gène retinoblastoma-associated protein 140 (Rap140)/family with sequence similarity 208 member A (Fam208a), présentant une mutation non synonyme entre SS/Jr et LEW est le gène candidat le plus plausible pour représenter C16QTL. Celui-ci code pour un facteur de transcription et semblerait influencer l’expression de Solute carrier family 7 (cationic amino acid transporter, y+ system) member 12 (Slc7a12), spécifiquement et significativement sous exprimé dans les reins de la souche congénique portant C16QTL par rapport à la souche SS/Jr. Rap140/Fam208a agirait comme un inhibiteur de la transcription de Slc7a12 menant à une diminution de la pression chez Lewis. Conclusions: L’architecture complexe de la régulation de la PA se dévoile mettant en scène de nouveaux acteurs, pour la plupart inconnus pour leur implication dans la PA. L’étude de la nouvelle voie de signalisation Rap140/Fam208a - Slc7a12 nous permettra d’approfondir nos connaissances quant à l’homéostasie de la pression artérielle et de l’hypertension chez SS/Jr. À long terme, de nouveaux traitements anti-hypertenseurs, ciblant plus d’une voie de régulation à la fois, pourraient voir le jour.
Resumo:
In this paper we define the notion of an axiom dependency hypergraph, which explicitly represents how axioms are included into a module by the algorithm for computing locality-based modules. A locality-based module of an ontology corresponds to a set of connected nodes in the hypergraph, and atoms of an ontology to strongly connected components. Collapsing the strongly connected components into single nodes yields a condensed hypergraph that comprises a representation of the atomic decomposition of the ontology. To speed up the condensation of the hypergraph, we first reduce its size by collapsing the strongly connected components of its graph fragment employing a linear time graph algorithm. This approach helps to significantly reduce the time needed for computing the atomic decomposition of an ontology. We provide an experimental evaluation for computing the atomic decomposition of large biomedical ontologies. We also demonstrate a significant improvement in the time needed to extract locality-based modules from an axiom dependency hypergraph and its condensed version.
Resumo:
Problématique: L’hypertension artérielle essentielle, facteur de risque majeur dans le développement des maladies cardiovasculaires, est un trait multigénique complexe dont les connaissances sur le déterminisme génétique nécessitent d’être approfondies. De nombreux loci à trait quantitatif (QTLs); soit des gènes responsables de faire varier la pression artérielle (PA), ont été identifiés chez l’humain et le modèle animal. Cependant, le mystère plane encore sur la façon dont ces gènes fonctionnent ensemble pour réguler la PA. Hypothèse et objectif: Plutôt qu’une addition de QTLs ayant chacun une action infinitésimale sur la PA, une interaction épistatique entre les gènes serait responsable du phénotype hypertendu. Ainsi, l’étude de cette épistasie entre les gènes impliqués, directement ou indirectement, dans l’homéostasie de la PA nous permettrait d’explorer de nouvelles voies de régulation moléculaire en cause dans cette maladie. Méthodes: Via la réalisation de souches congéniques de rats, où un segment chromosomique provenant d’une souche receveuse hypertendue (Dahl Salt Sensitive, SS/Jr) est remplacé par son homologue provenant d’une souche donneuse normotendue (Lewis, LEW), des QTLs peuvent être mis en évidence. Dans ce contexte, la combinaison de QTLs via la création de doubles ou multiples congéniques constitue la première démonstration fonctionnelle des interactions intergéniques. Résultats: Vingt-sept combinaisons au total nous ont menés à l’appréciation d’une modularisation des QTLs. Ces derniers ont été catégorisés selon deux principaux modules épistatiques (EMs) où les QTLs appartenant à un même EM sont épistatiques entre eux et participent à une même voie régulatrice. Les EMs/cascades agissent alors en parallèle pour réguler la PA. Grâce à l’existence de QTLs ayant des effets opposés sur la PA, nous avons pu établir l’ordre hiérarchique entre trois paires de QTLs. Cependant, lorsque cette suite régulatrice ne peut être déterminée, d’autres approches sont nécessaires. Nos travaux nous ont mené à l’identification d’un QTL situé sur le chromosome 16 du rat (C16QTL), appartenant au EM1 et qui révélerait une nouvelle voie de l’homéostasie de la PA. Le gène retinoblastoma-associated protein 140 (Rap140)/family with sequence similarity 208 member A (Fam208a), présentant une mutation non synonyme entre SS/Jr et LEW est le gène candidat le plus plausible pour représenter C16QTL. Celui-ci code pour un facteur de transcription et semblerait influencer l’expression de Solute carrier family 7 (cationic amino acid transporter, y+ system) member 12 (Slc7a12), spécifiquement et significativement sous exprimé dans les reins de la souche congénique portant C16QTL par rapport à la souche SS/Jr. Rap140/Fam208a agirait comme un inhibiteur de la transcription de Slc7a12 menant à une diminution de la pression chez Lewis. Conclusions: L’architecture complexe de la régulation de la PA se dévoile mettant en scène de nouveaux acteurs, pour la plupart inconnus pour leur implication dans la PA. L’étude de la nouvelle voie de signalisation Rap140/Fam208a - Slc7a12 nous permettra d’approfondir nos connaissances quant à l’homéostasie de la pression artérielle et de l’hypertension chez SS/Jr. À long terme, de nouveaux traitements anti-hypertenseurs, ciblant plus d’une voie de régulation à la fois, pourraient voir le jour.
Resumo:
Enterprise systems are located within the antinomy of appearing as generic product, while being means of multiple integrations for the user through configuration and customisation. Technological and organisational integrations are defined by architectures and standardised interfaces. Until recently, technological integration of enterprise systems has been supported largely by monolithic architectures that were designed, and maintained by the respective developers. From a technical perspective, this approach had been challenged by the suggestion of component-based enterprise systems that would allow for a more user-focused system through strict modularisation. Lately, the product nature of software as proprietary item has been questioned through the rapid increase of open source programs that are being used in business computing in general, and also within the overall portfolio that makes up enterprise systems. This suggests the potential for altered technological and commercial constellations for the design of enterprise systems, which are presented in different scenarios. The technological and commercial decomposition of enterprise software and systems may also address some concerns emerging from the users’ experience of those systems, and which may have arisen from their proprietary or product nature.
Resumo:
BACKGROUND OR CONTEXT Laboratories provide the physical spaces for engineering students to connect with theory and have a personal hands-on learning experience. Learning space design and development is well established in many universities however laboratories are often not part of that movement. While active, collaborative and group learning pedagogies are all key words in relation to these new spaces the concepts have always been central to laboratory based learning. The opportunity to build on and strengthen good practice in laboratories is immense. In the 2001 review “Universities in Crisis” many references are made to the decline of laboratories. One such comment in the review was made by Professor Ian Chubb (AVCC), who in 2013, as Chief Scientist for Australia, identifies the national concern about STEM education and presents a strategic plan to address the challenges ahead. What has been achieved and changed in engineering teaching and research laboratories in this time? PURPOSE OR GOAL A large number of universities in Australia and New Zealand own laboratory and other infrastructure designed well for the era they were built but now showing signs of their age, unable to meet the needs of today’s students, limiting the effectiveness of learning outcomes and presenting very low utilisation rates. This paper will present a model for new learning space design that improves student experience and engagement, supporting academic aims and significantly raising the space utilisation rate. APPROACH A new approach in laboratory teaching and research including new management has been adopted by the engineering disciplines at QUT. Flexibility is an underpinning principle along with the modularisation of fixed teaching and learning equipment, high utilisation of spaces and dynamic pedagogical approaches. The revitalised laboratories and workshop facilities are used primarily for the engineering disciplines and increasingly for integrated use across many disciplines in the STEM context. The new approach was built upon a base of an integrated faculty structure from 2005 and realised in 2010 as an associated development with the new Science and Engineering Centre (SEC). Evaluation through student feedback surveys for practical activities, utilisation rate statistics and uptake by academic and technical staff indicate a very positive outcome. DISCUSSION Resulting from this implementation has been increased satisfaction by students, creation of social learning and connecting space and an environment that meets the needs and challenges of active, collaborative and group learning pedagogies. Academic staff are supported, technical operations are efficient and laboratories are effectively utilised. RECOMMENDATIONS/IMPLICATIONS/CONCLUSION Future opportunities for continuous improvement are evident in using the student feedback to rectify faults and improve equipment, environment and process. The model is easily articulated and visible to other interested parties to contribute to sector wide development of learning spaces.
Resumo:
The goal of this work is to fabricate robust, highly-miniaturised, wireless sensor modules that incorporates ion-selective electrodes (ISEs). pH is one of the main parameters in assessment of the quality of our environment (water, soil) and these ISE/pH sensors will be deployed in a miniaturised, programmable modular system. The simplicity of ISEs (low costs and low power requirements) allow for the preparation of sensors that are all very similar in construction but can at the same time be easily made for variety of different environmentally important ions (i.e. heavy metals). This is important because of the increasing focus on the impact of the quality of the environment on society, both locally, and globally. The work described will contribute to a widely distributed sensor network for monitoring the quality of our environment, focused mainly on soil and water quality.
Resumo:
Use of structuring mechanisms (such as modularisation) is widely believed to be one of the key ways to improve software quality. Structuring is considered to be at least as important for specification documents as for source code, since it is assumed to improve comprehensibility. Yet, as with most widely held assumptions in software engineering, there is little empirical evidence to support this hypothesis. Also, even if structuring can be shown to he a good thing, we do not know how much structuring is somehow optimal. One of the more popular formal specification languages, Z, encourages structuring through its schema calculus. A controlled experiment is described in which two hypotheses about the effects of structure on the comprehensibility of Z specifications are tested. Evidence was found that structuring a specification into schemas of about 20 lines long significantly improved comprehensibility over a monolithic specification. However, there seems to be no perceived advantage in breaking down the schemas into much smaller components. The experiment can he fully replicated.
Resumo:
The Parker Morris report of 1961 attempted, through the application of scientific principles, to define the minimum living space standards needed to accommodate household activities. But while early modernist research into ideas of existenzminimum were the work of avant-garde architects and thinkers, this report was commissioned by the British State. This normalization of scientific enquiry into space can be considered not only a response to new conditions in the mass production of housing – economies of scale, prefabrication, system-building and modular coordination – but also to the post-war boom in consumer goods. The domestic interior was assigned a key role as a privileged site of mass consumption as the production and micro-management of space in Britain became integral to the development of a planned national economy underpinned by Fordist principles. The apparently placeless and scale-less diagrams executed by Gordon Cullen to illustrate Parker Morris emblematize these relationships. Walls dissolve as space flows from inside to outside in a homogenized and ephemeral landscape whose limits are perhaps only the boundaries of the nation state and the circuits of capital.
Resumo:
6.00 pm. If people like watching T.V. while they are eating their evening meal, space for a low table is needed (Ministry of Housing and Local Government, Space in the Home, 1963, p. 4).
This paper re-examines the 1961 Parker Morris report on housing standards in Britain. It explores the origins, scope, text and iconography of the report and suggests that these not only express a particularly modernist conception of space but one which presupposed very specific economic conditions and geographies.
Also known as Homes for Today and Tomorrow Parker Morris attempted, through the application of scientific principles, to define the minimum living space standards needed to accommodate household activities. But while early modernist research into notions of existenzminimum were the work of avant-garde architects and thinkers, Homes for Today and Tomorrow and its sister design manual Space in the Home were commissioned by the British State. This normalization of scientific enquiry into space can be considered not only as a response to new conditions in the mass production of housing – economies of scale, prefabrication, system-building and modular coordination – but also to the post-war boom in consumer goods. In this, it is suggested that the domestic interior was assigned a key role as a privileged site of mass consumption as the production and micro-management of space in Britain became integral to the development of a planned national economy underpinned by Fordist principles. Parker Morris, therefore, sought to accommodate activities which were pre-determined not so much by traditional social or familial ties but rather by recently introduced commodities such as the television set, white goods, table tennis tables and train sets. This relationship between the domestic interior and the national economy are emblematized by the series of placeless and scale-less diagrams executed by Gordon Cullen in Space in the Home. Here, walls dissolve as space flows from inside to outside in a homogenized and ephemeral landscape whose limits are perhaps only the boundaries of the nation state and the circuits of capital.
In Britain, Parker Morris was the last explicit State-sponsored attempt to prescribe a normative spatial programme for national living. The calm neutral efficiency of family-life expressed in its diagrams was almost immediately problematised by the rise of 1960s counter-culture, the feminist movement and the oil crisis of 1972 which altered perhaps forever the spatial, temporal and economic conditions it had taken for granted. The debate on space-standards, however, continues.
Resumo:
Traditionnellement, les applications orientées objets légataires intègrent différents aspects fonctionnels. Ces aspects peuvent être dispersés partout dans le code. Il existe différents types d’aspects : • des aspects qui représentent des fonctionnalités métiers ; • des aspects qui répondent à des exigences non fonctionnelles ou à d’autres considérations de conception comme la robustesse, la distribution, la sécurité, etc. Généralement, le code qui représente ces aspects chevauche plusieurs hiérarchies de classes. Plusieurs chercheurs se sont intéressés à la problématique de la modularisation de ces aspects dans le code : programmation orientée sujets, programmation orientée aspects et programmation orientée vues. Toutes ces méthodes proposent des techniques et des outils pour concevoir des applications orientées objets sous forme de composition de fragments de code qui répondent à différents aspects. La séparation des aspects dans le code a des avantages au niveau de la réutilisation et de la maintenance. Ainsi, il est important d’identifier et de localiser ces aspects dans du code légataire orienté objets. Nous nous intéressons particulièrement aux aspects fonctionnels. En supposant que le code qui répond à un aspect fonctionnel ou fonctionnalité exhibe une certaine cohésion fonctionnelle (dépendances entre les éléments), nous proposons d’identifier de telles fonctionnalités à partir du code. L’idée est d’identifier, en l’absence des paradigmes de la programmation par aspects, les techniques qui permettent l’implémentation des différents aspects fonctionnels dans un code objet. Notre approche consiste à : • identifier les techniques utilisées par les développeurs pour intégrer une fonctionnalité en l’absence des techniques orientées aspects • caractériser l’empreinte de ces techniques sur le code • et développer des outils pour identifier ces empreintes. Ainsi, nous présentons deux approches pour l’identification des fonctionnalités existantes dans du code orienté objets. La première identifie différents patrons de conception qui permettent l’intégration de ces fonctionnalités dans le code. La deuxième utilise l’analyse formelle de concepts pour identifier les fonctionnalités récurrentes dans le code. Nous expérimentons nos deux approches sur des systèmes libres orientés objets pour identifier les différentes fonctionnalités dans le code. Les résultats obtenus montrent l’efficacité de nos approches pour identifier les différentes fonctionnalités dans du code légataire orienté objets et permettent de suggérer des cas de refactorisation.
Resumo:
L’hypertension essentielle étant un facteur majeur de morbidité, la compréhension de son l’étiologie est prépondérante. Ainsi, la découverte de nouvelles composantes ou mécanismes de régulation de la PA par l’identification de QTL et l’étude de leurs interactions s’avère une approche prometteuse. L’utilisation de souches congéniques de rats pour l’étude de l’hypertension est une stratégie payante puisqu’elle permet de masquer les effets de l’environnement, tout en gardant le caractère polygénique de la PA. Longtemps conçu comme un trait issu de l’accumulation des effets minimes des QTL, la PA est régulée par une architecture basée sur l’existence d’interactions épistatiques. L’analyse par paires de QTL individuels a permis d’établir une modularité dans l’organisation des QTL chez le rat Dahl Salt-sensitive en fonction de la présence ou de l’absence d’une interaction épistatique entre eux. Ainsi, deux modules épistatiques ont été établis; EM1 et EM2 où tous les QTL appartenant à EM1 sont épistatiques entre eux et agissent de façon additive avec les membres de EM2. Des hiérarchies dans la régulation peuvent alors être révélées si les QTL d’un même EM ont des effets opposés. L’identification de la nature moléculaire des candidats C18QTL4/Hdhd2 et C18QTL3/Tcof1, membres du EM1, et de l’interaction épistatique entre ces deux QTL, a permis, en plus, d’élucider une régulation séquentielle au sein du module. Hdhd2 pourrait agir en amont de Tcof1 et réguler ce dernier par une modification post-traductionnelle. Cette interaction est la première évidence expérimentale de la prédiction des relations entre QTL, phénomène établi par leur modularisation. Le dévoilement du fonctionnement de l’architecture génétique à la base du contrôle de la PA et la découverte des gènes responsables des QTL permettrait d’élargir les cibles thérapeutiques et donc de développer des traitements antihypertenseurs plus efficaces.
Resumo:
In this session we look at how to think systematically about a problem and create a solution. We look at the definition and characteristics of an algorithm, and see how through modularisation and decomposition we can then choose a set of methods to create. We also compare this somewhat procedural approach, with the way that design works in Object Oriented Systems,
Resumo:
The proposal presented in this thesis is to provide designers of knowledge based supervisory systems of dynamic systems with a framework to facilitate their tasks avoiding interface problems among tools, data flow and management. The approach is thought to be useful to both control and process engineers in assisting their tasks. The use of AI technologies to diagnose and perform control loops and, of course, assist process supervisory tasks such as fault detection and diagnose, are in the scope of this work. Special effort has been put in integration of tools for assisting expert supervisory systems design. With this aim the experience of Computer Aided Control Systems Design (CACSD) frameworks have been analysed and used to design a Computer Aided Supervisory Systems (CASSD) framework. In this sense, some basic facilities are required to be available in this proposed framework: ·
Resumo:
Companies are focusing on efforts increasing the overall efficiency at the same time as the ability to meet customer needs becomes even more important. There is a need to improve the organisation and the product design at the same time through the visualisation of how a product family design should be performed in order to adapt to customers, company internal issues, and long-term strategy. Therefore, there is a need for qualified personnel in today’s companies with the knowledge of product development and modularity. The graduate course Development of Modular Products at Högskolan Dalarna has the objective to provide such knowledge. As a part of the course, each student will individually perform extensive research within a chosen area with respect to Product Development and Modularity. This proceeding is the result of the students own work and was presented during a two day seminar at Dalarna University. The contents of the papers cover many areas, from the identification of customer needs to cost effective manufacturing, and benefits of modularisation. The reader of this proceeding will not only benefit from many areas within Product Development and Modularity but also from the colour of many cultures. In this proceeding, students from nine countries are represented (Bangladesh, China, Costa Rica, Germany, Holland, India, Luxembourg Nigeria, and Sweden). Enjoy the reading.
Resumo:
En los años recientes se ha producido un rápido crecimiento del comercio internacional en productos semielaborados que son diseñados, producidos y ensamblados en diferentes localizaciones a lo largo de diferentes países, debido principalmente a los siguientes motivos: el desarrollo de las tecnologías de la información, la reducción de los costes de transporte, la liberalización de los mercados de capitales, la armonización de factores institucionales, la integración económica regional que implica la reducción y la eliminación de las barreras al comercio, el desarrollo económico de los países emergentes, el uso de economías de escala, así como una desregulación del comercio internacional. Todo ello ha incrementado la competencia a nivel mundial en los mercados y ha posibilitado a las compañías tener más facilidad de acceso a potenciales mercados, así como a la adquisición de capacidades y conocimientos en otros países y a la realización de alianzas estratégicas internacionales con terceros, creando un entorno con mayor incertidumbre y más exigente para las compañías que componen una industria, y que tiene consecuencias directas en las operaciones de las compañías y en la organización de su producción. Las compañías, para adaptarse, ser competitivas y beneficiarse de este nuevo escenario globalizado y más competitivo, han externalizado partes del proceso productivo hacia proveedores especializados, creando un nuevo mercado intermedio que divide el proceso productivo, anteriormente integrado en las compañías que conforman una industria, entre dos conjuntos de empresas especializadas en esa industria. Dicho proceso suele ocurrir conservando la industria en que tiene lugar, los mismos servicios y productos, la tecnología empleada y las compañías originales que la conformaban previamente a la desintegración vertical. Todo ello es así debido a que es beneficioso tanto para las compañías originales de la industria como para las nuevas compañías de este mercado intermedio por diversos motivos. La desintegración vertical en una industria tiene unas consecuencias que la transforman completamente, así como la forma de operar de las compañías que la integran, incluso para aquellas que permanecen verticalmente integradas. Una de las características más importantes de esta desintegración vertical en una industria es la posibilidad que tiene una compañía de adquirir a una tercera la primera parte del proceso productivo o un bien semielaborado, que posteriormente será finalizado por la compañía adquiriente con la práctica del outsourcing; así mismo, una compañía puede realizar la primera parte del proceso productivo o un bien semielaborado, que posteriormente será finalizado por una tercera compañía con la práctica de la fragmentación. El principal objetivo de la presente investigación es el estudio de los motivos, los facilitadores, los efectos, las consecuencias y los principales factores significativos, microeconómicos y macroeconómicos, que desencadenan o incrementan la práctica de la desintegración vertical en una industria; para ello, la investigación se divide en dos líneas completamente diferenciadas: el estudio de la práctica del outsourcing y, por otro lado, el estudio de la fragmentación por parte de las compañías que componen la industria del automóvil en España, puesto que se trata de una de las industrias más desintegradas verticalmente y fragmentadas, y este sector posee una gran importancia en la economía del país. En primer lugar, se hace una revisión de la literatura existente relativa a los siguientes aspectos: desintegración vertical, outsourcing, fragmentación, teoría del comercio internacional, historia de la industria del automóvil en España y el uso de las aglomeraciones geográficas y las tecnologías de la información en el sector del automóvil. La metodología empleada en cada uno de ellos ha sido diferente en función de la disponibilidad de los datos y del enfoque de investigación: los factores microeconómicos, utilizando el outsourcing, y los factores macroeconómicos, empleando la fragmentación. En el estudio del outsourcing, se usa un índice basado en las compras externas sobre el valor total de la producción. Así mismo, se estudia su correlación y significación con las variables económicas más importantes que definen a una compañía del sector del automóvil, utilizando la técnica estadística de regresión lineal. Aquellas variables relacionadas con la competencia en el mercado, la externalización de las actividades de menor valor añadido y el incremento de la modularización de las actividades de la cadena de valor, han resultado significativas con la práctica del outsourcing. En el estudio de la fragmentación se seleccionan un conjunto de factores macroeconómicos, comúnmente usados en este tipo de investigaciones, relacionados con las principales magnitudes económicas de un país, y un conjunto de factores macroeconómicos, no comúnmente usados en este tipo de investigaciones, relacionados con la libertad económica y el comercio internacional de un país. Se emplea un modelo de regresión logística para identificar qué factores son significativos en la práctica de la fragmentación. De entre todos los factores usados en el modelo, los relacionados con las economías de escala y los costes de servicio han resultado significativos. Los resultados obtenidos de los test estadísticos realizados en el modelo de regresión logística han resultado satisfactorios; por ello, el modelo propuesto de regresión logística puede ser considerado sólido, fiable y versátil; además, acorde con la realidad. De los resultados obtenidos en el estudio del outsourcing y de la fragmentación, combinados conjuntamente con el estado del arte, se concluye que el principal factor que desencadena la desintegración vertical en la industria del automóvil es la competencia en el mercado de vehículos. Cuanto mayor es la demanda de vehículos, más se reducen los beneficios y la rentabilidad para sus fabricantes. Estos, para ser competitivos, diferencian sus productos de la competencia centrándose en las actividades que mayor valor añadido aportan al producto final, externalizando las actividades de menor valor añadido a proveedores especializados, e incrementando la modularidad de las actividades de la cadena de valor. Las compañías de la industria del automóvil se especializan en alguna o varias de estas actividades modularizadas que, combinadas con el uso de factores facilitadores como las economías de escala, las tecnologías de la información, las ventajas de la globalización económica y la aglomeración geográfica de una industria, incrementan y motivan la desintegración vertical en la industria del automóvil, desencadenando la coespecialización en dos sectores claramente diferenciados: el sector de fabricantes de vehículos y el sector de proveedores especializados. Cada uno de ellos se especializa en unas actividades y en unos productos o servicios específicos de la cadena de valor, lo cual genera las siguientes consecuencias en la industria del automóvil: se reducen los costes de transacción en los productos o servicios intercambiados; se incrementan la relación de dependencia entre fabricantes de vehículos y proveedores especializados, provocando un aumento en la cooperación y la coordinación, acelerando el proceso de aprendizaje, posibilitando a ambos adquirir nuevas capacidades, conocimientos y recursos, y creando nuevas ventajas competitivas para ambos; por último, las barreras de entrada a la industria del automóvil y el número de compañías se ven alteradas cambiando su estructura. Como futura línea de investigación, los fabricantes de vehículos tenderán a centrarse en investigar, diseñar y comercializar el producto o servicio, delegando el ensamblaje en manos de nuevos especialistas en la materia, el contract manufacturer; por ello, sería conveniente investigar qué factores motivantes o facilitadores existen y qué consecuencias tendría la implantación de los contract manufacturer en la industria del automóvil. 1.1. ABSTRACT In recent years there has been a rapid growth of international trade in semi-finished products designed, produced and assembled in different locations across different countries, mainly due to the following reasons: development of information technologies, reduction of transportation costs, liberalisation of capital markets, harmonisation of institutional factors, regional economic integration, which involves the reduction and elimination of trade barriers, economic development of emerging countries, use of economies of scale and deregulation of international trade. All these factors have increased competition in markets at a global level and have allowed companies to gain easier access to potential markets and to the acquisition of skills and knowledge in other countries, as well as to the completion of international strategic alliances with third parties, thus creating a more demanding and uncertain environment for these companies constituting an industry, which has a direct impact on the companies' operations and the organization of their production. In order to adapt, be competitive and benefit from this new and more competitive global scenario, companies have outsourced some parts of their production process to specialist suppliers, generating a new intermediate market which divides the production process, previously integrated in the companies that made up the industry, into two sets of companies specialized in that industry. This process often occurs while preserving the industry where it takes place, its same services and products, the technology used and the original companies that formed it prior to vertical disintegration. This is because it is beneficial for both the industry's original companies and the companies belonging to this new intermediate market, for various reasons. Vertical disintegration has consequences which completely transform the industry where it takes place as well as the modus operandi of the companies that are part of it, even of those who remain vertically integrated. One of the most important features of vertical disintegration of an industry is the possibility for a company to acquire from a third one the first part of the production process or a semi-finished product, which will then be finished by the acquiring company through the practice of outsourcing; also, a company can perform the first part of the production process or a semi-finish product, which will then be completed by a third company through the practice of fragmentation. The main objective of this research is to study the motives, facilitators, effects, consequences and major significant microeconomic and macroeconomic factors that trigger or increase the practice of vertical disintegration in a certain industry; in order to do so, research is divided into two completely differentiated lines: on the one hand, the study of the practise of outsourcing and, on the other, the study of fragmentation by companies constituting the automotive industry in Spain, since this is one of the most vertically disintegrated and fragmented industries and this particular sector is of major significance in this country's economy. First, a review is made of the existing literature, on the following aspects: vertical disintegration, outsourcing, fragmentation, international trade theory, history of the automobile industry in Spain and the use of geographical agglomeration and information technologies in the automotive sector. The methodology used for each of these aspects has been different depending on the availability of data and the research approach: the microeconomic factors, using outsourcing, and the macroeconomic factors, using fragmentation. In the study on outsourcing, an index is used based on external purchases in relation to the total value of production. Likewise, their significance and correlation with the major economic variables that define an automotive company are studied, using the statistical technique of linear regression. Variables related to market competition, outsourcing of lowest value-added activities and increased modularisation of the activities of the value chain have turned out to be significant with the practice of outsourcing. In the study of fragmentation, a set of macroeconomic factors commonly used for this type of research, is selected, related to the main economic indicators of a country, as well as a set of macroeconomic factors, not commonly used for this type of research, which are related to economic freedom and the international trade of a certain country. A logistic regression model is used to identify which factors are significant in the practice of fragmentation. Amongst all factors used in the model, those related to economies of scale and service costs have turned out to be significant. The results obtained from the statistical tests performed on the logistic regression model have been successful; hence, the suggested logistic regression model can be considered to be solid, reliable and versatile; likewise, it is in line with reality. From the results obtained in the study of outsourcing and fragmentation, combined with the state of the art, it is concluded that the main factor that triggers vertical disintegration in the automotive industry is competition within the vehicle market. The greater the vehicle demand, the lower the earnings and profitability for manufacturers. These, in order to be competitive, differentiate their products from the competition by focusing on those activities that contribute with the highest added value to the final product, outsourcing the lower valueadded activities to specialist suppliers, and increasing the modularity of the activities of the value chain. Companies in the automotive industry specialize in one or more of these modularised activities which, combined with the use of enabling factors such as economies of scale, information technologies, the advantages of economic globalisation and the geographical agglomeration of an industry, increase and encourage vertical disintegration in the automotive industry, triggering co-specialization in two clearly distinct sectors: the sector of vehicle manufacturers and the specialist suppliers sector. Each of them specializes in certain activities and specific products or services of the value chain, generating the following consequences in the automotive industry: reduction of transaction costs of the goods or services exchanged; growth of the relationship of dependency between vehicle manufacturers and specialist suppliers, which causes an increase in cooperation and coordination, accelerates the learning process, enables both to acquire new skills, knowledge and resources, and creates new competitive advantages for both; finally, barriers to entry the automotive industry and the number of companies are altered, changing their structure. As a future line of research, vehicle manufacturers will tend to focus on researching, designing and marketing the product or service, delegating the assembly in the hands of new specialists in the field, the contract manufacturer; for this reason, it would be useful to investigate what motivating or facilitating factors exist in this respect and what consequences would the implementation of contract manufacturers have in the automotive industry.