17 resultados para MIXED-MODEL
em Universidad Politécnica de Madrid
Resumo:
The influence of climate on forest stand composition, development and growth is undeniable. Many studies have tried to quantify the effect of climatic variables on forest growth and yield. These works become especially important because there is a need to predict the effects of climate change on the development of forest ecosystems. One of the ways of facing this problem is the inclusion of climatic variables into the classic empirical growth models. The work has a double objective: (i) to identify the indicators which best describe the effect of climate on Pinus halepensis growth and (ii) to quantify such effect in several scenarios of rainfall decrease which are likely to occur in the Mediterranean area. A growth mixed model for P. halepensis including climatic variables is presented in this work. Growth estimates are based on data from the Spanish National Forest Inventory (SNFI). The best results are obtained for the indices including rainfall, or rainfall and temperature together, with annual precipitation, precipitation effectiveness, Emberger?s index or free bioclimatic intensity standing out among them. The final model includes Emberger?s index, free bioclimatic intensity and interactions between competition and climate indices. The results obtained show that a rainfall decrease about 5% leads to a decrease in volume growth of 5.5?7.5% depending on site quality.
Resumo:
Natural regeneration in Pinus pinea stands commonly fails throughout the Spanish Northern Plateau under current intensive regeneration treatments. As a result, extensive direct seeding is commonly conducted to guarantee regeneration occurrence. In a period of rationalization of the resources devoted to forest management, this kind of techniques may become unaffordable. Given that the climatic and stand factors driving germination remain unknown, tools are required to understand the process and temper the use of direct seeding. In this study, the spatio-temporal pattern of germination of P. pinea was modelled with those purposes. The resulting findings will allow us to (1) determine the main ecological variables involved in germination in the species and (2) infer adequate silvicultural alternatives. The modelling approach focuses on covariates which are readily available to forest managers. A two-step nonlinear mixed model was fitted to predict germination occurrence and abundance in P. pinea under varying climatic, environmental and stand conditions, based on a germination data set covering a 5-year period. The results obtained reveal that the process is primarily driven by climate variables. Favourable conditions for germination commonly occur in fall although the optimum window is often narrow and may not occur at all in some years. At spatial level, it would appear that germination is facilitated by high stand densities, suggesting that current felling intensity should be reduced. In accordance with other studies on P. pinea dispersal, it seems that denser stands during the regeneration period will reduce the present dependence on direct seeding.
Resumo:
Short-run forecasting of electricity prices has become necessary for power generation unit schedule, since it is the basis of every profit maximization strategy. In this article a new and very easy method to compute accurate forecasts for electricity prices using mixed models is proposed. The main idea is to develop an efficient tool for one-step-ahead forecasting in the future, combining several prediction methods for which forecasting performance has been checked and compared for a span of several years. Also as a novelty, the 24 hourly time series has been modelled separately, instead of the complete time series of the prices. This allows one to take advantage of the homogeneity of these 24 time series. The purpose of this paper is to select the model that leads to smaller prediction errors and to obtain the appropriate length of time to use for forecasting. These results have been obtained by means of a computational experiment. A mixed model which combines the advantages of the two new models discussed is proposed. Some numerical results for the Spanish market are shown, but this new methodology can be applied to other electricity markets as well
Resumo:
Objectives The study sought to evaluate the ability of cardiac magnetic resonance (CMR) to monitor acute and long-term changes in pulmonary vascular resistance (PVR) noninvasively. Background PVR monitoring during the follow-up of patients with pulmonary hypertension (PH) and the response to vasodilator testing require invasive right heart catheterization. Methods An experimental study in pigs was designed to evaluate the ability of CMR to monitor: 1) an acute increase in PVR generated by acute pulmonary embolization (n = 10); 2) serial changes in PVR in chronic PH (n = 22); and 3) changes in PVR during vasodilator testing in chronic PH (n = 10). CMR studies were performed with simultaneous hemodynamic assessment using a CMR-compatible Swan-Ganz catheter. Average flow velocity in the main pulmonary artery (PA) was quantified with phase contrast imaging. Pearson correlation and mixed model analysis were used to correlate changes in PVR with changes in CMR-quantified PA velocity. Additionally, PVR was estimated from CMR data (PA velocity and right ventricular ejection fraction) using a formula previously validated. Results Changes in PA velocity strongly and inversely correlated with acute increases in PVR induced by pulmonary embolization (r = –0.92), serial PVR fluctuations in chronic PH (r = –0.89), and acute reductions during vasodilator testing (r = –0.89, p ≤ 0.01 for all). CMR-estimated PVR showed adequate agreement with invasive PVR (mean bias –1.1 Wood units,; 95% confidence interval: –5.9 to 3.7) and changes in both indices correlated strongly (r = 0.86, p < 0.01). Conclusions CMR allows for noninvasive monitoring of acute and chronic changes in PVR in PH. This capability may be valuable in the evaluation and follow-up of patients with PH.
Resumo:
- Context: Pinus pinea L. presents serious problems of natural regeneration in managed forest of Central Spain. The species exhibits specific traits linked to frugivore activity. Therefore, information on plant–animal interactions may be crucial to understand regeneration failure. - Aims: Determining the spatio-temporal pattern of P. pinea seed predation by Apodemus sylvaticus L. and the factors involved. Exploring the importance of A. sylvaticus L. as a disperser of P. pinea. Identifying other frugivores and their seasonal patterns. - Methods: An intensive 24-month seed predation trial was carried out. The probability of seeds escaping predation was modelled through a zero-inflated binomial mixed model. Experiments on seed dispersal by A. sylvaticus were conducted. Cameras were set up to identify other potential frugivores. - Results: Decreasing rodent population in summer and masting enhances seed survival. Seeds were exploited more rapidly nearby parent trees and shelters. A. sylvaticus dispersal activity was found to be scarce. Corvids marginally preyed upon P. pinea seeds. - Conclusions: Survival of P. pinea seeds is climate-controlled through the timing of the dry period together with masting occurrence. Should germination not take place during the survival period, establishment may be limited. A. sylvaticus mediated dispersal does not modify the seed shadow. Seasonality of corvid activity points to a role of corvids in dispersal.
Resumo:
The authors are from UPM and are relatively grouped, and all have intervened in different academic or real cases on the subject, at different times as being of different age. With precedent from E. Torroja and A. Páez in Madrid Spain Safety Probabilistic models for concrete about 1957, now in ICOSSAR conferences, author J.M. Antón involved since autumn 1967 for euro-steel construction in CECM produced a math model for independent load superposition reductions, and using it a load coefficient pattern for codes in Rome Feb. 1969, practically adopted for European constructions, giving in JCSS Lisbon Feb. 1974 suggestion of union for concrete-steel-al.. That model uses model for loads like Gumbel type I, for 50 years for one type of load, reduced to 1 year to be added to other independent loads, the sum set in Gumbel theories to 50 years return period, there are parallel models. A complete reliability system was produced, including non linear effects as from buckling, phenomena considered somehow in actual Construction Eurocodes produced from Model Codes. The system was considered by author in CEB in presence of Hydraulic effects from rivers, floods, sea, in reference with actual practice. When redacting a Road Drainage Norm in MOPU Spain an optimization model was realized by authors giving a way to determine the figure of Return Period, 10 to 50 years, for the cases of hydraulic flows to be considered in road drainage. Satisfactory examples were a stream in SE of Spain with Gumbel Type I model and a paper of Ven Te Chow with Mississippi in Keokuk using Gumbel type II, and the model can be modernized with more varied extreme laws. In fact in the MOPU drainage norm the redacting commission acted also as expert to set a table of return periods for elements of road drainage, in fact as a multi-criteria complex decision system. These precedent ideas were used e.g. in wide Codes, indicated in symposia or meetings, but not published in journals in English, and a condensate of contributions of authors is presented. The authors are somehow involved in optimization for hydraulic and agro planning, and give modest hints of intended applications in presence of agro and environment planning as a selection of the criteria and utility functions involved in bayesian, multi-criteria or mixed decision systems. Modest consideration is made of changing in climate, and on the production and commercial systems, and on others as social and financial.
Resumo:
Detecting user affect automatically during real-time conversation is the main challenge towards our greater aim of infusing social intelligence into a natural-language mixed-initiative High-Fidelity (Hi-Fi) audio control spoken dialog agent. In recent years, studies on affect detection from voice have moved on to using realistic, non-acted data, which is subtler. However, it is more challenging to perceive subtler emotions and this is demonstrated in tasks such as labelling and machine prediction. This paper attempts to address part of this challenge by considering the role of user satisfaction ratings and also conversational/dialog features in discriminating contentment and frustration, two types of emotions that are known to be prevalent within spoken human-computer interaction. However, given the laboratory constraints, users might be positively biased when rating the system, indirectly making the reliability of the satisfaction data questionable. Machine learning experiments were conducted on two datasets, users and annotators, which were then compared in order to assess the reliability of these datasets. Our results indicated that standard classifiers were significantly more successful in discriminating the abovementioned emotions and their intensities (reflected by user satisfaction ratings) from annotator data than from user data. These results corroborated that: first, satisfaction data could be used directly as an alternative target variable to model affect, and that they could be predicted exclusively by dialog features. Second, these were only true when trying to predict the abovementioned emotions using annotator?s data, suggesting that user bias does exist in a laboratory-led evaluation.
Resumo:
Mixed criticality systems emerges as a suitable solution for dealing with the complexity, performance and costs of future embedded and dependable systems. However, this paradigm adds additional complexity to their development. This paper proposes an approach for dealing with this scenario that relies on hardware virtualization and Model-Driven Engineering (MDE). Hardware virtualization ensures isolation between subsystems with different criticality levels. MDE is intended to bridge the gap between design issues and partitioning concerns. MDE tooling will enhance the functional models by annotating partitioning and extra-functional properties. System partitioning and subsystems allocation will be generated with a high degree of automation. System configuration will be validated for ensuring that the resources assigned to a partition are sufficient for executing the allocated software components and that time requirements are met.
Finite element simulation of sandwich panels of plasterboard and rock wool under mixed mode fracture
Resumo:
This paper presents the results of research on mixed mode fracture of sandwich panels of plasterboard and rock wool. The experimental data of the performed tests are supplied. The specimens were made from commercial panels. Asymmetrical three-point bending tests were performed on notched specimens. Three sizes of geometrically similar specimens were tested for studying the size effect. The paper also includes the numerical simulation of the experimental results by using an embedded cohesive crack model.The involved parameters for modelling are previously measured by standardised tests.
Resumo:
The development of mixed-criticality virtualized multicore systems poses new challenges that are being subject of active research work. There is an additional complexity: it is now required to identify a set of partitions, and allocate applications to partitions. In this job, a number of issues have to be considered, such as the criticality level of the application, security and dependability requirements, operating system used by the application, time requirements granularity, specific hardware needs, etc. MultiPARTES [6] toolset relies on Model Driven Engineering (MDE) [12], which is a suitable approach in this setting. In this paper, it is described the support provided for automatic system partitioning generation and toolset extensibility.
Resumo:
The development of mixed-criticality virtualized multi-core systems poses new challenges that are being subject of active research work. There is an additional complexity: it is now required to identify a set of partitions, and allocate applications to partitions. In this job, a number of issues have to be considered, such as the criticality level of the application, security and dependability requirements, time requirements granularity, etc. MultiPARTES [11] toolset relies on Model Driven Engineering (MDE), which is a suitable approach in this setting, as it helps to bridge the gap between design issues and partitioning concerns. MDE is changing the way systems are developed nowadays, reducing development time. In general, modelling approaches have shown their benefits when applied to embedded systems. These benefits have been achieved by fostering reuse with an intensive use of abstractions, or automating the generation of boiler-plate code.
Resumo:
This PhD dissertation is framed in the emergent fields of Reverse Logistics and ClosedLoop Supply Chain (CLSC) management. This subarea of supply chain management has gained researchers and practitioners' attention over the last 15 years to become a fully recognized subdiscipline of the Operations Management field. More specifically, among all the activities that are included within the CLSC area, the focus of this dissertation is centered in direct reuse aspects. The main contribution of this dissertation to current knowledge is twofold. First, a framework for the so-called reuse CLSC is developed. This conceptual model is grounded in a set of six case studies conducted by the author in real industrial settings. The model has also been contrasted with existing literature and with academic and professional experts on the topic as well. The framework encompasses four building blocks. In the first block, a typology for reusable articles is put forward, distinguishing between Returnable Transport Items (RTI), Reusable Packaging Materials (RPM), and Reusable Products (RP). In the second block, the common characteristics that render reuse CLSC difficult to manage from a logistical standpoint are identified, namely: fleet shrinkage, significant investment and limited visibility. In the third block, the main problems arising in the management of reuse CLSC are analyzed, such as: (1) define fleet size dimension, (2) control cycle time and promote articles rotation, (3) control return rate and prevent shrinkage, (4) define purchase policies for new articles, (5) plan and control reconditioning activities, and (6) balance inventory between depots. Finally, in the fourth block some solutions to those issues are developed. Firstly, problems (2) and (3) are addressed through the comparative analysis of alternative strategies for controlling cycle time and return rate. Secondly, a methodology for calculating the required fleet size is elaborated (problem (1)). This methodology is valid for different configurations of the physical flows in the reuse CLSC. Likewise, some directions are pointed out for further development of a similar method for defining purchase policies for new articles (problem (4)). The second main contribution of this dissertation is embedded in the solutions part (block 4) of the conceptual framework and comprises a two-level decision problem integrating two mixed integer linear programming (MILP) models that have been formulated and solved to optimality using AIMMS as modeling language, CPLEX as solver and Excel spreadsheet for data introduction and output presentation. The results obtained are analyzed in order to measure in a client-supplier system the economic impact of two alternative control strategies (recovery policies) in the context of reuse. In addition, the models support decision-making regarding the selection of the appropriate recovery policy against the characteristics of demand pattern and the structure of the relevant costs in the system. The triangulation of methods used in this thesis has enabled to address the same research topic with different approaches and thus, the robustness of the results obtained is strengthened.
Resumo:
Background: This study examined the daily surgical scheduling problem in a teaching hospital. This problem relates to the use of multiple operating rooms and different types of surgeons in a typical surgical day with deterministic operation durations (preincision, incision, and postincision times). Teaching hospitals play a key role in the health-care system; however, existing models assume that the duration of surgery is independent of the surgeon's skills. This problem has not been properly addressed in other studies. We analyze the case of a Spanish public hospital, in which continuous pressures and budgeting reductions entail the more efficient use of resources. Methods: To obtain an optimal solution for this problem, we developed a mixed-integer programming model and user-friendly interface that facilitate the scheduling of planned operations for the following surgical day. We also implemented a simulation model to assist the evaluation of different dispatching policies for surgeries and surgeons. The typical aspects we took into account were the type of surgeon, potential overtime, idling time of surgeons, and the use of operating rooms. Results: It is necessary to consider the expertise of a given surgeon when formulating a schedule: such skill can decrease the probability of delays that could affect subsequent surgeries or cause cancellation of the final surgery. We obtained optimal solutions for a set of given instances, which we obtained through surgical information related to acceptable times collected from a Spanish public hospital. Conclusions: We developed a computer-aided framework with a user-friendly interface for use by a surgical manager that presents a 3-D simulation of the problem. Additionally, we obtained an efficient formulation for this complex problem. However, the spread of this kind of operation research in Spanish public health hospitals will take a long time since there is a lack of knowledge of the beneficial techniques and possibilities that operational research can offer for the health-care system.
Resumo:
Los montes Mediterráneos han experimentado múltiples cambios en las últimas décadas (tanto en clima como en usos), lo que ha conducido a variaciones en la distribución de especies. El aumento previsto de las temperaturas medias junto con la mayor variabilidad intra e inter anual en cuanto a la ocurrencia de eventos extremos o disturbios naturales (como periodos prolongados de sequía, olas de frío o calor, incendios forestales o vendavales) pueden dañar significativamente al regenerado, llevándolo hasta la muerte, y jugando un papel decisivo en la composición de especies y en la dinámica del monte. La amplitud ecológica de muchas especies forestales puede verse afectada, de forma que se esperan cambios en sus nichos actuales de regeneración. Sin embargo, la migración latitudinal de las especies en busca de mejores condiciones, podría ser una explicación demasiado simplista de un proceso mucho más complejo de interacción entre la temperatura y la precipitación, que afectaría a cada especie de un modo distinto. En este sentido tanto la capacidad de adaptación al estrés ambiental de una determinada especie, así como su habilidad para competir por los recursos limitados, podría significar variaciones dentro de una comunidad. Las características fisiológicas y morfológicas propias de cada especie se encuentran fuertemente relacionadas con el lugar donde cada una puede surgir, qué especies pueden convivir y como éstas responden a las condiciones ambientales. En este sentido, el conocimiento sobre las distintas respuestas ecofisiológicas observadas ante cambios ambientales puede ser fundamentales para la predicción de variaciones en la distribución de especies, composición de la comunidad y productividad del monte ante el cambio global. En esta tesis investigamos el grado de tolerancia y sensibilidad que cada una de las tres especies de estudio, coexistentes en el interior peninsular ibérico (Pinus pinea, Quercus ilex y Juniperus oxycedrus), muestra ante los factores abióticos de estrés típicos de la región Mediterránea. Nuestro trabajo se ha basado en la definición del nicho óptimo fisiológico para el regenerado de cada especie a través de la investigación en profundidad del efecto de la sequía, la temperatura y el ambiente lumínico. Para ello, hemos desarrollado un modelo de predicción de la tasa de asimilación de carbono que nos ha permitido identificar las condiciones óptimas ambientales donde el regenerado de cada especie podría establecerse con mayor facilidad. En apoyo a este trabajo y con la idea de estudiar el efecto de la sequía a nivel de toda la planta hemos desarrollado un experimento paralelo en invernadero. Aquí se han aplicado dos regímenes hídricos para estudiar las características fisiológicas y morfológicas de cada especie, sobre todo a nivel de raíz y crecimiento del tallo, y relacionarlas con las diferentes estrategias en el uso del agua de las especies. Por último, hemos estudiado los patrones de aclimatación y desaclimatación al frio de cada especie, identificando los periodos de sensibilidad a heladas, así como cuellos de botella donde la competencia entre especies podría surgir. A pesar de que el pino piñonero ha sido la especie objeto de la gestión de estas masas durante siglos, actualmente se encuentra en la posición más desfavorable para combatir el cambio global, presentado el nicho fisiológico más estrecho de las tres especies. La encina sin embargo, ha resultado ser la especie mejor cualificada para afrontar este cambio, seguida muy de cerca por el enebro. Nuestros resultados sugieren una posible expansión en el rango de distribución de la encina, un aumento en la presencia del enebro y una disminución progresiva del pino piñonero a medio plazo en estas masas. ABSTRACT Mediterranean forests have undergone multiple changes over the last decades (in both climate and land use), which have lead to variations in the distribution of species. The expected increase in mean annual temperature together with the greater inter and intra-annual variability in extreme events and disturbances occurrence (such as prolonged drought periods, cold or heat waves, wildfires or strong winds) can significantly damage natural regeneration, up to causing death, playing a decisive role on species composition and forest dynamics. The ecological amplitude for adaptation of many species can be affected in such a way that changes in the current regeneration niches of many species are expected. However, the forecasted poleward migration of species seeking better conditions could be an oversimplification of what is a more complex phenomenon of interactions among temperature and precipitation, that would affect different species in different ways. In this regard, either the ability to adapt to environmental stresses or to compete for limited resources of a single species in a mixed forest could lead to variations within a community. The ecophysiological and morphological traits specific to each species are strongly related to the place where each species can emerge, which species can coexist, and how they respond to environmental conditions. In this regard, the understanding of the ecophysiological responses observed against changes in environmental conditions can be essential for predicting variations in species distribution, community composition, and forest productivity in the context of global change. In this thesis we investigated the degree of tolerance and sensitivity that each of the three studied species, co-occurring in central of the Iberian Peninsula (Pinus pinea, Quercus ilex and Juniperus oxycedrus), show against the typical abiotic stress factors in the Mediterranean region. Our work is based on the optimal physiological niche for regeneration of each species through in-depth research on the effect of drought, temperature and light environment. For this purpose, we developed a model to predict the carbon assimilation rate which allows us to identify the optimal environmental conditions where regeneration from each species could establish itself more easily. To obtain a better understanding about the effect of low temperature on regeneration, we studied the acclimation and deacclimation patterns to cold of each species, identifying period of frost sensitivity, as well as bottlenecks where competition between species can arise. Finally, to support our results about the effect of water availabilty, we conducted a greenhouse experiment with a view of studying the drought effect at the whole plant level. Here, two watering regimes were applied in order to study the physiological and morphological traits of each species, mainly at the level of the root system and stem growth, and so relate them to the different water use strategies of the species. Despite the fact that stone pine has been the target species for centuries, nowadays this species is in the most unfavorable position to cope with climate change. Holm oak, however, resulted the species that is best adapted to tolerate the predicted changes, followed closely by prickly juniper. Our results suggest a feasible expansion of the distribution range in holm oak, an increase in the prickly juniper presence and a progressive decreasing of stone pine presence in the medium term in these stone pine-holm oak-prickly juniper mixed forests.
Resumo:
Los sistemas empotrados son cada día más comunes y complejos, de modo que encontrar procesos seguros, eficaces y baratos de desarrollo software dirigidos específicamente a esta clase de sistemas es más necesario que nunca. A diferencia de lo que ocurría hasta hace poco, en la actualidad los avances tecnológicos en el campo de los microprocesadores de los últimos tiempos permiten el desarrollo de equipos con prestaciones más que suficientes para ejecutar varios sistemas software en una única máquina. Además, hay sistemas empotrados con requisitos de seguridad (safety) de cuyo correcto funcionamiento depende la vida de muchas personas y/o grandes inversiones económicas. Estos sistemas software se diseñan e implementan de acuerdo con unos estándares de desarrollo software muy estrictos y exigentes. En algunos casos puede ser necesaria también la certificación del software. Para estos casos, los sistemas con criticidades mixtas pueden ser una alternativa muy valiosa. En esta clase de sistemas, aplicaciones con diferentes niveles de criticidad se ejecutan en el mismo computador. Sin embargo, a menudo es necesario certificar el sistema entero con el nivel de criticidad de la aplicación más crítica, lo que hace que los costes se disparen. La virtualización se ha postulado como una tecnología muy interesante para contener esos costes. Esta tecnología permite que un conjunto de máquinas virtuales o particiones ejecuten las aplicaciones con unos niveles de aislamiento tanto temporal como espacial muy altos. Esto, a su vez, permite que cada partición pueda ser certificada independientemente. Para el desarrollo de sistemas particionados con criticidades mixtas se necesita actualizar los modelos de desarrollo software tradicionales, pues estos no cubren ni las nuevas actividades ni los nuevos roles que se requieren en el desarrollo de estos sistemas. Por ejemplo, el integrador del sistema debe definir las particiones o el desarrollador de aplicaciones debe tener en cuenta las características de la partición donde su aplicación va a ejecutar. Tradicionalmente, en el desarrollo de sistemas empotrados, el modelo en V ha tenido una especial relevancia. Por ello, este modelo ha sido adaptado para tener en cuenta escenarios tales como el desarrollo en paralelo de aplicaciones o la incorporación de una nueva partición a un sistema ya existente. El objetivo de esta tesis doctoral es mejorar la tecnología actual de desarrollo de sistemas particionados con criticidades mixtas. Para ello, se ha diseñado e implementado un entorno dirigido específicamente a facilitar y mejorar los procesos de desarrollo de esta clase de sistemas. En concreto, se ha creado un algoritmo que genera el particionado del sistema automáticamente. En el entorno de desarrollo propuesto, se han integrado todas las actividades necesarias para desarrollo de un sistema particionado, incluidos los nuevos roles y actividades mencionados anteriormente. Además, el diseño del entorno de desarrollo se ha basado en la ingeniería guiada por modelos (Model-Driven Engineering), la cual promueve el uso de los modelos como elementos fundamentales en el proceso de desarrollo. Así pues, se proporcionan las herramientas necesarias para modelar y particionar el sistema, así como para validar los resultados y generar los artefactos necesarios para el compilado, construcción y despliegue del mismo. Además, en el diseño del entorno de desarrollo, la extensión e integración del mismo con herramientas de validación ha sido un factor clave. En concreto, se pueden incorporar al entorno de desarrollo nuevos requisitos no-funcionales, la generación de nuevos artefactos tales como documentación o diferentes lenguajes de programación, etc. Una parte clave del entorno de desarrollo es el algoritmo de particionado. Este algoritmo se ha diseñado para ser independiente de los requisitos de las aplicaciones así como para permitir al integrador del sistema implementar nuevos requisitos del sistema. Para lograr esta independencia, se han definido las restricciones al particionado. El algoritmo garantiza que dichas restricciones se cumplirán en el sistema particionado que resulte de su ejecución. Las restricciones al particionado se han diseñado con una capacidad expresiva suficiente para que, con un pequeño grupo de ellas, se puedan expresar la mayor parte de los requisitos no-funcionales más comunes. Las restricciones pueden ser definidas manualmente por el integrador del sistema o bien pueden ser generadas automáticamente por una herramienta a partir de los requisitos funcionales y no-funcionales de una aplicación. El algoritmo de particionado toma como entradas los modelos y las restricciones al particionado del sistema. Tras la ejecución y como resultado, se genera un modelo de despliegue en el que se definen las particiones que son necesarias para el particionado del sistema. A su vez, cada partición define qué aplicaciones deben ejecutar en ella así como los recursos que necesita la partición para ejecutar correctamente. El problema del particionado y las restricciones al particionado se modelan matemáticamente a través de grafos coloreados. En dichos grafos, un coloreado propio de los vértices representa un particionado del sistema correcto. El algoritmo se ha diseñado también para que, si es necesario, sea posible obtener particionados alternativos al inicialmente propuesto. El entorno de desarrollo, incluyendo el algoritmo de particionado, se ha probado con éxito en dos casos de uso industriales: el satélite UPMSat-2 y un demostrador del sistema de control de una turbina eólica. Además, el algoritmo se ha validado mediante la ejecución de numerosos escenarios sintéticos, incluyendo algunos muy complejos, de más de 500 aplicaciones. ABSTRACT The importance of embedded software is growing as it is required for a large number of systems. Devising cheap, efficient and reliable development processes for embedded systems is thus a notable challenge nowadays. Computer processing power is continuously increasing, and as a result, it is currently possible to integrate complex systems in a single processor, which was not feasible a few years ago.Embedded systems may have safety critical requirements. Its failure may result in personal or substantial economical loss. The development of these systems requires stringent development processes that are usually defined by suitable standards. In some cases their certification is also necessary. This scenario fosters the use of mixed-criticality systems in which applications of different criticality levels must coexist in a single system. In these cases, it is usually necessary to certify the whole system, including non-critical applications, which is costly. Virtualization emerges as an enabling technology used for dealing with this problem. The system is structured as a set of partitions, or virtual machines, that can be executed with temporal and spatial isolation. In this way, applications can be developed and certified independently. The development of MCPS (Mixed-Criticality Partitioned Systems) requires additional roles and activities that traditional systems do not require. The system integrator has to define system partitions. Application development has to consider the characteristics of the partition to which it is allocated. In addition, traditional software process models have to be adapted to this scenario. The V-model is commonly used in embedded systems development. It can be adapted to the development of MCPS by enabling the parallel development of applications or adding an additional partition to an existing system. The objective of this PhD is to improve the available technology for MCPS development by providing a framework tailored to the development of this type of system and by defining a flexible and efficient algorithm for automatically generating system partitionings. The goal of the framework is to integrate all the activities required for developing MCPS and to support the different roles involved in this process. The framework is based on MDE (Model-Driven Engineering), which emphasizes the use of models in the development process. The framework provides basic means for modeling the system, generating system partitions, validating the system and generating final artifacts. The framework has been designed to facilitate its extension and the integration of external validation tools. In particular, it can be extended by adding support for additional non-functional requirements and support for final artifacts, such as new programming languages or additional documentation. The framework includes a novel partitioning algorithm. It has been designed to be independent of the types of applications requirements and also to enable the system integrator to tailor the partitioning to the specific requirements of a system. This independence is achieved by defining partitioning constraints that must be met by the resulting partitioning. They have sufficient expressive capacity to state the most common constraints and can be defined manually by the system integrator or generated automatically based on functional and non-functional requirements of the applications. The partitioning algorithm uses system models and partitioning constraints as its inputs. It generates a deployment model that is composed by a set of partitions. Each partition is in turn composed of a set of allocated applications and assigned resources. The partitioning problem, including applications and constraints, is modeled as a colored graph. A valid partitioning is a proper vertex coloring. A specially designed algorithm generates this coloring and is able to provide alternative partitions if required. The framework, including the partitioning algorithm, has been successfully used in the development of two industrial use cases: the UPMSat-2 satellite and the control system of a wind-power turbine. The partitioning algorithm has been successfully validated by using a large number of synthetic loads, including complex scenarios with more that 500 applications.