943 resultados para Model-driven design


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Usability plays an important role to satisfy users? needs. There are many recommendations in the HCI literature on how to improve software usability. Our research focuses on such recommendations that affect the system architecture rather than just the interface. However, improving software usability in aspects that affect architecture increases the analyst?s workload and development complexity. This paper proposes a solution based on model-driven development. We propose representing functional usability mechanisms abstractly by means of conceptual primitives. The analyst will use these primitives to incorporate functional usability features at the early stages of the development process. Following the model-driven development paradigm, these features are then automatically transformed into subsequent steps of development, a practice that is hidden from the analyst.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The characterisation of mineral texture has been a major concern for process mineralogists, as liberation characteristics of the ores are intimately related to the mineralogical texture. While a great effort has been done to automatically characterise texture in unbroken ores, the characterisation of textural attributes in mineral particles is usually descriptive. However, the quantitative characterisation of texture in mineral particles is essential to improve and predict the performance of minerallurgical processes (i.e. all the processes involved in the liberation and separation of the mineral of interest) and to achieve a more accurate geometallurgical model. Driven by this necessity of achieving a more complete characterisation of textural attributes in mineral particles, a methodology has been recently developed to automatically characterise the type of intergrowth between mineral phases within particles by means of digital image analysis. In this methodology, a set ofminerallurgical indices has been developed to quantify different mineralogical features and to identify the intergrowth pattern by discriminant analysis. The paper shows the application of the methodology to the textural characterisation of chalcopyrite in the rougher concentrate of the Kansanshi copper mine (Zambia). In this sample, the variety of intergrowth patterns of chalcopyrite with the other minerals has been used to illustrate the methodology. The results obtained show that the method identifies the intergrowth type and provides quantitative information to achieve a complete and detailed mineralogical characterisation. Therefore, the use of this methodology as a routinely tool in automated mineralogy would contribute to a better understanding of the ore behaviour during liberation and separation processes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Reducing the energy consumption for computation and cooling in servers is a major challenge considering the data center energy costs today. To ensure energy-efficient operation of servers in data centers, the relationship among computa- tional power, temperature, leakage, and cooling power needs to be analyzed. By means of an innovative setup that enables monitoring and controlling the computing and cooling power consumption separately on a commercial enterprise server, this paper studies temperature-leakage-energy tradeoffs, obtaining an empirical model for the leakage component. Using this model, we design a controller that continuously seeks and settles at the optimal fan speed to minimize the energy consumption for a given workload. We run a customized dynamic load-synthesis tool to stress the system. Our proposed cooling controller achieves up to 9% energy savings and 30W reduction in peak power in comparison to the default cooling control scheme.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En la actualidad existe una gran expectación ante la introducción de nuevas herramientas y métodos para el desarrollo de productos software, que permitirán en un futuro próximo un planteamiento de ingeniería del proceso de producción software. Las nuevas metodologías que empiezan a esbozarse suponen un enfoque integral del problema abarcando todas las fases del esquema productivo. Sin embargo el grado de automatización conseguido en el proceso de construcción de sistemas es muy bajo y éste está centrado en las últimas fases del ciclo de vida del software, consiguiéndose así una reducción poco significativa de sus costes y, lo que es aún más importante, sin garantizar la calidad de los productos software obtenidos. Esta tesis define una metodología de desarrollo software estructurada que se puede automatizar, es decir una metodología CASE. La metodología que se presenta se ajusta al modelo de ciclo de desarrollo CASE, que consta de las fases de análisis, diseño y pruebas; siendo su ámbito de aplicación los sistemas de información. Se establecen inicialmente los principios básicos sobre los que la metodología CASE se asienta. Posteriormente, y puesto que la metodología se inicia con la fijación de los objetivos de la empresa que demanda un sistema informático, se emplean técnicas que sirvan de recogida y validación de la información, que proporcionan a la vez un lenguaje de comunicación fácil entre usuarios finales e informáticos. Además, estas mismas técnicas detallarán de una manera completa, consistente y sin ambigüedad todos los requisitos del sistema. Asimismo, se presentan un conjunto de técnicas y algoritmos para conseguir que desde la especificación de requisitos del sistema se logre una automatización tanto del diseño lógico del Modelo de Procesos como del Modelo de Datos, validados ambos conforme a la especificación de requisitos previa. Por último se definen unos procedimientos formales que indican el conjunto de actividades a realizar en el proceso de construcción y cómo llevarlas a cabo, consiguiendo de esta manera una integridad en las distintas etapas del proceso de desarrollo.---ABSTRACT---Nowdays there is a great expectation with regard to the introduction of new tools and methods for the software products development that, in the very near future will allow, an engineering approach in the software development process. New methodologies, just emerging, imply an integral approach to the problem, including all the productive scheme stages. However, the automatization degree obtained in the systems construction process is very low and focused on the last phases of the software lifecycle, which means that the costs reduction obtained is irrelevant and, which is more important, the quality of the software products is not guaranteed. This thesis defines an structured software development methodology that can be automated, that is a CASE methodology. Such a methodology is adapted to the CASE development cycle-model, which consists in analysis, design and testing phases, being the information systems its field of application. Firstly, we present the basic principies on which CASE methodology is based. Secondly, since the methodology starts from fixing the objectives of the company demanding the automatization system, we use some techniques that are useful for gathering and validating the information, being at the same time an easy communication language between end-users and developers. Indeed, these same techniques will detail completely, consistently and non ambiguously all the system requirements. Likewise, a set of techniques and algorithms are shown in order to obtain, from the system requirements specification, an automatization of the Process Model logical design, and of the Data Model logical design. Those two models are validated according to the previous requirement specification. Finally, we define several formal procedures that suggest which set of activities to be accomplished in the construction process, and how to carry them out, getting in this way integrity and completness for the different stages of the development process.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nowadays, organizations have plenty of data stored in DB databases, which contain invaluable information. Decision Support Systems DSS provide the support needed to manage this information and planning médium and long-term ?the modus operandi? of these organizations. Despite the growing importance of these systems, most proposals do not include its total evelopment, mostly limiting itself on the development of isolated parts, which often have serious integration problems. Hence, methodologies that include models and processes that consider every factor are necessary. This paper will try to fill this void as it proposes an approach for developing spatial DSS driven by the development of their associated Data Warehouse DW, without forgetting its other components. To the end of framing the proposal different Engineering Software focus (The Software Engineering Process and Model Driven Architecture) are used, and coupling with the DB development methodology, (and both of them adapted to DW peculiarities). Finally, an example illustrates the proposal.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The development of mixed-criticality virtualized multicore systems poses new challenges that are being subject of active research work. There is an additional complexity: it is now required to identify a set of partitions, and allocate applications to partitions. In this job, a number of issues have to be considered, such as the criticality level of the application, security and dependability requirements, operating system used by the application, time requirements granularity, specific hardware needs, etc. MultiPARTES [6] toolset relies on Model Driven Engineering (MDE) [12], which is a suitable approach in this setting. In this paper, it is described the support provided for automatic system partitioning generation and toolset extensibility.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Desde hace ya varias décadas la praxis de la ecología ha venido reconociendo la necesidad de estudiar los múltiples sistemas de interacción del ser humano, como especie viva, y su entorno. Entidades espaciales como el paisaje geográfico son empleadas para delimitar sistemas territoriales operados por la sociedad, precisando campos concretos de su acción física, biológica y cultural. La ecología aborda así el conocimiento científico del territorio como asentamiento humano, rastrea sus patrones espaciales y analiza su compleja estructura funcional. En ese contexto, la transferencia de herramientas e instrumentos desde la ecología al ámbito proyectivo posee ya un bagaje de más de cinco décadas. Cada vez con más frecuencia el proyecto emplea parámetros, inventarios, fórmulas, indicadores y tecnologías que tratan de dar una respuesta ambientalmente adecuada a los condicionantes de contorno, por ejemplo aprovechando las condiciones climáticas en la optimización energética o proponiendo programas de usos del suelo que eviten perturbaciones en ecosistemas de interés. Con todo, en el momento presente surgen voces que, ante el dominio indiscutible de los enfoques netamente deterministas, tratan de recordar que los principios del pensamiento ecológico van más allá del mero control cuantitativo de los procesos biofísicos. Recuerdan que la etología demostró a principios del XX que el ser humano, como ser consciente, inviste una relación de intimidad con su entorno que supera tales perspectivas: a través de la correspondencia entre percepción y significación, entre lo físico y lo psíquico, entre interioridad y exterioridad, las personas abrazan la plenitud de aquello que les rodea en un acto de profunda conciliación afectiva. De tal ligadura de intimidad depende, sí o sí, y en toda su profundidad, la aceptación humana del entorno construido. A través de la noción de ambiente [Umwelt] se demuestra que la relación del hombre con su entorno es inseparable, bidireccional y coordinada y, por lo tanto, desde una posición coherente, la experiencia del espacio puede ser examinada a partir de la reciprocidad que constituyen, en continuidad, la persona y el lugar. De esta forma, la tesis dirige su objetivo principal a explorar y considerar, desde el proyecto, el significado y la influencia de la experiencia ambiental del espacio construido en la vida humana. Es más que probable que buena parte de los problemas de desafección del hombre con los paisajes transformados de su contemporaneidad tenga que ver con que tanto las intensidades de la experiencia y percepción humana, como la potestad interpretativa de sus productos culturales, incluyendo la arquitectura, han sido fuertemente reducidas. Ante este problema, la investigación toma como hipótesis la oportunidad que ofrece el pensamiento ecológico de reformular la experiencia estética como un acto de conocimiento, como un evento donde se da el encuentro físico y se construyen significados, donde se sancionan valores sociales y se mira hacia el futuro. Se ha de señalar que la presente tesis doctoral arranca en el Laboratorio de Paisaje del Grupo de Investigación Paisaje Cultural de la Universidad Politécnica de Madrid dirigido por Concha Lapayese y Darío Gazapo, y por tanto hace suyos para el estado del arte los principales conceptos e ideas bajo los que el trabajo teórico y práctico del grupo se viene orientando desde hace años: la consideración del paisaje como acontecimiento; la oscilación de la interpretación entre un paisaje específico y un paisaje genérico en un mundo globalizado; el reconocimiento de la experiencia estética del paisaje como una toma de conciencia social; y en definitiva, la reivindicación de la interioridad en el proyecto contemporáneo. La investigación profundiza en una línea de oportunidad que se abre al promover lo que se ha llamado un conocimiento por lo sentido como estrategia ambiental que permite contrarrestar mitos profundamente arraigados en las estructuras sociales. El primer paso en ese recorrido sería explorar ecológicamente el aporte de la experiencia estética; esto es, su consideración como forma de conocimiento específico. Resultaría pertinente impulsar la idea de la inmersión en el paisaje como fenómeno experiencial, sensual y corporal, y enfrentar, desde ahí, el problema de la aceptación social de lo nuevo y lo trasformado de acuerdo con el momento actual. La exploración sobre la afectividad en el ambiente no es, en cualquier caso, un asunto nuevo. Sin pretensiones de historiografía, dos momentos del siglo XX concentran el interés de la investigación. La primera se corresponde fundamentalmente con la segunda década del siglo, en relación a una serie de influencias que desde los avances científicos determinaron singulares aventuras del arte más experimental. La segunda se posiciona en el entorno de 1970, época en la que es conocido el interés que despertaron las cuestiones ambientales. En ambos casos se han estudiado aportaciones que desvelan conceptos determinantes en la definición de la experiencia estética como un evento de adquisición de conocimiento por lo sentido. Es conveniente adelantar el rol de centralidad que para la investigación tiene el concepto de energía, tal como el propio título subraya. La energía como realidad material y sensible es el sustrato que permite navegar por el principio de unidad epistemológica que subyace al pensamiento ecológico. Sus continuas referencias simbólicas, físicas y metafóricas entre los artistas estudiados no son un mero recurso iconográfico: mantienen inherente el principio de continuidad ambiental en el cual el ser humano y la inmensidad del cosmos navegan indisociables. Un discurso unificado y consistente sobre los aportes de la experiencia estética enfocada como forma de conocimiento por lo sentido hila la lectura histórica, conceptual y práctica de toda la investigación. Con ello se alcanza a hilvanar un diagrama conceptual, modelo de análisis proyectivo, que recoge ideas científicas, filosóficas y proyectivas. De alguna manera, el diagrama trata de dibujar, desde los principios del pensamiento ecológico, la correlación de continuidad que, vacilante, tensa, sutil y frágil se desplaza incesante e irresuelta entre interioridad y exterioridad. ABSTRACT Over the last few decades ecological practice has come to acknowledge a need for studying the multiple systems of interaction between the human being - inasmuch as it is a living species - and its environment. Spatial entities such as the geographic notion of landscape have been used to delimitate the territorial systems operated by society and to describe in detail specific fields of its physical, biological and cultural action. Ecology has thus managed to address the scientific knowledge of the territory as a human settlement, tracking its spatial patterns and analysing its complex functional structure. In this context, the transfer of tools and instruments from the field of ecology to that of design has a tradition already going back more than fifty years. Increasingly more often, design makes use of parameters, inventories, formulas, indicators and technologies to give an environmentally sound response to contour conditions: for instance, taking advantage of the local climate for the optimisation of energy consumption or proposing land uses that avoid disturbing valuable ecosystems. Yet in the present day some voices have arisen that, against the uncontested domination of purely positivistic approaches, are trying to draw attention to the fact that the principles of ecological thought go beyond mere quantitative control of biophysical processes. They point out that, in the early 20th century, ethology proved that the human being, as a conscious entity, invests itself into a relationship of intimacy with its environment that surpasses such perspectives: through the correspondences between perception and signification, between physical and psychological or between inside and outside, people embrace the entirety of their surroundings in an action of deep affective conciliation. It is on this link of intimacy that - fully and unquestionably - human acceptance of the built environment depends. Through the notion of environment [Umwelt] it can be proven that the relationship between the human being and its environment is inseparable, bidirectional and coordinated; and that, therefore, from a coherent position the experience of space can be examined through the reciprocity constituted continuously by person and place. Thus, the main goal in this thesis is to explore and acknowledge, from the standpoint of design, the meaning and influence of the environmental experience in human life. It is extremely likely that many of the issues with mankind’s alienation from the transformed landscapes of the present day arise from the fact that both the intensity of human perception and experience and the interpretive capacity of its cultural products –including architecture - have been greatly reduced. Facing this issue, research has taken as hypothesis the opportunity offered by ecological thought of reformulating aesthetic experience as an act of knowledge – as an event where physical encounter takes place and meanings are constructed; where social values are sanctioned and the path towards the future is drawn. This notwithstanding, the present thesis began in the Landscape Laboratory of the Technical University of Madrid Cultural Landscape Research Group (GIPC-UPM), led by Concha Lapayese and Darío Gazapo; and has therefore appropriated for its state of the art the main concepts and ideas that have been orienting the practical and theoretical work of the latter: the understanding of landscape as an event, the oscillation of interpretation between a specific and a generic landscape within a globalised world; the acknowledgement of the aesthetic experience of landscape as a way of acquiring social awareness; and, all in all, a vindication of interiority in contemporary design. An exploration has been made of the line of opportunity that is opened when promoting what has been termed knowledge through the senses as an environmental strategy allowing to counter myths deeply rooted in social structures. The first step in this path would be an ecological exploration of the contribution of the aesthetic experience; that is, its consideration as a type of specific knowledge. It would be pertinent to further the idea of immersion into the landscape as an experiential, sensual and corporeal phenomenon and, from that point, to face the issue of social acceptance of what is new and transformed according to the values of the present day. The exploration of affectivity in the environment is not, at any rate, a new topic. Without aspiring to make a history of it, we can mark two points in the 20th century that have concentrated the interest of this research. The first coincides with the second decade of the century and relates to a number of influences that, arising from scientific progress, determined the singular adventures of the more experimental tendencies in art. The second is centred around 1970: a period in which the interest drawn by environmental matters is well known. In both cases, contributions have been studied that reveal crucial concepts in defining the aesthetic experience as an event for the acquisition of knowledge through the senses. It is necessary to highlight the role of centrality that the concept of energy has throughout this research, as is evident even in its title. Energy as a material, sensitive reality is the substrate making it possible to navigate through the principle of epistemological unity underlying ecological thought. The continuous symbolic, physical and metaphorical references to it among the artists studied here are not a mere iconographic source: they remind of the inherency of the principle of environmental continuity within which the human being and the immensity of cosmos travel indissociably. A unified, consistent discourse on the contributions of the aesthetic experience addressed as knowledge through the senses weaves together the historic, conceptual and practical reading of the whole research. With it, a conceptual diagram is constructed – a model of design analysis – gathering together scientific, philosophical and design ideas. Somehow, the diagram tries to draw from the principles of ecological thought the correlation of continuity that, vacillating, tense, subtle and fragile, shifts incessantly and unresolved between interiority and exteriority.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Los sistemas empotrados son cada día más comunes y complejos, de modo que encontrar procesos seguros, eficaces y baratos de desarrollo software dirigidos específicamente a esta clase de sistemas es más necesario que nunca. A diferencia de lo que ocurría hasta hace poco, en la actualidad los avances tecnológicos en el campo de los microprocesadores de los últimos tiempos permiten el desarrollo de equipos con prestaciones más que suficientes para ejecutar varios sistemas software en una única máquina. Además, hay sistemas empotrados con requisitos de seguridad (safety) de cuyo correcto funcionamiento depende la vida de muchas personas y/o grandes inversiones económicas. Estos sistemas software se diseñan e implementan de acuerdo con unos estándares de desarrollo software muy estrictos y exigentes. En algunos casos puede ser necesaria también la certificación del software. Para estos casos, los sistemas con criticidades mixtas pueden ser una alternativa muy valiosa. En esta clase de sistemas, aplicaciones con diferentes niveles de criticidad se ejecutan en el mismo computador. Sin embargo, a menudo es necesario certificar el sistema entero con el nivel de criticidad de la aplicación más crítica, lo que hace que los costes se disparen. La virtualización se ha postulado como una tecnología muy interesante para contener esos costes. Esta tecnología permite que un conjunto de máquinas virtuales o particiones ejecuten las aplicaciones con unos niveles de aislamiento tanto temporal como espacial muy altos. Esto, a su vez, permite que cada partición pueda ser certificada independientemente. Para el desarrollo de sistemas particionados con criticidades mixtas se necesita actualizar los modelos de desarrollo software tradicionales, pues estos no cubren ni las nuevas actividades ni los nuevos roles que se requieren en el desarrollo de estos sistemas. Por ejemplo, el integrador del sistema debe definir las particiones o el desarrollador de aplicaciones debe tener en cuenta las características de la partición donde su aplicación va a ejecutar. Tradicionalmente, en el desarrollo de sistemas empotrados, el modelo en V ha tenido una especial relevancia. Por ello, este modelo ha sido adaptado para tener en cuenta escenarios tales como el desarrollo en paralelo de aplicaciones o la incorporación de una nueva partición a un sistema ya existente. El objetivo de esta tesis doctoral es mejorar la tecnología actual de desarrollo de sistemas particionados con criticidades mixtas. Para ello, se ha diseñado e implementado un entorno dirigido específicamente a facilitar y mejorar los procesos de desarrollo de esta clase de sistemas. En concreto, se ha creado un algoritmo que genera el particionado del sistema automáticamente. En el entorno de desarrollo propuesto, se han integrado todas las actividades necesarias para desarrollo de un sistema particionado, incluidos los nuevos roles y actividades mencionados anteriormente. Además, el diseño del entorno de desarrollo se ha basado en la ingeniería guiada por modelos (Model-Driven Engineering), la cual promueve el uso de los modelos como elementos fundamentales en el proceso de desarrollo. Así pues, se proporcionan las herramientas necesarias para modelar y particionar el sistema, así como para validar los resultados y generar los artefactos necesarios para el compilado, construcción y despliegue del mismo. Además, en el diseño del entorno de desarrollo, la extensión e integración del mismo con herramientas de validación ha sido un factor clave. En concreto, se pueden incorporar al entorno de desarrollo nuevos requisitos no-funcionales, la generación de nuevos artefactos tales como documentación o diferentes lenguajes de programación, etc. Una parte clave del entorno de desarrollo es el algoritmo de particionado. Este algoritmo se ha diseñado para ser independiente de los requisitos de las aplicaciones así como para permitir al integrador del sistema implementar nuevos requisitos del sistema. Para lograr esta independencia, se han definido las restricciones al particionado. El algoritmo garantiza que dichas restricciones se cumplirán en el sistema particionado que resulte de su ejecución. Las restricciones al particionado se han diseñado con una capacidad expresiva suficiente para que, con un pequeño grupo de ellas, se puedan expresar la mayor parte de los requisitos no-funcionales más comunes. Las restricciones pueden ser definidas manualmente por el integrador del sistema o bien pueden ser generadas automáticamente por una herramienta a partir de los requisitos funcionales y no-funcionales de una aplicación. El algoritmo de particionado toma como entradas los modelos y las restricciones al particionado del sistema. Tras la ejecución y como resultado, se genera un modelo de despliegue en el que se definen las particiones que son necesarias para el particionado del sistema. A su vez, cada partición define qué aplicaciones deben ejecutar en ella así como los recursos que necesita la partición para ejecutar correctamente. El problema del particionado y las restricciones al particionado se modelan matemáticamente a través de grafos coloreados. En dichos grafos, un coloreado propio de los vértices representa un particionado del sistema correcto. El algoritmo se ha diseñado también para que, si es necesario, sea posible obtener particionados alternativos al inicialmente propuesto. El entorno de desarrollo, incluyendo el algoritmo de particionado, se ha probado con éxito en dos casos de uso industriales: el satélite UPMSat-2 y un demostrador del sistema de control de una turbina eólica. Además, el algoritmo se ha validado mediante la ejecución de numerosos escenarios sintéticos, incluyendo algunos muy complejos, de más de 500 aplicaciones. ABSTRACT The importance of embedded software is growing as it is required for a large number of systems. Devising cheap, efficient and reliable development processes for embedded systems is thus a notable challenge nowadays. Computer processing power is continuously increasing, and as a result, it is currently possible to integrate complex systems in a single processor, which was not feasible a few years ago.Embedded systems may have safety critical requirements. Its failure may result in personal or substantial economical loss. The development of these systems requires stringent development processes that are usually defined by suitable standards. In some cases their certification is also necessary. This scenario fosters the use of mixed-criticality systems in which applications of different criticality levels must coexist in a single system. In these cases, it is usually necessary to certify the whole system, including non-critical applications, which is costly. Virtualization emerges as an enabling technology used for dealing with this problem. The system is structured as a set of partitions, or virtual machines, that can be executed with temporal and spatial isolation. In this way, applications can be developed and certified independently. The development of MCPS (Mixed-Criticality Partitioned Systems) requires additional roles and activities that traditional systems do not require. The system integrator has to define system partitions. Application development has to consider the characteristics of the partition to which it is allocated. In addition, traditional software process models have to be adapted to this scenario. The V-model is commonly used in embedded systems development. It can be adapted to the development of MCPS by enabling the parallel development of applications or adding an additional partition to an existing system. The objective of this PhD is to improve the available technology for MCPS development by providing a framework tailored to the development of this type of system and by defining a flexible and efficient algorithm for automatically generating system partitionings. The goal of the framework is to integrate all the activities required for developing MCPS and to support the different roles involved in this process. The framework is based on MDE (Model-Driven Engineering), which emphasizes the use of models in the development process. The framework provides basic means for modeling the system, generating system partitions, validating the system and generating final artifacts. The framework has been designed to facilitate its extension and the integration of external validation tools. In particular, it can be extended by adding support for additional non-functional requirements and support for final artifacts, such as new programming languages or additional documentation. The framework includes a novel partitioning algorithm. It has been designed to be independent of the types of applications requirements and also to enable the system integrator to tailor the partitioning to the specific requirements of a system. This independence is achieved by defining partitioning constraints that must be met by the resulting partitioning. They have sufficient expressive capacity to state the most common constraints and can be defined manually by the system integrator or generated automatically based on functional and non-functional requirements of the applications. The partitioning algorithm uses system models and partitioning constraints as its inputs. It generates a deployment model that is composed by a set of partitions. Each partition is in turn composed of a set of allocated applications and assigned resources. The partitioning problem, including applications and constraints, is modeled as a colored graph. A valid partitioning is a proper vertex coloring. A specially designed algorithm generates this coloring and is able to provide alternative partitions if required. The framework, including the partitioning algorithm, has been successfully used in the development of two industrial use cases: the UPMSat-2 satellite and the control system of a wind-power turbine. The partitioning algorithm has been successfully validated by using a large number of synthetic loads, including complex scenarios with more that 500 applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Los procesos de diseño y construcción en Arquitectura han mostrado un desarrollo de optimización históricamente muy deficiente cuando se compara con las restantes actividades típicamente industriales. La aspiración constante a una industrialización efectiva, tanto en aras de alcanzar mayores cotas de calidad así como de ahorro de recursos, recibe hoy una oportunidad inmejorable desde el ámbito informático: el Building Information Modelling o BIM. Lo que en un inicio puede parecer meramente un determinado tipo de programa informático, en realidad supone un concepto de “proceso” que subvierte muchas rutinas hoy habituales en el desarrollo de proyectos y construcciones arquitectónicas. La inclusión y desarrollo de datos ligados al proyecto, desde su inicio hasta el fin de su ciclo de vida, conlleva la oportunidad de crear una realidad virtual dinámica y actualizable, que por añadidura posibilita su ensayo y optimización en todos sus aspectos: antes y durante su ejecución, así como vida útil. A ello se suma la oportunidad de transmitir eficientemente los datos completos de proyecto, sin apenas pérdidas o reelaboración, a la cadena de fabricación, lo que facilita el paso a una industrialización verdaderamente significativa en edificación. Ante una llamada mundial a la optimización de recursos y el interés indudable de aumentar beneficios económicos por medio de la reducción del factor de incertidumbre de los procesos, BIM supone un opción de mejora indudable, y así ha sido reconocido a través de la inminente implantación obligatoria por parte de los gobiernos (p. ej. Gran Bretaña en 2016 y España en 2018). La modificación de procesos y roles profesionales que conlleva la incorporación de BIM resulta muy significativa y marcará el ejercicio profesional de los futuros graduados en las disciplinas de Arquitectura, Ingeniería y Construcción (AEC por sus siglas en inglés). La universidad debe responder ágilmente a estas nuevas necesidades incorporando esta metodología en la enseñanza reglada y aportando una visión sinérgica que permita extraer los beneficios formativos subyacentes en el propio marco BIM. En este sentido BIM, al aglutinar el conjunto de datos sobre un único modelo virtual, ofrece un potencial singularmente interesante. La realidad tridimensional del modelo, desarrollada y actualizada continuamente, ofrece al estudiante una gestión radicalmente distinta de la representación gráfica, en la que las vistas parciales de secciones y plantas, tan complejas de asimilar en los inicios de la formación universitaria, resultan en una mera petición a posteriori, para ser extraída según necesidad del modelo virtual. El diseño se realiza siempre sobre el propio modelo único, independientemente de la vista de trabajo elegida en cada momento, permaneciendo los datos y sus relaciones constructivas siempre actualizados y plenamente coherentes. Esta descripción condensada de características de BIM preconfiguran gran parte de las beneficios formativos que ofrecen los procesos BIM, en especial, en referencia al desarrollo del diseño integrado y la gestión de la información (incluyendo TIC). Destacan a su vez las facilidades en comprensión visual de elementos arquitectónicos, sistemas técnicos, sus relaciones intrínsecas así como procesos constructivos. A ello se une el desarrollo experimental que la plataforma BIM ofrece a través de sus software colaborativos: la simulación del comportamiento estructural, energético, económico, entre otros muchos, del modelo virtual en base a los datos inherentes del proyecto. En la presente tesis se describe un estudio de conjunto para explicitar tanto las cualidades como posibles reservas en el uso de procesos BIM, en el marco de una disciplina concreta: la docencia de la Arquitectura. Para ello se ha realizado una revisión bibliográfica general sobre BIM y específica sobre docencia en Arquitectura, así como analizado las experiencias de distintos grupos de interés en el marco concreto de la enseñanza de la en Arquitectura en la Universidad Europea de Madrid. El análisis de beneficios o reservas respecto al uso de BIM se ha enfocado a través de la encuesta a estudiantes y la entrevista a profesionales AEC relacionados o no con BIM. Las conclusiones del estudio permiten sintetizar una implantación de metodología BIM que para mayor claridad y facilidad de comunicación y manejo, se ha volcado en un Marco de Implantación eminentemente gráfico. En él se orienta sobre las acciones docentes para el desarrollo de competencias concretas, valiéndose de la flexibilidad conceptual de los Planes de Estudio en el contexto del Espacio Europeo de Educación Superior (Declaración de Bolonia) para incorporar con naturalidad la nueva herramienta docente al servicio de los objetivos formativo legalmente establecidos. El enfoque global del Marco de Implementación propuesto facilita la planificación de acciones formativas con perspectiva de conjunto: combinar los formatos puntuales o vehiculares BIM, establecer sinergias transversales y armonizar recursos, de modo que la metodología pueda beneficiar tanto la asimilación de conocimientos y habilidades establecidas para el título, como el propio flujo de aprendizaje o learn flow BIM. Del mismo modo reserva, incluso visualmente, aquellas áreas de conocimiento en las que, al menos en la planificación actual, la inclusión de procesos BIM no se considera ventajosa respecto a otras metodologías, o incluso inadecuadas para los objetivos docentes establecidos. Y es esta última categorización la que caracteriza el conjunto de conclusiones de esta investigación, centrada en: 1. la incuestionable necesidad de formar en conceptos y procesos BIM desde etapas muy iniciales de la formación universitaria en Arquitectura, 2. los beneficios formativos adicionales que aporta BIM en el desarrollo de competencias muy diversas contempladas en el currículum académico y 3. la especificidad del rol profesional del arquitecto que exigirá una implantación cuidadosa y ponderada de BIM que respete las metodologías de desarrollo creativo tradicionalmente efectivas, y aporte valor en una reorientación simbiótica con el diseño paramétrico y fabricación digital que permita un diseño finalmente generativo. ABSTRACT The traditional architectural design and construction procedures have proven to be deficient where process optimization is concerned, particularly when compared to other common industrial activities. The ever‐growing strife to achieve effective industrialization, both in favor of reaching greater quality levels as well as sustainable management of resources, has a better chance today than ever through a mean out of the realm of information technology, the Building Information Modelling o BIM. What may initially seem to be merely another computer program, in reality turns out to be a “process” concept that subverts many of today’s routines in architectural design and construction. Including and working with project data from the very beginning to the end of its full life cycle allows for creating a dynamic and updatable virtual reality, enabling data testing and optimizing throughout: before and during execution, all the way to the end of its lifespan. In addition, there is an opportunity to transmit complete project data efficiently, with hardly any loss or redeveloping of the manufacture chain required, which facilitates attaining a truly significant industrialization within the construction industry. In the presence of a world‐wide call for optimizing resources, along with an undeniable interest in increasing economic benefits through reducing uncertainty factors in its processes, BIM undoubtedly offers a chance for improvement as acknowledged by its imminent and mandatory implementation on the part of governments (for example United Kingdom in 2016 and Spain in 2018). The changes involved in professional roles and procedures upon incorporating BIM are highly significant and will set the course for future graduates of Architecture, Engineering and Construction disciplines (AEC) within their professions. Higher Education must respond to such needs with swiftness by incorporating this methodology into their educational standards and providing a synergetic vision that focuses on the underlying educational benefits inherent in the BIM framework. In this respect, BIM, in gathering data set under one single virtual model, offers a uniquely interesting potential. The three‐dimensional reality of the model, under continuous development and updating, provides students with a radically different graphic environment, in which partial views of elevation, section or plan that tend characteristically to be difficult to assimilate at the beginning of their studies, become mere post hoc requests to be ordered when needed directly out the virtual model. The design is always carried out on the sole model itself, independently of the working view chosen at any particular moment, with all data and data relations within construction permanently updated and fully coherent. This condensed description of the features of BIM begin to shape an important part of the educational benefits posed by BIM processes, particularly in reference to integrated design development and information management (including ITC). At the same time, it highlights the ease with which visual understanding is achieved regarding architectural elements, technology systems, their intrinsic relationships, and construction processes. In addition to this, there is the experimental development the BIM platform grants through its collaborative software: simulation of structural, energetic, and economic behavior, among others, of the virtual model according to the data inherent to the project. This doctoral dissertation presents a broad study including a wide array of research methods and issues in order to specify both the virtues and possible reservations in the use of BIM processes within the framework of a specific discipline: teaching Architecture. To do so, a literature review on BIM has been carried out, specifically concerning teaching in the discipline of Architecture, as well as an analysis of the experience of different groups of interest delimited to Universidad Europea de Madrid. The analysis of the benefits and/or limitations of using BIM has been approached through student surveys and interviews with professionals from the AEC sector, associated or not, with BIM. Various diverse educational experiences are described and academic management for experimental implementation has been analyzed. The conclusions of this study offer a synthesis for a Framework of Implementation of BIM methodology, which in order to reach greater clarity, communication ease and user‐friendliness, have been posed in an eminently graphic manner. The proposed framework proffers guidance on teaching methods conducive to the development of specific skills, taking advantage of the conceptual flexibility of the European Higher Education Area guidelines based on competencies, which naturally facilitate for the incorporation of this new teaching tool to achieve the educational objectives established by law. The global approach of the Implementation Framework put forth in this study facilitates the planning of educational actions within a common perspective: combining exceptional or vehicular BIM formats, establishing cross‐disciplinary synergies, and sharing resources, so as to purport a methodology that contributes to the assimilation of knowledge and pre‐defined competencies within the degree program, and to the flow of learning itself. At the same time, it reserves, even visually, those areas of knowledge in which the use of BIM processes is not considered necessarily an advantage over other methodologies, or even inadequate for the learning outcomes established, at least where current planning is concerned. It is this last category which characterizes the research conclusions as a whole, centering on: 1. The unquestionable need for teaching BIM concepts and processes in Architecture very early on, in the initial stages of higher education; 2. The additional educational benefits that BIM offers in a varied array of competency development within the academic curriculum; and 3. The specific nature of the professional role of the Architect, which demands a careful and balanced implementation of BIM that respects the traditional teaching methodologies that have proven effective and creative, and adds value by a symbiotic reorientation merged with parametric design and digital manufacturing so to enable for a finally generative design.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Several languages have been proposed for the task of describing networks of systems, either to help on managing, simulate or deploy testbeds for testing purposes. However, there is no one specifically designed to describe the honeynets, covering the specific characteristics in terms of applications and tools included in the honeypot systems that make the honeynet. In this paper, the requirements of honeynet description are studied and a survey of existing description languages is presented, concluding that a CIM (Common Information Model) match the basic requirements. Thus, a CIM like technology independent honeynet description language (TIHDL) is proposed. The language is defined being independent of the platform where the honeynet will be deployed later, and it can be translated, either using model-driven techniques or other translation mechanisms, into the description languages of honeynet deployment platforms and tools. This approach gives flexibility to allow the use of a combination of heterogeneous deployment platforms. Besides, a flexible virtual honeynet generation tool (HoneyGen) based on the approach and description language proposed and capable of deploying honeynets over VNX (Virtual Networks over LinuX) and Honeyd platforms is presented for validation purposes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O número de acidentes de trânsito é crescente nas últimas décadas no Brasil. Uma das principais causas de acidentes em rodovias brasileiras é o excesso de velocidade, que contribui para a possibilidade de ocorrência de acidentes. As velocidades praticadas pelos motoristas são também função dos elementos geométricos que compõem a via (raio, rampa, largura da faixa, etc). A consistência de traçado não afeta a expectativa dos motoristas e garante uma operação segura. A maioria dos motoristas consegue perceber as falhas de coordenação, mas tecnicamente, por exemplo, desconhecem a origem das mesmas. Esta pesquisa apresenta como objetivo a análise de consistência de um trecho de uma determinada rodovia do país de múltiplas faixas, com elevado índice de acidentes e alto fluxo de veículos comerciais. Os pontos com maior ocorrência de acidentes foram identificados e realizaram-se medições de velocidade para elaboração de um modelo de previsão de velocidade operacional (V85) do trecho de estudo. De posse deste modelo, procedeu-se à análise de consistência através do método dos critérios de segurança, que identificou 2 seções com problemas de consistência. Por fim, verificou-se se estas seções correspondiam aos locais de maior número de acidentes: a tangente T5 precede uma curva com alto índice de acidentes (km 511+000); o local com maior concentração de acidentes (km 514) foi classificado como RAZOÁVEL.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Los sistemas de búsqueda de respuestas (BR) se pueden considerar como potenciales sucesores de los buscadores tradicionales de información en la Web. Para que sean precisos deben adaptarse a dominios concretos mediante el uso de recursos semánticos adecuados. La adaptación no es una tarea trivial, ya que deben integrarse e incorporarse a sistemas de BR existentes varios recursos heterogéneos relacionados con un dominio restringido. Se presenta la herramienta Maraqa, cuya novedad radica en el uso de técnicas de ingeniería del software, como el desarrollo dirigido por modelos, para automatizar dicho proceso de adaptación a dominios restringidos. Se ha evaluado Maraqa mediante una serie de experimentos (sobre el dominio agrícola) que demuestran su viabilidad, mejorando en un 29,5% la precisión del sistema adaptado.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Context: Today’s project managers have a myriad of methods to choose from for the development of software applications. However, they lack empirical data about the character of these methods in terms of usefulness, ease of use or compatibility, all of these being relevant variables to assess the developer’s intention to use them. Objective: To compare three methods, each following a different paradigm (Model-Driven, Model-Based and Code-Centric) with respect to their adoption potential by junior software developers engaged in the development of the business layer of a Web 2.0 application. Method: We have conducted a quasi-experiment with 26 graduate students of the University of Alicante. The application developed was a Social Network, which was organized around a fixed set of modules. Three of them, similar in complexity, were used for the experiment. Subjects were asked to use a different method for each module, and then to answer a questionnaire that gathered their perceptions during such use. Results: The results show that the Model-Driven method is regarded as the most useful, although it is also considered the least compatible with previous developers’ experiences. They also show that junior software developers feel comfortable with the use of models, and that they are likely to use them if the models are accompanied by a Model-Driven development environment. Conclusions: Despite their relatively low level of compatibility, Model-Driven development methods seem to show a great potential for adoption. That said, however, further experimentation is needed to make it possible to generalize the results to a different population, different methods, other languages and tools, different domains or different application sizes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, a new control design method is proposed for stable processes which can be described using Hammerstein-Wiener models. The internal model control (IMC) framework is extended to accommodate multiple IMC controllers, one for each subsystem. The concept of passive systems is used to construct the IMC controllers which approximate the inverses of the subsystems to achieve dynamic control performance. The Passivity Theorem is used to ensure the closed-loop stability. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this study was to explore clinician reactions to (i) the introduction of routine outcome measures and (ii) the utility of outcomes data in clinical practice. Focus group discussions (n = 34) were conducted with mental health staff (n = 324) at approximately 8 months post implementation of routine outcome measures. A semi-structured interview schedule was used to collect data on two key issues; reactions to the introduction of outcome measures and factors influencing the utility of outcomes data in clinical practice. Data from the discussion groups were analysed using content analysis to isolate emerging themes. While the majority of participants endorsed the collection and utilization of outcomes data, many raised questions about the merits of the initiative. Ambivalence, competing work demands, lack of support from senior medical staff, questionable evidence to support the use of outcome measures, and fear of how outcomes data might be used emerged as key issues. At 8 months post implementation a significant number of clinical staff remained ambivalent about the benefits of outcome measurement and had not engaged in the process. The shift to a service model driven by outcomes and case-mix data will take time and resources to achieve. Implications for nursing staff are discussed.