12 resultados para Plant architecture model
em Universidad de Alicante
Resumo:
Se inicia un análisis de los procesos de trabajo de stop-motion porque ayudan a comprender las diferentes escalas en arquitectura donde las maquetas se convierten en futuros prototipos de infraestructuras de edificios o de paisaje. Stop motion es una técnica de animación fotograma a fotograma de objetos estáticos mediante la manipulación de figuras de plastilina en entornos fijos con cambios de luz, color y sonido. Igual que dicha técnica reúne lo mejor del rodaje tradicional -story board, escenografía, fotografía, personajes, iluminación- la animación de maquetas de interiores sintetiza micro-procesos de mayor repercusión -habitaciones con cambios de humedad, de temperatura, de ventilación y de iluminación- incorporando efectos especiales que son procesados digitalmente en post-producción. Se construyen varios prototipos de habitación con parámetros fijos como el tamaño y la posición de la cámara y otros variables como los materiales, los personajes y la iluminación. Representan un mundo en miniatura que intenta aportar un acercamiento sensorial y atmosférico analizando la magia y la fantasía que Junichirô Tanizaki describe en la penumbra de las construcciones tradicionales japonesas y estudiando las imperfecciones de los escenarios que Tim Burton manipula en su películas de animación con una textura que las tecnologías digitales no pueden igualar. El objetivo es utilizar una escala micro para realizar unos modelos interiores donde las condiciones atmosféricas están controladas y reducidas, y tomar datos que se podrían aplicar a un proceso de modelado a escala intermedia para testar prototipos de edificios como el túnel de viento; o, finalmente, a una escala macro con maquetas de un sector de la costa o de un río donde los fenómenos meteorológicos son los protagonistas para simular inundaciones y diseñar futuras medidas de prevención y seguridad.
Resumo:
Nowadays, data mining is based on low-level specications of the employed techniques typically bounded to a specic analysis platform. Therefore, data mining lacks a modelling architecture that allows analysts to consider it as a truly software-engineering process. Here, we propose a model-driven approach based on (i) a conceptual modelling framework for data mining, and (ii) a set of model transformations to automatically generate both the data under analysis (via data-warehousing technology) and the analysis models for data mining (tailored to a specic platform). Thus, analysts can concentrate on the analysis problem via conceptual data-mining models instead of low-level programming tasks related to the underlying-platform technical details. These tasks are now entrusted to the model-transformations scaffolding.
Resumo:
Data mining is one of the most important analysis techniques to automatically extract knowledge from large amount of data. Nowadays, data mining is based on low-level specifications of the employed techniques typically bounded to a specific analysis platform. Therefore, data mining lacks a modelling architecture that allows analysts to consider it as a truly software-engineering process. Bearing in mind this situation, we propose a model-driven approach which is based on (i) a conceptual modelling framework for data mining, and (ii) a set of model transformations to automatically generate both the data under analysis (that is deployed via data-warehousing technology) and the analysis models for data mining (tailored to a specific platform). Thus, analysts can concentrate on understanding the analysis problem via conceptual data-mining models instead of wasting efforts on low-level programming tasks related to the underlying-platform technical details. These time consuming tasks are now entrusted to the model-transformations scaffolding. The feasibility of our approach is shown by means of a hypothetical data-mining scenario where a time series analysis is required.
Resumo:
In this paper we describe Fénix, a data model for exchanging information between Natural Language Processing applications. The format proposed is intended to be flexible enough to cover both current and future data structures employed in the field of Computational Linguistics. The Fénix architecture is divided into four separate layers: conceptual, logical, persistence and physical. This division provides a simple interface to abstract the users from low-level implementation details, such as programming languages and data storage employed, allowing them to focus in the concepts and processes to be modelled. The Fénix architecture is accompanied by a set of programming libraries to facilitate the access and manipulation of the structures created in this framework. We will also show how this architecture has been already successfully applied in different research projects.
Resumo:
With advances in the synthesis and design of chemical processes there is an increasing need for more complex mathematical models with which to screen the alternatives that constitute accurate and reliable process models. Despite the wide availability of sophisticated tools for simulation, optimization and synthesis of chemical processes, the user is frequently interested in using the ‘best available model’. However, in practice, these models are usually little more than a black box with a rigid input–output structure. In this paper we propose to tackle all these models using generalized disjunctive programming to capture the numerical characteristics of each model (in equation form, modular, noisy, etc.) and to deal with each of them according to their individual characteristics. The result is a hybrid modular–equation based approach that allows synthesizing complex processes using different models in a robust and reliable way. The capabilities of the proposed approach are discussed with a case study: the design of a utility system power plant that has been decomposed into its constitutive elements, each treated differently numerically. And finally, numerical results and conclusions are presented.
Resumo:
Conceptual frameworks of dryland degradation commonly include ecohydrological feedbacks between landscape spatial organization and resource loss, so that decreasing cover and size of vegetation patches result in higher water and soil losses, which lead to further vegetation loss. However, the impacts of these feedbacks on dryland dynamics in response to external stress have barely been tested. Using a spatially-explicit model, we represented feedbacks between vegetation pattern and landscape resource loss by establishing a negative dependence of plant establishment on the connectivity of runoff-source areas (e.g., bare soils). We assessed the impact of various feedback strengths on the response of dryland ecosystems to changing external conditions. In general, for a given external pressure, these connectivity-mediated feedbacks decrease vegetation cover at equilibrium, which indicates a decrease in ecosystem resistance. Along a gradient of gradual increase of environmental pressure (e.g., aridity), the connectivity-mediated feedbacks decrease the amount of pressure required to cause a critical shift to a degraded state (ecosystem resilience). If environmental conditions improve, these feedbacks increase the pressure release needed to achieve the ecosystem recovery (restoration potential). The impact of these feedbacks on dryland response to external stress is markedly non-linear, which relies on the non-linear negative relationship between bare-soil connectivity and vegetation cover. Modelling studies on dryland vegetation dynamics not accounting for the connectivity-mediated feedbacks studied here may overestimate the resistance, resilience and restoration potential of drylands in response to environmental and human pressures. Our results also suggest that changes in vegetation pattern and associated hydrological connectivity may be more informative early-warning indicators of dryland degradation than changes in vegetation cover.
Resumo:
Integration is currently a key factor in intelligent transportation systems (ITS), especially because of the ever increasing service demands originating from the ITS industry and ITS users. The current ITS landscape is made up of multiple technologies that are tightly coupled, and its interoperability is extremely low, which limits ITS services generation. Given this fact, novel information technologies (IT) based on the service-oriented architecture (SOA) paradigm have begun to introduce new ways to address this problem. The SOA paradigm allows the construction of loosely coupled distributed systems that can help to integrate the heterogeneous systems that are part of ITS. In this paper, we focus on developing an SOA-based model for integrating information technologies (IT) into ITS to achieve ITS service delivery. To develop our model, the ITS technologies and services involved were identified, catalogued, and decoupled. In doing so, we applied our SOA-based model to integrate all of the ITS technologies and services, ranging from the lowest-level technical components, such as roadside unit as a service (RS S), to the most abstract ITS services that will be offered to ITS users (value-added services). To validate our model, a functionality case study that included all of the components of our model was designed.
Resumo:
The explosive growth of the traffic in computer systems has made it clear that traditional control techniques are not adequate to provide the system users fast access to network resources and prevent unfair uses. In this paper, we present a reconfigurable digital hardware implementation of a specific neural model for intrusion detection. It uses a specific vector of characterization of the network packages (intrusion vector) which is starting from information obtained during the access intent. This vector will be treated by the system. Our approach is adaptative and to detecting these intrusions by using a complex artificial intelligence method known as multilayer perceptron. The implementation have been developed and tested into a reconfigurable hardware (FPGA) for embedded systems. Finally, the Intrusion detection system was tested in a real-world simulation to gauge its effectiveness and real-time response.
Resumo:
In this work, we propose a new methodology for the large scale optimization and process integration of complex chemical processes that have been simulated using modular chemical process simulators. Units with significant numerical noise or large CPU times are substituted by surrogate models based on Kriging interpolation. Using a degree of freedom analysis, some of those units can be aggregated into a single unit to reduce the complexity of the resulting model. As a result, we solve a hybrid simulation-optimization model formed by units in the original flowsheet, Kriging models, and explicit equations. We present a case study of the optimization of a sour water stripping plant in which we simultaneously consider economics, heat integration and environmental impact using the ReCiPe indicator, which incorporates the recent advances made in Life Cycle Assessment (LCA). The optimization strategy guarantees the convergence to a local optimum inside the tolerance of the numerical noise.
Resumo:
In this study, we investigated the cellular and molecular mechanisms that regulate salt acclimation. The main objective was to obtain new insights into the molecular mechanisms that control salt acclimation. Therefore, we carried out a multidisciplinary study using proteomic, transcriptomic, subcellular and physiological techniques. We obtained a Nicotiana tabacum BY-2 cell line acclimated to be grown at 258 mM NaCl as a model for this study. The proteomic and transcriptomic data indicate that the molecular response to stress (chaperones, defence proteins, etc.) is highly induced in these salt-acclimated cells. The subcellular results show that salt induces sodium compartmentalization in the cell vacuoles and seems to be mediated by vesicle trafficking in tobacco salt-acclimated cells. Our results demonstrate that abscisic acid (ABA) and proline metabolism are crucial in the cellular signalling of salt acclimation, probably regulating reactive oxygen species (ROS) production in the mitochondria. ROS may act as a retrograde signal, regulating the cell response. The network of endoplasmic reticulum and Golgi apparatus is highly altered in salt-acclimated cells. The molecular and subcellular analysis suggests that the unfolded protein response is induced in salt-acclimated cells. Finally, we propose that this mechanism may mediate cell death in salt-acclimated cells.
Resumo:
We present and evaluate a novel supervised recurrent neural network architecture, the SARASOM, based on the associative self-organizing map. The performance of the SARASOM is evaluated and compared with the Elman network as well as with a hidden Markov model (HMM) in a number of prediction tasks using sequences of letters, including some experiments with a reduced lexicon of 15 words. The results were very encouraging with the SARASOM learning better and performing with better accuracy than both the Elman network and the HMM.
Resumo:
The majority of the organizations store their historical business information in data warehouses which are queried to make strategic decisions by using online analytical processing (OLAP) tools. This information has to be correctly assured against unauthorized accesses, but nevertheless there are a great amount of legacy OLAP applications that have been developed without considering security aspects or these have been incorporated once the system was implemented. This work defines a reverse engineering process that allows us to obtain the conceptual model corresponding to a legacy OLAP application, and also analyses and represents the security aspects that could have established. This process has been aligned with a model-driven architecture for developing secure OLAP applications by defining the transformations needed to automatically apply it. Once the conceptual model has been extracted, it can be easily modified and improved with security, and automatically transformed to generate the new implementation.