904 resultados para experimental approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

El objetivo principal que este trabajo ha perseguido tiene que ver, primero, con reconocer la arquitectura como una disciplina intensamente ligada con la realidad (espacial, constructiva, económica) para después reivindicar una docencia de la misma que busque trabar lazos más estrechos con la experiencia directa del aprendizaje. A lo largo de la primera parte del trabajo se proponer un enfoque que identifica como próximas las bases del llamado “aprendizaje experimental” con la forma de pensar que trata de transmitirse respecto al concepto de “proyectar” en nuestras escuelas. Acordada esta relación de proximidad entre el aprendizaje experimental y el aprendizaje de proyectos se da un paso más allá: la bibliografía propia de las ciencias del aprendizaje señala los rasgos que caracterizan a los espacios donde tiene lugar este aprendizaje experimental. Esta información se ha confrontado con seis ejemplos en el panorama contemporáneo de las escuelas de arquitectura que, pensábamos, podían responder a la definición de “espacios de aprendizaje experimental”. Una vez descritos se ha hecho una lectura crítica de cada uno respecto a las características descritas por las ciencias del aprendizaje. Como resultado de estas lecturas críticas podemos concluir una serie de puntos que pueden caracterizar al espacio de aprendizaje experimental en la escuela de arquitectura de cara al futuro. The main objective that this work has pursued was, first, to recognize architecture as a discipline strongly linked with reality (spatial, constructive, economic) and then claim a teaching of it to seek closer ties lock experience direct learning. Throughout the first part of the paper is to propose an approach that identifies it as coming bases called “experiential learning” with the way of thinking that is transmitted on the concept of “project” in our schools. Agreed this close relationship between experiential learning and project learning goes a step further: the bibliography own learning sciences highlights the features that characterize the spaces where this experiential learning occurs. This information has been confronted with six examples in contemporary landscape architecture schools, we thought, could fall within the definition of “experimental learning spaces”. Having described has become a critical reading of each with respect to the characteristics described by the learning sciences. As a result of these critical readings can conclude a number of points that can characterize the space of experiential learning in the school of architecture facing the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current research aims to analyse theoretically and evaluate a self-manufactured simple design for subsurface drip irrigation (SDI) emitter to avoid root and soil intrusion. It was composed of three concentric cylindrical elements: an elastic silicone membrane; a polyethylene tube with two holes drilled on its wall for water discharge; and a vinyl polychloride protector system to wrap the other elements. The discharge of the emitter depends on the change in the membrane diameter when it is deformed by the water pressure. The study of the operation of this emitter is a new approach that considers mechanical and hydraulic principles. Thus, the estimation on the membrane deformation was based on classical mechanical stress theories in composite cylinders. The hydraulic principles considered the solid deformation due to force based on water pressure and the general Darcy–Weisbach head-loss equation. Twenty emitter units, with the selected design, were handcrafted in a lathe and were used in this study. The measured pressure/discharge relationship for the emitters showed good agreement with that calculated by the theoretical approach. The variation coefficient of the handcrafted emitters was high compared to commercial emitters. Results from field evaluations showed variable values for the relative flow variation, water emission uniformity and relative flow rate coefficients, but no emitter was obstructed. Therefore, the current emitter design could be suitable for SDI following further studies to develop a final prototype.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two important characteristics of science are the ?reproducibility? and ?clarity?. By rigorous practices, scientists explore aspects of the world that they can reproduce under carefully controlled experimental conditions. The clarity, complementing reproducibility, provides unambiguous descriptions of results in a mechanical or mathematical form. Both pillars depend on well-structured and accurate descriptions of scientific practices, which are normally recorded in experimental protocols, scientific workflows, etc. Here we present SMART Protocols (SP), our ontology-based approach for representing experimental protocols and our contribution to clarity and reproducibility. SP delivers an unambiguous description of processes by means of which data is produced; by doing so, we argue, it facilitates reproducibility. Moreover, SP is thought to be part of e-science infrastructures. SP results from the analysis of 175 protocols; from this dataset, we extracted common elements. From our analysis, we identified document, workflow and domain-specific aspects in the representation of experimental protocols. The ontology is available at http://purl.org/net/SMARTprotocol

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reproducible research in scientific workflows is often addressed by tracking the provenance of the produced results. While this approach allows inspecting intermediate and final results, improves understanding, and permits replaying a workflow execution, it does not ensure that the computational environment is available for subsequent executions to reproduce the experiment. In this work, we propose describing the resources involved in the execution of an experiment using a set of semantic vocabularies, so as to conserve the computational environment. We define a process for documenting the workflow application, management system, and their dependencies based on 4 domain ontologies. We then conduct an experimental evaluation using a real workflow application on an academic and a public Cloud platform. Results show that our approach can reproduce an equivalent execution environment of a predefined virtual machine image on both computing platforms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hoy en día, el proceso de un proyecto sostenible persigue realizar edificios de elevadas prestaciones que son, energéticamente eficientes, saludables y económicamente viables utilizando sabiamente recursos renovables para minimizar el impacto sobre el medio ambiente reduciendo, en lo posible, la demanda de energía, lo que se ha convertido, en la última década, en una prioridad. La Directiva 2002/91/CE "Eficiencia Energética de los Edificios" (y actualizaciones posteriores) ha establecido el marco regulatorio general para el cálculo de los requerimientos energéticos mínimos. Desde esa fecha, el objetivo de cumplir con las nuevas directivas y protocolos ha conducido las políticas energéticas de los distintos países en la misma dirección, centrándose en la necesidad de aumentar la eficiencia energética en los edificios, la adopción de medidas para reducir el consumo, y el fomento de la generación de energía a través de fuentes renovables. Los edificios de energía nula o casi nula (ZEB, Zero Energy Buildings ó NZEB, Net Zero Energy Buildings) deberán convertirse en un estándar de la construcción en Europa y con el fin de equilibrar el consumo de energía, además de reducirlo al mínimo, los edificios necesariamente deberán ser autoproductores de energía. Por esta razón, la envolvente del edifico y en particular las fachadas son importantes para el logro de estos objetivos y la tecnología fotovoltaica puede tener un papel preponderante en este reto. Para promover el uso de la tecnología fotovoltaica, diferentes programas de investigación internacionales fomentan y apoyan soluciones para favorecer la integración completa de éstos sistemas como elementos arquitectónicos y constructivos, los sistemas BIPV (Building Integrated Photovoltaic), sobre todo considerando el próximo futuro hacia edificios NZEB. Se ha constatado en este estudio que todavía hay una falta de información útil disponible sobre los sistemas BIPV, a pesar de que el mercado ofrece una interesante gama de soluciones, en algunos aspectos comparables a los sistemas tradicionales de construcción. Pero por el momento, la falta estandarización y de una regulación armonizada, además de la falta de información en las hojas de datos técnicos (todavía no comparables con las mismas que están disponibles para los materiales de construcción), hacen difícil evaluar adecuadamente la conveniencia y factibilidad de utilizar los componentes BIPV como parte integrante de la envolvente del edificio. Organizaciones internacionales están trabajando para establecer las normas adecuadas y procedimientos de prueba y ensayo para comprobar la seguridad, viabilidad y fiabilidad estos sistemas. Sin embargo, hoy en día, no hay reglas específicas para la evaluación y caracterización completa de un componente fotovoltaico de integración arquitectónica de acuerdo con el Reglamento Europeo de Productos de la Construcción, CPR 305/2011. Los productos BIPV, como elementos de construcción, deben cumplir con diferentes aspectos prácticos como resistencia mecánica y la estabilidad; integridad estructural; seguridad de utilización; protección contra el clima (lluvia, nieve, viento, granizo), el fuego y el ruido, aspectos que se han convertido en requisitos esenciales, en la perspectiva de obtener productos ambientalmente sostenibles, saludables, eficientes energéticamente y económicamente asequibles. Por lo tanto, el módulo / sistema BIPV se convierte en una parte multifuncional del edificio no sólo para ser física y técnicamente "integrado", además de ser una oportunidad innovadora del diseño. Las normas IEC, de uso común en Europa para certificar módulos fotovoltaicos -IEC 61215 e IEC 61646 cualificación de diseño y homologación del tipo para módulos fotovoltaicos de uso terrestre, respectivamente para módulos fotovoltaicos de silicio cristalino y de lámina delgada- atestan únicamente la potencia del módulo fotovoltaico y dan fe de su fiabilidad por un período de tiempo definido, certificando una disminución de potencia dentro de unos límites. Existe también un estándar, en parte en desarrollo, el IEC 61853 (“Ensayos de rendimiento de módulos fotovoltaicos y evaluación energética") cuyo objetivo es la búsqueda de procedimientos y metodologías de prueba apropiados para calcular el rendimiento energético de los módulos fotovoltaicos en diferentes condiciones climáticas. Sin embargo, no existen ensayos normalizados en las condiciones específicas de la instalación (p. ej. sistemas BIPV de fachada). Eso significa que es imposible conocer las efectivas prestaciones de estos sistemas y las condiciones ambientales que se generan en el interior del edificio. La potencia nominal de pico Wp, de un módulo fotovoltaico identifica la máxima potencia eléctrica que éste puede generar bajo condiciones estándares de medida (STC: irradición 1000 W/m2, 25 °C de temperatura del módulo y distribución espectral, AM 1,5) caracterizando eléctricamente el módulo PV en condiciones específicas con el fin de poder comparar los diferentes módulos y tecnologías. El vatio pico (Wp por su abreviatura en inglés) es la medida de la potencia nominal del módulo PV y no es suficiente para evaluar el comportamiento y producción del panel en términos de vatios hora en las diferentes condiciones de operación, y tampoco permite predecir con convicción la eficiencia y el comportamiento energético de un determinado módulo en condiciones ambientales y de instalación reales. Un adecuado elemento de integración arquitectónica de fachada, por ejemplo, debería tener en cuenta propiedades térmicas y de aislamiento, factores como la transparencia para permitir ganancias solares o un buen control solar si es necesario, aspectos vinculados y dependientes en gran medida de las condiciones climáticas y del nivel de confort requerido en el edificio, lo que implica una necesidad de adaptación a cada contexto específico para obtener el mejor resultado. Sin embargo, la influencia en condiciones reales de operación de las diferentes soluciones fotovoltaicas de integración, en el consumo de energía del edificio no es fácil de evaluar. Los aspectos térmicos del interior del ambiente o de iluminación, al utilizar módulos BIPV semitransparentes por ejemplo, son aún desconocidos. Como se dijo antes, la utilización de componentes de integración arquitectónica fotovoltaicos y el uso de energía renovable ya es un hecho para producir energía limpia, pero también sería importante conocer su posible contribución para mejorar el confort y la salud de los ocupantes del edificio. Aspectos como el confort, la protección o transmisión de luz natural, el aislamiento térmico, el consumo energético o la generación de energía son aspectos que suelen considerarse independientemente, mientras que todos juntos contribuyen, sin embargo, al balance energético global del edificio. Además, la necesidad de dar prioridad a una orientación determinada del edificio, para alcanzar el mayor beneficio de la producción de energía eléctrica o térmica, en el caso de sistemas activos y pasivos, respectivamente, podría hacer estos últimos incompatibles, pero no necesariamente. Se necesita un enfoque holístico que permita arquitectos e ingenieros implementar sistemas tecnológicos que trabajen en sinergia. Se ha planteado por ello un nuevo concepto: "C-BIPV, elemento fotovoltaico consciente integrado", esto significa necesariamente conocer los efectos positivos o negativos (en términos de confort y de energía) en condiciones reales de funcionamiento e instalación. Propósito de la tesis, método y resultados Los sistemas fotovoltaicos integrados en fachada son a menudo soluciones de vidrio fácilmente integrables, ya que por lo general están hechos a medida. Estos componentes BIPV semitransparentes, integrados en el cerramiento proporcionan iluminación natural y también sombra, lo que evita el sobrecalentamiento en los momentos de excesivo calor, aunque como componente estático, asimismo evitan las posibles contribuciones pasivas de ganancias solares en los meses fríos. Además, la temperatura del módulo varía considerablemente en ciertas circunstancias influenciada por la tecnología fotovoltaica instalada, la radiación solar, el sistema de montaje, la tipología de instalación, falta de ventilación, etc. Este factor, puede suponer un aumento adicional de la carga térmica en el edificio, altamente variable y difícil de cuantificar. Se necesitan, en relación con esto, más conocimientos sobre el confort ambiental interior en los edificios que utilizan tecnologías fotovoltaicas integradas, para abrir de ese modo, una nueva perspectiva de la investigación. Con este fin, se ha diseñado, proyectado y construido una instalación de pruebas al aire libre, el BIPV Env-lab "BIPV Test Laboratory", para la caracterización integral de los diferentes módulos semitransparentes BIPV. Se han definido también el método y el protocolo de ensayos de caracterización en el contexto de un edificio y en condiciones climáticas y de funcionamiento reales. Esto ha sido posible una vez evaluado el estado de la técnica y la investigación, los aspectos que influyen en la integración arquitectónica y los diferentes tipos de integración, después de haber examinado los métodos de ensayo para los componentes de construcción y fotovoltaicos, en condiciones de operación utilizadas hasta ahora. El laboratorio de pruebas experimentales, que consiste en dos habitaciones idénticas a escala real, 1:1, ha sido equipado con sensores y todos los sistemas de monitorización gracias a los cuales es posible obtener datos fiables para evaluar las prestaciones térmicas, de iluminación y el rendimiento eléctrico de los módulos fotovoltaicos. Este laboratorio permite el estudio de tres diferentes aspectos que influencian el confort y consumo de energía del edificio: el confort térmico, lumínico, y el rendimiento energético global (demanda/producción de energía) de los módulos BIPV. Conociendo el balance de energía para cada tecnología solar fotovoltaica experimentada, es posible determinar cuál funciona mejor en cada caso específico. Se ha propuesto una metodología teórica para la evaluación de estos parámetros, definidos en esta tesis como índices o indicadores que consideran cuestiones relacionados con el bienestar, la energía y el rendimiento energético global de los componentes BIPV. Esta metodología considera y tiene en cuenta las normas reglamentarias y estándares existentes para cada aspecto, relacionándolos entre sí. Diferentes módulos BIPV de doble vidrio aislante, semitransparentes, representativos de diferentes tecnologías fotovoltaicas (tecnología de silicio monocristalino, m-Si; de capa fina en silicio amorfo unión simple, a-Si y de capa fina en diseleniuro de cobre e indio, CIS) fueron seleccionados para llevar a cabo una serie de pruebas experimentales al objeto de demostrar la validez del método de caracterización propuesto. Como resultado final, se ha desarrollado y generado el Diagrama Caracterización Integral DCI, un sistema gráfico y visual para representar los resultados y gestionar la información, una herramienta operativa útil para la toma de decisiones con respecto a las instalaciones fotovoltaicas. Este diagrama muestra todos los conceptos y parámetros estudiados en relación con los demás y ofrece visualmente toda la información cualitativa y cuantitativa sobre la eficiencia energética de los componentes BIPV, por caracterizarlos de manera integral. ABSTRACT A sustainable design process today is intended to produce high-performance buildings that are energy-efficient, healthy and economically feasible, by wisely using renewable resources to minimize the impact on the environment and to reduce, as much as possible, the energy demand. In the last decade, the reduction of energy needs in buildings has become a top priority. The Directive 2002/91/EC “Energy Performance of Buildings” (and its subsequent updates) established a general regulatory framework’s methodology for calculation of minimum energy requirements. Since then, the aim of fulfilling new directives and protocols has led the energy policies in several countries in a similar direction that is, focusing on the need of increasing energy efficiency in buildings, taking measures to reduce energy consumption, and fostering the use of renewable sources. Zero Energy Buildings or Net Zero Energy Buildings will become a standard in the European building industry and in order to balance energy consumption, buildings, in addition to reduce the end-use consumption should necessarily become selfenergy producers. For this reason, the façade system plays an important role for achieving these energy and environmental goals and Photovoltaic can play a leading role in this challenge. To promote the use of photovoltaic technology in buildings, international research programs encourage and support solutions, which favors the complete integration of photovoltaic devices as an architectural element, the so-called BIPV (Building Integrated Photovoltaic), furthermore facing to next future towards net-zero energy buildings. Therefore, the BIPV module/system becomes a multifunctional building layer, not only physically and functionally “integrated” in the building, but also used as an innovative chance for the building envelope design. It has been found in this study that there is still a lack of useful information about BIPV for architects and designers even though the market is providing more and more interesting solutions, sometimes comparable to the existing traditional building systems. However at the moment, the lack of an harmonized regulation and standardization besides to the non-accuracy in the technical BIPV datasheets (not yet comparable with the same ones available for building materials), makes difficult for a designer to properly evaluate the fesibility of this BIPV components when used as a technological system of the building skin. International organizations are working to establish the most suitable standards and test procedures to check the safety, feasibility and reliability of BIPV systems. Anyway, nowadays, there are no specific rules for a complete characterization and evaluation of a BIPV component according to the European Construction Product Regulation, CPR 305/2011. BIPV products, as building components, must comply with different practical aspects such as mechanical resistance and stability; structural integrity; safety in use; protection against weather (rain, snow, wind, hail); fire and noise: aspects that have become essential requirements in the perspective of more and more environmentally sustainable, healthy, energy efficient and economically affordable products. IEC standards, commonly used in Europe to certify PV modules (IEC 61215 and IEC 61646 respectively crystalline and thin-film ‘Terrestrial PV Modules-Design Qualification and Type Approval’), attest the feasibility and reliability of PV modules for a defined period of time with a limited power decrease. There is also a standard (IEC 61853, ‘Performance Testing and Energy Rating of Terrestrial PV Modules’) still under preparation, whose aim is finding appropriate test procedures and methodologies to calculate the energy yield of PV modules under different climate conditions. Furthermore, the lack of tests in specific conditions of installation (e.g. façade BIPV devices) means that it is difficult knowing the exact effective performance of these systems and the environmental conditions in which the building will operate. The nominal PV power at Standard Test Conditions, STC (1.000 W/m2, 25 °C temperature and AM 1.5) is usually measured in indoor laboratories, and it characterizes the PV module at specific conditions in order to be able to compare different modules and technologies on a first step. The “Watt-peak” is not enough to evaluate the panel performance in terms of Watt-hours of various modules under different operating conditions, and it gives no assurance of being able to predict the energy performance of a certain module at given environmental conditions. A proper BIPV element for façade should take into account thermal and insulation properties, factors as transparency to allow solar gains if possible or a good solar control if necessary, aspects that are linked and high dependent on climate conditions and on the level of comfort to be reached. However, the influence of different façade integrated photovoltaic solutions on the building energy consumption is not easy to assess under real operating conditions. Thermal aspects, indoor temperatures or luminance level that can be expected using building integrated PV (BIPV) modules are not well known. As said before, integrated photovoltaic BIPV components and the use of renewable energy is already a standard for green energy production, but would also be important to know the possible contribution to improve the comfort and health of building occupants. Comfort, light transmission or protection, thermal insulation or thermal/electricity power production are aspects that are usually considered alone, while all together contribute to the building global energy balance. Besides, the need to prioritize a particular building envelope orientation to harvest the most benefit from the electrical or thermal energy production, in the case of active and passive systems respectively might be not compatible, but also not necessary. A holistic approach is needed to enable architects and engineers implementing technological systems working in synergy. A new concept have been suggested: “C-BIPV, conscious integrated BIPV”. BIPV systems have to be “consciously integrated” which means that it is essential to know the positive and negative effects in terms of comfort and energy under real operating conditions. Purpose of the work, method and results The façade-integrated photovoltaic systems are often glass solutions easily integrable, as they usually are custommade. These BIPV semi-transparent components integrated as a window element provides natural lighting and shade that prevents overheating at times of excessive heat, but as static component, likewise avoid the possible solar gains contributions in the cold months. In addition, the temperature of the module varies considerably in certain circumstances influenced by the PV technology installed, solar radiation, mounting system, lack of ventilation, etc. This factor may result in additional heat input in the building highly variable and difficult to quantify. In addition, further insights into the indoor environmental comfort in buildings using integrated photovoltaic technologies are needed to open up thereby, a new research perspective. This research aims to study their behaviour through a series of experiments in order to define the real influence on comfort aspects and on global energy building consumption, as well as, electrical and thermal characteristics of these devices. The final objective was to analyze a whole set of issues that influence the global energy consumption/production in a building using BIPV modules by quantifying the global energy balance and the BIPV system real performances. Other qualitative issues to be studied were comfort aspect (thermal and lighting aspects) and the electrical behaviour of different BIPV technologies for vertical integration, aspects that influence both energy consumption and electricity production. Thus, it will be possible to obtain a comprehensive global characterization of BIPV systems. A specific design of an outdoor test facility, the BIPV Env-lab “BIPV Test Laboratory”, for the integral characterization of different BIPV semi-transparent modules was developed and built. The method and test protocol for the BIPV characterization was also defined in a real building context and weather conditions. This has been possible once assessed the state of the art and research, the aspects that influence the architectural integration and the different possibilities and types of integration for PV and after having examined the test methods for building and photovoltaic components, under operation conditions heretofore used. The test laboratory that consists in two equivalent test rooms (1:1) has a monitoring system in which reliable data of thermal, daylighting and electrical performances can be obtained for the evaluation of PV modules. The experimental set-up facility (testing room) allows studying three different aspects that affect building energy consumption and comfort issues: the thermal indoor comfort, the lighting comfort and the energy performance of BIPV modules tested under real environmental conditions. Knowing the energy balance for each experimented solar technology, it is possible to determine which one performs best. A theoretical methodology has been proposed for evaluating these parameters, as defined in this thesis as indices or indicators, which regard comfort issues, energy and the overall performance of BIPV components. This methodology considers the existing regulatory standards for each aspect, relating them to one another. A set of insulated glass BIPV modules see-through and light-through, representative of different PV technologies (mono-crystalline silicon technology, mc-Si, amorphous silicon thin film single junction, a-Si and copper indium selenide thin film technology CIS) were selected for a series of experimental tests in order to demonstrate the validity of the proposed characterization method. As result, it has been developed and generated the ICD Integral Characterization Diagram, a graphic and visual system to represent the results and manage information, a useful operational tool for decision-making regarding to photovoltaic installations. This diagram shows all concepts and parameters studied in relation to each other and visually provides access to all the results obtained during the experimental phase to make available all the qualitative and quantitative information on the energy performance of the BIPV components by characterizing them in a comprehensive way.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El cálculo de cargas de aerogeneradores flotantes requiere herramientas de simulación en el dominio del tiempo que consideren todos los fenómenos que afectan al sistema, como la aerodinámica, la dinámica estructural, la hidrodinámica, las estrategias de control y la dinámica de las líneas de fondeo. Todos estos efectos están acoplados entre sí y se influyen mutuamente. Las herramientas integradas se utilizan para calcular las cargas extremas y de fatiga que son empleadas para dimensionar estructuralmente los diferentes componentes del aerogenerador. Por esta razón, un cálculo preciso de las cargas influye de manera importante en la optimización de los componentes y en el coste final del aerogenerador flotante. En particular, el sistema de fondeo tiene gran impacto en la dinámica global del sistema. Muchos códigos integrados para la simulación de aerogeneradores flotantes utilizan modelos simplificados que no consideran los efectos dinámicos de las líneas de fondeo. Una simulación precisa de las líneas de fondeo dentro de los modelos integrados puede resultar fundamental para obtener resultados fiables de la dinámica del sistema y de los niveles de cargas en los diferentes componentes. Sin embargo, el impacto que incluir la dinámica de los fondeos tiene en la simulación integrada y en las cargas todavía no ha sido cuantificada rigurosamente. El objetivo principal de esta investigación es el desarrollo de un modelo dinámico para la simulación de líneas de fondeo con precisión, validarlo con medidas en un tanque de ensayos e integrarlo en un código de simulación para aerogeneradores flotantes. Finalmente, esta herramienta, experimentalmente validada, es utilizada para cuantificar el impacto que un modelos dinámicos de líneas de fondeo tienen en la computación de las cargas de fatiga y extremas de aerogeneradores flotantes en comparación con un modelo cuasi-estático. Esta es una información muy útil para los futuros diseñadores a la hora de decidir qué modelo de líneas de fondeo es el adecuado, dependiendo del tipo de plataforma y de los resultados esperados. El código dinámico de líneas de fondeo desarrollado en esta investigación se basa en el método de los Elementos Finitos, utilizando en concreto un modelo ”Lumped Mass” para aumentar su eficiencia de computación. Los experimentos realizados para la validación del código se realizaron en el tanque del École Céntrale de Nantes (ECN), en Francia, y consistieron en sumergir una cadena con uno de sus extremos anclados en el fondo del tanque y excitar el extremo suspendido con movimientos armónicos de diferentes periodos. El código demostró su capacidad para predecir la tensión y los movimientos en diferentes posiciones a lo largo de la longitud de la línea con gran precisión. Los resultados indicaron la importancia de capturar la dinámica de las líneas de fondeo para la predicción de la tensión especialmente en movimientos de alta frecuencia. Finalmente, el código se utilizó en una exhaustiva evaluación del efecto que la dinámica de las líneas de fondeo tiene sobre las cargas extremas y de fatiga de diferentes conceptos de aerogeneradores flotantes. Las cargas se calcularon para tres tipologías de aerogenerador flotante (semisumergible, ”spar-buoy” y ”tension leg platform”) y se compararon con las cargas obtenidas utilizando un modelo cuasi-estático de líneas de fondeo. Se lanzaron y postprocesaron más de 20.000 casos de carga definidos por la norma IEC 61400-3 siguiendo todos los requerimientos que una entidad certificadora requeriría a un diseñador industrial de aerogeneradores flotantes. Los resultados mostraron que el impacto de la dinámica de las líneas de fondeo, tanto en las cargas de fatiga como en las extremas, se incrementa conforme se consideran elementos situados más cerca de la plataforma: las cargas en la pala y en el eje sólo son ligeramente modificadas por la dinámica de las líneas, las cargas en la base de la torre pueden cambiar significativamente dependiendo del tipo de plataforma y, finalmente, la tensión en las líneas de fondeo depende fuertemente de la dinámica de las líneas, tanto en fatiga como en extremas, en todos los conceptos de plataforma que se han evaluado. ABSTRACT The load calculation of floating offshore wind turbine requires time-domain simulation tools taking into account all the phenomena that affect the system such as aerodynamics, structural dynamics, hydrodynamics, control actions and the mooring lines dynamics. These effects present couplings and are mutually influenced. The results provided by integrated simulation tools are used to compute the fatigue and ultimate loads needed for the structural design of the different components of the wind turbine. For this reason, their accuracy has an important influence on the optimization of the components and the final cost of the floating wind turbine. In particular, the mooring system greatly affects the global dynamics of the floater. Many integrated codes for the simulation of floating wind turbines use simplified approaches that do not consider the mooring line dynamics. An accurate simulation of the mooring system within the integrated codes can be fundamental to obtain reliable results of the system dynamics and the loads. The impact of taking into account the mooring line dynamics in the integrated simulation still has not been thoroughly quantified. The main objective of this research consists on the development of an accurate dynamic model for the simulation of mooring lines, validate it against wave tank tests and then integrate it in a simulation code for floating wind turbines. This experimentally validated tool is finally used to quantify the impact that dynamic mooring models have on the computation of fatigue and ultimate loads of floating wind turbines in comparison with quasi-static tools. This information will be very useful for future designers to decide which mooring model is adequate depending on the platform type and the expected results. The dynamic mooring lines code developed in this research is based in the Finite Element Method and is oriented to the achievement of a computationally efficient code, selecting a Lumped Mass approach. The experimental tests performed for the validation of the code were carried out at the `Ecole Centrale de Nantes (ECN) wave tank in France, consisting of a chain submerged into a water basin, anchored at the bottom of the basin, where the suspension point of the chain was excited with harmonic motions of different periods. The code showed its ability to predict the tension and the motions at several positions along the length of the line with high accuracy. The results demonstrated the importance of capturing the evolution of the mooring dynamics for the prediction of the line tension, especially for the high frequency motions. Finally, the code was used for an extensive assessment of the effect of mooring dynamics on the computation of fatigue and ultimate loads for different floating wind turbines. The loads were computed for three platforms topologies (semisubmersible, spar-buoy and tension leg platform) and compared with the loads provided using a quasi-static mooring model. More than 20,000 load cases were launched and postprocessed following the IEC 61400-3 guideline and fulfilling the conditions that a certification entity would require to an offshore wind turbine designer. The results showed that the impact of mooring dynamics in both fatigue and ultimate loads increases as elements located closer to the platform are evaluated; the blade and the shaft loads are only slightly modified by the mooring dynamics in all the platform designs, the tower base loads can be significantly affected depending on the platform concept and the mooring lines tension strongly depends on the lines dynamics both in fatigue and extreme loads in all the platform concepts evaluated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Las prestaciones de un velero de regatas se estiman por medio de los Programas de Predicción de Velocidad (VPP) que incluyen las características de estabilidad y modelos aero e hidrodinámico del barco. Por esta razón, es importante tener una evaluación adecuada de las fuerzas en apéndices y de su variación en diferentes condiciones de navegación, escora y deriva. Además, para el cálculo de las fuerzas en los apéndices es importante conocer sus características hidrodinámicas cuando trabajan conjuntamente en un campo fluido fuertemente modificado por la carena. Por esta razón, se han utilizado una serie de ensayos realizados en el Canal de Ensayos de la ETSIN con el objetivo de validar códigos numéricos que permiten una evaluación más rápida y focalizada en los distintos fenómenos que se producen. Dichos ensayos se han realizado de forma que pudiera medirse independientemente las fuerzas hidrodinámicas en cada apéndice, lo que permitirá evaluar el reparto de fuerzas en diferentes condiciones de navegación para poder profundizar en las interacciones entre carena, quilla y timón. Las técnicas numéricas permiten capturar detalles que difícilmente se pueden visualizar en ensayos experimentales. En este sentido, se han probado las últimas técnicas utilizadas en los últimos workshops y se ha enfocado el estudio a un nuevo método con el objetivo de mostrar una metodologia más rápida que pueda servir a la industria para este tipo de aproximación al problema. ABSTRACT The performances of a racing sailboat are estimated by means of the speed prediction programs (VPP), which include the ship stability characteristics and the aero and hydrodynamic models. For this reason, it is important to have an adequate evaluation of the forces in appendices and its variation in different sailing conditions, heel and leeway Moreover, for the analysis of the forces in the appendices, it is important to know their hydrodynamic characteristics when they work together in a fluid field strongly modified by the canoe body. For this reason, several tests have been done in the ETSIN towing tank with the aim to validate numeric codes that allowing faster analysis and they permit to focus on the different phenomena that occur there. Such tests have been done in a way that the hydrodynamic forces in each appendage could be measured independently allowing assessing the distribution of forces in different sailing conditions to be able to deepen the interactions between the canoe body, the keel and the rudder. Numerical techniques allow capturing details that can hardly be displayed in experimental tests. In this sense, the latest techniques used in the recent workshops have been reviewed and the study has been focused to propose a new model with the aim to show a new faster methodology which serves the industry for this type of approach to the problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La cuestión del asentamiento socialista en la URSS durante la década de 1920 estuvo caracterizada por el objetivo de definir y establecer un estado socialista en términos políticos, sociales y económicos. En este contexto de inestabilidad y cambio, un grupo de arquitectos pertenecientes a la Asociación de Arquitectos Contemporáneos, OSA, y liderado por Moisey Ginzburg, abordó el tema de la vivienda obrera asumiendo la responsabilidad y el compromiso por alcanzar un nuevo orden social. Su misión no consistió únicamente en solucionar el problema del alojamiento para los trabajadores en las grandes ciudades soviéticas, sino en redefinirlo como el marco adecuado para una sociedad sometida a un cambio sin precedentes que, al mismo tiempo y en un proceso dialéctico, debía contribuir a la construcción de esa nueva sociedad. La respuesta dada por la OSA trascendió el diseño inmediato bajo los estándares modernos establecidos en Occidente y tomó forma en un proceso de investigación que habría de prolongarse durante cinco años. Este trabajo, que culminó con la construcción y puesta en crisis de la Casa Narkomfin, se desarrolló en tres aproximaciones sucesivas. La primera, de carácter conceptual, consideró la participación ciudadana, así como de especialistas independientes, formalizándose en el Concurso entre Camaradas convocado por la OSA en 1926. La segunda aproximación al problema de la vivienda obrera se articuló a través de la investigación llevada a cabo por la Sección de Tipificación del Stroykom, esta vez desde premisas científicas y metodológicas. Finalmente, las conclusiones alcanzadas fueron transferidas a la práctica arquitectónica por medio de la construcción de seis Casas Experimentales de Transición, entre las que destacó la Casa Narkomfin. Este último acercamiento, de carácter empírico, ha sido tradicionalmente examinado por los expertos como un hecho aislado. Sin embargo, su estudio debe trascender necesariamente el genio del autor-creador en favor del proceso de investigación al que pertenece. En esta tesis, la Casa Narkomfin no se presenta sólo como el paradigma de vivienda soviética de vanguardia al que estamos acostumbrados, sino como un prototipo que recoge los principios y conclusiones alcanzados en las aproximaciones conceptuales y científicas precedentes. Únicamente desde este punto de vista cobra sentido la consideración de Ginzburg sobre su propio edificio como un medio propositivo y no impositivo: un proyecto concebido como una herramienta de transición hacia una sociedad más avanzada. ABSTRACT The question of mass housing in the USSR during the Twenties was marked by the drive to define and establish a socialist state in political, social and economic terms. In this context of instability and change, a group of architects gathered together under the Association of Contemporary Architects, OSA, led by Moisey Ginzburg, to address the issue of mass housing, thus taking on the responsibility and being committed to creating a new social order. Their quest not only involved solving the problem of housing for workers in large Soviet cities, but also redefining this solution as an appropriate framework for a society undergoing dramatic changes which, at the same time and in a dialectical process, would contribute to the creation of this new society. The solution provided by OSA transcended Modern standards of immediate design set by the West and was the result of a research process that would last five years. This work culminated in the construction of Narkomfin House and its self-criticism, developed in three successive approaches. The first was conceptual, being formalized in the Comradely Competition held by the OSA in 1926 and taking into account the participation of citizens and independent experts. The second approach to the problem of mass housing involved research developed by the Typification Section of the Stroykom, this time under scientific and methodological premises. Finally, the conclusions reached were put in practice with the construction of six Experimental Transitional Houses of which the most notable is Narkomfin House. This third empirical approach has traditionally been examined by scholars in isolation. However, its study must necessarily transcend the genius of the author-creator and involve the research process of which it is part. In this thesis, Narkomfin House is presented not only as the paradigm in Soviet housing avant-garde we are used to, but also as a prototype reflecting the principles and conclusions reached in the preceding conceptual and scientific approaches. Only from this point of view does Ginzburg’s understanding of his own building as a proactive and non-imposed environment make sense: a project conceived as a transition tool towards a more advanced society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La obra fílmica del director francés Jacques Tati podría considerarse como el perfecto reflejo del paradigma edificatorio de mediados del siglo XX en plena posguerra europea, una época ávida de transformaciones de las que el cine supo hacerse eco. Particularmente, el cine de Tati refleja las preocupaciones del ciudadano europeo de posguerra sobre las consecuencias de las masivas construcciones erigidas en sus devastados núcleos urbanos y la puesta en práctica de la ciudad funcional propuesta por la Carta de Atenas (1931). Pero, además, el análisis del cine de Jacques Tati permite un acercamiento a la modernidad desde diversos puntos de vista como la movilidad, el diseño urbano, las nuevas construcciones, los espacios de trabajo en los nuevos distritos terciarios, la vivienda -tradicional, moderna y experimental- o el diseño mobiliario en la posguerra. A través de su alter ego -Monsieur Hulot- Tati interacciona curioso con las nuevas construcciones geométricas de paños neutros y cuestiona su ruptura con la tradición edificatoria francesa, enfatizando la oposición entre el pasado nostálgico y la modernidad de las décadas de los 50 y 60, salpicadas por el consumismo feroz del recién estrenado estado de bienestar. La confrontación funcional, volumétrica, estética e incluso cromática entre ambos mundos construidos –el tradicional y el moderno- invita al espectador a un ejercicio de reflexión y crítica sobre la arquitectura moderna de este período en Europa. En particular, la mirada cinematográfica de Tati se centra en dos conceptos fundamentales. Por una parte, su atención se dirige a la famosa casa mecanicista Le Corbuseriana materializada en la ultra-moderna casa Arpel (Mon Oncle, 1958) y proyectada en la misma época en la que se desarrollaban importantes prototipos de vivienda experimental como la Casa de Futuro de Alison y Peter Smithson o las viviendas de Jean Prouvé. Debe ponerse de manifiesto que la crítica de Jacques Tati no se centraba en la arquitectura moderna en sí misma sino en el empleo erróneo que los usuarios pudieran hacer de ella. Por otro lado, Tati centra su atención en el prisma miesiano a través de los bloques de oficinas que conforman la ciudad de Tativille en Playtime (1967). Se trataba de una gran ciudad moderna construida explícitamente para el rodaje de la película y basada en los casi idénticos tejidos urbanos residenciales y terciarios ya en funcionamiento en las principales capitales europeas y norteamericanas en aquellos años. Tativille funcionaría como una ciudad autónoma disponiendo de diversas instalaciones y con el objetivo de integrarse y consolidarse en la trama urbana parisina. Lamentablemente, su destino al final del rodaje fue bien distinto. En definitiva, el análisis de la producción fílmica de Jacques Tati permite un acercamiento a la arquitectura y al urbanismo modernos de posguerra y al contexto socio-económico que favoreció su crecimiento y expansión. Por ello, su obra constituye una herramienta visual muy útil que aún hoy es consultada y mostrada por su claridad y humor y que invita a los ciudadanos –telespectadores- a participar en un ejercicio crítico arquitectónico hasta entonces reservado a los arquitectos. ABSTRACT The film work of French director Jacques Tati could be considered as the perfect reflection of the mid-20th century European post-war building paradigm, a period of time plenty of transformations perfectly echoed by cinema. In particular, Tati’s film work reflects the European post-war citizen’s concerns about the consequences of massive constructions built in their desvastated urban centres, as well as the development of functional cities proposed by the Athens Charter (1931). But, on top of that, an analysis of Jacques Tati’s cinematography allows for an approach to modernity from different perspectives, such as mobility, urban design, new buildings, working spaces in the new tertiary districts, housing -traditional, modern, and experimental-, or furniture design during the post-war period. Embodied by his alter-ego –Monsieur Hulot,- Tati curiously interacts with the new geometric constructions of neutral facades and questions the break with the French building tradition, highlighting the opposition between the nostalgic past and modernity of the 50s and 60s, affected by the fierce consumerism of the new welfare state. The functional, volumetric, aesthetic and even chromatic confrontation between both built worlds –traditional vs modern- invites the viewer to an exercise of meditation and criticism on the European modern architecture of that period. Tati’s film look is particularly focused on two basic concepts: on the one hand, his attention addresses Le Corbusier’s famous mechanistic house which is materialized in the ultra-modern Arpel house (Mon Oncle, 1958) and designed, in turn, when the development of other important experimental dwelling prototypes like Alison and Peter Smithson’s House of the Future or Jean Prouvé´s houses was taking place. It must be highlighted that Jacques Tati’s criticism was not addressed to modern architecture itself but to the wrong use that citizens could make of it. On the other hand, Tati focuses on the Miesian prism through the office buildings that shape the city of Tativille in Playtime (1967). It was a big, modern city built specifically for the film shooting, and based on the almost identical residential and tertiary urban fabrics already active in the main European and American capitals those years. Tativille would work as an autonomous city, having several facilities at its disposal and with the goal of getting integrated and consolidated into the Parisian urban weave. However, its final use was, unfortunately, quite different. In conclusion, an analysis of Jacques Tati’s film production allows for an approach to modern post-war architecture and urbanism, as well as to the socio-economic context that favoured its growth and expansion. As a result of this, Jacques Tati’s film production constitutes a suitable visual tool which, even nowadays, is consulted and shown due to its clarity and humour, and at the same time invites citizens –viewers- to participate in an architectural criticism exercise that, so far, had been reserved to architects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A computational system for the prediction of polymorphic loci directly and efficiently from human genomic sequence was developed and verified. A suite of programs, collectively called pompous (polymorphic marker prediction of ubiquitous simple sequences) detects tandem repeats ranging from dinucleotides up to 250 mers, scores them according to predicted level of polymorphism, and designs appropriate flanking primers for PCR amplification. This approach was validated on an approximately 750-kilobase region of human chromosome 3p21.3, involved in lung and breast carcinoma homozygous deletions. Target DNA from 36 paired B lymphoblastoid and lung cancer lines was amplified and allelotyped for 33 loci predicted by pompous to be variable in repeat size. We found that among those 36 predominately Caucasian individuals 22 of the 33 (67%) predicted loci were polymorphic with an average heterozygosity of 0.42. Allele loss in this region was found in 27/36 (75%) of the tumor lines using these markers. pompous provides the genetic researcher with an additional tool for the rapid and efficient identification of polymorphic markers, and through a World Wide Web site, investigators can use pompous to identify polymorphic markers for their research. A catalog of 13,261 potential polymorphic markers and associated primer sets has been created from the analysis of 141,779,504 base pairs of human genomic sequence in GenBank. This data is available on our Web site (pompous.swmed.edu) and will be updated periodically as GenBank is expanded and algorithm accuracy is improved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thermodynamic stability and oligomerization status of the tumor suppressor p53 tetramerization domain have been studied experimentally and theoretically. A series of hydrophilic mutations at Met-340 and Leu-344 of human p53 were designed to disrupt the hydrophobic dimer–dimer interface of the tetrameric oligomerization domain of p53 (residues 325–355). Meanfield calculations of the free energy of the solvated mutants as a function of interdimer distance were compared with experimental data on the thermal stability and oligomeric state (tetramer, dimer, or equilibrium mixture of both) of each mutant. The calculations predicted a decreasing stability and oligomeric state for the following amino acids at residue 340: Met (tetramer) > Ser Asp, His, Gln, > Glu, Lys (dimer), whereas the experimental results showed the following order: Met (tetramer) > Ser > Gln > His, Lys > Asp, Glu (dimers). For residue 344, the calculated trend was Leu (tetramer) > Ala > Arg, Gln, Lys (dimer), and the experimental trend was Leu (tetramer) > Ala, Arg, Gln, Lys (dimer). The discrepancy for the lysine side chain at residue 340 is attributed to the dual nature of lysine, both hydrophobic and charged. The incorrect prediction of stability of the mutant with Asp at residue 340 is attributed to the fact that within the meanfield approach, we use the wild-type backbone configuration for all mutants, but low melting temperatures suggest a softening of the α-helices at the dimer–dimer interface. Overall, this initial application of meanfield theory toward a protein-solvent system is encouraging for the application of the theoretical model to more complex systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Independent transgene insertions differ in expression based on their location in the genome; these position effects are of interest because they reflect the influence of genome organization on gene regulation. Position effects also represent potentially insurmountable obstacles to the rigorous functional comparison of homologous genes from different species because (i) quantitative variation in expression of each gene across genomic positions (generalized position effects, or GPEs) may overwhelm differences between the genes of interest, or (ii) divergent genes may be differentially sensitive to position effects, reflecting unique interactions between each gene and its genomic milieu (lineage-specific position effects, or LSPEs). We have investigated both types of position-effect variation by applying our method of transgene coplacement, which allows comparisons of transgenes in the same position in the genome of Drosophila melanogaster. Here we report an experimental test for LSPE in Drosophila. The alcohol dehydrogenase (Adh) genes of D. melanogaster and Drosophila affinidisjuncta differ in both tissue distribution and amounts of ADH activity. Despite this striking regulatory divergence, we found a very high correlation in overall ADH activity between the genes of the two species when placed in the same genomic position as assayed in otherwise Adh-null adults and larvae. These results argue against the influence of LSPE for these sequences, although the effects of GPE are significant. Our new findings validate the coplacement approach and show that it greatly magnifies the power to detect differences in expression between transgenes. Transgene coplacement thus dramatically extends the range of functional and evolutionary questions that can be addressed by transgenic technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review the study of flower color polymorphisms in the morning glory as a model for the analysis of adaptation. The pathway involved in the determination of flower color phenotype is traced from the molecular and genetic levels to the phenotypic level. Many of the genes that determine the enzymatic components of flavonoid biosynthesis are redundant, but, despite this complexity, it is possible to associate discrete floral phenotypes with individual genes. An important finding is that almost all of the mutations that determine phenotypic differences are the result of transposon insertions. Thus, the flower color diversity seized on by early human domesticators of this plant is a consequence of the rich variety of mobile elements that reside in the morning glory genome. We then consider a long history of research aimed at uncovering the ecological fate of these various flower phenotypes in the southeastern U.S. A large body of work has shown that insect pollinators discriminate against white phenotypes when white flowers are rare in populations. Because the plant is self-compatible, pollinator bias causes an increase in self-fertilization in white maternal plants, which should lead to an increase in the frequency of white genes, according to modifier gene theory. Studies of geographical distributions indicate other, as yet undiscovered, disadvantages associated with the white phenotype. The ultimate goal of connecting ecology to molecular genetics through the medium of phenotype is yet to be attained, but this approach may represent a model for analyzing the translation between these two levels of biological organization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Blocking CD28-B7 T-cell costimulation by systemic administration of CTLA4Ig, a fusion protein which binds B7 molecules on the surface of antigen-presenting cells, prevents rejection and induces tolerance in experimental acute allograft rejection models. We tested the effect of CTLA4Ig therapy on the process of chronic renal allograft rejection using an established experimental transplantation model. F344 kidneys were transplanted orthotopically into bilaterally nephrectomized LEW recipients. Control animals received low dose cyclosporine for 10 days posttransplantation. Administration of a single injection of CTLA4Ig on day 2 posttransplant alone or in addition to the low dose cyclosporine protocol resulted in improvement of long-term graft survival as compared with controls. More importantly, control recipients which received cyclosporine only developed progressive proteinuria by 8-12 weeks, and morphological evidence of chronic rejection by 16-24 weeks, including widespread transplant arteriosclerosis and focal and segmental glomerulosclerosis, while animals treated with CTLA4Ig alone or in addition to cyclosporine did not. Competitive reverse transcriptase-PCR and immunohistological analysis of allografts at 8, 16, and 24 weeks showed attenuation of lymphocyte and macrophage infiltration and activation in the CTLA4Ig-treated animals, as compared with cyclosporine-alone treated controls. These data confirm that early blockade of the CD28-B7 T-cell costimulatory pathway prevents later development and evolution of chronic renal allograft rejection. Our results indicate that T-cell recognition of alloantigen is a central event in initiating the process of chronic rejection, and that strategies targeted at blocking T-cell costimulation may prove to be a valuable clinical approach to preventing development of the process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rapid progress in effective methods to image brain functions has revolutionized neuroscience. It is now possible to study noninvasively in humans neural processes that were previously only accessible in experimental animals and in brain-injured patients. In this endeavor, positron emission tomography has been the leader, but the superconducting quantum interference device-based magnetoencephalography (MEG) is gaining a firm role, too. With the advent of instruments covering the whole scalp, MEG, typically with 5-mm spatial and 1-ms temporal resolution, allows neuroscientists to track cortical functions accurately in time and space. We present five representative examples of recent MEG studies in our laboratory that demonstrate the usefulness of whole-head magnetoencephalography in investigations of spatiotemporal dynamics of cortical signal processing.