864 resultados para point of interest


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the context of the present conference paper culverts are defined as an opening or conduit passing through an embankment usually for the purpose of conveying water or providing safe pedestrian and animal crossings under rail infrastructure. The clear opening of culverts may reach values of up to 12m however, values around 3m are encountered much more frequently. Depending on the topography, the number of culverts is about 10 times that of bridges. In spite of this, their dynamic behavior has received far less attention than that of bridges. The fundamental frequency of culverts is considerably higher than that of bridges even in the case of short span bridges. As the operational speed of modern high-speed passenger rail systems rises, higher frequencies are excited and thus more energy is encountered in frequency bands where the fundamental frequency of box culverts is located. Many research efforts have been spent on the subject of ballast instability due to bridge resonance, since it was first observed when high-speed trains were introduced to the Paris/Lyon rail line. To prevent this phenomenon from occurring, design codes establish a limit value for the vertical deck acceleration. Obviously one needs some sort of numerical model in order to estimate this acceleration level and at that point things get quite complicated. Not only acceleration but also displacement values are of interest e.g. to estimate the impact factor. According to design manuals the structural design should consider the depth of cover, trench width and condition, bedding type, backfill material, and compaction. The same applies to the numerical model however, the question is: What type of model is appropriate for this job? A 3D model including the embankment and an important part of the soil underneath the culvert is computationally very expensive and hard to justify taking into account the associated costs. Consequently, there is a clear need for simplified models and design rules in order to achieve reasonable costs. This paper will describe the results obtained from a 2D finite element model which has been calibrated by means of a 3D model and experimental data obtained at culverts that belong to the high-speed railway line that links the two towns of Segovia and Valladolid in Spain

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Una apropiada evaluación de los márgenes de seguridad de una instalación nuclear, por ejemplo, una central nuclear, tiene en cuenta todas las incertidumbres que afectan a los cálculos de diseño, funcionanmiento y respuesta ante accidentes de dicha instalación. Una fuente de incertidumbre son los datos nucleares, que afectan a los cálculos neutrónicos, de quemado de combustible o activación de materiales. Estos cálculos permiten la evaluación de las funciones respuesta esenciales para el funcionamiento correcto durante operación, y también durante accidente. Ejemplos de esas respuestas son el factor de multiplicación neutrónica o el calor residual después del disparo del reactor. Por tanto, es necesario evaluar el impacto de dichas incertidumbres en estos cálculos. Para poder realizar los cálculos de propagación de incertidumbres, es necesario implementar metodologías que sean capaces de evaluar el impacto de las incertidumbres de estos datos nucleares. Pero también es necesario conocer los datos de incertidumbres disponibles para ser capaces de manejarlos. Actualmente, se están invirtiendo grandes esfuerzos en mejorar la capacidad de analizar, manejar y producir datos de incertidumbres, en especial para isótopos importantes en reactores avanzados. A su vez, nuevos programas/códigos están siendo desarrollados e implementados para poder usar dichos datos y analizar su impacto. Todos estos puntos son parte de los objetivos del proyecto europeo ANDES, el cual ha dado el marco de trabajo para el desarrollo de esta tesis doctoral. Por tanto, primero se ha llevado a cabo una revisión del estado del arte de los datos nucleares y sus incertidumbres, centrándose en los tres tipos de datos: de decaimiento, de rendimientos de fisión y de secciones eficaces. A su vez, se ha realizado una revisión del estado del arte de las metodologías para la propagación de incertidumbre de estos datos nucleares. Dentro del Departamento de Ingeniería Nuclear (DIN) se propuso una metodología para la propagación de incertidumbres en cálculos de evolución isotópica, el Método Híbrido. Esta metodología se ha tomado como punto de partida para esta tesis, implementando y desarrollando dicha metodología, así como extendiendo sus capacidades. Se han analizado sus ventajas, inconvenientes y limitaciones. El Método Híbrido se utiliza en conjunto con el código de evolución isotópica ACAB, y se basa en el muestreo por Monte Carlo de los datos nucleares con incertidumbre. En esta metodología, se presentan diferentes aproximaciones según la estructura de grupos de energía de las secciones eficaces: en un grupo, en un grupo con muestreo correlacionado y en multigrupos. Se han desarrollado diferentes secuencias para usar distintas librerías de datos nucleares almacenadas en diferentes formatos: ENDF-6 (para las librerías evaluadas), COVERX (para las librerías en multigrupos de SCALE) y EAF (para las librerías de activación). Gracias a la revisión del estado del arte de los datos nucleares de los rendimientos de fisión se ha identificado la falta de una información sobre sus incertidumbres, en concreto, de matrices de covarianza completas. Además, visto el renovado interés por parte de la comunidad internacional, a través del grupo de trabajo internacional de cooperación para evaluación de datos nucleares (WPEC) dedicado a la evaluación de las necesidades de mejora de datos nucleares mediante el subgrupo 37 (SG37), se ha llevado a cabo una revisión de las metodologías para generar datos de covarianza. Se ha seleccionando la actualización Bayesiana/GLS para su implementación, y de esta forma, dar una respuesta a dicha falta de matrices completas para rendimientos de fisión. Una vez que el Método Híbrido ha sido implementado, desarrollado y extendido, junto con la capacidad de generar matrices de covarianza completas para los rendimientos de fisión, se han estudiado diferentes aplicaciones nucleares. Primero, se estudia el calor residual tras un pulso de fisión, debido a su importancia para cualquier evento después de la parada/disparo del reactor. Además, se trata de un ejercicio claro para ver la importancia de las incertidumbres de datos de decaimiento y de rendimientos de fisión junto con las nuevas matrices completas de covarianza. Se han estudiado dos ciclos de combustible de reactores avanzados: el de la instalación europea para transmutación industrial (EFIT) y el del reactor rápido de sodio europeo (ESFR), en los cuales se han analizado el impacto de las incertidumbres de los datos nucleares en la composición isotópica, calor residual y radiotoxicidad. Se han utilizado diferentes librerías de datos nucleares en los estudios antreriores, comparando de esta forma el impacto de sus incertidumbres. A su vez, mediante dichos estudios, se han comparando las distintas aproximaciones del Método Híbrido y otras metodologías para la porpagación de incertidumbres de datos nucleares: Total Monte Carlo (TMC), desarrollada en NRG por A.J. Koning y D. Rochman, y NUDUNA, desarrollada en AREVA GmbH por O. Buss y A. Hoefer. Estas comparaciones demostrarán las ventajas del Método Híbrido, además de revelar sus limitaciones y su rango de aplicación. ABSTRACT For an adequate assessment of safety margins of nuclear facilities, e.g. nuclear power plants, it is necessary to consider all possible uncertainties that affect their design, performance and possible accidents. Nuclear data are a source of uncertainty that are involved in neutronics, fuel depletion and activation calculations. These calculations can predict critical response functions during operation and in the event of accident, such as decay heat and neutron multiplication factor. Thus, the impact of nuclear data uncertainties on these response functions needs to be addressed for a proper evaluation of the safety margins. Methodologies for performing uncertainty propagation calculations need to be implemented in order to analyse the impact of nuclear data uncertainties. Nevertheless, it is necessary to understand the current status of nuclear data and their uncertainties, in order to be able to handle this type of data. Great eórts are underway to enhance the European capability to analyse/process/produce covariance data, especially for isotopes which are of importance for advanced reactors. At the same time, new methodologies/codes are being developed and implemented for using and evaluating the impact of uncertainty data. These were the objectives of the European ANDES (Accurate Nuclear Data for nuclear Energy Sustainability) project, which provided a framework for the development of this PhD Thesis. Accordingly, first a review of the state-of-the-art of nuclear data and their uncertainties is conducted, focusing on the three kinds of data: decay, fission yields and cross sections. A review of the current methodologies for propagating nuclear data uncertainties is also performed. The Nuclear Engineering Department of UPM has proposed a methodology for propagating uncertainties in depletion calculations, the Hybrid Method, which has been taken as the starting point of this thesis. This methodology has been implemented, developed and extended, and its advantages, drawbacks and limitations have been analysed. It is used in conjunction with the ACAB depletion code, and is based on Monte Carlo sampling of variables with uncertainties. Different approaches are presented depending on cross section energy-structure: one-group, one-group with correlated sampling and multi-group. Differences and applicability criteria are presented. Sequences have been developed for using different nuclear data libraries in different storing-formats: ENDF-6 (for evaluated libraries) and COVERX (for multi-group libraries of SCALE), as well as EAF format (for activation libraries). A revision of the state-of-the-art of fission yield data shows inconsistencies in uncertainty data, specifically with regard to complete covariance matrices. Furthermore, the international community has expressed a renewed interest in the issue through the Working Party on International Nuclear Data Evaluation Co-operation (WPEC) with the Subgroup (SG37), which is dedicated to assessing the need to have complete nuclear data. This gives rise to this review of the state-of-the-art of methodologies for generating covariance data for fission yields. Bayesian/generalised least square (GLS) updating sequence has been selected and implemented to answer to this need. Once the Hybrid Method has been implemented, developed and extended, along with fission yield covariance generation capability, different applications are studied. The Fission Pulse Decay Heat problem is tackled first because of its importance during events after shutdown and because it is a clean exercise for showing the impact and importance of decay and fission yield data uncertainties in conjunction with the new covariance data. Two fuel cycles of advanced reactors are studied: the European Facility for Industrial Transmutation (EFIT) and the European Sodium Fast Reactor (ESFR), and response function uncertainties such as isotopic composition, decay heat and radiotoxicity are addressed. Different nuclear data libraries are used and compared. These applications serve as frameworks for comparing the different approaches of the Hybrid Method, and also for comparing with other methodologies: Total Monte Carlo (TMC), developed at NRG by A.J. Koning and D. Rochman, and NUDUNA, developed at AREVA GmbH by O. Buss and A. Hoefer. These comparisons reveal the advantages, limitations and the range of application of the Hybrid Method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The interest for modelling of human actions acting on structures has been recurrent since the first accidents on suspension bridges in the nineteenth century like Broughton (1831) in the U.K. or Angers (1850) in France. Stadiums, gymnasiums are other type of structure where the human induced vibration is very important. In these structures appear particular phenomenon like the interaction person-structure (lock-in), the person-person synchronization, and the influence of the mass and damping of the people in the structure behaviour. This work focuses on the latter topic. The dynamic characteristic of a structure can be changed due to the presence of people on it. In order to evaluate these property modifications several testing have been carried out on a structure designed to be a gymnasium. For the test an electro-dynamic shaker was installed in a fixed point of the gym slab and different groups of people were located around the shaker. In each test the number of people was changed and also their posture (standing and sitting). Test data were analyzed and processed to verify modifications in the structure behaviour.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The interest for modelling of human actions acting on structures has been recurrent since the first accidents on suspension bridges in the nineteenth century such as Broughton (1831) in the U.K. or Angers (1850) in France. Stadiums, gymnasiums are other types of structure where human induced vibration is very important. In these structures a particular phenomenon appears such as the interaction personstructure (lock-in), the person-person synchronization, and the influence of the mass and damping of the people in the structural behaviour. This paper focuses on the latter topic. In order to evaluate these property modifications several tests have been carried out on a stand-alone building. For the test an electro-dynamic shaker was installed at a fixed point of the gym slab and different groups of people were located around the shaker. The dynamic characteristics of the structure without people inside have been calculated by two methods: using a three-dimensional finite element model of the building and by operational modal analysis. These calculated experimental and numerical values are the reference values used to evaluate the modifications in the dynamic properties of the structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing use of very light structures in aerospace applications are given rise to the need of taking into account the effects of the surrounding media in the motion of a structure (as for instance, in modal testing of solar panels or antennae) as it is usually performed in the motion of bodies submerged in water in marine applications. New methods are in development aiming at to determine rigid-body properties (the center of mass position and inertia properties) from the results of oscillations tests (at low frequencies during modal testing, by exciting the rigid-body modes only) by using the equations of the rigid-body dynamics. As it is shown in this paper, the effect of the surrounding media significantly modifies the oscillation dynamics in the case of light structures and therefore this effect should be taken into account in the development of the above-mentioned methods. The aim of the paper is to show that, if a central point exists for the aerodynamic forces acting on the body, the motion equations for the small amplitude rotational and translational oscillations can be expressed in a form which is a generalization of the motion equations for a body in vacuum, thus allowing to obtain a physical idea of the motion and aerodynamic effects and also significantly simplifying the calculation of the solutions and the interpretation of the results. In the formulation developed here the translational oscillations and the rotational motion around the center of mass are decoupled, as is the case for the rigid-body motion in vacuum, whereas in the classical added mass formulation the six motion equations are coupled. Also in this paper the nonsteady motion of small amplitude of a rigid body submerged in an ideal, incompressible fluid is considered in order to define the conditions for the existence of the central point in the case of a three-dimensional body. The results here presented are also of interest in marine applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La tesis se centra en el estudio, descripción y análisis del libro publicado por el arquitecto británico George Edmund Street en 1865, bajo el título Some Account of Gothic Architecture in Spain. El libro dio a conocer una de la colecciones más importantes de dibujos realizada en el siglo XIX sobre la arquitectura gótica española, y por lo tanto fue una primera referencia para su estudio, hasta bien entrado el siglo veinte. El volumen incluyó 107 grabados de diversos tipos de edificios con perspectivas y detalles, y 25 láminas con 45 planos de plantas de iglesias y claustros, muchos de ellos publicados por primera vez. Gracias a varias campañas de verano por la mitad norte del país, la casi inexplorada -desde un punto de vista académico- arquitectura española medieval fue finalmente descubierta. Este hecho conduce a una pregunta intrigante que está en el origen de esta investigación ¿cómo pudo Street en pocos viajes sentar las bases de la historia del gótico español que hasta entonces los estudiosos españoles no habían sido capaces de definir? Esta tesis comienza describiendo la obra de Street en su contexto cultural con un breve repaso a su biografía y a su posición profesional y teórica. También su relación con las personas más representativas que participaron en el estudio de la arquitectura gótica, como Robert Willis, William Whewell, Augustus Pugin, o George G. Scott. Se ha prestado especial atención, en explicar su papel relevante en el Gothic Revival, para entender el significado de su interés en la arquitectura gótica continental. Estos capítulos preliminares son seguidos por una revisión del papel del dibujo como herramienta para la arquitectura de los viajeros en sus rutas en busca de la arquitectura gótica. También se trata la influencia de la Royal Academy y sus académicos, (entre los cuales estuvo Street) y su formación académica. Finalmente la tesis entra en el estudio de los planos arquitectónicos que Street hizo durante sus viajes continentales de arquitectura, seguido por una descripción detallada de sus dibujos de España, analizando su método, su técnica, y las nuevas características aportadas, que fueron una novedad en el contexto español. También se lleva a cabo algunos estudios comparativos de los dibujos de España, gracias a una recopilación exhaustiva de bocetos y dibujos originales de Street, que en su gran mayoría se conservan en los archivos del RIBA, cotejándolos con sus versiones finales, con dibujos de la época de otros autores sobre los mismos edificios, y con fotos recientes. La tesis deja claro por qué y cómo Street, gracias a su soporte teórico y habilidades para el dibujo, pudo realizar algo que había pasado desapercibido para los estudiosos españoles de la época (construcción, historia de los estilos, señalamiento de períodos constructivos), lo que le permitió encontrar el lugar adecuado de la arquitectura gótica española en la historia y en el mapa de la arquitectura gótica europea. ABSTRACT This thesis focuses on the study, description and analysis of the book published by the British architect George Edmund Street in 1865, under the title Some Account of Gothic Architecture in Spain. The book displayed one of the most significant collections of drawings on Spanish Gothic Architecture made in the nineteenth century, and therefore was a first reference for its study, until well into the twentieth century. The book included 107 engravings, the surveying of various types of buildings with perspectives and details, and 25 sheets containing 45 ground plans of churches and cloisters, many of them new and published for the first time. Thanks to several summer campaigns in the north half of the country, the almost unexplored -from a scholar point of view- medieval Spanish architecture was eventually revealed. This fact lead to an intriguing question that is at the origin of this research: how could Street in a few trips lay the foundations of the history of Spanish Gothic that until then Spanish scholars had not been able to define? This thesis begins inscribing this Street's work in his cultural context. A brief review of his biography and professional and theoretical positions has been seen as necessary. Also his debts and relationship with the most representative people involved in the study of Gothic architecture, like Robert Willis, William Whewell, Augustus Pugin, or George G. Scott are discussed. Special attention has been paid, taken into account his relevant role in the Gothic Revival, to understand the significance of his interest in continental Gothic architecture. These preliminary chapters are followed by a review of role of drawing as a tool for Architectural travellers in their Tours in search of the Gothic architecture. The influence of the Royal Academy and its academicians, (among which was Street) and his educational background are here tackled. Eventually this thesis enters into the study of the architectural drawings Street made during his continental architectural journeys, which is followed by a detailed description and analysis of the Spanish ones: his methods, his technique, and the new features which were a novelty in the Spanish context are explored. Also in this thesis is carried out some comparative studies thanks to a previous exhaustive gathering of Street's sketches and original drawings, most of which are preserved in the RIBA archives. Their final versions, drawings of the same buildings from other contemporary draughtsman and pictures of their current state are compared with them. This thesis makes clear why and how Street thanks to his theoretical back-ground and portraying skills could realize what have passed unnoticed by contemporary Spanish scholars (construction, genealogy of forms, dating of periods) allowing him to find the proper place of the Spanish architecture in the history and the map of European Gothic architecture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cuando se trata de Rem Koolhaas, su espejo no refleja una sola imagen sino múltiples, es un prisma poliédrico. Su espejo nos devuelve el Rem mediático, el intelectual, el conceptualizador, el constructor, el analista, el periodista, el actor... En el caso de esta investigación, fijamos el punto de mira en el Rem COMUNICADOR. “Rem a los dos lados del espejo” se enmarca en una investigación sobre los medios de comunicación de arquitectura, su reflejo en la producción arquitectónica y viceversa. Se trata de llegar a discernir si comunicación y producción arquitectónica colisionan y confluyen en el caso de grandes comunicadores como Rem Koolhaas, si el mensaje y el medio transmisor adquieren las mismas cualidades. Centrándose en la figura de Rem Koolhaas, la tesis aborda la evolución de su faceta comunicativa y las transformaciones sucesivas en el campo de la comunicación arquitectónica, en paralelo a su evolución conceptual a lo largo de su trayectoria. La investigación, por tanto, no se centra tanto en su componente teórica o en la práctica arquitectónica de OMA, sino en la exposición de su producción al mundo, especialmente a través de sus ensayos y libros. “Delirious New York” y “SMLXL” son un reflejo del momento conceptual en que se inscriben, y contienen mucha información sobre los referentes gráficos que irremediablemente han influido en su composición. Especialmente, la aparición de “SMLXL” supuso un revulsivo para el mundo de la comunicación arquitectónica, porque puso el foco sobre la importancia de dejar atrás un discurso narrativo linea y unifocal, para afrontar la comunicación barajando múltiples variables, y aproximaciones, en un proceso similar al desarrollo de un proyecto de arquitectura. Presenta un diseño muy novedoso y una edición extremadamente cuidada, que atiende a parámetros mucho más ambiciosos que los meramente narrativos. Profundiza en la necesidad de una temática global, planteando cuál es la aproximación más apropiada para cada uno de los proyectos que describe, transmitiendo al lector una percepción más allá de lo estrictamente visual, más próximo a lo sensorial. Además, su enorme repercusión a nivel internacional y el gran interés que despertó (no solamente entre los arquitectos, sino también entre diseñadores gráficos, publicistas, personas provenientes de todo tipo de tendencias artísticas y público en general), provocó la globalización del fenómeno de las publicaciones arquitectónicas y puso de manifiesto la importancia de la comunicación como una disciplina en sí misma, dentro de la producción arquitectónica en la era actual. A pesar de la importancia de “SMLXL” a todos los niveles, la presente tesis plantea que, donde realmente se culmina esa experiencia comunicativa, es en “Content”, al incluir nuevos parámetros relacionados con la fusión conceptual de continente y contenido. Es en esta publicación donde el objeto de la comunicación y la expresión de la misma se convierten en un único elemento, que se rige por leyes similares. En este caso, la ley fundamental es la aplicación hasta sus máximas consecuencias de la “cultura de la congestión”, tanto en el mensaje como en el medio, generando lo que hemos convenido en denominar “comunicación congestiva”. Esta concepción deviene en que necesariamente se materialice como un producto efímero, desechable, casi virtual, porque responde a las condiciones de un momento muy concreto y específico y fuera de ese contexto pierde su significación.. La “cultura de la congestión” empieza a surgir en los planteamientos de Koolhaas en la Architectural Association School of Architecture de Londres, bajo la tutela de Elia Zenghelis. Posteriormente se desarrolla en su manifiesto retroactivo sobre Manhattan, “Delirious New York”, donde declara la guerra abierta al urbanismo del movimiento moderno y afirma que la ciudad realmente contemporánea es aquella que es fruto de un desarrollo no planificado, hiperdensa y posible gracias a los avances tecnológicos de su era. Finalmente comienza a materializarse en la Diploma Unit 9 de la AA, donde entra como profesor en 1975, dejando una huella indeleble en las generaciones posteriores de arquitectos que pasaron dicha unidad. Rem Koolhaas es ante todo un intelectual y por ello, todo el constructo teórico sobre la metrópolis comienza a reflejarse en su obra a través de OMA desde el comienzo de su producción. Podemos decir a grandes rasgos que su carrera está marcada por dos hitos históricos fundamentales que determinan tres etapas diferenciadas en su producción. En sus primeros años de profesión, Koolhaas sigue fascinado por la metrópolis urbana y la aplicación del método paranoico crítico a su producción arquitectónica. Es un arquitecto profundamente surrealista. Entiende este método como una estrategia de conocimiento y aproximación al mundo que le rodea: “dejar salir el inconsciente pero sostenerlo con las muletas de la racionalidad”. Pero lo que en realidad le interesa es su aplicación a la gran escala, el “Bigness”, y por ello, participa en proyectos muy ambiciosos de los que surgen conceptos que, más allá de resultar premiados o no, han dejado una huella ideológica en el devenir de la arquitectura. Entre estos proyectos, cabe destacar su propuesta para el Parque de la Villette o la Très Grande Bibliotèque de París. Sus proyectos de esta época destilan una gran carga conceptual, que devienen en unos interiores sorprendentes pero una apariencia exterior sobria o incluso podríamos decir "povera", por el uso de materiales efímeros, poco habituales en la macro-arquitectura hasta ese momento. Súbitamente, en 1997, explotó el denominado “Efecto Bilbao”, de la mano de Frank Gehry (1). El Museo Guggenheim de Bilbao, con su espectacularidad, sus formas pregnantes e imposibles, impacta al mundo. Nace la era de la “Arquitectura del Espectáculo”; la transformación de la ciudad a través de ICONOS que actúen como nodos de atracción y concentración en torno a los cuales supuestamente se revitaliza la actividad económica, cultural y sociopolítica de la ciudad, como si a través de un único gesto se pudieran regenerar todos los tejidos internos de la urbe. Rem Koolhaas comprende rápidamente que la aproximación a la ciudad ha cambiado y, sobre todo, el mercado. En el mundo de la globalización, la única manera de llegar a materializar el “Bigness”, es encerrando sus ejercicios intelectuales en formas pregnantes, bellas, icónicas, espectaculares. Koolhaas encuentra su marca personal en la estética “Stealth”, proveniente de los aviones de combate facetados para evitar los radares, elaborados en los años 80. De esta época surgen proyectos como la Casa da Música de Oporto o la Biblioteca de Seattle; ambos edificios son iconos facetados, de belleza pregnante, que dejan una huella indeleble en la ciudad y provocan, al igual que el Guggenheim, un cierto efecto de recuperación y revitalización en el entorno en que se asientan, al menos de manera temporal. En cualquier caso, Koolhaas nunca abandona los ejercicios meramente teóricos, pero segrega su actividad en dos: OMA produce aquello que tiene vocación de ser construido y se rige por los parámetros del mercado global y AMO, la otra cara del espejo de Rem, aplica el pensamiento arquitectónico a campos no explorados, sin la dependencia de agentes externos, pudiendo permitirse ser un laboratorio puramente experimental. En este escenario, llega el 11 de septiembre de 2001 y el ataque a las Torres Gemelas de Nueva York tiene efectos devastadores a todos los niveles, significando, en un período de tiempo sorprendentemente corto, un cambio en el orden mundial. Rem Koolhaas da entonces un giro de 180 grados, dirige su mirada hacia China, donde entiende que sus aportaciones tienen un beneficio social más directo que en occidente. (2) Para presentar al mundo su nuevo cambio de rumbo y la creación del “Think Tank” AMO, plantea una gran exposición en la NeueGallerie de Berlín bajo el título de “Content”, experiencia paralela a la edición del libro con el mismo título, que inicialmente nace como “catálogo de la exposición, pero que internamente siempre se concibió como el documento más trascendente en el estudio desde “SMLXL”. Sin embargo, en muchos aspectos se trata de su opuesto: una publicación con formato revista, de tapa blanda, con paginado muy fino, formato de "folleto de supermercado" y contenido hiperdenso. Es un experimento efímero, fugaz, ligero, barato, de “usar y tirar”. De hecho, está fuera de stock, ya no se edita. Probablemente Rem Koolhaas desaprobaría que se hiciera una investigación que pusiera el foco sobre el mismo, porque diez años después de su publicación seguramente opine que su vigencia ha caducado. Sin embargo, muestra con una claridad meridiana el estado conceptual y vital de OMA en el momento de su publicación y representa, además un verdadero hito en la comunicación arquitectónica, un punto de no retorno, el máximo exponente de lo que hemos denominado “comunicación congestiva”. La presente tesis plantea que “Content” contiene la esencia de la mayor aportación de Rem Koolhaas al mundo de la arquitectura: la transformación profunda y definitiva de la comunicación arquitectónica mediante la convergencia del estado conceptual y la transmisión del mismo. Su legado arquitectónico y conceptual ha marcado a todas las generaciones posteriores de manera indeleble. Sus ensayos, sus teorías, sus proyectos y sus edificaciones ya pertenecen a la historia de la arquitectura, sin ninguna duda. Pero es su revisión del concepto de la comunicación en arquitectura lo que ha tenido y tendrá un reflejo inmediato en las generaciones futuras, no solamente en la comunicación sino en su arquitectura, a través de un intercambio biyectivo. El planteamiento a futuro sería determinar qué sucede tras “Content”, tras la hiperdensidad máxima, tras la cultura de la congestión visual; qué es lo que propone Koolhaas y qué se va a plantear también en el mundo de la comunicación arquitectónica. Para ello, estudiaremos en profundidad sus últimos proyectos relacionados con la comunicación, como su propuesta para la Biennale de Arquitectura de Venecia de 2014, su intensa investigación sobre el “Metabolismo” en “Project Japan: Metabolism Talks...”, o la dirección de sus últimos planteamientos territoriales. En los últimos tiempos Rem Koolhaas habla de “Preservación”, de “Sobriedad”, de “Esencialismo”, de “Performance”... El autor intelectual de la cultura de la congestión habla ahora de la “low density”...como no podía ser de otra manera en la otra cara del espejo. En definitiva, el color blanco como suma de todos los colores, todas las longitudes de onda del espectro visible recibidas al tiempo. ABSTRACT When talking about Rem Koolhaas, the mirror does not only reflect one but numerous images: it is nothing but a polyhedral prism. His mirror gives us the image of Rem the media celebrity, the intellectual, the conceptualizer, the builder, the analyst, the journalist, the actor... This research sets the spotlight on Rem the COMMUNICATOR. "Rem on both sides of the mirror" belongs to a research on architectural media, its influence on the architectural production and vice versa. It is aimed at getting to discern whether communication and architectural production collide and converge in the case of great communicators such as Rem Koolhaas, and whether the message and transmission media acquire the same features. Focusing on the figure of Rem Koolhaas, this thesis addresses the evolution of his communicative facet and the successive transformations in the field of architectural communication, parallel to the conceptual evolution he underwent throughout his career. Therefore, this research is not so much focused on his theoretical component or on the OMA’s architectural practice, but on the exhibition of his production to the world, especially through his essays and books. "Delirious New York" and "SMLXL" hold up a mirror to the conceptual moment they are part of, and contain a great deal of information about the graphic references that have inevitably influenced his work. Specially, the launch of "SMLXL" was a salutary shock for the architectural communication world, since it set the spotlight on the importance of leaving a linear and unifocal narrative behind in order to face communication considering multiple variables and approaches, based on a process similar to the development of an architectural project. It offers a very innovative design and an extremely careful editing, which deals with parameters much more ambitious than those merely narrative. It explores the need for a global subject and suggests the most appropriate approach for each of the projects described, giving the reader a closer insight to the sensory that goes beyond what’s strictly visual. In addition, its huge international impact and the great interest shown, not only by architects but also by graphic designers, publishers, people from all kinds of artistic trends and the general public, led to the globalisation of the architectural publications phenomenon and brought the importance of communication as a discipline in itself, within the architectural production in the age at hand, to light. Despite the importance of "SMLXL" at all levels, this thesis suggests that the communication experience really culminates in "Content", for it includes new conceptual parameters associated with the container-content conceptual fusion. It is in this book where the purpose of communication and the expression of such become a single element, ruled by similar laws. In this particular case, the fundamental law is to implement the "culture of congestion" to its extreme consequences in both the message and the media, leading to what we have agreed to refer to as "congestive communication”. This concept leads to its inevitable materialisation into an ephemeral, disposable, almost virtual product, because it meets the conditions of a very concrete and specific time, and outside that context it loses its significance. The "culture of congestion" emerged in Koolhaas’ approaches under the guidance of Elia Zenghelis, in the Architectural Association School of Architecture of London. Subsequently, his retroactive manifesto on Manhattan, "Delirious New York" developed it, waging an all-out war against the modern movement urbanism and maintaining that the really contemporary cities are those hyperdense ones that rise as a result of an unplanned development and thanks to the typical technological advances of their time. Finally it began to materialise in the Diploma Unit 9 of the AA, in which he started lecturing in 1975, leaving an indelible mark on subsequent generations of architects who passed that unit. First and foremost, Rem Koolhaas is an intellectual and, therefore, all the theoretical construct in the metropolis began to be reflected in his work through OMA since the beginnings of his production. Broadly speaking, we can say that his career is influenced by two essential historic events, which determine three different stages in his production. In the early years of his career, Koolhaas was still fascinated by the urban metropolis and the implementation of the paranoiac-critical method to his architectural production. He was then a deeply surreal architect. He understood this method as a knowledge strategy and an approach to the world around him: "let the subconscious out but hold it with the crutches of reasonableness”. However, he was actually interested in its implementation on a broad scale, the "Bigness", and therefore, he took part in ambitious projects that led to the accrual of concepts that, beyond being rewarded, left an ideological impression on the evolution of architecture. These projects included his proposal for the Parc de la Villette or the Très Grande Bibliotèque in Paris. The projects he carried out during this period showed a great conceptual background, which evolved into surprising interiors but a sober, or even "povera", exterior appearance, thanks to the use of ephemeral materials that were atypical in the macro-architecture field until that moment. Suddenly, in 1997, the so-called "Bilbao effect" boomed thanks to Frank Gehry (1). The Guggenheim Museum of Bilbao amazed the world with its spectacular nature and its pregnant and impossible shapes. It was the beginning of the era of “The architecture of spectacle”: the transformation of the city through ICONS that would act as nodes of attraction and gathering, around which the economic, cultural and socio-political activity of the city was supposed to be revitalized, as if through a single gesture all internal tissues of the city could be rebuilt. Rem Koolhaas quickly realized that the approach to the city, and especially to the global market, had changed. In the world of globalisation, the only way to get to materialise such "Bigness" was by keeping his intellectual exercises in pregnant, beautiful, iconic and spectacular shapes. Koolhaas found his personal brand in the Stealth aesthetic, resulting from the eighties American combat aircrafts whose shape was faceted in order to avoid radars. Projects such as the Casa da Música in Porto or the Seattle Library date from this period; both buildings are faceted icons of pregnant beauty that left an indelible mark on the city and caused, like the Guggenheim, some degree of recovery and revitalization on the environment in which they were based, at least temporarily. In any case, Koolhaas never gave the merely theoretical exercises up, but he segregated his work in two: OMA produced what was destined to be built and ruled by the parameters of the global market and AMO, Rem’s other side of the mirror, applied the architectural thought in unexplored fields, notwithstanding external agents and being able to work as a purely experimental laboratory. In light of this backdrop, September 11th 2001 came and the attacks on the Twin Towers in New York had devastating effects at all levels, leading to a change in the world order, in a surprisingly short period of time. Rem Koolhaas made a 180° turn directing his vision towards China, where he believed his contributions would have a more direct social benefit than in the Western world. (2) In order to introduce his new course of direction and the creation of the AMO "Think Tank", he planned a major exhibition in the Neue Nationalgalerie of Berlin under the title "Content", in parallel with edition of the book with the same title, which was at first the "exhibition catalog” but, deep down, was always conceived as the most important document of the Office since "SMLXL". However, in many ways it was just the opposite: a publication characterised by its magazine format, soft cover, very fine paging, "supermarket brochure" form and hyperdense content. It was an ephemeral, brief, light, cheap and "disposable" experiment. In fact, it is currently out of stock and out of print. Rem Koolhaas would probably disapprove of a research that sets the spotlight on him, for he would probably say that his validity has expired given that it has been ten years since its publication. However, it shows OMA’s conceptual and vital status at the time of its publication with crystalline clarity and it is also a true milestone in architectural communication. A point of no return. The epitome of the so-called "congestive communication ". This thesis suggests that "Content" contains the essence of Rem Koolhaas’ greatest contribution to the world of architecture: the deep and definitive transformation of architectural communication through the convergence of the conceptual state and the transmission thereof. His architectural and conceptual legacy has left an indelible mark on all subsequent generations. There is no doubt his essays, theories, projects and buildings already belong to the history of architecture. But it is his review on the concept of communication in architecture that has had and shall have an immediate influence on future generations, not only in their communication but also in their architecture, through a bijective exchange. Future approaches should try to determine what happens after "Content", after the maximum hyperdensity, after the visual culture of congestion; what shall Koolhaas suggest as well as what shall happen in the world of architectural communication. To this end, we shall study his latest communication-related projects, such as the design of the Venetian Architecture Biennale in 2014, his intensive research on the "Metabolism" in "Project Japan: Metabolism Talks ...", or the course of his latest territorial approaches in depth. Most recently, Rem Koolhaas has talked about "Preservation", "Sobriety" of "Essentialism", "Performance", etc. The mastermind of the culture of congestion now speaks of the "low density"... as it could not be otherwise, on the other side of the mirror. Summarizing, the white color as the sum of all colors; all wavelengths of the visible spectrum received at the same time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wave energy conversion has an essential difference from other renewable energies since the dependence between the devices design and the energy resource is stronger. Dimensioning is therefore considered a key stage when a design project of Wave Energy Converters (WEC) is undertaken. Location, WEC concept, Power Take-Off (PTO) type, control strategy and hydrodynamic resonance considerations are some of the critical aspects to take into account to achieve a good performance. The paper proposes an automatic dimensioning methodology to be accomplished at the initial design project stages and the following elements are described to carry out the study: an optimization design algorithm, its objective functions and restrictions, a PTO model, as well as a procedure to evaluate the WEC energy production. After that, a parametric analysis is included considering different combinations of the key parameters previously introduced. A variety of study cases are analysed from the point of view of energy production for different design-parameters and all of them are compared with a reference case. Finally, a discussion is presented based on the results obtained, and some recommendations to face the WEC design stage are given.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses the target localization problem in wireless visual sensor networks. Additive noises and measurement errors will affect the accuracy of target localization when the visual nodes are equipped with low-resolution cameras. In the goal of improving the accuracy of target localization without prior knowledge of the target, each node extracts multiple feature points from images to represent the target at the sensor node level. A statistical method is presented to match the most correlated feature point pair for merging the position information of different sensor nodes at the base station. Besides, in the case that more than one target exists in the field of interest, a scheme for locating multiple targets is provided. Simulation results show that, our proposed method has desirable performance in improving the accuracy of locating single target or multiple targets. Results also show that the proposed method has a better trade-off between camera node usage and localization accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study is to explain the changes in the real estate prices as well as in the real estate stock market prices, using some macro-economic explanatory variables, such as the gross domestic product (GDP), the real interest rate and the unemployment rate. Several regressions have been carried out in order to express some types of incremental and absolute deflated real estate lock market indexes in terms of the macro-economic variables. The analyses are applied to the Swedish economy. The period under study is 1984-1994. Time series on monthly data are used. i.e. the number of data-points is 132. If time leads/lags are introduced in the e regressions, significant improvements in the already high correlations are achieved. The signs of the coefficients for IR, UE and GDP are all what one would expect to see from an economic point of view: those for GDP are all positive, those for both IR and UE are negative. All the regressions have high R2 values. Both markets anticipate change in the unemployment rate by 6 to 9 months, which seems reasonable because such change can be forecast quite reliably. But, on the contrary, there is no reason why they should anticipate by 3-6 months changes in the interest rate that can hardly be reliably forecast so far in advance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have designed a p53 DNA binding domain that has virtually the same binding affinity for the gadd45 promoter as does wild-type protein but is considerably more stable. The design strategy was based on molecular evolution of the protein domain. Naturally occurring amino acid substitutions were identified by comparing the sequences of p53 homologues from 23 species, introducing them into wild-type human p53, and measuring the changes in stability. The most stable substitutions were combined in a multiple mutant. The advantage of this strategy is that, by substituting with naturally occurring residues, the function is likely to be unimpaired. All point mutants bind the consensus DNA sequence. The changes in stability ranged from +1.27 (less stable Q165K) to −1.49 (more stable N239Y) kcal mol−1, respectively. The changes in free energy of unfolding on mutation are additive. Of interest, the two most stable mutants (N239Y and N268D) have been known to act as suppressors and restored the activity of two of the most common tumorigenic mutants. Of the 20 single mutants, 10 are cancer-associated, though their frequency of occurrence is extremely low: A129D, Q165K, Q167E, and D148E are less stable and M133L, V203A and N239Y are more stable whereas the rest are neutral. The quadruple mutant (M133LV203AN239YN268D), which is stabilized by 2.65 kcal mol−1 and Tm raised by 5.6°C is of potential interest for trials in vivo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method was developed to detect 5' ends of bacterial RNAs expressed at low levels and to differentiate newly initiated transcripts from processed transcripts produced in vivo. The procedure involves use of RNA ligase to link a specific oligoribonucleotide to the 5' ends of cellular RNAs, followed by production of cDNA and amplification of the gene of interest by PCR. The method was used to identify the precise sites of transcription initiation within a 10-kb region of the pheromone-inducible conjugative plasmid pCF10 of Enterococcus faecalis. Results confirmed the 5' end of a very abundant, constitutively produced transcript (from prgQ) that had been mapped previously by primer extension and defined the initiation point of a less abundant, divergently transcribed message (from prgX). The method also showed that the 5' end of a pheromone-inducible transcript (prgB) that had been mapped by primer extension was generated by processing rather than new initiation. In addition, the results provided evidence for two promoters, 3 and 5 kb upstream of prgB, and indicated that only the transcripts originating 5 kb upstream may be capable of extending to prgB.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although it may sound reasonable that American education continues to be more effective at sending high school students to college, in a study conducted in 2009, The Council of the Great City Schools states that "slightly more than half of entering ninth grade students arrive performing below grade level in reading and math, while one in five entering ninth grade students is more than two years behind grade level...[and] 25% received support in the form of remedial literacy instruction or interventions" (Council of the Great City Schools, 2009). Students are distracted with technology (Lei & Zhao, 2005), family (Xu & Corno, 2003), medical illnesses (Nielson, 2009), learning disabilities and perhaps the most detrimental to academic success, the very lack of interest in school (Ruch, 1963). In a Johns Hopkins research study, Building a Graduation Nation - Colorado (Balfanz, 2008), warning signs were apparent years before the student dropped out of high school. The ninth grade was often referenced as a critical point that indicated success or failure to graduate high school. The research conducted by Johns Hopkins illustrates the problem: students who become disengaged from school have a much greater chance of dropping out of high school and not graduating. The first purpose of this study was to compare different measurement models of the Student School Engagement (SSE) using Factor Analysis to verify model fit with student engagement. The second purpose was to determine the extent to which the SSE instrument measures student school engagement by investigating convergent validity (via the SSE and Appleton, Christenson, Kim and Reschly's instrument and Fredricks, Blumenfeld, Friedel and Paris's instrument), discriminant validity (via Huebner's Student Life Satisfaction Survey) and criterion-related validity (via the sub-latent variables of Aspirations, Belonging and Productivity and student outcome measures such as achievement, attendance and discipline). Discriminant validity was established between the SSE and the Appleton, Christenson, Kim and Reschly's model and Fredricks, Blumenfeld, Friedel and Paris's (2005) Student Engagement Instruments (SEI). When confirming discriminant validity, the SSE's correlations were weak and statistically not significant, thus establishing discriminant validity with the SLSS. Criterion-related validity was established through structural equation modeling when the SSE was found to be a significant predictor of student outcome measures when both risk score and CSAP scores were used. The third purpose of this study was to assess the factorial invariance of the SSE instrument across gender to ensure the instrument is measuring the intended construct across different groups. Conclusively, configural, weak and metric invariances were established for the SSE as a non-significant change in chi-square indicating that all parameters including the error variances were invariant across groups of gender. Engagement is not a clearly defined psychological construct; it requires more research in order to fully comprehend its complexity. Hopefully, with parental and teacher involvement and a sense of community, student engagement can be nurtured to result in a meaningful attachment to school and academic success.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The current tendency to undertake more trips, but of shorter duration, throughout the year, has meant that the tourist industry has started to show greater interest in attracting those market segments that opt for more prolonged stays, as they are especially profitable. One of these segments is that of seniors. Given the aging demographic of the population worldwide, which is particularly noticeable in Spain, the object of this study is to identify the variables that determine the length of stay of Spanish seniors at their destination. The Negative Binomial model was adapted to the context of length of stay by Spanish seniors and the determinant factors identified were: age, travel purpose, climate, type of accommodation, group size, trip type and the activities carried out at the destination. This study is a contribution to this field from an empirical point of view, given the scarcity of studies of this type and their eminently descriptive character; as well as from a practical level, with interesting implications for the sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The reprise evidential conditional (REC) is nowadays not very usual in Catalan: it is restricted to journalistic language and to some very formal genres (such as academic or legal language), it is not present in spontaneous discourse. On the one hand, it has been described among the rather new modality values of the conditional. On the other, the normative tradition tended to reject it for being a gallicism, or to describe it as an unsuitable neologism. Thanks to the extraction from text corpora, we surprisingly find this REC in Catalan from the beginning of the fourteenth century to the contemporary age, with semantic and pragmatic nuances and different evidence of grammaticalization. Due to the current interest in evidentiality, the REC has been widely studied in French, Italian and Portuguese, focusing mainly on its contemporary uses and not so intensively on the diachronic process that could explain the origin of this value. In line with this research, that we initiated studying the epistemic and evidential future in Catalan, our aim is to describe: a) the pragmatic context that could have been the initial point of the REC in the thirteenth century, before we find indisputable attestations of this use; b) the path of semantic change followed by the conditional from a ‘future in the past’ tense to the acquisition of epistemic and evidential values; and c) the role played by invited inferences, subjectification and intersubjectification in this change.