14 resultados para 200406 Language in Time and Space (incl. Historical Linguistics Dialectology)

em Universidad Politécnica de Madrid


Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are a number of research and development activities that are exploring Time and Space Partition (TSP) to implement safe and secure flight software. This approach allows to execute different real-time applications with different levels of criticality in the same computer board. In order to do that, flight applications must be isolated from each other in the temporal and spatial domains. This paper presents the first results of a partitioning platform based on the Open Ravenscar Kernel (ORK+) and the XtratuM hypervisor. ORK+ is a small, reliable real-time kernel supporting the Ada Ravenscar Computational model that is central to the ASSERT development process. XtratuM supports multiple virtual machines, i.e. partitions, on a single computer and is being used in the Integrated Modular Avionics for Space study. ORK+ executes in an XtratuM partition enabling Ada applications to share the computer board with other applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We discuss experiences gained by porting a Software Validation Facility (SVF) and a satellite Central Software (CSW) to a platform with support for Time and Space Partitioning (TSP). The SVF and CSW are part of the EagleEye Reference mission of the European Space Agency (ESA). As a reference mission, EagleEye is a perfect candidate to evaluate practical aspects of developing satellite CSW for and on TSP platforms. The specific TSP platform we used consists of a simulate D LEON3 CPU controlled by the XtratuM separation micro-kernel. On top of this, we run five separate partitions. Each partition ru n s its own real-time operating system or Ada run-time kernel, which in turn are running the application software of the CSW. We describe issues related to partitioning; inter-partition communication; scheduling; I/O; and fault-detection, isolation, and recovery (FDIR)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A gene expression atlas is an essential resource to quantify and understand the multiscale processes of embryogenesis in time and space. The automated reconstruction of a prototypic 4D atlas for vertebrate early embryos, using multicolor fluorescence in situ hybridization with nuclear counterstain, requires dedicated computational strategies. To this goal, we designed an original methodological framework implemented in a software tool called Match-IT. With only minimal human supervision, our system is able to gather gene expression patterns observed in different analyzed embryos with phenotypic variability and map them onto a series of common 3D templates over time, creating a 4D atlas. This framework was used to construct an atlas composed of 6 gene expression templates from a cohort of zebrafish early embryos spanning 6 developmental stages from 4 to 6.3 hpf (hours post fertilization). They included 53 specimens, 181,415 detected cell nuclei and the segmentation of 98 gene expression patterns observed in 3D for 9 different genes. In addition, an interactive visualization software, Atlas-IT, was developed to inspect, supervise and analyze the atlas. Match-IT and Atlas-IT, including user manuals, representative datasets and video tutorials, are publicly and freely available online. We also propose computational methods and tools for the quantitative assessment of the gene expression templates at the cellular scale, with the identification, visualization and analysis of coexpression patterns, synexpression groups and their dynamics through developmental stages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To properly understand and model animal embryogenesis it is crucial to obtain detailed measurements, both in time and space, about their gene expression domains and cell dynamics. Such challenge has been confronted in recent years by a surge of atlases which integrate a statistically relevant number of different individuals to get robust, complete information about their spatiotemporal locations of gene patterns. This paper will discuss the fundamental image analysis strategies required to build such models and the most common problems found along the way. We also discuss the main challenges and future goals in the field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To properly understand and model animal embryogenesis it is crucial to obtain detailed measurements, both in time and space, about their gene expression domains and cell dynamics. Such challenge has been confronted in recent years by a surge of atlases which integrate a statistically relevant number of different individuals to get robust, complete information about their spatiotemporal locations of gene patterns. This paper will discuss the fundamental image analysis strategies required to build such models and the most common problems found along the way. We also discuss the main challenges and future goals in the field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to properly understand and model the gene regulatory networks in animals development, it is crucial to obtain detailed measurements, both in time and space, about their gene expression domains. In this paper, we propose a complete computational framework to fulfill this task and create a 3D Atlas of the early zebrafish embryogenesis annotated with both the cellular localizations and the level of expression of different genes at different developmental stages. The strategy to construct such an Atlas is described here with the expression pattern of 5 different genes at 6 hours of development post fertilization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Digital atlases of animal development provide a quantitative description of morphogenesis, opening the path toward processes modeling. Prototypic atlases offer a data integration framework where to gather information from cohorts of individuals with phenotypic variability. Relevant information for further theoretical reconstruction includes measurements in time and space for cell behaviors and gene expression. The latter as well as data integration in a prototypic model, rely on image processing strategies. Developing the tools to integrate and analyze biological multidimensional data are highly relevant for assessing chemical toxicity or performing drugs preclinical testing. This article surveys some of the most prominent efforts to assemble these prototypes, categorizes them according to salient criteria and discusses the key questions in the field and the future challenges toward the reconstruction of multiscale dynamics in model organisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La región del espectro electromagnético comprendida entre 100 GHz y 10 THz alberga una gran variedad de aplicaciones en campos tan dispares como la radioastronomía, espectroscopíamolecular, medicina, seguridad, radar, etc. Los principales inconvenientes en el desarrollo de estas aplicaciones son los altos costes de producción de los sistemas trabajando a estas frecuencias, su costoso mantenimiento, gran volumen y baja fiabilidad. Entre las diferentes tecnologías a frecuencias de THz, la tecnología de los diodos Schottky juega un importante papel debido a su madurez y a la sencillez de estos dispositivos. Además, los diodos Schottky pueden operar tanto a temperatura ambiente como a temperaturas criogénicas, con altas eficiencias cuando se usan como multiplicadores y con moderadas temperaturas de ruido en mezcladores. El principal objetivo de esta tesis doctoral es analizar los fenómenos físicos responsables de las características eléctricas y del ruido en los diodos Schottky, así como analizar y diseñar circuitos multiplicadores y mezcladores en bandas milimétricas y submilimétricas. La primera parte de la tesis presenta un análisis de los fenómenos físicos que limitan el comportamiento de los diodos Schottky de GaAs y GaN y de las características del espectro de ruido de estos dispositivos. Para llevar a cabo este análisis, un modelo del diodo basado en la técnica de Monte Carlo se ha considerado como referencia debido a la elevada precisión y fiabilidad de este modelo. Además, el modelo de Monte Carlo permite calcular directamente el espectro de ruido de los diodos sin necesidad de utilizar ningún modelo analítico o empírico. Se han analizado fenómenos físicos como saturación de la velocidad, inercia de los portadores, dependencia de la movilidad electrónica con la longitud de la epicapa, resonancias del plasma y efectos no locales y no estacionarios. También se ha presentado un completo análisis del espectro de ruido para diodos Schottky de GaAs y GaN operando tanto en condiciones estáticas como variables con el tiempo. Los resultados obtenidos en esta parte de la tesis contribuyen a mejorar la comprensión de la respuesta eléctrica y del ruido de los diodos Schottky en condiciones de altas frecuencias y/o altos campos eléctricos. También, estos resultados han ayudado a determinar las limitaciones de modelos numéricos y analíticos usados en el análisis de la respuesta eléctrica y del ruido electrónico en los diodos Schottky. La segunda parte de la tesis está dedicada al análisis de multiplicadores y mezcladores mediante una herramienta de simulación de circuitos basada en la técnica de balance armónico. Diferentes modelos basados en circuitos equivalentes del dispositivo, en las ecuaciones de arrastre-difusión y en la técnica de Monte Carlo se han considerado en este análisis. El modelo de Monte Carlo acoplado a la técnica de balance armónico se ha usado como referencia para evaluar las limitaciones y el rango de validez de modelos basados en circuitos equivalentes y en las ecuaciones de arrastredifusión para el diseño de circuitos multiplicadores y mezcladores. Una notable característica de esta herramienta de simulación es que permite diseñar circuitos Schottky teniendo en cuenta tanto la respuesta eléctrica como el ruido generado en los dispositivos. Los resultados de las simulaciones presentados en esta parte de la tesis, tanto paramultiplicadores comomezcladores, se han comparado con resultados experimentales publicados en la literatura. El simulador que integra el modelo de Monte Carlo con la técnica de balance armónico permite analizar y diseñar circuitos a frecuencias superiores a 1 THz. ABSTRACT The terahertz region of the electromagnetic spectrum(100 GHz-10 THz) presents a wide range of applications such as radio-astronomy, molecular spectroscopy, medicine, security and radar, among others. The main obstacles for the development of these applications are the high production cost of the systems working at these frequencies, highmaintenance, high volume and low reliability. Among the different THz technologies, Schottky technology plays an important rule due to its maturity and the inherent simplicity of these devices. Besides, Schottky diodes can operate at both room and cryogenic temperatures, with high efficiency in multipliers and moderate noise temperature in mixers. This PhD. thesis is mainly concerned with the analysis of the physical processes responsible for the characteristics of the electrical response and noise of Schottky diodes, as well as the analysis and design of frequency multipliers and mixers at millimeter and submillimeter wavelengths. The first part of the thesis deals with the analysis of the physical phenomena limiting the electrical performance of GaAs and GaN Schottky diodes and their noise performance. To carry out this analysis, a Monte Carlo model of the diode has been used as a reference due to the high accuracy and reliability of this diode model at millimeter and submillimter wavelengths. Besides, the Monte Carlo model provides a direct description of the noise spectra of the devices without the necessity of any additional analytical or empirical model. Physical phenomena like velocity saturation, carrier inertia, dependence of the electron mobility on the epilayer length, plasma resonance and nonlocal effects in time and space have been analysed. Also, a complete analysis of the current noise spectra of GaAs and GaN Schottky diodes operating under static and time varying conditions is presented in this part of the thesis. The obtained results provide a better understanding of the electrical and the noise responses of Schottky diodes under high frequency and/or high electric field conditions. Also these results have helped to determine the limitations of numerical and analytical models used in the analysis of the electrical and the noise responses of these devices. The second part of the thesis is devoted to the analysis of frequency multipliers and mixers by means of an in-house circuit simulation tool based on the harmonic balance technique. Different lumped equivalent circuits, drift-diffusion and Monte Carlo models have been considered in this analysis. The Monte Carlo model coupled to the harmonic balance technique has been used as a reference to evaluate the limitations and range of validity of lumped equivalent circuit and driftdiffusion models for the design of frequency multipliers and mixers. A remarkable feature of this reference simulation tool is that it enables the design of Schottky circuits from both electrical and noise considerations. The simulation results presented in this part of the thesis for both multipliers and mixers have been compared with measured results available in the literature. In addition, the Monte Carlo simulation tool allows the analysis and design of circuits above 1 THz.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La propulsión eléctrica constituye hoy una tecnología muy competitiva y de gran proyección de futuro. Dentro de los diversos motores de plasma existentes, el motor de efecto Hall ha adquirido una gran madurez y constituye un medio de propulsión idóneo para un rango amplio de misiones. En la presente Tesis se estudian los motores Hall con geometría convencional y paredes dieléctricas. La compleja interacción entre los múltiples fenómenos físicos presentes hace que sea difícil la simulación del plasma en estos motores. Los modelos híbridos son los que representan un mejor compromiso entre precisión y tiempo de cálculo. Se basan en utilizar un modelo fluido para los electrones y algoritmos de dinámica de partículas PIC (Particle-In- Cell) para los iones y los neutros. Permiten hacer uso de la hipótesis de cuasineutralidad del plasma, a cambio de resolver separadamente las capas límite (o vainas) que se forman en torno a las paredes de la cámara. Partiendo de un código híbrido existente, llamado HPHall-2, el objetivo de la Tesis doctoral ha sido el desarrollo de un código híbrido avanzado que mejorara la simulación de la descarga de plasma en un motor de efecto Hall. Las actualizaciones y mejoras realizadas en las diferentes partes que componen el código comprenden tanto aspectos teóricos como numéricos. Fruto de la extensa revisión de la algoritmia del código HPHall-2 se han conseguido reducir los errores de precisión un orden de magnitud, y se ha incrementado notablemente su consistencia y robustez, permitiendo la simulación del motor en un amplio rango de condiciones. Algunos aspectos relevantes a destacar en el subcódigo de partículas son: la implementación de un nuevo algoritmo de pesado que permite determinar de forma más precisa el flujo de las magnitudes del plasma; la implementación de un nuevo algoritmo de control de población, que permite tener suficiente número de partículas cerca de las paredes de la cámara, donde los gradientes son mayores y las condiciones de cálculo son más críticas; las mejoras en los balances de masa y energía; y un mejor cálculo del campo eléctrico en una malla no uniforme. Merece especial atención el cumplimiento de la condición de Bohm en el borde de vaina, que en los códigos híbridos representa una condición de contorno necesaria para obtener una solución consistente con el modelo de interacción plasma-pared, y que en HPHall-2 aún no se había resuelto satisfactoriamente. En esta Tesis se ha implementado el criterio cinético de Bohm para una población de iones con diferentes cargas eléctricas y una gran dispersión de velocidades. En el código, el cumplimiento de la condición cinética de Bohm se consigue por medio de un algoritmo que introduce una fina capa de aceleración nocolisional adyacente a la vaina y mide adecuadamente el flujo de partículas en el espacio y en el tiempo. Las mejoras realizadas en el subcódigo de electrones incrementan la capacidad de simulación del código, especialmente en la región aguas abajo del motor, donde se simula la neutralización del chorro del plasma por medio de un modelo de cátodo volumétrico. Sin abordar el estudio detallado de la turbulencia del plasma, se implementan modelos sencillos de ajuste de la difusión anómala de Bohm, que permiten reproducir los valores experimentales del potencial y la temperatura del plasma, así como la corriente de descarga del motor. En cuanto a los aspectos teóricos, se hace especial énfasis en la interacción plasma-pared y en la dinámica de los electrones secundarios libres en el interior del plasma, cuestiones que representan hoy en día problemas abiertos en la simulación de los motores Hall. Los nuevos modelos desarrollados buscan una imagen más fiel a la realidad. Así, se implementa el modelo de vaina de termalización parcial, que considera una función de distribución no-Maxwelliana para los electrones primarios y contabiliza unas pérdidas energéticas más cercanas a la realidad. Respecto a los electrones secundarios, se realiza un estudio cinético simplificado para evaluar su grado de confinamiento en el plasma, y mediante un modelo fluido en el límite no-colisional, se determinan las densidades y energías de los electrones secundarios libres, así como su posible efecto en la ionización. El resultado obtenido muestra que los electrones secundarios se pierden en las paredes rápidamente, por lo que su efecto en el plasma es despreciable, no así en las vainas, donde determinan el salto de potencial. Por último, el trabajo teórico y de simulación numérica se complementa con el trabajo experimental realizado en el Pnnceton Plasma Physics Laboratory, en el que se analiza el interesante transitorio inicial que experimenta el motor en el proceso de arranque. Del estudio se extrae que la presencia de gases residuales adheridos a las paredes juegan un papel relevante, y se recomienda, en general, la purga completa del motor antes del modo normal de operación. El resultado final de la investigación muestra que el código híbrido desarrollado representa una buena herramienta de simulación de un motor Hall. Reproduce adecuadamente la física del motor, proporcionando resultados similares a los experimentales, y demuestra ser un buen laboratorio numérico para estudiar el plasma en el interior del motor. Abstract Electric propulsion is today a very competitive technology and has a great projection into the future. Among the various existing plasma thrusters, the Hall effect thruster has acquired a considerable maturity and constitutes an ideal means of propulsion for a wide range of missions. In the present Thesis only Hall thrusters with conventional geometry and dielectric walls are studied. The complex interaction between multiple physical phenomena makes difficult the plasma simulation in these engines. Hybrid models are those representing a better compromise between precision and computational cost. They use a fluid model for electrons and Particle-In-Cell (PIC) algorithms for ions and neutrals. The hypothesis of plasma quasineutrality is invoked, which requires to solve separately the sheaths formed around the chamber walls. On the basis of an existing hybrid code, called HPHall-2, the aim of this doctoral Thesis is to develop an advanced hybrid code that better simulates the plasma discharge in a Hall effect thruster. Updates and improvements of the code include both theoretical and numerical issues. The extensive revision of the algorithms has succeeded in reducing the accuracy errors in one order of magnitude, and the consistency and robustness of the code have been notably increased, allowing the simulation of the thruster in a wide range of conditions. The most relevant achievements related to the particle subcode are: the implementation of a new weighing algorithm that determines more accurately the plasma flux magnitudes; the implementation of a new algorithm to control the particle population, assuring enough number of particles near the chamber walls, where there are strong gradients and the conditions to perform good computations are more critical; improvements in the mass and energy balances; and a new algorithm to compute the electric field in a non-uniform mesh. It deserves special attention the fulfilment of the Bohm condition at the edge of the sheath, which represents a boundary condition necessary to match consistently the hybrid code solution with the plasma-wall interaction, and remained as a question unsatisfactory solved in the HPHall-2 code. In this Thesis, the kinetic Bohm criterion has been implemented for an ion particle population with different electric charges and a large dispersion in their velocities. In the code, the fulfilment of the kinetic Bohm condition is accomplished by an algorithm that introduces a thin non-collisional layer next to the sheaths, producing the ion acceleration, and measures properly the flux of particles in time and space. The improvements made in the electron subcode increase the code simulation capabilities, specially in the region downstream of the thruster, where the neutralization of the plasma jet is simulated using a volumetric cathode model. Without addressing the detailed study of the plasma turbulence, simple models for a parametric adjustment of the anomalous Bohm difussion are implemented in the code. They allow to reproduce the experimental values of the plasma potential and the electron temperature, as well as the discharge current of the thruster. Regarding the theoretical issues, special emphasis has been made in the plasma-wall interaction of the thruster and in the dynamics of free secondary electrons within the plasma, questions that still remain unsolved in the simulation of Hall thrusters. The new developed models look for results closer to reality, such as the partial thermalization sheath model, that assumes a non-Maxwellian distribution functions for primary electrons, and better computes the energy losses at the walls. The evaluation of secondary electrons confinement within the chamber is addressed by a simplified kinetic study; and using a collisionless fluid model, the densities and energies of free secondary electrons are computed, as well as their effect on the plasma ionization. Simulations show that secondary electrons are quickly lost at walls, with a negligible effect in the bulk of the plasma, but they determine the potential fall at sheaths. Finally, numerical simulation and theoretical work is complemented by the experimental work carried out at the Princeton Plasma Physics Laboratory, devoted to analyze the interesting transitional regime experienced by the thruster in the startup process. It is concluded that the gas impurities adhered to the thruster walls play a relevant role in the transitional regime and, as a general recomendation, a complete purge of the thruster before starting its normal mode of operation it is suggested. The final result of the research conducted in this Thesis shows that the developed code represents a good tool for the simulation of Hall thrusters. The code reproduces properly the physics of the thruster, with results similar to the experimental ones, and represents a good numerical laboratory to study the plasma inside the thruster.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a Finite Element Model, which has been used for forecasting the diffusion of innovations in time and space. Unlike conventional models used in diffusion literature, the model considers the spatial heterogeneity. The implementation steps of the model are explained by applying it to the case of diffusion of photovoltaic systems in a local region in southern Germany. The applied model is based on a parabolic partial differential equation that describes the diffusion ratio of photovoltaic systems in a given region over time. The results of the application show that the Finite Element Model constitutes a powerful tool to better understand the diffusion of an innovation as a simultaneous space-time process. For future research, model limitations and possible extensions are also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El proyecto de una tienda concentra un poco de todo aquello que interesa al arquitecto: conforma una suerte de encrucijada en la que se dan cita, junto a cuestiones disciplinares que la hacen muy atractiva y ponen a prueba su habilidad como proyectista, la necesidad de aunar recursos compositivos procedentes de otros terrenos periféricos a la arquitectura para dar una respuesta adecuada a los requerimientos específicos de imagen y persuasión que la actividad comercial comporta. Las difíciles condiciones de partida, habitualmente configuraciones espaciales no excesivamente favorables, donde las preexistencias y los contornos coartan y encierran un espacio al que hay que dotar de un nuevo orden; el tamaño reducido y las posibilidades de control total de la obra que de éste se derivan; el corto plazo de tiempo y la rapidez de respuesta que la estrategia comercial impone; su posición estratégica, en relación directa con la calle, y la mayor evidencia que, por tanto, se le asigna a la fachada y al escaparate como ‘primeros anuncios’ de la actividad; la integración y el comentario recíproco entre arquitectura y objeto que tiene lugar en su seno…, son algunas de las razones que explican este interés y justifican que la tienda −ese espacio acotado tan apto para el invento y la innovación−, constituya un banco de pruebas donde poder ensayar nuevos conceptos de la arquitectura al reunir espacial y temporalmente los condicionantes ideales requeridos para la experimentación y la comprobación de hallazgos. Aunque escasas, existen en la arquitectura contemporánea tiendas que han logrado ocupar por méritos propios un lugar destacado dentro de la obra de sus autores. Entre las merecedoras de ese reconocimiento habría que citar la ‘mítica’ sastrería Kniže proyectada por Adolf Loos a comienzos del siglo XX en Viena, el Negozio Vitrum que Giuseppe Terragni diseñara en Como en los años treinta, o la sucursal londinense de las líneas aéreas de Iraq construida en los sesenta por Alison y Peter Smithson. La elección, lejos de ser gratuita, obedece a razones fundadas. Dentro de las circunstancias temporales en las que se gestaron −las tres fueron proyectadas y construidas a lo largo del pasado siglo en un arco que abarca algo más de cincuenta años (1907-1961)−, los ejemplos seleccionados aúnan toda una serie de ‘coincidencias’ entre las que no resulta difícil establecer ciertos paralelismos: las tres marcan la talla de unos creadores que fueron capaces de dedicar la misma intensidad creativa a estos temas ‘menores’, corroborando que el carácter de la arquitectura también puede hacerse grande en lo pequeño; las tres, debido al momento de madurez en el que se abordó su diseño, reflejan su condición de laboratorio experimental al servicio de los intereses proyectuales que en ese momento ocupaban la mente de los arquitectos; las tres corroboran hipótesis ya apuntadas en arquitecturas anteriores, prueban líneas de trabajo no materializadas por falta de oportunidad y testan de manera menos comprometida soluciones que luego se trasladarán a obras con mayor vocación de permanencia; obras −y esto es algo especialmente sorprendente− con las que mantuvieron una estrecha relación en el espacio y en el tiempo, convirtiéndose incluso en plataformas de experimentación paralelas: baste en este sentido con apuntar la cercanía física −a ‘metros’ de distancia− y temporal −realizadas en los mismos años− entre la sastrería Kniže (1907- 1913;1928) y la Casa en la Michaelerplatz (1910-11); la tienda Vitrum (1930) y la Casa del Fascio (1929; 1932-36) o las oficinas de venta de las Iraqi Airways (1960-61) y la sede del The Economist (1959-64). Esta potencialidad que la tienda encierra para erigirse en laboratorio de experimentación y ensayo de la arquitectura, constituye la clave de la investigación que la tesis propone. ABSTRACT A little of everything that interests the architect is concentrated in the designing of a shop: it forms a kind of crossroads bringing together, apart from certain disciplinary questions rendering it particularly attractive and testing one’s ability as a designer, the need to coordinate compositional resources from fields peripheral to architecture in order to devise an adequate response to the specific requirements of image and persuasion that are part and parcel of business activity. The difficult start-up conditions, the generally not overly favourable spatial configurations ‒where the pre-existing conditions and shape of the site encroach on and enclose a space which has to be given a new order‒, the reduced size and possibilities afforded in terms of controlling the work, the short time frame and the rapid response imposed by the business tactics and its strategic position and direct frontal relationship with the street, make the shopfront and the display window are the ‘first advertisements” of the activity, or the integration and the reciprocal commentary between architecture and what takes place within: these are but some of the reasons explaining this interest and justifying the claim that the shop –a dimensional space so well suited to invention and innovation− constitutes a test-bed for trying out new concepts of architecture, combining in space and time the ideal conditions for experiment and the examination of its findings. Albeit not numerous, there are shops in contemporary architecture which have managed on their own merit to obtain a special place among the works of their authors. Among those earning such recognition, one should mention the ‘mythical’ Kniže tailoring establishment designed by Adolf Loos at the beginning of the 20th century in Vienna, the Negozio Vitrum designed by Giuseppe Terragni in Como in the thirties, or the London offices of Iraqi Airways built in the sixties by Alison and Peter Smithson. This selection, far from gratuitous, is based on well-founded reasons. Within the circumstances of the time-frame in which they were developed −the three were designed and built during the 20th century in a period that spans just over fifty years (1907-1961)− the chosen examples bring together a whole series of ‘coincidences’ where it is not difficult to draw certain parallels: the three bear witness to the stature of creators who were capable of devoting the same creative intensity to these ‘minor’ themes, thus corroborating the fact that the nature of the architecture can also be great in less important works; the three, thanks to the moment of maturity in which their design was carried out, reflect their condition as an experimental laboratory at the service of the particular designing interests which at the time occupied the minds of these architects; the three confirm hypotheses already displayed in previous architectures, they test lines of work which had not materialised through lack of opportunity, and in a less comprised manner check solutions that were later transferred to works with a greater vocation for permanence; works −and this is something especially surprising – with which they maintained a close relationship in time and space, even becoming parallel experimental platforms: in this sense, we need only to mention the physical proximity –just metres away− and proximity in time –built within the same years− between the Kniže shop (1907-1913; 1928) and the House of Michaelerplatz (1910-11); the Vitrum shop (1930) and the Casa del Fascio (1929;1932-36); and the Iraqi Airways sales offices (1960-61) and the headquarters of The Economist (1959-64). The potential of the shop to set itself up as an experimental laboratory and architectural rehearsal constitutes the main focus of the research put forward by this thesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La estructura urbana de Madrid comenzó a dibujarse con los primeros asentamientos fortificados del siglo IX. Sus posteriores ampliaciones estuvieron acotadas por los sucesivos recintos que delimitaron los contornos de una capital en constante expansión. De carácter inicialmente defensivo, luego fiscal y sanitario, estas estructuras estuvieron articuladas en torno a las puertas de acceso a la Villa, estableciendo un sistema general de cerramiento y comunicación que permitiera el control de personas y mercancías en su tránsito hacia el interior de la ciudad. La modestia inicial con que las puertas y tapias del recinto edificado en tiempos de Felipe IV desempeñaron sus funciones de espaldas al exterior de la Villa fue remplazada por un creciente protagonismo, de carácter simbólico y ornamental, que trascendió a su propia arquitectura para inspirar algunas de las importantes transformaciones urbanas operadas en su entorno. Relacionada principalmente con el ideal ilustrado de embellecimiento de la ciudad, la renovación de sus puertas principales se completaba con la reedificación y regularización de sus cercas, y la conformación de nuevos paseos en las afueras de la capital, cuyo trazado vertebrará en buena medida la ocupación de la periferia y la consiguiente definición de la trama urbana del Madrid de hoy. El presente trabajo de investigación indaga sobre la significación urbana de las Reales Puertas de la Villa de Madrid, a partir de la revisión de su establecimiento en los sucesivos recintos de la capital, con especial atención a las transformaciones urbanas operadas en ella desde la definición de su último límite hasta la proyección futura del Madrid ampliado según el anteproyecto de Carlos Mª de Castro. La observación conjunta de las componentes arquitectónica y urbana de las puertas de la Villa de Madrid se ofrece a partir de un relato cronológico de los hechos, fundamentado en su justificación documental y la secuencia visual registrada en la cartografía histórica de la ciudad. La incorporación de aportaciones gráficas de nueva elaboración, de carácter y alcance diversos, proporciona una superposición espacio-temporal que posibilita la lectura comparada de las arquitecturas de las Reales Puertas de la Villa de Madrid y de las transformaciones urbanas operadas a partir de ellas, determinantes en gran medida la configuración de la ciudad actual. ABSTRACT Madrid’s current urban structure has its roots in the first fortified settlements of the IX century. Its subsequent expansions due to the capital’s constant growth were limited by successive enclosures, built originally as a defense mechanism, but later used for fiscal and sanitary purposes as well. The construction of these structures pivoted around the gates that gave access to the city, establishing an enclosure that allowed control of both people and goods on their way into the city. The gates and walls originally built by Felipe IV performed their purpose with a modesty that was later replaced by an increasing symbolic and ornamental prominence, eventually surpassing their own architecture to inspire profound urban changes around them. With the purpose of embellishing the city, the main gates were renovated, the walls were rebuilt and standardized, and new avenues were laid out outside the city. These changes dictated in large part the settling on the suburbs and the resulting configuration of Madrid’s urban scene as we know it today. This research explores the urban significance of the Royal Gates of Madrid through the study of the enclosures that marked the limits of the city. Special attention is given to the urban changes since the last enclosure was established through to Carlos Mª de Castro’s draft for Madrid’s future projection. The architectonic and urban facets of Madrid’s gates are examined simultaneously in a series of chronological events, based on relevant documentation and the graphical record found in Madrid’s historic cartography. This thesis includes new graphic contributions, which allow the comparison of the architecture of the Royal Gates of Madrid as they evolved in time and space. These documents are essential in order to understand the urban transformations that took place based on the Gates, having largely determined the city’s current configuration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present an analysis of the space-time dynamics of oceanic sea states exploiting stereo imaging techniques. In particular, a novel Wave Acquisition Stereo System (WASS) has been developed and deployed at the oceanographic tower Acqua Alta in the Northern Adriatic Sea, off the Venice coast in Italy. The analysis of WASS video measurements yields accurate estimates of the oceanic sea state dynamics, the associated directional spectra and wave surface statistics that agree well with theoretical models. Finally, we show that a space-time extreme, defined as the expected largest surface wave height over an area, is considerably larger than the maximum crest observed in time at a point, in agreement with theoretical predictions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The project arises from the need to develop improved teaching methodologies in field of the mechanics of continuous media. The objective is to offer the student a learning process to acquire the necessary theoretical knowledge, cognitive skills and the responsibility and autonomy to professional development in this area. Traditionally the teaching of the concepts of these subjects was performed through lectures and laboratory practice. During these lessons the students attitude was usually passive, and therefore their effectiveness was poor. The proposed methodology has already been successfully employed in universities like University Bochum, Germany, University the South Australia and aims to improve the effectiveness of knowledge acquisition through use by the student of a virtual laboratory. This laboratory allows to adapt the curricula and learning techniques to the European Higher Education and improve current learning processes in the University School of Public Works Engineers -EUITOP- of the Technical University of Madrid -UPM-, due there are not laboratories in this specialization. The virtual space is created using a software platform built on OpenSim, manages 3D virtual worlds, and, language LSL -Linden Scripting Language-, which imprints specific powers to objects. The student or user can access this virtual world through their avatar -your character in the virtual world- and can perform practices within the space created for the purpose, at any time, just with computer with internet access and viewfinder. The virtual laboratory has three partitions. The virtual meeting rooms, where the avatar can interact with peers, solve problems and exchange existing documentation in the virtual library. The interactive game room, where the avatar is has to resolve a number of issues in time. And the video room where students can watch instructional videos and receive group lessons. Each audiovisual interactive element is accompanied by explanations framing it within the area of knowledge and enables students to begin to acquire a vocabulary and practice of the profession for which they are being formed. Plane elasticity concepts are introduced from the tension and compression testing of test pieces of steel and concrete. The behavior of reticulated and articulated structures is reinforced by some interactive games and concepts of tension, compression, local and global buckling will by tests to break articulated structures. Pure bending concepts, simple and composite torsion will be studied by observing a flexible specimen. Earthquake resistant design of buildings will be checked by a laboratory test video.