952 resultados para Facade, Buildings, Earthquake, Time Histories, Inner-Story Lift


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nervous Kitchens intervenes in the story of soul food by treating the kitchen as a central site of instability. These kitchens reveal and critique their importance to constructions of Black womanhood. Utilizing close readings of Black women’s culinary practices in popular televisual kitchens and archival analysis of USDA domestic reforms, the project locates sites that challenge how we oversimplify soul food as a Black cultural product. These oversimplifications come through what I term the soul food imaginary. This term underscores how the cuisine is tangible (i.e., how dishes are made) but also the ways that histories of enslavement, migration, and domesticity are disseminated through fictionalized representations of Black women in the kitchen offering comfort through food. The project explores how images of these kitchens adhere to and diverge from the imaginary's four conventions: (1) Soul food originates in enslavement where master’s scraps became mama’s meal time; (2) Soul food is not healthy food; (3) Soul food moves South to North uninterrupted during the Great Migration and is evidence of and fuel for struggle, survival, and transformation; and 4) Black women cook it the best, naturally, and alone in the kitchen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work concerns the use of shade elements as architectural elements to block sunlight in public buildings. In a city like Natal, (5o South) the incidence of sunrays in any type of design should be a constant concern for all the architects. Besides, this habit of avoiding insolation in the environment is not a common practice. Within this context, the present work has the objective to dig deep into the knowledge of solar control, studying some cases and verifying its function according to the orientation and the original design of the building, having in mind if the shade elements usually used in the region have achieved their purpose of providing protection against the incidence of direct sun rays. This study considers the position of the shade element (horizontal and vertical), the angle formed between them and the respective facades, and the local of the buildings in relation to their orientation during the summer, winter and equinox solstice. As supporting instruments the solar map of the city and the protractor, for measuring shade angles, were used. It was concluded that in all the cases studied, it was not possible to obtain the maximum use of the elements. It was verified that the best type of shade element (more efficient) for the city of Natal is the mixed type (horizontal and vertical) and that the vertical shade elements are more efficient in the early mornings and late afternoon. The horizontal shade elements are used more effective at midday. We intend to present the results of this study to the architects in the region in order to show them the correct ways of using the shade elements according to the possible orientation on the facade, as a supporting tool at the time of designing a project as well as a subsidy for further discussions on the elaboration of the new urban standards for the city of Natal/RN

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two trends are emerging from modern electric power systems: the growth of renewable (e.g., solar and wind) generation, and the integration of information technologies and advanced power electronics. The former introduces large, rapid, and random fluctuations in power supply, demand, frequency, and voltage, which become a major challenge for real-time operation of power systems. The latter creates a tremendous number of controllable intelligent endpoints such as smart buildings and appliances, electric vehicles, energy storage devices, and power electronic devices that can sense, compute, communicate, and actuate. Most of these endpoints are distributed on the load side of power systems, in contrast to traditional control resources such as centralized bulk generators. This thesis focuses on controlling power systems in real time, using these load side resources. Specifically, it studies two problems.

(1) Distributed load-side frequency control: We establish a mathematical framework to design distributed frequency control algorithms for flexible electric loads. In this framework, we formulate a category of optimization problems, called optimal load control (OLC), to incorporate the goals of frequency control, such as balancing power supply and demand, restoring frequency to its nominal value, restoring inter-area power flows, etc., in a way that minimizes total disutility for the loads to participate in frequency control by deviating from their nominal power usage. By exploiting distributed algorithms to solve OLC and analyzing convergence of these algorithms, we design distributed load-side controllers and prove stability of closed-loop power systems governed by these controllers. This general framework is adapted and applied to different types of power systems described by different models, or to achieve different levels of control goals under different operation scenarios. We first consider a dynamically coherent power system which can be equivalently modeled with a single synchronous machine. We then extend our framework to a multi-machine power network, where we consider primary and secondary frequency controls, linear and nonlinear power flow models, and the interactions between generator dynamics and load control.

(2) Two-timescale voltage control: The voltage of a power distribution system must be maintained closely around its nominal value in real time, even in the presence of highly volatile power supply or demand. For this purpose, we jointly control two types of reactive power sources: a capacitor operating at a slow timescale, and a power electronic device, such as a smart inverter or a D-STATCOM, operating at a fast timescale. Their control actions are solved from optimal power flow problems at two timescales. Specifically, the slow-timescale problem is a chance-constrained optimization, which minimizes power loss and regulates the voltage at the current time instant while limiting the probability of future voltage violations due to stochastic changes in power supply or demand. This control framework forms the basis of an optimal sizing problem, which determines the installation capacities of the control devices by minimizing the sum of power loss and capital cost. We develop computationally efficient heuristics to solve the optimal sizing problem and implement real-time control. Numerical experiments show that the proposed sizing and control schemes significantly improve the reliability of voltage control with a moderate increase in cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most exciting discoveries in astrophysics of the last last decade is of the sheer diversity of planetary systems. These include "hot Jupiters", giant planets so close to their host stars that they orbit once every few days; "Super-Earths", planets with sizes intermediate to those of Earth and Neptune, of which no analogs exist in our own solar system; multi-planet systems with planets smaller than Mars to larger than Jupiter; planets orbiting binary stars; free-floating planets flying through the emptiness of space without any star; even planets orbiting pulsars. Despite these remarkable discoveries, the field is still young, and there are many areas about which precious little is known. In particular, we don't know the planets orbiting Sun-like stars nearest to our own solar system, and we know very little about the compositions of extrasolar planets. This thesis provides developments in those directions, through two instrumentation projects.

The first chapter of this thesis concerns detecting planets in the Solar neighborhood using precision stellar radial velocities, also known as the Doppler technique. We present an analysis determining the most efficient way to detect planets considering factors such as spectral type, wavelengths of observation, spectrograph resolution, observing time, and instrumental sensitivity. We show that G and K dwarfs observed at 400-600 nm are the best targets for surveys complete down to a given planet mass and out to a specified orbital period. Overall we find that M dwarfs observed at 700-800 nm are the best targets for habitable-zone planets, particularly when including the effects of systematic noise floors caused by instrumental imperfections. Somewhat surprisingly, we demonstrate that a modestly sized observatory, with a dedicated observing program, is up to the task of discovering such planets.

We present just such an observatory in the second chapter, called the "MINiature Exoplanet Radial Velocity Array," or MINERVA. We describe the design, which uses a novel multi-aperture approach to increase stability and performance through lower system etendue, as well as keeping costs and time to deployment down. We present calculations of the expected planet yield, and data showing the system performance from our testing and development of the system at Caltech's campus. We also present the motivation, design, and performance of a fiber coupling system for the array, critical for efficiently and reliably bringing light from the telescopes to the spectrograph. We finish by presenting the current status of MINERVA, operational at Mt. Hopkins observatory in Arizona.

The second part of this thesis concerns a very different method of planet detection, direct imaging, which involves discovery and characterization of planets by collecting and analyzing their light. Directly analyzing planetary light is the most promising way to study their atmospheres, formation histories, and compositions. Direct imaging is extremely challenging, as it requires a high performance adaptive optics system to unblur the point-spread function of the parent star through the atmosphere, a coronagraph to suppress stellar diffraction, and image post-processing to remove non-common path "speckle" aberrations that can overwhelm any planetary companions.

To this end, we present the "Stellar Double Coronagraph," or SDC, a flexible coronagraphic platform for use with the 200" Hale telescope. It has two focal and pupil planes, allowing for a number of different observing modes, including multiple vortex phase masks in series for improved contrast and inner working angle behind the obscured aperture of the telescope. We present the motivation, design, performance, and data reduction pipeline of the instrument. In the following chapter, we present some early science results, including the first image of a companion to the star delta Andromeda, which had been previously hypothesized but never seen.

A further chapter presents a wavefront control code developed for the instrument, using the technique of "speckle nulling," which can remove optical aberrations from the system using the deformable mirror of the adaptive optics system. This code allows for improved contrast and inner working angles, and was written in a modular style so as to be portable to other high contrast imaging platforms. We present its performance on optical, near-infrared, and thermal infrared instruments on the Palomar and Keck telescopes, showing how it can improve contrasts by a factor of a few in less than ten iterations.

One of the large challenges in direct imaging is sensing and correcting the electric field in the focal plane to remove scattered light that can be much brighter than any planets. In the last chapter, we present a new method of focal-plane wavefront sensing, combining a coronagraph with a simple phase-shifting interferometer. We present its design and implementation on the Stellar Double Coronagraph, demonstrating its ability to create regions of high contrast by measuring and correcting for optical aberrations in the focal plane. Finally, we derive how it is possible to use the same hardware to distinguish companions from speckle errors using the principles of optical coherence. We present results observing the brown dwarf HD 49197b, demonstrating the ability to detect it despite it being buried in the speckle noise floor. We believe this is the first detection of a substellar companion using the coherence properties of light.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Late Cretaceous to Modern tectonic evolution of central and eastern California has been studied for many decades, with published work generally focusing on specific geographic areas and time periods. The resulting literature leaves the reader, whether graduate student, faculty member, or layperson, wondering what a coherently integrated tectonic evolution might look like, or if it would be at all possible to undertake such a task. This question is the common thread weaving together the four studies presented in this work. Each of the individual chapters is targeted at a specific location and time period which I have identified as a critical yet missing link in piecing together a coherent regional tectonic story. In the first chapter, we re-discover a set of major west down normal faults running along the western slope of the southern Sierra, the western Sierra fault system (WSFS). We show that one of these faults was offset by roughly a kilometer in Eocene time, and that this activity directly resulted in the incision of much of the relief present in modern Kings Canyon. The second chapter is a basement landscape and thermochronometric study of the hanging wall of the WSFS. New data from this study area provide a significant westward expansion of basement thermochronometric data from the southern Sierra Nevada batholith. Thermal modeling results of these data provide critical new constraints on the early exhumation of the Sierra Nevada batholith, and in the context of the results from Chapter I, allow us to piece together a coherent chronology of tectonic forcings and landscape evolution for the southern Sierra Nevada. In the third chapter, I present a study of the surface rupture of the 1999 Hector Mine earthquake, a dextral strike slip event on a fault in the Eastern California Shear Zone (ECSZ). New constraints on the active tectonics in ECSZ will help future studies better resolve the enigmatic mismatch between geologic slip rates and geodetically determined regional rates. Chapter IV is a magnetostratigraphic pilot study of the Paleocene Goler Formation. This study provides strong evidence that continued investigation will yield new constraints on the depositional age of the only fossil-bearing Paleocene terrestrial deposit on the west coast of North America. Each of these studies aims to provide important new data at critical missing links in the tectonic evolution of central and eastern California.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several deterministic and probabilistic methods are used to evaluate the probability of seismically induced liquefaction of a soil. The probabilistic models usually possess some uncertainty in that model and uncertainties in the parameters used to develop that model. These model uncertainties vary from one statistical model to another. Most of the model uncertainties are epistemic, and can be addressed through appropriate knowledge of the statistical model. One such epistemic model uncertainty in evaluating liquefaction potential using a probabilistic model such as logistic regression is sampling bias. Sampling bias is the difference between the class distribution in the sample used for developing the statistical model and the true population distribution of liquefaction and non-liquefaction instances. Recent studies have shown that sampling bias can significantly affect the predicted probability using a statistical model. To address this epistemic uncertainty, a new approach was developed for evaluating the probability of seismically-induced soil liquefaction, in which a logistic regression model in combination with Hosmer-Lemeshow statistic was used. This approach was used to estimate the population (true) distribution of liquefaction to non-liquefaction instances of standard penetration test (SPT) and cone penetration test (CPT) based most updated case histories. Apart from this, other model uncertainties such as distribution of explanatory variables and significance of explanatory variables were also addressed using KS test and Wald statistic respectively. Moreover, based on estimated population distribution, logistic regression equations were proposed to calculate the probability of liquefaction for both SPT and CPT based case history. Additionally, the proposed probability curves were compared with existing probability curves based on SPT and CPT case histories.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sustainable development has only recently started examining the existing infrastructure, and a key aspect of this is hazard mitigation. To examine buildings under a sustainable perspective requires an understanding of a building's life-cycle environmental costs, including the consideration of associated environmental impacts induced by earthquake damage. Damage repair costs lead to additional material and energy consumption, leading to harmful environmental impacts. Merging results obtained from a seismic evaluation and life-cycle analysis for buildings will give a novel outlook on sustainable design decisions. To evaluate the environmental impacts caused by buildings, long-term impacts accrued throughout a building's lifetime and impacts associated with damage repair need to be quantified. A method and literature review for completing this examination has been developed and is discussed. Using software Athena and HAZUS-MH, this study evaluated the performance of steel and concrete buildings considering their life-cycle assessments and earthquake resistance. It was determined that code design-level greatly effects a building repair and damage estimations. This study presented two case study buildings and found specific results that were obtained using several premade assumptions. Future research recommendations were provided to make this methodology more useful in real-world applications. Examining cost and environmental impacts that a building has through, a cradle-to-grave analysis and seismic damage assessment will help reduce material consumption and construction activities from taking place before and after an earthquake event happens.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Buildings and other infrastructures located in the coastal regions of the US have a higher level of wind vulnerability. Reducing the increasing property losses and causalities associated with severe windstorms has been the central research focus of the wind engineering community. The present wind engineering toolbox consists of building codes and standards, laboratory experiments, and field measurements. The American Society of Civil Engineers (ASCE) 7 standard provides wind loads only for buildings with common shapes. For complex cases it refers to physical modeling. Although this option can be economically viable for large projects, it is not cost-effective for low-rise residential houses. To circumvent these limitations, a numerical approach based on the techniques of Computational Fluid Dynamics (CFD) has been developed. The recent advance in computing technology and significant developments in turbulence modeling is making numerical evaluation of wind effects a more affordable approach. The present study targeted those cases that are not addressed by the standards. These include wind loads on complex roofs for low-rise buildings, aerodynamics of tall buildings, and effects of complex surrounding buildings. Among all the turbulence models investigated, the large eddy simulation (LES) model performed the best in predicting wind loads. The application of a spatially evolving time-dependent wind velocity field with the relevant turbulence structures at the inlet boundaries was found to be essential. All the results were compared and validated with experimental data. The study also revealed CFD’s unique flow visualization and aerodynamic data generation capabilities along with a better understanding of the complex three-dimensional aerodynamics of wind-structure interactions. With the proper modeling that realistically represents the actual turbulent atmospheric boundary layer flow, CFD can offer an economical alternative to the existing wind engineering tools. CFD’s easy accessibility is expected to transform the practice of structural design for wind, resulting in more wind-resilient and sustainable systems by encouraging optimal aerodynamic and sustainable structural/building design. Thus, this method will help ensure public safety and reduce economic losses due to wind perils.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acoustic measurements were performed in eight schools of different levels of education (from kindergarten to college) located in Viseu – Portugal. The acoustic evaluation was made in order to analyze the most common problems that may condition the acoustic environment inside school building. The acoustics evaluation of school buildings was made by the measurement of: reverberation time in classrooms; sound insulation between classrooms and between classrooms and corridors; impact sound insulation of floors and airborne sound insulation of façade. The sound insulation of façade was made with all elements closed and with natural ventilation conditions (banners or windows tilt mode). It was found that most of the studied cases revealed disabled constructive aspects in relation to the acoustic requirements of school buildings compromising the quality of education.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Palaces, as an architectural typology, can be found in recreation Quintas that surrounded the main cities in Portugal, which preserved its rural character since the 16th century until the middle of the 19th century. Consisting of cultivated land and farm buildings, the palace of the Quinta was the owner´s temporary residence, for summer holidays and festive events, with gardens, pavilions, fountains and lakes for recreational purposes and leisure. The focus on palaces, as a historic building and as in need of new uses, clearly shows how current the debate on contemporary interventions in this heritage typology is. Interventions in architectural heritage require multidisciplinary teams to identify conservation strategies which enable a qualified use of its spaces, such as for example the experience of security and well-being, which can contribute to a better quality of life and simultaneously to the quality of the urban environment. This paper presents the Palace of Quinta Alegre and its rehabilitation project for contemporary use and public esteem, both of which are considered fundamental prerequisites for its sustainable maintenance in space, in time and in memory. [versão Portuguesa] Sob a denominação de tipologia arquitectónica, o edifício Palácio pode ser encontrado nas Quintas de Recreio que rodeavam as principais cidades Portuguesas, tendo preservado o seu carácter rural, desde o século XVI até metade do século XIX. Consistindo as Quintas em terra cultivada e edifícios rurais, o Palácio da Quinta consistia na residência temporária do proprietário, para férias de verão e eventos comemorativos, dispondo de jardins, pavilhões, fontes e lagos para recreação e lazer. O tema dos Palácios, entendido como edifício histórico que procura novos usos, demonstra como é actual o debate sobre intervenções contemporâneas nesta tipologia de valor patrimonial. A intervenção em património arquitectónico requer a definição de estratégias de conservação por equipas multidisciplinares que permitam estabelecer um uso qualificado dos seus espaços, proporcionando experiências sensoriais de bem-estar e segurança, contribuindo para uma melhor qualidade de vida e, simultaneamente, para a qualidade do ambiente urbano em que se insere. Este artigo tem por objectivo apresentar o Palácio da Quinta Alegre e o projecto de reabilitação, devolvendo-o a um uso contemporâneo e à estima pública, factores fundamentais para a sua manutenção sustentável no espaço, no tempo e na memória.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Building Information Modelling is changing the design and construction field ever since it entered the market. It took just some time to show its capabilities, it takes some time to be mastered before it could be used expressing all its best features. Since it was conceived to be adopted from the earliest stage of design to get the maximum from the decisional project, it still struggles to adapt to existing buildings. In fact, there is a branch of this methodology that is dedicated to what has been already made that is called Historic BIM or HBIM. This study aims to make clear what are BIM and HBIM, both from a theoretical point of view and in practice, applying from scratch the state of the art to a case study. It had been chosen the fortress of San Felice sul Panaro, a marvellous building with a thousand years of history in its bricks, that suffered violent earthquakes, but it is still standing. By means of this example, it will be shown which are the limits that could be encountered when applying BIM methodology to existing heritage, moreover will be pointed out all the new features that a simple 2D design could not achieve.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The surface of the Earth is subjected to vertical deformations caused by geophysical and geological processes which can be monitored by Global Positioning System (GPS) observations. The purpose of this work is to investigate GPS height time series to identify interannual signals affecting the Earth’s surface over the European and Mediterranean area, during the period 2001-2019. Thirty-six homogeneously distributed GPS stations were selected from the online dataset made available by the Nevada Geodetic Laboratory (NGL) on the basis of the length and quality of the data series. The Principal Component Analysis (PCA) is the technique applied to extract the main patterns of the space and time variability of the GPS Up coordinate. The time series were studied by means of a frequency analysis using a periodogram and the real-valued Morlet wavelet. The periodogram is used to identify the dominant frequencies and the spectral density of the investigated signals; the second one is applied to identify the signals in the time domain and the relevant periodicities. This study has identified, over European and Mediterranean area, the presence of interannual non-linear signals with a period of 2-to-4 years, possibly related to atmospheric and hydrological loading displacements and to climate phenomena, such as El Niño Southern Oscillation (ENSO). A clear signal with a period of about six years is present in the vertical component of the GPS time series, likely explainable by the gravitational coupling between the Earth’s mantle and the inner core. Moreover, signals with a period in the order of 8-9 years, might be explained by mantle-inner core gravity coupling and the cycle of the lunar perigee, and a signal of 18.6 years, likely associated to lunar nodal cycle, were identified through the wavelet spectrum. However, these last two signals need further confirmation because the present length of the GPS time series is still too short when compared to the periods involved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we focus on the analysis and interpretation of time dependent deformations recorded through different geodetic methods. Firstly, we apply a variational Bayesian Independent Component Analysis (vbICA) technique to GPS daily displacement solutions, to separate the postseismic deformation that followed the mainshocks of the 2016-2017 Central Italy seismic sequence from the other, hydrological, deformation sources. By interpreting the signal associated with the postseismic relaxation, we model an afterslip distribution on the faults involved by the mainshocks consistent with the co-seismic models available in literature. We find evidences of aseismic slip on the Paganica fault, responsible for the Mw 6.1 2009 L’Aquila earthquake, highlighting the importance of aseismic slip and static stress transfer to properly model the recurrence of earthquakes on nearby fault segments. We infer a possible viscoelastic relaxation of the lower crust as a contributing mechanism to the postseismic displacements. We highlight the importance of a proper separation of the hydrological signals for an accurate assessment of the tectonic processes, especially in cases of mm-scale deformations. Contextually, we provide a physical explanation to the ICs associated with the observed hydrological processes. In the second part of the thesis, we focus on strain data from Gladwin Tensor Strainmeters, working on the instruments deployed in Taiwan. We develop a novel approach, completely data driven, to calibrate these strainmeters. We carry out a joint analysis of geodetic (strainmeters, GPS and GRACE products) and hydrological (rain gauges and piezometers) data sets, to characterize the hydrological signals in Southern Taiwan. Lastly, we apply the calibration approach here proposed to the strainmeters recently installed in Central Italy. We provide, as an example, the detection of a storm that hit the Umbria-Marche regions (Italy), demonstrating the potential of strainmeters in following the dynamics of deformation processes with limited spatio-temporal signature

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis aims to understand the behavior of a low-rise unreinforced masonry building (URM), the typical residential house in the Netherlands, when subjected to low-intensity earthquakes. In fact, in the last decades, the Groningen region was hit by several shallow earthquakes caused by the extraction of natural gas. In particular, the focus is addressed to the internal non-structural walls and to their interaction with the structural parts of the building. A simple and cost-efficient 2D FEM model is developed, focused on the interfaces representing mortar layers that are present between the non-structural walls and the rest of the structure. As a reference for geometries and materials, it has been taken into consideration a prototype that was built in full-scale at the EUCENTRE laboratory of Pavia (Italy). Firstly, a quasi-static analysis is performed by gradually applying a prescribed displacement on the roof floor of the structure. Sensitivity analyses are conducted on some key parameters characterizing mortar. This analysis allows for the calibration of their values and the evaluation of the reliability of the model. Successively, a transient analysis is performed to effectively subject the model to a seismic action and hence also evaluate the mechanical response of the building over time. Moreover, it was possible to compare the results of this analysis with the displacements recorded in the experimental tests by creating a model representing the entire considered structure. As a result, some conditions for the model calibration are defined. The reliability of the model is then confirmed by both the reasonable results obtained from the sensitivity analysis and the compatibility of the values obtained for the top displacement of the roof floor of the experimental test, and the same value acquired from the structural model.