3 resultados para Missions California.
em AMS Tesi di Laurea - Alm@DL - Università di Bologna
Resumo:
The work for the present thesis started in California, during my semester as an exchange student overseas. California is known worldwide for its seismicity and its effort in the earthquake engineering research field. For this reason, I immediately found interesting the Structural Dynamics Professor, Maria Q. Feng's proposal, to work on a pushover analysis of the existing Jamboree Road Overcrossing bridge. Concrete is a popular building material in California, and for the most part, it serves its functions well. However, concrete is inherently brittle and performs poorly during earthquakes if not reinforced properly. The San Fernando Earthquake of 1971 dramatically demonstrated this characteristic. Shortly thereafter, code writers revised the design provisions for new concrete buildings so to provide adequate ductility to resist strong ground shaking. There remain, nonetheless, millions of square feet of non-ductile concrete buildings in California. The purpose of this work is to perform a Pushover Analysis and compare the results with those of a Nonlinear Time-History Analysis of an existing bridge, located in Southern California. The analyses have been executed through the software OpenSees, the Open System for Earthquake Engineering Simulation. The bridge Jamboree Road Overcrossing is classified as a Standard Ordinary Bridge. In fact, the JRO is a typical three-span continuous cast-in-place prestressed post-tension box-girder. The total length of the bridge is 366 ft., and the height of the two bents are respectively 26,41 ft. and 28,41 ft.. Both the Pushover Analysis and the Nonlinear Time-History Analysis require the use of a model that takes into account for the nonlinearities of the system. In fact, in order to execute nonlinear analyses of highway bridges it is essential to incorporate an accurate model of the material behavior. It has been observed that, after the occurrence of destructive earthquakes, one of the most damaged elements on highway bridges is a column. To evaluate the performance of bridge columns during seismic events an adequate model of the column must be incorporated. Part of the work of the present thesis is, in fact, dedicated to the modeling of bents. Different types of nonlinear element have been studied and modeled, with emphasis on the plasticity zone length determination and location. Furthermore, different models for concrete and steel materials have been considered, and the selection of the parameters that define the constitutive laws of the different materials have been accurate. The work is structured into four chapters, to follow a brief overview of the content. The first chapter introduces the concepts related to capacity design, as the actual philosophy of seismic design. Furthermore, nonlinear analyses both static, pushover, and dynamic, time-history, are presented. The final paragraph concludes with a short description on how to determine the seismic demand at a specific site, according to the latest design criteria in California. The second chapter deals with the formulation of force-based finite elements and the issues regarding the objectivity of the response in nonlinear field. Both concentrated and distributed plasticity elements are discussed into detail. The third chapter presents the existing structure, the software used OpenSees, and the modeling assumptions and issues. The creation of the nonlinear model represents a central part in this work. Nonlinear material constitutive laws, for concrete and reinforcing steel, are discussed into detail; as well as the different scenarios employed in the columns modeling. Finally, the results of the pushover analysis are presented in chapter four. Capacity curves are examined for the different model scenarios used, and failure modes of concrete and steel are discussed. Capacity curve is converted into capacity spectrum and intersected with the design spectrum. In the last paragraph, the results of nonlinear time-history analyses are compared to those of pushover analysis.
Resumo:
Passive acoustic data have been collected using HARPs (High-frequency Acoustic Recording Packages) and were used to assess (1) the seasonality of blue whale D calls in the Southern California Bight, (2) their interannual abundance during 2007-2012 and (3) their diel variation. This goal has been achieved running the GPL (Generalized Power-Law) automated detector. (1) Blue whale D calls were detected in the Southern California Bight from May through November with a peak in July, even though few detections were from December to April as well. A key predictor for blue whale distribution and movement in the California Current region has been identified with zooplankton aggregations, paying a particular attention to those euphausiid species, such as E. pacifica and T. spinifera, which are blue whale favorite krill. The Southern California Bight experiences seasonal upwelling, resulting in an increase of productivity and prey availability. The summer and early fall have been marked as the most favorable periods. This supports the presence of blue whales in the area at that time, supposing these marine mammals exploit the region as a feeding ground. (2) As to the interannual abundance during 2007-2012, I found a large variability. I observed a great increase of vocalizations in 2007 and 2010, whereas a decrease was shown in the other years, which is well marked in 2009. It is my belief that these fluctuations in abundance of D calls detections through the deployed period are due to the alternation of El Nino and La Nina events, which occurred in those years. (3) The assessment of the daily timing of D calls production shows that D calls are more abundant during the day than during the night with a peak at 12:00 and 13:00. Assuming that D calling is associated with feeding, the daily pattern of D calls may be linked to the prey availability. E. pacifica and T. spinifera are among those species of krill which undertake daily vertical migrations, remaining at depth during the day and slowly coming up towards the surface at night. Because of some anatomical arrangements, these euphausiids are very sensitive to the light. Given that we believe D calls have a social function, I hypothesize that blue whales may recognize the hours at the highest solar incidence as the best moment of the day in terms of prey availability, exploiting this time window to advert their conspecifics.
Resumo:
L’obiettivo del lavoro esposto nella seguente relazione di tesi ha riguardato lo studio e la simulazione di esperimenti di radar bistatico per missioni di esplorazione planeteria. In particolare, il lavoro si è concentrato sull’uso ed il miglioramento di un simulatore software già realizzato da un consorzio di aziende ed enti di ricerca nell’ambito di uno studio dell’Agenzia Spaziale Europea (European Space Agency – ESA) finanziato nel 2008, e svolto fra il 2009 e 2010. L’azienda spagnola GMV ha coordinato lo studio, al quale presero parte anche gruppi di ricerca dell’Università di Roma “Sapienza” e dell’Università di Bologna. Il lavoro svolto si è incentrato sulla determinazione della causa di alcune inconsistenze negli output relativi alla parte del simulatore, progettato in ambiente MATLAB, finalizzato alla stima delle caratteristiche della superficie di Titano, in particolare la costante dielettrica e la rugosità media della superficie, mediante un esperimento con radar bistatico in modalità downlink eseguito dalla sonda Cassini-Huygens in orbita intorno al Titano stesso. Esperimenti con radar bistatico per lo studio di corpi celesti sono presenti nella storia dell’esplorazione spaziale fin dagli anni ’60, anche se ogni volta le apparecchiature utilizzate e le fasi di missione, durante le quali questi esperimenti erano effettuati, non sono state mai appositamente progettate per lo scopo. Da qui la necessità di progettare un simulatore per studiare varie possibili modalità di esperimenti con radar bistatico in diversi tipi di missione. In una prima fase di approccio al simulatore, il lavoro si è incentrato sullo studio della documentazione in allegato al codice così da avere un’idea generale della sua struttura e funzionamento. È seguita poi una fase di studio dettagliato, determinando lo scopo di ogni linea di codice utilizzata, nonché la verifica in letteratura delle formule e dei modelli utilizzati per la determinazione di diversi parametri. In una seconda fase il lavoro ha previsto l’intervento diretto sul codice con una serie di indagini volte a determinarne la coerenza e l’attendibilità dei risultati. Ogni indagine ha previsto una diminuzione delle ipotesi semplificative imposte al modello utilizzato in modo tale da identificare con maggiore sicurezza la parte del codice responsabile dell’inesattezza degli output del simulatore. I risultati ottenuti hanno permesso la correzione di alcune parti del codice e la determinazione della principale fonte di errore sugli output, circoscrivendo l’oggetto di studio per future indagini mirate.