6 resultados para Ground analysis
em AMS Tesi di Laurea - Alm@DL - Università di Bologna
Resumo:
The work for the present thesis started in California, during my semester as an exchange student overseas. California is known worldwide for its seismicity and its effort in the earthquake engineering research field. For this reason, I immediately found interesting the Structural Dynamics Professor, Maria Q. Feng's proposal, to work on a pushover analysis of the existing Jamboree Road Overcrossing bridge. Concrete is a popular building material in California, and for the most part, it serves its functions well. However, concrete is inherently brittle and performs poorly during earthquakes if not reinforced properly. The San Fernando Earthquake of 1971 dramatically demonstrated this characteristic. Shortly thereafter, code writers revised the design provisions for new concrete buildings so to provide adequate ductility to resist strong ground shaking. There remain, nonetheless, millions of square feet of non-ductile concrete buildings in California. The purpose of this work is to perform a Pushover Analysis and compare the results with those of a Nonlinear Time-History Analysis of an existing bridge, located in Southern California. The analyses have been executed through the software OpenSees, the Open System for Earthquake Engineering Simulation. The bridge Jamboree Road Overcrossing is classified as a Standard Ordinary Bridge. In fact, the JRO is a typical three-span continuous cast-in-place prestressed post-tension box-girder. The total length of the bridge is 366 ft., and the height of the two bents are respectively 26,41 ft. and 28,41 ft.. Both the Pushover Analysis and the Nonlinear Time-History Analysis require the use of a model that takes into account for the nonlinearities of the system. In fact, in order to execute nonlinear analyses of highway bridges it is essential to incorporate an accurate model of the material behavior. It has been observed that, after the occurrence of destructive earthquakes, one of the most damaged elements on highway bridges is a column. To evaluate the performance of bridge columns during seismic events an adequate model of the column must be incorporated. Part of the work of the present thesis is, in fact, dedicated to the modeling of bents. Different types of nonlinear element have been studied and modeled, with emphasis on the plasticity zone length determination and location. Furthermore, different models for concrete and steel materials have been considered, and the selection of the parameters that define the constitutive laws of the different materials have been accurate. The work is structured into four chapters, to follow a brief overview of the content. The first chapter introduces the concepts related to capacity design, as the actual philosophy of seismic design. Furthermore, nonlinear analyses both static, pushover, and dynamic, time-history, are presented. The final paragraph concludes with a short description on how to determine the seismic demand at a specific site, according to the latest design criteria in California. The second chapter deals with the formulation of force-based finite elements and the issues regarding the objectivity of the response in nonlinear field. Both concentrated and distributed plasticity elements are discussed into detail. The third chapter presents the existing structure, the software used OpenSees, and the modeling assumptions and issues. The creation of the nonlinear model represents a central part in this work. Nonlinear material constitutive laws, for concrete and reinforcing steel, are discussed into detail; as well as the different scenarios employed in the columns modeling. Finally, the results of the pushover analysis are presented in chapter four. Capacity curves are examined for the different model scenarios used, and failure modes of concrete and steel are discussed. Capacity curve is converted into capacity spectrum and intersected with the design spectrum. In the last paragraph, the results of nonlinear time-history analyses are compared to those of pushover analysis.
Resumo:
All the structures designed by engineers are vulnerable to natural disasters including floods and earthquakes. The energy released during strong ground motions should be dissipated by structural elements. Before 1990’s, this energy was expected to be dissipated through the beams and columns which at the same time were a part of gravity-load-resisting system. However, the main disadvantage of this idea was that gravity-resisting-frame was not repairable. Hence, during 1990’s, the idea of designing passive energy dissipation systems, including dampers, emerged. At the beginning, main problem was lack of guidelines for passive energy dissipation systems. Although till 2000 many guidelines and procedures where published, yet most of them were based on complicated analysis which was not so convenient for engineers and practitioners. In order to solve this problem recently some alternative design methods are proposed including 1. Lopez Garcia (2001) simple procedure for optimal damper configuration in MDOF structures 2. Christopoulos and Filiatrault (2006) trial and error procedure 3. Silvestri et al. (2010) Five-Step Method. 4. Palermo et al. (2015) Direct Five-Step Method. 5. Palermo et al. (2016) Simplified Equivalent Static Analysis (ESA). In this study, effectiveness and differences between last three alternative methods have been evaluated.
Resumo:
The aim of this novel experimental study is to investigate the behaviour of a 2m x 2m model of a masonry groin vault, which is built by the assembly of blocks made of a 3D-printed plastic skin filled with mortar. The choice of the groin vault is due to the large presence of this vulnerable roofing system in the historical heritage. Experimental tests on the shaking table are carried out to explore the vault response on two support boundary conditions, involving four lateral confinement modes. The data processing of markers displacement has allowed to examine the collapse mechanisms of the vault, based on the arches deformed shapes. There then follows a numerical evaluation, to provide the orders of magnitude of the displacements associated to the previous mechanisms. Given that these displacements are related to the arches shortening and elongation, the last objective is the definition of a critical elongation between two diagonal bricks and consequently of a diagonal portion. This study aims to continue the previous work and to take another step forward in the research of ground motion effects on masonry structures.
Resumo:
The Venice Lagoon is a complex, heterogeneous and highly dynamic system, subject to anthropogenic and natural pressures that deeply affect the functioning of this ecosystem. Thanks to the development of acoustic technologies, it is possible to obtain maps with a high resolution that describe the characteristics of the seabed. With this aim, a high resolution Multibeam Echosounder (MBES) bathymetry and backscatter survey was carried out in 2021 within the project Research Programme Venezia 2021. Ground-truthing samples were collected in 24 sampling sites to characterize the seafloor and validate the maps produced with the MBES acoustic data. Ground-truthing included the collection of sediment samples for particle size analysis and video footage of the seabed to describe the biological component. The backscatter data was analysed using the unsupervised Jenks classification. We created a map of the habitats integrating morphological, granulometric and biological data in a GIS environment. The results obtained in this study were compared to those collected in 2015 as part of the National Flagship Project RITMARE. Through the comparison of the repeated morpho-bathymetric surveys over time we highlighted the changes of the seafloor geomorphology, sediment, and habitat distribution. We observed different type of habitats and the presence of areas characterized by erosive processes and others in which deposition occurred. These effects led to changes in the benthic communities and in the type of sediment. The combination of the MBES surveys, the ground truth data and the GIS methodology, permitted to construct high-resolution maps of the seafloor and proved to be effective implement for monitoring an extremely dynamic area. This work can contribute not only to broaden the knowledge of transitional environments, but also to their monitor and protection.
Resumo:
Artificial Intelligence (AI) is gaining ever more ground in every sphere of human life, to the point that it is now even used to pass sentences in courts. The use of AI in the field of Law is however deemed quite controversial, as it could provide more objectivity yet entail an abuse of power as well, given that bias in algorithms behind AI may cause lack of accuracy. As a product of AI, machine translation is being increasingly used in the field of Law too in order to translate laws, judgements, contracts, etc. between different languages and different legal systems. In the legal setting of Company Law, accuracy of the content and suitability of terminology play a crucial role within a translation task, as any addition or omission of content or mistranslation of terms could entail legal consequences for companies. The purpose of the present study is to first assess which neural machine translation system between DeepL and ModernMT produces a more suitable translation from Italian into German of the atto costitutivo of an Italian s.r.l. in terms of accuracy of the content and correctness of terminology, and then to assess which translation proves to be closer to a human reference translation. In order to achieve the above-mentioned aims, two human and automatic evaluations are carried out based on the MQM taxonomy and the BLEU metric. Results of both evaluations show an overall better performance delivered by ModernMT in terms of content accuracy, suitability of terminology, and closeness to a human translation. As emerged from the MQM-based evaluation, its accuracy and terminology errors account for just 8.43% (as opposed to DeepL’s 9.22%), while it obtains an overall BLEU score of 29.14 (against DeepL’s 27.02). The overall performances however show that machines still face barriers in overcoming semantic complexity, tackling polysemy, and choosing domain-specific terminology, which suggests that the discrepancy with human translation may still be remarkable.