916 resultados para Failure time analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The work for the present thesis started in California, during my semester as an exchange student overseas. California is known worldwide for its seismicity and its effort in the earthquake engineering research field. For this reason, I immediately found interesting the Structural Dynamics Professor, Maria Q. Feng's proposal, to work on a pushover analysis of the existing Jamboree Road Overcrossing bridge. Concrete is a popular building material in California, and for the most part, it serves its functions well. However, concrete is inherently brittle and performs poorly during earthquakes if not reinforced properly. The San Fernando Earthquake of 1971 dramatically demonstrated this characteristic. Shortly thereafter, code writers revised the design provisions for new concrete buildings so to provide adequate ductility to resist strong ground shaking. There remain, nonetheless, millions of square feet of non-ductile concrete buildings in California. The purpose of this work is to perform a Pushover Analysis and compare the results with those of a Nonlinear Time-History Analysis of an existing bridge, located in Southern California. The analyses have been executed through the software OpenSees, the Open System for Earthquake Engineering Simulation. The bridge Jamboree Road Overcrossing is classified as a Standard Ordinary Bridge. In fact, the JRO is a typical three-span continuous cast-in-place prestressed post-tension box-girder. The total length of the bridge is 366 ft., and the height of the two bents are respectively 26,41 ft. and 28,41 ft.. Both the Pushover Analysis and the Nonlinear Time-History Analysis require the use of a model that takes into account for the nonlinearities of the system. In fact, in order to execute nonlinear analyses of highway bridges it is essential to incorporate an accurate model of the material behavior. It has been observed that, after the occurrence of destructive earthquakes, one of the most damaged elements on highway bridges is a column. To evaluate the performance of bridge columns during seismic events an adequate model of the column must be incorporated. Part of the work of the present thesis is, in fact, dedicated to the modeling of bents. Different types of nonlinear element have been studied and modeled, with emphasis on the plasticity zone length determination and location. Furthermore, different models for concrete and steel materials have been considered, and the selection of the parameters that define the constitutive laws of the different materials have been accurate. The work is structured into four chapters, to follow a brief overview of the content. The first chapter introduces the concepts related to capacity design, as the actual philosophy of seismic design. Furthermore, nonlinear analyses both static, pushover, and dynamic, time-history, are presented. The final paragraph concludes with a short description on how to determine the seismic demand at a specific site, according to the latest design criteria in California. The second chapter deals with the formulation of force-based finite elements and the issues regarding the objectivity of the response in nonlinear field. Both concentrated and distributed plasticity elements are discussed into detail. The third chapter presents the existing structure, the software used OpenSees, and the modeling assumptions and issues. The creation of the nonlinear model represents a central part in this work. Nonlinear material constitutive laws, for concrete and reinforcing steel, are discussed into detail; as well as the different scenarios employed in the columns modeling. Finally, the results of the pushover analysis are presented in chapter four. Capacity curves are examined for the different model scenarios used, and failure modes of concrete and steel are discussed. Capacity curve is converted into capacity spectrum and intersected with the design spectrum. In the last paragraph, the results of nonlinear time-history analyses are compared to those of pushover analysis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The hydrologic risk (and the hydro-geologic one, closely related to it) is, and has always been, a very relevant issue, due to the severe consequences that may be provoked by a flooding or by waters in general in terms of human and economic losses. Floods are natural phenomena, often catastrophic, and cannot be avoided, but their damages can be reduced if they are predicted sufficiently in advance. For this reason, the flood forecasting plays an essential role in the hydro-geological and hydrological risk prevention. Thanks to the development of sophisticated meteorological, hydrologic and hydraulic models, in recent decades the flood forecasting has made a significant progress, nonetheless, models are imperfect, which means that we are still left with a residual uncertainty on what will actually happen. In this thesis, this type of uncertainty is what will be discussed and analyzed. In operational problems, it is possible to affirm that the ultimate aim of forecasting systems is not to reproduce the river behavior, but this is only a means through which reducing the uncertainty associated to what will happen as a consequence of a precipitation event. In other words, the main objective is to assess whether or not preventive interventions should be adopted and which operational strategy may represent the best option. The main problem for a decision maker is to interpret model results and translate them into an effective intervention strategy. To make this possible, it is necessary to clearly define what is meant by uncertainty, since in the literature confusion is often made on this issue. Therefore, the first objective of this thesis is to clarify this concept, starting with a key question: should be the choice of the intervention strategy to adopt based on the evaluation of the model prediction based on its ability to represent the reality or on the evaluation of what actually will happen on the basis of the information given by the model forecast? Once the previous idea is made unambiguous, the other main concern of this work is to develope a tool that can provide an effective decision support, making possible doing objective and realistic risk evaluations. In particular, such tool should be able to provide an uncertainty assessment as accurate as possible. This means primarily three things: it must be able to correctly combine all the available deterministic forecasts, it must assess the probability distribution of the predicted quantity and it must quantify the flooding probability. Furthermore, given that the time to implement prevention strategies is often limited, the flooding probability will have to be linked to the time of occurrence. For this reason, it is necessary to quantify the flooding probability within a horizon time related to that required to implement the intervention strategy and it is also necessary to assess the probability of the flooding time.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the present work we perform an econometric analysis of the Tribal art market. To this aim, we use a unique and original database that includes information on Tribal art market auctions worldwide from 1998 to 2011. In Literature, art prices are modelled through the hedonic regression model, a classic fixed-effect model. The main drawback of the hedonic approach is the large number of parameters, since, in general, art data include many categorical variables. In this work, we propose a multilevel model for the analysis of Tribal art prices that takes into account the influence of time on artwork prices. In fact, it is natural to assume that time exerts an influence over the price dynamics in various ways. Nevertheless, since the set of objects change at every auction date, we do not have repeated measurements of the same items over time. Hence, the dataset does not constitute a proper panel; rather, it has a two-level structure in that items, level-1 units, are grouped in time points, level-2 units. The main theoretical contribution is the extension of classical multilevel models to cope with the case described above. In particular, we introduce a model with time dependent random effects at the second level. We propose a novel specification of the model, derive the maximum likelihood estimators and implement them through the E-M algorithm. We test the finite sample properties of the estimators and the validity of the own-written R-code by means of a simulation study. Finally, we show that the new model improves considerably the fit of the Tribal art data with respect to both the hedonic regression model and the classic multilevel model.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

English: The assessment of safety in existing bridges and viaducts led the Ministry of Public Works of the Netherlands to finance a specific campaing aimed at the study of the response of the elements of these infrastructures. Therefore, this activity is focused on the investigation of the behaviour of reinforced concrete slabs under concentrated loads, adopting finite element modeling and comparison with experimental results. These elements are characterized by shear behaviour and crisi, whose modeling is, from a computational point of view, a hard challeng, due to the brittle behavior combined with three-dimensional effects. The numerical modeling of the failure is studied through Sequentially Linear Analysis (SLA), an alternative Finite Element method, with respect to traditional incremental and iterative approaches. The comparison between the two different numerical techniques represents one of the first works and comparisons in a three-dimensional environment. It's carried out adopting one of the experimental test executed on reinforced concrete slabs as well. The advantage of the SLA is to avoid the well known problems of convergence of typical non-linear analysis, by directly specifying a damage increment, in terms of reduction of stiffness and resistance in particular finite element, instead of load or displacement increasing on the whole structure . For the first time, particular attention has been paid to specific aspects of the slabs, like an accurate constraints modeling and sensitivity of the solution with respect to the mesh density. This detailed analysis with respect to the main parameters proofed a strong influence of the tensile fracture energy, mesh density and chosen model on the solution in terms of force-displacement diagram, distribution of the crack patterns and shear failure mode. The SLA showed a great potential, but it requires a further developments for what regards two aspects of modeling: load conditions (constant and proportional loads) and softening behaviour of brittle materials (like concrete) in the three-dimensional field, in order to widen its horizons in these new contexts of study.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis is a collection of works focused on the topic of Earthquake Early Warning, with a special attention to large magnitude events. The topic is addressed from different points of view and the structure of the thesis reflects the variety of the aspects which have been analyzed. The first part is dedicated to the giant, 2011 Tohoku-Oki earthquake. The main features of the rupture process are first discussed. The earthquake is then used as a case study to test the feasibility Early Warning methodologies for very large events. Limitations of the standard approaches for large events arise in this chapter. The difficulties are related to the real-time magnitude estimate from the first few seconds of recorded signal. An evolutionary strategy for the real-time magnitude estimate is proposed and applied to the single Tohoku-Oki earthquake. In the second part of the thesis a larger number of earthquakes is analyzed, including small, moderate and large events. Starting from the measurement of two Early Warning parameters, the behavior of small and large earthquakes in the initial portion of recorded signals is investigated. The aim is to understand whether small and large earthquakes can be distinguished from the initial stage of their rupture process. A physical model and a plausible interpretation to justify the observations are proposed. The third part of the thesis is focused on practical, real-time approaches for the rapid identification of the potentially damaged zone during a seismic event. Two different approaches for the rapid prediction of the damage area are proposed and tested. The first one is a threshold-based method which uses traditional seismic data. Then an innovative approach using continuous, GPS data is explored. Both strategies improve the prediction of large scale effects of strong earthquakes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

L’attività di ricerca della presente tesi di dottorato ha riguardato sistemi tribologici complessi di interesse industriale per i quali sono stati individuati, mediante failure analysis, i meccanismi di usura dominanti. Per ciascuno di essi sono state studiate soluzioni migliorative sulla base di prove tribologiche di laboratorio. Nella realizzazione di maglie per macchine movimentazione terra sono ampiamente utilizzati i tradizionali acciai da bonifica. La possibilità di utilizzare i nuovi microlegati a medio tenore di carbonio, consentirebbe una notevole semplificazione del ciclo produttivo e benefici in termini di costi. Una parte della tesi ha riguardato lo studio del comportamento tribologico di tali acciai. E’ stato anche affrontato lo studio tribologico di motori idraulici, con l’obiettivo di riuscire a migliorarne la resistenza ad usura e quindi la vita utile. Sono state eseguite prove a banco, per valutare i principali meccanismi di usura, e prove di laboratorio atte a riprodurre le reali condizioni di utilizzo, valutando tecniche di modificazione superficiale che fossero in grado di ridurre l’usura dei componenti. Sono state analizzate diverse tipologie di rivestimenti Thermal Spray in termini di modalità di deposizione (AFS-APS) e di leghe metalliche depositate (Ni,Mo,Cu/Al). Si sono infine caratterizzati contatti tribologici nel settore del packaging, dove l’utilizzo di acciai inox austenitici è in alcuni casi obbligatorio. L’acciaio inossidabile AISI 316L è ampiamente utilizzato in applicazioni in cui siano richieste elevate resistenze alla corrosione, tuttavia la bassa resistenza all’usura, ne limitano l’impiego in campo tribologico. In tale ambito, è stata analizzata una problematica tribologica relativa a macchine automatiche per il dosaggio di polveri farmaceutiche. Sono state studiate soluzioni alternative che hanno previsto sia la completa sostituzione dei materiali della coppia tribologica, sia l’individuazione di tecniche di modificazione superficiale innovative quali la cementazione a bassa temperatura anche seguita dalla deposizione di un rivestimento di carbonio amorfo idrogenato a-C:H

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Analysis of the collapse of a precast r.c. industrial building during the 2012 Emilia earthquake, focus on the failure mechanisms in particular on the flexure-shear interactions. Analysis performed by a time history analysis using a FEM model with the software SAP2000. Finally a reconstruction of the collapse on the basis of the numerical data coming from the strength capacity of the elements failed, using formulation for lightly reinforced columns with high shear and bending moment.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In vielen Industriezweigen, zum Beispiel in der Automobilindustrie, werden Digitale Versuchsmodelle (Digital MockUps) eingesetzt, um die Konstruktion und die Funktion eines Produkts am virtuellen Prototypen zu überprüfen. Ein Anwendungsfall ist dabei die Überprüfung von Sicherheitsabständen einzelner Bauteile, die sogenannte Abstandsanalyse. Ingenieure ermitteln dabei für bestimmte Bauteile, ob diese in ihrer Ruhelage sowie während einer Bewegung einen vorgegeben Sicherheitsabstand zu den umgebenden Bauteilen einhalten. Unterschreiten Bauteile den Sicherheitsabstand, so muss deren Form oder Lage verändert werden. Dazu ist es wichtig, die Bereiche der Bauteile, welche den Sicherhabstand verletzen, genau zu kennen. rnrnIn dieser Arbeit präsentieren wir eine Lösung zur Echtzeitberechnung aller den Sicherheitsabstand unterschreitenden Bereiche zwischen zwei geometrischen Objekten. Die Objekte sind dabei jeweils als Menge von Primitiven (z.B. Dreiecken) gegeben. Für jeden Zeitpunkt, in dem eine Transformation auf eines der Objekte angewendet wird, berechnen wir die Menge aller den Sicherheitsabstand unterschreitenden Primitive und bezeichnen diese als die Menge aller toleranzverletzenden Primitive. Wir präsentieren in dieser Arbeit eine ganzheitliche Lösung, welche sich in die folgenden drei großen Themengebiete unterteilen lässt.rnrnIm ersten Teil dieser Arbeit untersuchen wir Algorithmen, die für zwei Dreiecke überprüfen, ob diese toleranzverletzend sind. Hierfür präsentieren wir verschiedene Ansätze für Dreiecks-Dreiecks Toleranztests und zeigen, dass spezielle Toleranztests deutlich performanter sind als bisher verwendete Abstandsberechnungen. Im Fokus unserer Arbeit steht dabei die Entwicklung eines neuartigen Toleranztests, welcher im Dualraum arbeitet. In all unseren Benchmarks zur Berechnung aller toleranzverletzenden Primitive beweist sich unser Ansatz im dualen Raum immer als der Performanteste.rnrnDer zweite Teil dieser Arbeit befasst sich mit Datenstrukturen und Algorithmen zur Echtzeitberechnung aller toleranzverletzenden Primitive zwischen zwei geometrischen Objekten. Wir entwickeln eine kombinierte Datenstruktur, die sich aus einer flachen hierarchischen Datenstruktur und mehreren Uniform Grids zusammensetzt. Um effiziente Laufzeiten zu gewährleisten ist es vor allem wichtig, den geforderten Sicherheitsabstand sinnvoll im Design der Datenstrukturen und der Anfragealgorithmen zu beachten. Wir präsentieren hierzu Lösungen, die die Menge der zu testenden Paare von Primitiven schnell bestimmen. Darüber hinaus entwickeln wir Strategien, wie Primitive als toleranzverletzend erkannt werden können, ohne einen aufwändigen Primitiv-Primitiv Toleranztest zu berechnen. In unseren Benchmarks zeigen wir, dass wir mit unseren Lösungen in der Lage sind, in Echtzeit alle toleranzverletzenden Primitive zwischen zwei komplexen geometrischen Objekten, bestehend aus jeweils vielen hunderttausend Primitiven, zu berechnen. rnrnIm dritten Teil präsentieren wir eine neuartige, speicheroptimierte Datenstruktur zur Verwaltung der Zellinhalte der zuvor verwendeten Uniform Grids. Wir bezeichnen diese Datenstruktur als Shrubs. Bisherige Ansätze zur Speicheroptimierung von Uniform Grids beziehen sich vor allem auf Hashing Methoden. Diese reduzieren aber nicht den Speicherverbrauch der Zellinhalte. In unserem Anwendungsfall haben benachbarte Zellen oft ähnliche Inhalte. Unser Ansatz ist in der Lage, den Speicherbedarf der Zellinhalte eines Uniform Grids, basierend auf den redundanten Zellinhalten, verlustlos auf ein fünftel der bisherigen Größe zu komprimieren und zur Laufzeit zu dekomprimieren.rnrnAbschießend zeigen wir, wie unsere Lösung zur Berechnung aller toleranzverletzenden Primitive Anwendung in der Praxis finden kann. Neben der reinen Abstandsanalyse zeigen wir Anwendungen für verschiedene Problemstellungen der Pfadplanung.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objectives: To assess the biological and technical complication rates of single crowns on vital teeth (SC-V), endodontically treated teeth without post and core (SC-E), with a cast post and core (SC-PC) and on implants (SC-I). Material and methods: From 392 patients with chronic periodontitis treated and documented by graduate students during the period from 1978 to 2002, 199 were reexamined during 2005 for this retrospective cohort study, and 64 of these patients were treated with SCs. Statistical analysis included Kaplan–Meier survival functions and event rates per 100 years of object-time. Poisson regression was used to compare the four groups of crowns with respect to the incidence rate ratio of failures, and failures and complications combined over 10 years and the entire observation period. Results: Forty-one (64%) female and 23 (36%) male patients participated in the reexamination. At the time of seating the crowns, the mean patient age was 46.8 (range 24–66.3) years. One hundred and sixty-eight single unit crowns were incorporated. Their mean follow-up time was 11.8 (range 0.8–26.4) years. During the time of observation, 22 biological and 11 technical complications occurred; 19 SC were lost. The chance for SC-V (56) to remain free of any failure or complication was 89.3% (95% confidence interval [CI] 76.1–95.4) after 10 years, 85.8% (95% CI 66–94.5) for SC-E (34), 75.9% for SC-PC (39), (95% CI 58.8–86.7) and 66.2% (95% CI 45.1–80.7) for SC-I (39). Over 10 years, 95% of SC-I remained free of failure and demonstrated a cumulative incidence of failure or complication of 34%. Compared with SC-E, SC-I were 3.5 times more likely to yield failures or complications and SC-PC failed 1.7 times more frequently than did SC-E. SC-V had the lowest rate of failures or complications over the 10 years. Conclusions: While SCs on vital teeth have the best prognosis, those on endodontically treated teeth have a slightly poorer prognosis over 10 years. Crowns on teeth with post and cores and implant-supported SCs displayed the highest incidence of failures and complications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

To (1) establish the feasibility of texture analysis for the in vivo assessment of biochemical changes in meniscal tissue on delayed gadolinium-enhanced magnetic resonance imaging of cartilage (dGEMRIC), and (2) compare textural with conventional T1 relaxation time measurements calculated from dGEMRIC data ("T1(Gd) relaxation times").

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objective  To assess the outcome of patients who experienced treatment failure with antiretrovirals in sub-Saharan Africa. Methods  Analysis of 11 antiretroviral therapy (ART) programmes in sub-Saharan Africa. World Health Organization (WHO) criteria were used to define treatment failure. All ART-naive patients aged ≥16 who started with a non-nucleoside reverse transcriptase inhibitor (NNRTI)-based regimen and had at least 6 months of follow-up were eligible. For each patient who switched to a second-line regimen, 10 matched patients who remained on a non-failing first-line regimen were selected. Time was measured from the time of switching, from the corresponding time in matched patients, or from the time of treatment failure in patients who remained on a failing regimen. Mortality was analysed using Kaplan–Meier curves and random-effects Cox models. Results  Of 16 591 adult patients starting ART, 382 patients (2.3%) switched to a second-line regimen. Another 323 patients (1.9%) did not switch despite developing immunological or virological failure. Cumulative mortality at 1 year was 4.2% (95% CI 2.2–7.8%) in patients who switched to a second-line regimen and 11.7% (7.3%–18.5%) in patients who remained on a failing first-line regimen, compared to 2.2% (1.6–3.0%) in patients on a non-failing first-line regimen (P < 0.0001). Differences in mortality were not explained by nadir CD4 cell count, age or differential loss to follow up. Conclusions  Many patients who meet criteria for treatment failure do not switch to a second-line regimen and die. There is an urgent need to clarify the reasons why in sub-Saharan Africa many patients remain on failing first-line ART.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

ROTEM(®) is considered a helpful point-of-care device to monitor blood coagulation. Centrally performed analysis is desirable but rapid transport of blood samples and real-time transmission of graphic results are an important prerequisite. The effect of sample transport through a pneumatic tube system on ROTEM(®) results is unknown. The aims of the present work were (i) to determine the influence of blood sample transport through a pneumatic tube system on ROTEM(®) parameters compared to manual transportation, and (ii) to verify whether graphic results can be transmitted on line via virtual network computing using local area network to the physician in charge of the patient.