942 resultados para Linear Static Analysis
Resumo:
ABSTRACT (italiano) Con crescente attenzione riguardo al problema della sicurezza di ponti e viadotti esistenti nei Paesi Bassi, lo scopo della presente tesi è quello di studiare, mediante la modellazione con Elementi Finiti ed il continuo confronto con risultati sperimentali, la risposta in esercizio di elementi che compongono infrastrutture del genere, ovvero lastre in calcestruzzo armato sollecitate da carichi concentrati. Tali elementi sono caratterizzati da un comportamento ed una crisi per taglio, la cui modellazione è, da un punto di vista computazionale, una sfida piuttosto ardua, a causa del loro comportamento fragile combinato a vari effetti tridimensionali. La tesi è incentrata sull'utilizzo della Sequentially Linear Analysis (SLA), un metodo di soluzione agli Elementi Finiti alternativo rispetto ai classici approcci incrementali e iterativi. Il vantaggio della SLA è quello di evitare i ben noti problemi di convergenza tipici delle analisi non lineari, specificando direttamente l'incremento di danno sull'elemento finito, attraverso la riduzione di rigidezze e resistenze nel particolare elemento finito, invece dell'incremento di carico o di spostamento. Il confronto tra i risultati di due prove di laboratorio su lastre in calcestruzzo armato e quelli della SLA ha dimostrato in entrambi i casi la robustezza del metodo, in termini di accuratezza dei diagrammi carico-spostamento, di distribuzione di tensioni e deformazioni e di rappresentazione del quadro fessurativo e dei meccanismi di crisi per taglio. Diverse variazioni dei più importanti parametri del modello sono state eseguite, evidenziando la forte incidenza sulle soluzioni dell'energia di frattura e del modello scelto per la riduzione del modulo elastico trasversale. Infine è stato effettuato un paragone tra la SLA ed il metodo non lineare di Newton-Raphson, il quale mostra la maggiore affidabilità della SLA nella valutazione di carichi e spostamenti ultimi insieme ad una significativa riduzione dei tempi computazionali. ABSTRACT (english) With increasing attention to the assessment of safety in existing dutch bridges and viaducts, the aim of the present thesis is to study, through the Finite Element modeling method and the continuous comparison with experimental results, the real response of elements that compose these infrastructures, i.e. reinforced concrete slabs subjected to concentrated loads. These elements are characterized by shear behavior and crisis, whose modeling is, from a computational point of view, a hard challenge, due to their brittle behavior combined with various 3D effects. The thesis is focused on the use of Sequentially Linear Analysis (SLA), an alternative solution technique to classical non linear Finite Element analyses that are based on incremental and iterative approaches. The advantage of SLA is to avoid the well-known convergence problems of non linear analyses by directly specifying a damage increment, in terms of a reduction of stiffness and strength in the particular finite element, instead of a load or displacement increment. The comparison between the results of two laboratory tests on reinforced concrete slabs and those obtained by SLA has shown in both the cases the robustness of the method, in terms of accuracy of load-displacements diagrams, of the distribution of stress and strain and of the representation of the cracking pattern and of the shear failure mechanisms. Different variations of the most important parameters have been performed, pointing out the strong incidence on the solutions of the fracture energy and of the chosen shear retention model. At last a confrontation between SLA and the non linear Newton-Raphson method has been executed, showing the better reliability of the SLA in the evaluation of the ultimate loads and displacements, together with a significant reduction of computational times.
Resumo:
English: The assessment of safety in existing bridges and viaducts led the Ministry of Public Works of the Netherlands to finance a specific campaing aimed at the study of the response of the elements of these infrastructures. Therefore, this activity is focused on the investigation of the behaviour of reinforced concrete slabs under concentrated loads, adopting finite element modeling and comparison with experimental results. These elements are characterized by shear behaviour and crisi, whose modeling is, from a computational point of view, a hard challeng, due to the brittle behavior combined with three-dimensional effects. The numerical modeling of the failure is studied through Sequentially Linear Analysis (SLA), an alternative Finite Element method, with respect to traditional incremental and iterative approaches. The comparison between the two different numerical techniques represents one of the first works and comparisons in a three-dimensional environment. It's carried out adopting one of the experimental test executed on reinforced concrete slabs as well. The advantage of the SLA is to avoid the well known problems of convergence of typical non-linear analysis, by directly specifying a damage increment, in terms of reduction of stiffness and resistance in particular finite element, instead of load or displacement increasing on the whole structure . For the first time, particular attention has been paid to specific aspects of the slabs, like an accurate constraints modeling and sensitivity of the solution with respect to the mesh density. This detailed analysis with respect to the main parameters proofed a strong influence of the tensile fracture energy, mesh density and chosen model on the solution in terms of force-displacement diagram, distribution of the crack patterns and shear failure mode. The SLA showed a great potential, but it requires a further developments for what regards two aspects of modeling: load conditions (constant and proportional loads) and softening behaviour of brittle materials (like concrete) in the three-dimensional field, in order to widen its horizons in these new contexts of study.
Resumo:
Shell structure is widely used in engineering area. The purpose of this dissertation is to show the behavior of a thin shell under external load, especially for long cylindrical shell under compressive load, I analyzed not only for linear elastic problem and also for buckling problem, and by using finite element analysis it shows that the imperfection of a cylinder could affect the critical load which means the buckling capability of this cylinder. For linear elastic problem, I compared the theoretical results with the results got from Straus7 and Abaqus, and the results are really close. For the buckling problem I did the same: compared the theoretical and Abaqus results, the error is less than 1%, but in reality, it’s not possible to reach the theoretical buckling capability due to the imperfection of the cylinder, so I put different imperfection for the cylinder in Abaqus, and found out that with the increasing of the percentage of imperfection, the buckling capability decreases, for example 10% imperfection could decrease 40% of the buckling capability, and the outcome meet the buckling behavior in reality.
Resumo:
Fuel Cells are a promising alternative energy technology. One of the biggest problems that exists in fuel cell is that of water management. A better understanding of wettability characteristics in the fuel cells is needed to alleviate the problem of water management. Contact angle data on gas diffusion layers (GDL) of the fuel cells can be used to characterize the wettability of GDL in fuel cells. A contact angle measurement program has been developed to measure the contact angle of sessile drops from drop images. Digitization of drop images induces pixel errors in the contact angle measurement process. The resulting uncertainty in contact angle measurement has been analyzed. An experimental apparatus has been developed for contact angle measurements at different temperature, with the feature to measure advancing and receding contact angles on gas diffusion layers of fuel cells.
Resumo:
In this thesis, we consider Bayesian inference on the detection of variance change-point models with scale mixtures of normal (for short SMN) distributions. This class of distributions is symmetric and thick-tailed and includes as special cases: Gaussian, Student-t, contaminated normal, and slash distributions. The proposed models provide greater flexibility to analyze a lot of practical data, which often show heavy-tail and may not satisfy the normal assumption. As to the Bayesian analysis, we specify some prior distributions for the unknown parameters in the variance change-point models with the SMN distributions. Due to the complexity of the joint posterior distribution, we propose an efficient Gibbs-type with Metropolis- Hastings sampling algorithm for posterior Bayesian inference. Thereafter, following the idea of [1], we consider the problems of the single and multiple change-point detections. The performance of the proposed procedures is illustrated and analyzed by simulation studies. A real application to the closing price data of U.S. stock market has been analyzed for illustrative purposes.
Resumo:
OBJECTIVES We sought to analyze the time course of atrial fibrillation (AF) episodes before and after circular plus linear left atrial ablation and the percentage of patients with complete freedom from AF after ablation by using serial seven-day electrocardiograms (ECGs). BACKGROUND The curative treatment of AF targets the pathophysiological corner stones of AF (i.e., the initiating triggers and/or the perpetuation of AF). The pathophysiological complexity of both may not result in an "all-or-nothing" response but may modify number and duration of AF episodes. METHODS In patients with highly symptomatic AF, circular plus linear ablation lesions were placed around the left and right pulmonary veins, between the two circles, and from the left circle to the mitral annulus using the electroanatomic mapping system. Repetitive continuous 7-day ECGs administered before and after catheter ablation were used for rhythm follow-up. RESULTS In 100 patients with paroxysmal (n = 80) and persistent (n = 20) AF, relative duration of time spent in AF significantly decreased over time (35 +/- 37% before ablation, 26 +/- 41% directly after ablation, and 10 +/- 22% after 12 months). Freedom from AF stepwise increased in patients with paroxysmal AF and after 12 months measured at 88% or 74% depending on whether 24-h ECG or 7-day ECG was used. Complete pulmonary vein isolation was demonstrated in <20% of the circular lesions. CONCLUSIONS The results obtained in patients with AF treated with circular plus linear left atrial lesions strongly indicate that substrate modification is the main underlying pathophysiologic mechanism and that it results in a delayed cure instead of an immediate cure.
Resumo:
We assessed the efficacy and the toxicity for pediatric craniopharyngioma patients of fractionated stereotactic radiotherapy (FSRT). Between May 2000 and May 2009, 9 patients (male to female ratio, 5:4) with craniopharyngiomas underwent FSRT (median dose, 54 Gy). Among the 9 patients, 6 received radiation therapy (RT) for recurrent tumors and 3 for residual disease as adjuvant therapy after incomplete surgery. Median tumor 3 volume was 2.3 cm (range, 0.1-5.8). The median target coverage was 93.7% (range 79.3-99.8%). The median conformity index was 0.94 (range, 0.6-1.4). Dose to the hippocampal region was assessed for all patients. After a median follow-up of 62.5 months (range, 32-127)the treated volume decreased in size in four of eight patients (50%). One patient was lost to follow-up. Local control and survival rates at 3 years were 100% and there were no marginal relapses. One patient, with a chronic bilateral papillary oedema after surgery, visual defect deteriorated after FSRT to a complete hemianopsia. One male patient with normal pituitary function before FSRT presented with precocious puberty at the age of 7.4 years, 24 months after FSRT. Four patients (50%) were severely obese at their last visit. FSRT is a safe treatment option for craniopharyngioma after incomplete resection.
Resumo:
Background and Aims Ongoing global warming has been implicated in shifting phenological patterns such as the timing and duration of the growing season across a wide variety of ecosystems. Linear models are routinely used to extrapolate these observed shifts in phenology into the future and to estimate changes in associated ecosystem properties such as net primary productivity. Yet, in nature, linear relationships may be special cases. Biological processes frequently follow more complex, non-linear patterns according to limiting factors that generate shifts and discontinuities, or contain thresholds beyond which responses change abruptly. This study investigates to what extent cambium phenology is associated with xylem growth and differentiation across conifer species of the northern hemisphere. Methods Xylem cell production is compared with the periods of cambial activity and cell differentiation assessed on a weekly time scale on histological sections of cambium and wood tissue collected from the stems of nine species in Canada and Europe over 1–9 years per site from 1998 to 2011. Key Results The dynamics of xylogenesis were surprisingly homogeneous among conifer species, although dispersions from the average were obviously observed. Within the range analysed, the relationships between the phenological timings were linear, with several slopes showing values close to or not statistically different from 1. The relationships between the phenological timings and cell production were distinctly non-linear, and involved an exponential pattern. Conclusions The trees adjust their phenological timings according to linear patterns. Thus, shifts of one phenological phase are associated with synchronous and comparable shifts of the successive phases. However, small increases in the duration of xylogenesis could correspond to a substantial increase in cell production. The findings suggest that the length of the growing season and the resulting amount of growth could respond differently to changes in environmental conditions.
Resumo:
Software developers are often unsure of the exact name of the method they need to use to invoke the desired behavior in a given context. This results in a process of searching for the correct method name in documentation, which can be lengthy and distracting to the developer. We can decrease the method search time by enhancing the documentation of a class with the most frequently used methods. Usage frequency data for methods is gathered by analyzing other projects from the same ecosystem - written in the same language and sharing dependencies. We implemented a proof of concept of the approach for Pharo Smalltalk and Java. In Pharo Smalltalk, methods are commonly searched for using a code browser tool called "Nautilus", and in Java using a web browser displaying HTML based documentation - Javadoc. We developed plugins for both browsers and gathered method usage data from open source projects, in order to increase developer productivity by reducing method search time. A small initial evaluation has been conducted showing promising results in improving developer productivity.
Resumo:
Osteoporotic proximal femur fractures are caused by low energy trauma, typically when falling on the hip from standing height. Finite element simulations, widely used to predict the fracture load of femora in fall, usually include neither mass-related inertial effects, nor the viscous part of bone's material behavior. The aim of this study was to elucidate if quasi-static non-linear homogenized finite element analyses can predict in vitro mechanical properties of proximal femora assessed in dynamic drop tower experiments. The case-specific numerical models of thirteen femora predicted the strength (R2=0.84, SEE=540 N, 16.2%), stiffness (R2=0.82, SEE=233 N/mm, 18.0%) and fracture energy (R2=0.72, SEE=3.85 J, 39.6%); and provided fair qualitative matches with the fracture patterns. The influence of material anisotropy was negligible for all predictions. These results suggest that quasi-static homogenized finite element analysis may be used to predict mechanical properties of proximal femora in the dynamic sideways fall situation.
Resumo:
With the recognition of the importance of evidence-based medicine, there is an emerging need for methods to systematically synthesize available data. Specifically, methods to provide accurate estimates of test characteristics for diagnostic tests are needed to help physicians make better clinical decisions. To provide more flexible approaches for meta-analysis of diagnostic tests, we developed three Bayesian generalized linear models. Two of these models, a bivariate normal and a binomial model, analyzed pairs of sensitivity and specificity values while incorporating the correlation between these two outcome variables. Noninformative independent uniform priors were used for the variance of sensitivity, specificity and correlation. We also applied an inverse Wishart prior to check the sensitivity of the results. The third model was a multinomial model where the test results were modeled as multinomial random variables. All three models can include specific imaging techniques as covariates in order to compare performance. Vague normal priors were assigned to the coefficients of the covariates. The computations were carried out using the 'Bayesian inference using Gibbs sampling' implementation of Markov chain Monte Carlo techniques. We investigated the properties of the three proposed models through extensive simulation studies. We also applied these models to a previously published meta-analysis dataset on cervical cancer as well as to an unpublished melanoma dataset. In general, our findings show that the point estimates of sensitivity and specificity were consistent among Bayesian and frequentist bivariate normal and binomial models. However, in the simulation studies, the estimates of the correlation coefficient from Bayesian bivariate models are not as good as those obtained from frequentist estimation regardless of which prior distribution was used for the covariance matrix. The Bayesian multinomial model consistently underestimated the sensitivity and specificity regardless of the sample size and correlation coefficient. In conclusion, the Bayesian bivariate binomial model provides the most flexible framework for future applications because of its following strengths: (1) it facilitates direct comparison between different tests; (2) it captures the variability in both sensitivity and specificity simultaneously as well as the intercorrelation between the two; and (3) it can be directly applied to sparse data without ad hoc correction. ^
Resumo:
Life expectancy has consistently increased over the last 150 years due to improvements in nutrition, medicine, and public health. Several studies found that in many developed countries, life expectancy continued to rise following a nearly linear trend, which was contrary to a common belief that the rate of improvement in life expectancy would decelerate and was fit with an S-shaped curve. Using samples of countries that exhibited a wide range of economic development levels, we explored the change in life expectancy over time by employing both nonlinear and linear models. We then observed if there were any significant differences in estimates between linear models, assuming an auto-correlated error structure. When data did not have a sigmoidal shape, nonlinear growth models sometimes failed to provide meaningful parameter estimates. The existence of an inflection point and asymptotes in the growth models made them inflexible with life expectancy data. In linear models, there was no significant difference in the life expectancy growth rate and future estimates between ordinary least squares (OLS) and generalized least squares (GLS). However, the generalized least squares model was more robust because the data involved time-series variables and residuals were positively correlated. ^
Resumo:
We present a novel approach for detecting severe obstructive sleep apnea (OSA) cases by introducing non-linear analysis into sustained speech characterization. The proposed scheme was designed for providing additional information into our baseline system, built on top of state-of-the-art cepstral domain modeling techniques, aiming to improve accuracy rates. This new information is lightly correlated with our previous MFCC modeling of sustained speech and uncorrelated with the information in our continuous speech modeling scheme. Tests have been performed to evaluate the improvement for our detection task, based on sustained speech as well as combined with a continuous speech classifier, resulting in a 10% relative reduction in classification for the first and a 33% relative reduction for the fused scheme. Results encourage us to consider the existence of non-linear effects on OSA patients' voices, and to think about tools which could be used to improve short-time analysis.