931 resultados para success models comparison


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background The development of protocols for RNA extraction from paraffin-embedded samples facilitates gene expression studies on archival samples with known clinical outcome. Older samples are particularly valuable because they are associated with longer clinical follow up. RNA extracted from formalin-fixed paraffin-embedded (FFPE) tissue is problematic due to chemical modifications and continued degradation over time. We compared quantity and quality of RNA extracted by four different protocols from 14 ten year old and 14 recently archived (three to ten months old) FFPE breast cancer tissues. Using three spin column purification-based protocols and one magnetic bead-based protocol, total RNA was extracted in triplicate, generating 336 RNA extraction experiments. RNA fragment size was assayed by reverse transcription-polymerase chain reaction (RT-PCR) for the housekeeping gene glucose-6-phosphate dehydrogenase (G6PD), testing primer sets designed to target RNA fragment sizes of 67 bp, 151 bp, and 242 bp. Results Biologically useful RNA (minimum RNA integrity number, RIN, 1.4) was extracted in at least one of three attempts of each protocol in 86–100% of older and 100% of recently archived ("months old") samples. Short RNA fragments up to 151 bp were assayable by RT-PCR for G6PD in all ten year old and months old tissues tested, but none of the ten year old and only 43% of months old samples showed amplification if the targeted fragment was 242 bp. Conclusion All protocols extracted RNA from ten year old FFPE samples with a minimum RIN of 1.4. Gene expression of G6PD could be measured in all samples, old and recent, using RT-PCR primers designed for RNA fragments up to 151 bp. RNA quality from ten year old FFPE samples was similar to that extracted from months old samples, but quantity and success rate were generally higher for the months old group. We preferred the magnetic bead-based protocol because of its speed and higher quantity of extracted RNA, although it produced similar quality RNA to other protocols. If a chosen protocol fails to extract biologically useful RNA from a given sample in a first attempt, another attempt and then another protocol should be tried before excluding the case from molecular analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The great challenges for researchers working in the field of vaccinology are optimizing DNA vaccines for use in humans or large animals and creating effective single-dose vaccines using appropriated controlled delivery systems. Plasmid DNA encoding the heat-shock protein 65 (hsp65) (DNAhsp65) has been shown to induce protective and therapeutic immune responses in a murine model of tuberculosis (TB). Despite the success of naked DNAhsp65-based vaccine to protect mice against TB, it requires multiple doses of high amounts of DNA for effective immunization. In order to optimize this DNA vaccine and simplify the vaccination schedule, we coencapsulated DNAhsp65 and the adjuvant trehalose dimycolate (TDM) into biodegradable poly (DL-lactide-co-glycolide) (PLGA) microspheres for a single dose administration. Moreover, a single-shot prime-boost vaccine formulation based on a mixture of two different PLGA microspheres, presenting faster and slower release of, respectively, DNAhsp65 and the recombinant hsp65 protein was also developed. These formulations were tested in mice as well as in guinea pigs by comparison with the efficacy and toxicity induced by the naked DNA preparation or BCG. The single-shot prime-boost formulation clearly presented good efficacy and diminished lung pathology in both mice and guinea pigs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the results of a simulation using physical objects. This concept integrates the physical dimensions of an entity such as length, width, and weight, with the usual process flow paradigm, recurrent in the discrete event simulation models. Based on a naval logistics system, we applied this technique in an access channel of the largest port of Latin America. This system is composed by vessel movement constrained by the access channel dimensions. Vessel length and width dictates whether it is safe or not to have one or two ships simultaneously. The success delivered by the methodology proposed was an accurate validation of the model, approximately 0.45% of deviation, when compared to real data. Additionally, the model supported the design of new terminals operations for Santos, delivering KPIs such as: canal utilization, queue time, berth utilization, and throughput capability

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation takes a step towards providing a better understanding of post-socialist welfare state development from a theoretical as well as an empirical perspective. The overall analytical goal of this thesis has been to critically assess the development of social policies in Estonia, Latvia and Lithuania using them as illustrative examples of post-socialist welfare state development in the light of the theories, approaches and typologies that have been developed to study affluent capitalist democracies. The four studies included in this dissertation aspire to a common aim in a number of specific ways. The first study tries to place the ideal-typical welfare state models of the Baltic States within the well-known welfare state typologies. At the same time, it provides a rich overview of the main social security institutions in the three countries by comparing them with each other and with the previous structures of the Soviet period. It examines the social insurance institutions of the Baltic States (old-age pensions, unemployment insurance, short-term benefits, sickness, maternity and parental insurance and family benefits) with respect to conditions of eligibility, replacement rates, financing and contributions. The findings of this study indicate that the Latvian social security system can generally be labelled as a mix of the basic security and corporatist models. The Estonian social security system can generally also be characterised as a mix of the basic security and corporatist models, even if there are some weak elements of the targeted model in it. It appears that the institutional changes developing in the social security system of Lithuania have led to a combination of the basic security and targeted models of the welfare state. Nevertheless, as the example of the three Baltic States shows, there is diversity in how these countries solve problems within the field of social policy. In studying the social security schemes in detail, some common features were found that could be attributed to all three countries. Therefore, the critical analysis of the main social security institutions of the Baltic States in this study gave strong supporting evidence in favour of identifying the post-socialist regime type that is already gaining acceptance within comparative welfare state research. Study Two compares the system of social maintenance and insurance in the Soviet Union, which was in force in the three Baltic countries before their independence, with the currently existing social security systems. The aim of the essay is to highlight the forces that have influenced the transformation of the social policy from its former highly universal, albeit authoritarian, form, to the less universal, social insurance-based systems of present-day Estonia, Latvia and Lithuania. This study demonstrates that the welfare–economy nexus is not the only important factor in the development of social programs. The results of this analysis revealed that people's attitudes towards distributive justice and the developmental level of civil society also play an important part in shaping social policies. The shift to individualism in people’s mentality and the decline of the labour movement, or, to be more precise, the decline in trade union membership and influence, does nothing to promote the development of social rights in the Baltic countries and hinders the expansion of social policies. The legacy of the past has been another important factor in shaping social programs. It can be concluded that social policy should be studied as if embedded not only in the welfare-economy nexus, but also in the societal, historical and cultural nexus of a given society. Study Three discusses the views of the state elites on family policy within a wider theoretical setting covering family policy and social policy in a broader sense and attempts to expand this analytical framework to include other post-socialist countries. The aim of this essay is to explore the various views of the state elites in the Baltics concerning family policy and, in particular, family benefits as one of the possible explanations for the observed policy differences. The qualitative analyses indicate that the Baltic States differ significantly with regard to the motives behind their family policies. Lithuanian decision-makers seek to reduce poverty among families with children and enhance the parents’ responsibility for bringing up their children. Latvian policy-makers act so as to increase the birth rate and create equal opportunities for children from all families. Estonian policy-makers seek to create equal opportunities for all children and the desire to enhance gender equality is more visible in the case of Estonia in comparison with the other two countries. It is strongly arguable that there is a link between the underlying motives and the kinds of family benefits in a given country. This study, thus, indicates how intimately the attitudes of the state bureaucrats, policy-makers, political elite and researchers shape social policy. It confirms that family policy is a product of the prevailing ideology within a country, while the potential influence of globalisation and Europeanisation is detectable too. The final essay takes into account the opinions of welfare users and examines the performances of the institutionalised family benefits by relying on the recipients’ opinions regarding these benefits. The opinions of the populations as a whole regarding government efforts to help families are compared with those of the welfare users. Various family benefits are evaluated according to the recipients' satisfaction with those benefits as well as the contemporaneous levels of subjective satisfaction with the welfare programs related to the absolute level of expenditure on each program. The findings of this paper indicate that, in Latvia, people experience a lower level of success regarding state-run family insurance institutions, as compared to those in Lithuania and Estonia. This is deemed to be because the cash benefits for families and children in Latvia are, on average, seen as marginally influencing the overall financial situation of the families concerned. In Lithuania and Estonia, the overwhelming majority think that the family benefit systems improve the financial situation of families. It appears that recipients evaluated universal family benefits as less positive than targeted benefits. Some universal benefits negatively influenced the level of general satisfaction with the family benefits system provided in the countries being researched. This study puts forward a discussion about whether universalism is always more legitimate than targeting. In transitional economies, in which resources are highly constrained, some forms of universal benefits could turn out to be very expensive in relative terms, without being seen as useful or legitimate forms of help to families. In sum, by closely examining the different aspects of social policy, this dissertation goes beyond the over-generalisation of Eastern European welfare state development and, instead, takes a more detailed look at what is really going on in these countries through the examples of Lithuania, Latvia and Estonia. In addition, another important contribution made by this study is that it revives ‘western’ theoretical knowledge through ‘eastern’ empirical evidence and provides the opportunity to expand the theoretical framework for post-socialist societies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advances in stem cell biology have challenged the notion that infarcted myocardium is irreparable. The pluripotent ability of stem cells to differentiate into specialized cell lines began to garner intense interest within cardiology when it was shown in animal models that intramyocardial injection of bone marrow stem cells (MSCs), or the mobilization of bone marrow stem cells with spontaneous homing to myocardium, could improve cardiac function and survival after induced myocardial infarction (MI) [1, 2]. Furthermore, the existence of stem cells in myocardium has been identified in animal heart [3, 4], and intense research is under way in an attempt to clarify their potential clinical application for patients with myocardial infarction. To date, in order to identify the best one, different kinds of stem cells have been studied; these have been derived from embryo or adult tissues (i.e. bone marrow, heart, peripheral blood etc.). Currently, three different biologic therapies for cardiovascular diseases are under investigation: cell therapy, gene therapy and the more recent “tissue-engineering” therapy . During my Ph.D. course, first I focalised my study on the isolation and characterization of Cardiac Stem Cells (CSCs) in wild-type and transgenic mice and for this purpose I attended, for more than one year, the Cardiovascular Research Institute of the New York Medical College, in Valhalla (NY, USA) under the direction of Doctor Piero Anversa. During this period I learnt different Immunohistochemical and Biomolecular techniques, useful for investigating the regenerative potential of stem cells. Then, during the next two years, I studied the new approach of cardiac regenerative medicine based on “tissue-engineering” in order to investigate a new strategy to regenerate the infracted myocardium. Tissue-engineering is a promising approach that makes possible the creation of new functional tissue to replace lost or failing tissue. This new discipline combines isolated functioning cells and biodegradable 3-dimensional (3D) polymeric scaffolds. The scaffold temporarily provides the biomechanical support for the cells until they produce their own extracellular matrix. Because tissue-engineering constructs contain living cells, they may have the potential for growth and cellular self-repair and remodeling. In the present study, I examined whether the tissue-engineering strategy within hyaluron-based scaffolds would result in the formation of alternative cardiac tissue that could replace the scar and improve cardiac function after MI in syngeneic heterotopic rat hearts. Rat hearts were explanted, subjected to left coronary descending artery occlusion, and then grafted into the abdomen (aorta-aorta anastomosis) of receiving syngeneic rat. After 2 weeks, a pouch of 3 mm2 was made in the thickness of the ventricular wall at the level of the post-infarction scar. The hyaluronic scaffold, previously engineered for 3 weeks with rat MSCs, was introduced into the pouch and the myocardial edges sutured with few stitches. Two weeks later we evaluated the cardiac function by M-Mode echocardiography and the myocardial morphology by microscope analysis. We chose bone marrow-derived mensenchymal stem cells (MSCs) because they have shown great signaling and regenerative properties when delivered to heart tissue following a myocardial infarction (MI). However, while the object of cell transplantation is to improve ventricular function, cardiac cell transplantation has had limited success because of poor graft viability and low cell retention, that’s why we decided to combine MSCs with a biopolimeric scaffold. At the end of the experiments we observed that the hyaluronan fibres had not been substantially degraded 2 weeks after heart-transplantation. Most MSCs had migrated to the surrounding infarcted area where they were especially found close to small-sized vessels. Scar tissue was moderated in the engrafted region and the thickness of the corresponding ventricular wall was comparable to that of the non-infarcted remote area. Also, the left ventricular shortening fraction, evaluated by M-Mode echocardiography, was found a little bit increased when compared to that measured just before construct transplantation. Therefore, this study suggests that post-infarction myocardial remodelling can be favourably affected by the grafting of MSCs delivered through a hyaluron-based scaffold

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. The surgical treatment of dysfunctional hips is a severe condition for the patient and a costly therapy for the public health. Hip resurfacing techniques seem to hold the promise of various advantages over traditional THR, with particular attention to young and active patients. Although the lesson provided in the past by many branches of engineering is that success in designing competitive products can be achieved only by predicting the possible scenario of failure, to date the understanding of the implant quality is poorly pre-clinically addressed. Thus revision is the only delayed and reliable end point for assessment. The aim of the present work was to model the musculoskeletal system so as to develop a protocol for predicting failure of hip resurfacing prosthesis. Methods. Preliminary studies validated the technique for the generation of subject specific finite element (FE) models of long bones from Computed Thomography data. The proposed protocol consisted in the numerical analysis of the prosthesis biomechanics by deterministic and statistic studies so as to assess the risk of biomechanical failure on the different operative conditions the implant might face in a population of interest during various activities of daily living. Physiological conditions were defined including the variability of the anatomy, bone densitometry, surgery uncertainties and published boundary conditions at the hip. The protocol was tested by analysing a successful design on the market and a new prototype of a resurfacing prosthesis. Results. The intrinsic accuracy of models on bone stress predictions (RMSE < 10%) was aligned to the current state of the art in this field. The accuracy of prediction on the bone-prosthesis contact mechanics was also excellent (< 0.001 mm). The sensitivity of models prediction to uncertainties on modelling parameter was found below 8.4%. The analysis of the successful design resulted in a very good agreement with published retrospective studies. The geometry optimisation of the new prototype lead to a final design with a low risk of failure. The statistical analysis confirmed the minimal risk of the optimised design over the entire population of interest. The performances of the optimised design showed a significant improvement with respect to the first prototype (+35%). Limitations. On the authors opinion the major limitation of this study is on boundary conditions. The muscular forces and the hip joint reaction were derived from the few data available in the literature, which can be considered significant but hardly representative of the entire variability of boundary conditions the implant might face over the patients population. This moved the focus of the research on modelling the musculoskeletal system; the ongoing activity is to develop subject-specific musculoskeletal models of the lower limb from medical images. Conclusions. The developed protocol was able to accurately predict known clinical outcomes when applied to a well-established device and, to support the design optimisation phase providing important information on critical characteristics of the patients when applied to a new prosthesis. The presented approach does have a relevant generality that would allow the extension of the protocol to a large set of orthopaedic scenarios with minor changes. Hence, a failure mode analysis criterion can be considered a suitable tool in developing new orthopaedic devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deals with an investigation of combinatorial and robust optimisation models to solve railway problems. Railway applications represent a challenging area for operations research. In fact, most problems in this context can be modelled as combinatorial optimisation problems, in which the number of feasible solutions is finite. Yet, despite the astonishing success in the field of combinatorial optimisation, the current state of algorithmic research faces severe difficulties with highly-complex and data-intensive applications such as those dealing with optimisation issues in large-scale transportation networks. One of the main issues concerns imperfect information. The idea of Robust Optimisation, as a way to represent and handle mathematically systems with not precisely known data, dates back to 1970s. Unfortunately, none of those techniques proved to be successfully applicable in one of the most complex and largest in scale (transportation) settings: that of railway systems. Railway optimisation deals with planning and scheduling problems over several time horizons. Disturbances are inevitable and severely affect the planning process. Here we focus on two compelling aspects of planning: robust planning and online (real-time) planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advent of distributed and heterogeneous systems has laid the foundation for the birth of new architectural paradigms, in which many separated and autonomous entities collaborate and interact to the aim of achieving complex strategic goals, impossible to be accomplished on their own. A non exhaustive list of systems targeted by such paradigms includes Business Process Management, Clinical Guidelines and Careflow Protocols, Service-Oriented and Multi-Agent Systems. It is largely recognized that engineering these systems requires novel modeling techniques. In particular, many authors are claiming that an open, declarative perspective is needed to complement the closed, procedural nature of the state of the art specification languages. For example, the ConDec language has been recently proposed to target the declarative and open specification of Business Processes, overcoming the over-specification and over-constraining issues of classical procedural approaches. On the one hand, the success of such novel modeling languages strongly depends on their usability by non-IT savvy: they must provide an appealing, intuitive graphical front-end. On the other hand, they must be prone to verification, in order to guarantee the trustworthiness and reliability of the developed model, as well as to ensure that the actual executions of the system effectively comply with it. In this dissertation, we claim that Computational Logic is a suitable framework for dealing with the specification, verification, execution, monitoring and analysis of these systems. We propose to adopt an extended version of the ConDec language for specifying interaction models with a declarative, open flavor. We show how all the (extended) ConDec constructs can be automatically translated to the CLIMB Computational Logic-based language, and illustrate how its corresponding reasoning techniques can be successfully exploited to provide support and verification capabilities along the whole life cycle of the targeted systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High spectral resolution radiative transfer (RT) codes are essential tools in the study of the radiative energy transfer in the Earth atmosphere and a support for the development of parameterizations for fast RT codes used in climate and weather prediction models. Cirrus clouds cover permanently 30% of the Earth's surface, representing an important contribution to the Earth-atmosphere radiation balance. The work has been focussed on the development of the RT model LBLMS. The model, widely tested in the infra-red spectral range, has been extended to the short wave spectrum and it has been used in comparison with airborne and satellite measurements to study the optical properties of cirrus clouds. A new database of single scattering properties has been developed for mid latitude cirrus clouds. Ice clouds are treated as a mixture of ice crystals with various habits. The optical properties of the mixture are tested in comparison to radiometric measurements in selected case studies. Finally, a parameterization of the mixture for application to weather prediction and global circulation models has been developed. The bulk optical properties of ice crystals are parameterized as functions of the effective dimension of measured particle size distributions that are representative of mid latitude cirrus clouds. Tests with the Limited Area Weather Prediction model COSMO have shown the impact of the new parameterization with respect to cirrus cloud optical properties based on ice spheres.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ziel der Arbeit war die Quantifizierung einer Reihe von Lebenszyklusmerkmalen der beiden tropischen Grasmückenarten Sylvia boehmi und S. lugens (Aves: Sylviidae; frühere Gattung Parisoma). 13 Brutpaare beider Arten wurden von 2000 bis 2002 in Kenia beobachtet. Die Daten wurden mit multivariater Statistik und multistate mark-recapture Modellen ausgewertet. Die Lebenszyklusmerkmale der beiden untersuchten Sylvia Arten sind im Vergleich zu den temperaten Sylvia-Arten gekennzeichnet durch kleine Gelege von zwei Eiern, lange Inkubationsperioden (S. boehmi (b.) 15.0 Tage, S. lugens (l.) 14.5 Tage), lange Nestlingsperioden (b. 12.9 Tage, l. 16.0 Tage), und niedrige Nesterfolgsraten (b. 19.4%, l. 33.2%). Der Zeitraum vom Ausfliegen der Jungen bis zu ihrer Unabhängigkeit war mit 58.5 Tagen bei S. boehmi und 37.5 Tagen bei S. lugens vergleichsweise lang und die Überlebensrate der flüggen Jungen in dieser Zeit war relativ hoch (b. 69.2%, l. 55.4%). Die jährliche Überlebensrate der brütenden adulten Tiere betrug bei S. boehmi 71.2% und bei S. lugens 57.2%. Die Saisonalität des Habitats, bedingt durch Regen- und Trockenzeiten, hatte keinen Einfluss auf die monatliche Überlebensrate im Laufe eines Jahres. Trotz hoher Nestprädationsraten gab es keinen klaren Zusammenhang zwischen Prädation und Fütterungsrate, Nestbewachung oder Neststandort.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The "sustainability" concept relates to the prolonging of human economic systems with as little detrimental impact on ecological systems as possible. Construction that exhibits good environmental stewardship and practices that conserve resources in a manner that allow growth and development to be sustained for the long-term without degrading the environment are indispensable in a developed society. Past, current and future advancements in asphalt as an environmentally sustainable paving material are especially important because the quantities of asphalt used annually in Europe as well as in the U.S. are large. The asphalt industry is still developing technological improvements that will reduce the environmental impact without affecting the final mechanical performance. Warm mix asphalt (WMA) is a type of asphalt mix requiring lower production temperatures compared to hot mix asphalt (HMA), while aiming to maintain the desired post construction properties of traditional HMA. Lowering the production temperature reduce the fuel usage and the production of emissions therefore and that improve conditions for workers and supports the sustainable development. Even the crumb-rubber modifier (CRM), with shredded automobile tires and used in the United States since the mid 1980s, has proven to be an environmentally friendly alternative to conventional asphalt pavement. Furthermore, the use of waste tires is not only relevant in an environmental aspect but also for the engineering properties of asphalt [Pennisi E., 1992]. This research project is aimed to demonstrate the dual value of these Asphalt Mixes in regards to the environmental and mechanical performance and to suggest a low environmental impact design procedure. In fact, the use of eco-friendly materials is the first phase towards an eco-compatible design but it cannot be the only step. The eco-compatible approach should be extended also to the design method and material characterization because only with these phases is it possible to exploit the maximum potential properties of the used materials. Appropriate asphalt concrete characterization is essential and vital for realistic performance prediction of asphalt concrete pavements. Volumetric (Mix design) and mechanical (Permanent deformation and Fatigue performance) properties are important factors to consider. Moreover, an advanced and efficient design method is necessary in order to correctly use the material. A design method such as a Mechanistic-Empirical approach, consisting of a structural model capable of predicting the state of stresses and strains within the pavement structure under the different traffic and environmental conditions, was the application of choice. In particular this study focus on the CalME and its Incremental-Recursive (I-R) procedure, based on damage models for fatigue and permanent shear strain related to the surface cracking and to the rutting respectively. It works in increments of time and, using the output from one increment, recursively, as input to the next increment, predicts the pavement conditions in terms of layer moduli, fatigue cracking, rutting and roughness. This software procedure was adopted in order to verify the mechanical properties of the study mixes and the reciprocal relationship between surface layer and pavement structure in terms of fatigue and permanent deformation with defined traffic and environmental conditions. The asphalt mixes studied were used in a pavement structure as surface layer of 60 mm thickness. The performance of the pavement was compared to the performance of the same pavement structure where different kinds of asphalt concrete were used as surface layer. In comparison to a conventional asphalt concrete, three eco-friendly materials, two warm mix asphalt and a rubberized asphalt concrete, were analyzed. The First Two Chapters summarize the necessary steps aimed to satisfy the sustainable pavement design procedure. In Chapter I the problem of asphalt pavement eco-compatible design was introduced. The low environmental impact materials such as the Warm Mix Asphalt and the Rubberized Asphalt Concrete were described in detail. In addition the value of a rational asphalt pavement design method was discussed. Chapter II underlines the importance of a deep laboratory characterization based on appropriate materials selection and performance evaluation. In Chapter III, CalME is introduced trough a specific explanation of the different equipped design approaches and specifically explaining the I-R procedure. In Chapter IV, the experimental program is presented with a explanation of test laboratory devices adopted. The Fatigue and Rutting performances of the study mixes are shown respectively in Chapter V and VI. Through these laboratory test data the CalME I-R models parameters for Master Curve, fatigue damage and permanent shear strain were evaluated. Lastly, in Chapter VII, the results of the asphalt pavement structures simulations with different surface layers were reported. For each pavement structure, the total surface cracking, the total rutting, the fatigue damage and the rutting depth in each bound layer were analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research has focused on the study of the behavior and of the collapse of masonry arch bridges. The latest decades have seen an increasing interest in this structural type, that is still present and in use, despite the passage of time and the variation of the transport means. Several strategies have been developed during the time to simulate the response of this type of structures, although even today there is no generally accepted standard one for assessment of masonry arch bridges. The aim of this thesis is to compare the principal analytical and numerical methods existing in literature on case studies, trying to highlight values and weaknesses. The methods taken in exam are mainly three: i) the Thrust Line Analysis Method; ii) the Mechanism Method; iii) the Finite Element Methods. The Thrust Line Analysis Method and the Mechanism Method are analytical methods and derived from two of the fundamental theorems of the Plastic Analysis, while the Finite Element Method is a numerical method, that uses different strategies of discretization to analyze the structure. Every method is applied to the case study through computer-based representations, that allow a friendly-use application of the principles explained. A particular closed-form approach based on an elasto-plastic material model and developed by some Belgian researchers is also studied. To compare the three methods, two different case study have been analyzed: i) a generic masonry arch bridge with a single span; ii) a real masonry arch bridge, the Clemente Bridge, built on Savio River in Cesena. In the analyses performed, all the models are two-dimensional in order to have results comparable between the different methods taken in exam. The different methods have been compared with each other in terms of collapse load and of hinge positions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advances that have been characterizing spatial econometrics in recent years are mostly theoretical and have not found an extensive empirical application yet. In this work we aim at supplying a review of the main tools of spatial econometrics and to show an empirical application for one of the most recently introduced estimators. Despite the numerous alternatives that the econometric theory provides for the treatment of spatial (and spatiotemporal) data, empirical analyses are still limited by the lack of availability of the correspondent routines in statistical and econometric software. Spatiotemporal modeling represents one of the most recent developments in spatial econometric theory and the finite sample properties of the estimators that have been proposed are currently being tested in the literature. We provide a comparison between some estimators (a quasi-maximum likelihood, QML, estimator and some GMM-type estimators) for a fixed effects dynamic panel data model under certain conditions, by means of a Monte Carlo simulation analysis. We focus on different settings, which are characterized either by fully stable or quasi-unit root series. We also investigate the extent of the bias that is caused by a non-spatial estimation of a model when the data are characterized by different degrees of spatial dependence. Finally, we provide an empirical application of a QML estimator for a time-space dynamic model which includes a temporal, a spatial and a spatiotemporal lag of the dependent variable. This is done by choosing a relevant and prolific field of analysis, in which spatial econometrics has only found limited space so far, in order to explore the value-added of considering the spatial dimension of the data. In particular, we study the determinants of cropland value in Midwestern U.S.A. in the years 1971-2009, by taking the present value model (PVM) as the theoretical framework of analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work a modelization of the turbulence in the atmospheric boundary layer, under convective condition, is made. For this aim, the equations that describe the atmospheric motion are expressed through Reynolds averages and, then, they need closures. This work consists in modifying the TKE-l closure used in the BOLAM (Bologna Limited Area Model) forecast model. In particular, the single column model extracted from BOLAM is used, which is modified to obtain other three different closure schemes: a non-local term is added to the flux- gradient relations used to close the second order moments present in the evolution equation of the turbulent kinetic energy, so that the flux-gradient relations become more suitable for simulating an unstable boundary layer. Furthermore, a comparison among the results obtained from the single column model, the ones obtained from the three new schemes and the observations provided by the known case in literature ”GABLS2” is made.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The atmosphere is a global influence on the movement of heat and humidity between the continents, and thus significantly affects climate variability. Information about atmospheric circulation are of major importance for the understanding of different climatic conditions. Dust deposits from maar lakes and dry maars from the Eifel Volcanic Field (Germany) are therefore used as proxy data for the reconstruction of past aeolian dynamics.rnrnIn this thesis past two sediment cores from the Eifel region are examined: the core SM3 from Lake Schalkenmehren and the core DE3 from the Dehner dry maar. Both cores contain the tephra of the Laacher See eruption, which is dated to 12,900 before present. Taken together the cores cover the last 60,000 years: SM3 the Holocene and DE3 the marine isotope stages MIS-3 and MIS-2, respectively. The frequencies of glacial dust storm events and their paleo wind direction are detected by high resolution grain size and provenance analysis of the lake sediments. Therefore two different methods are applied: geochemical measurements of the sediment using µXRF-scanning and the particle analysis method RADIUS (rapid particle analysis of digital images by ultra-high-resolution scanning of thin sections).rnIt is shown that single dust layers in the lake sediment are characterized by an increased content of aeolian transported carbonate particles. The limestone-bearing Eifel-North-South zone is the most likely source for the carbonate rich aeolian dust in the lake sediments of the Dehner dry maar. The dry maar is located on the western side of the Eifel-North-South zone. Thus, carbonate rich aeolian sediment is most likely to be transported towards the Dehner dry maar within easterly winds. A methodology is developed which limits the detection to the aeolian transported carbonate particles in the sediment, the RADIUS-carbonate module.rnrnIn summary, during the marine isotope stage MIS-3 the storm frequency and the east wind frequency are both increased in comparison to MIS-2. These results leads to the suggestion that atmospheric circulation was affected by more turbulent conditions during MIS-3 in comparison to the more stable atmospheric circulation during the full glacial conditions of MIS-2.rnThe results of the investigations of the dust records are finally evaluated in relation a study of atmospheric general circulation models for a comprehensive interpretation. Here, AGCM experiments (ECHAM3 and ECHAM4) with different prescribed SST patterns are used to develop a synoptic interpretation of long-persisting east wind conditions and of east wind storm events, which are suggested to lead to an enhanced accumulation of sediment being transported by easterly winds to the proxy site of the Dehner dry maar.rnrnThe basic observations made on the proxy record are also illustrated in the 10 m-wind vectors in the different model experiments under glacial conditions with different prescribed sea surface temperature patterns. Furthermore, the analysis of long-persisting east wind conditions in the AGCM data shows a stronger seasonality under glacial conditions: all the different experiments are characterized by an increase of the relative importance of the LEWIC during spring and summer. The different glacial experiments consistently show a shift from a long-lasting high over the Baltic Sea towards the NW, directly above the Scandinavian Ice Sheet, together with contemporary enhanced westerly circulation over the North Atlantic.rnrnThis thesis is a comprehensive analysis of atmospheric circulation patterns during the last glacial period. It has been possible to reconstruct important elements of the glacial paleo climate in Central Europe. While the proxy data from sediment cores lead to a binary signal of the wind direction changes (east versus west wind), a synoptic interpretation using atmospheric circulation models is successful. This shows a possible distribution of high and low pressure areas and thus the direction and strength of wind fields which have the capacity to transport dust. In conclusion, the combination of numerical models, to enhance understanding of processes in the climate system, with proxy data from the environmental record is the key to a comprehensive approach to paleo climatic reconstruction.rn