927 resultados para Marginal structural model
Resumo:
Epoxy adhesives are nowadays being extensively used in Civil Engineering applications, mostly in the scope of the rehabilitation of reinforced concrete (RC) structures. In this context, epoxy adhesives are used to provide adequate stress transference from fibre reinforced polymers (FRP) to the surrounding concrete substrate. Most recently, the possibility of using prestressed FRPs bonded with these epoxy adhesives is also being explored in order to maximize the potentialities of this strengthening approach. In this context, the understanding of the long term behaviour of the involved materials becomes essential. Even when non-prestressed FRPs are used a certain amount of stress is permanently applied on the adhesive interface during the serviceability conditions of the strengthened structure, and the creep of the adhesive may cause a continuous variation in the deformational response of the element. In this context, this paper presents a study aiming to experimentally characterize the tensile creep behaviour of an epoxy-based adhesive currently used in the strengthening of concrete structures with carbon FRP (CFRP) systems. To analytically describe the tensile creep behaviour, the modified Burgers model was fitted to the experimental creep curves, and the obtained results revealed that this model is capable of predicting with very good accuracy the long term behaviour of this material up to a sustained stress level of 60% of the adhesive’s tensile strength.
Resumo:
Construction of hydroelectric dams in tropical regions has been contributing significantly to forest fragmentation. Alterations at edges of forest fragments impact plant communities that suffer increases in tree damage and dead, and decreases in seedling recruitment. This study aimed to test the core-area model in a fragmented landscape caused by construction of a hydroelectric power plant in the Brazilian Amazon. We studied variations in forest structure between the margin and interiors of 17 islands of 8-100 hectares in the Tucuruí dam reservoir, in two plots (30 and >100m from the margin) per island. Mean tree density, basal area, seedling density and forest cover did not significantly differ between marginal and interior island plots. Also, no significant differences were found in liana density, dead tree or damage for margin and interior plots. The peculiar topographic conditions associated with the matrix habitat and shapes of the island seem to extend edge effects to the islands' centers independently of the island size, giving the interior similar physical microclimatic conditions as at the edges. We propose a protocol for assessing the ecological impacts of edge effects in fragments of natural habitat surrounded by induced (artificial) edges. The protocol involves three steps: (1) identification of focal taxa of particular conservation or management interest, (2) measurement of an "edge function" that describes the response of these taxa to induced edges, and (3) use of a "Core-Area Model" to extrapolate edge function parameters to existing or novel situations.
Resumo:
Dissertação de mestrado em Structural Analysis of Monuments and Historical Constructions
Resumo:
Dissertação de mestrado em Structural Analysis of Monuments and Historical Constructions
Resumo:
A novel framework for probabilistic-based structural assessment of existing structures, which combines model identification and reliability assessment procedures, considering in an objective way different sources of uncertainty, is presented in this paper. A short description of structural assessment applications, provided in literature, is initially given. Then, the developed model identification procedure, supported in a robust optimization algorithm, is presented. Special attention is given to both experimental and numerical errors, to be considered in this algorithm convergence criterion. An updated numerical model is obtained from this process. The reliability assessment procedure, which considers a probabilistic model for the structure in analysis, is then introduced, incorporating the results of the model identification procedure. The developed model is then updated, as new data is acquired, through a Bayesian inference algorithm, explicitly addressing statistical uncertainty. Finally, the developed framework is validated with a set of reinforced concrete beams, which were loaded up to failure in laboratory.
Resumo:
Auxetic materials are a class of materials behaves unusual way compared to regular materials i.e. possess negative Poisson’s ratio. This paper reports, the development of auxetic structures based on re-entrant hexagon design from braided composite materials and testing of the mechanical properties (tensile property, auxetic property and work of rupture). The structure developed from glass and basalt braided composite rods and properties were compared between them. Later, the basic re-entrant hexagon design was modified with vertical straight rods to improve their mechanical behavior and their auxetic property was studied. Auxetic behavior of these structures was studied in a tensile testing machine taking video during testing by Digital camera, later the video converted into images to measure the strain values using simple software, ImageJ. Along with experimental work, analytical model was used to calculate the Poisson’s ratio of basic structure and results were compared
Resumo:
Genome-scale metabolic models are valuable tools in the metabolic engineering process, based on the ability of these models to integrate diverse sources of data to produce global predictions of organism behavior. At the most basic level, these models require only a genome sequence to construct, and once built, they may be used to predict essential genes, culture conditions, pathway utilization, and the modifications required to enhance a desired organism behavior. In this chapter, we address two key challenges associated with the reconstruction of metabolic models: (a) leveraging existing knowledge of microbiology, biochemistry, and available omics data to produce the best possible model; and (b) applying available tools and data to automate the reconstruction process. We consider these challenges as we progress through the model reconstruction process, beginning with genome assembly, and culminating in the integration of constraints to capture the impact of transcriptional regulation. We divide the reconstruction process into ten distinct steps: (1) genome assembly from sequenced reads; (2) automated structural and functional annotation; (3) phylogenetic tree-based curation of genome annotations; (4) assembly and standardization of biochemistry database; (5) genome-scale metabolic reconstruction; (6) generation of core metabolic model; (7) generation of biomass composition reaction; (8) completion of draft metabolic model; (9) curation of metabolic model; and (10) integration of regulatory constraints. Each of these ten steps is documented in detail.
Resumo:
The primary purpose of this exploratory empirical study is to examine the structural stability of a limited number of alternative explanatory factors of strategic change. On the basis of theoretical arguments and prior empirical evidence from two traditional perspectives, we propose an original empirical framework to analyse whether these potential explanatory factors have remained stable over time in a highly turbulent environment. This original question is explored in a particular setting: the population of Spanish private banks. The firms of this industry have experienced a high level of strategic mobility as a consequence of fundamental changes undergone in their environmental conditions over the last two decades (mainly changes related to the new banking and financial regulation process). Our results consistently support that the effect of most explanatory factors of strategic mobility considered did not remain stable over the whole period of analysis. From this point of view, the study sheds new light on major debates and dilemmas in the field of strategy regarding why firms change their competitive patterns over time and, hence, to what extent the "contextdependency" of alternative views of strategic change as their relative validation can vary over time for a given population. Methodologically, this research makes two major contributions to the study of potential determinants of strategic change. First, the definition and measurement of strategic change employing a new grouping method, the Model-based Cluster Method or MCLUST. Second, in order to asses the possible effect of determinants of strategic mobility we have controlled the non-observable heterogeneity using logistic regression models for panel data.
Resumo:
The public perception of the EU in Spain varies greatly. The most positive aspects of Spanish membership are associated with the consolidation of democracy, economic growth, the introduction of the euro, the growth in employment and structural and cohesion funds, the increase in the female participation rate, and the equal opportunities policies. The analysts are in favour of common objectives in the employment policy and multi-level government. The less positive aspects of the EU are the risks of losing social protection and loss of employment in some sectors due to mergers of multinationals and delocalization of companies towards Eastern Europe. The continuous demands for reform of the welfare state, the toughening of the conditions of access to social benefit and the reform of the labour market are also seen as problematic issues. Risks of competitive cuts and social dumping.
Resumo:
Besley (1988) uses a scaling approach to model merit good arguments in commodity tax policy. In this paper, I question this approach on the grounds that it produces 'wrong' recommendations--taxation (subsidisation) of merit (demerit) goods--whenever the demand for the (de)merit good is inelastic. I propose an alternative approach that does not suffer from this deficiency, and derive the ensuing first and second best tax rules, as well as the marginal cost expressions to perform tax reform analysis.
Resumo:
Actual tax systems do not follow the normative recommendations of yhe theory of optimal taxation. There are two reasons for this. Firstly, the informational difficulties of knowing or estimating all relevant elasticities and parameters. Secondly, the political complexities that would arise if a new tax implementation would depart too much from current systems that are perceived as somewhat egalitarians. Hence an ex-novo overhaul of the tax system might just be non-viable. In contrast, a small marginal tax reform could be politically more palatable to accept and economically more simple to implement. The goal of this paper is to evaluate, as a step previous to any tax reform, the marginal welfare cost of the current tax system in Spain. We do this by using a computational general equilibrium model calibrated to a point-in-time micro database. The simulations results show that the Spanish tax system gives rise to a considerable marginal excess burden. Its order of magnitude is of about 0.50 money units for each additional money unit collected through taxes.
Resumo:
This paper analyzes the growth and employment effects of the 1994-99 Community Support Framework (CSF) for the Objective 1 Spanish regions using a simple supply-side model estimated with a panel of regional data. The results suggest that the impact of the Structural Funds in Spain has been quite sizable, adding around a percentage point to annual output growth in the average Objective 1 region and 0.4 points to employment growth. Over the period 1994-2000, the Framework has resulted in the creation of over 300,000 new jobs and has eliminated 20% of the initial gap in income per capita between the assisted regions and the rest of the country.
Resumo:
We use structural methods to assess equilibrium models of bidding with data from first-price auction experiments. We identify conditions to test the Nash equilibrium models for homogenous and for heterogeneous constant relative risk aversion when bidders private valuations are independent and uniformly drawn. The outcomes of our study indicate that behavior may have been affected by the procedure used to conduct the experiments and that the usual Nash equilibrium model for heterogeneous constant relative risk averse bidders does not consistently explain the observed overbidding. From an empirical standpoint, our analysis shows the possible drawbacks of overlooking the homogeneity hypothesis when testing symmetric equilibrium models of bidding and it puts in perspective the sensitivity of structural inferences to the available information.
Resumo:
This paper tests the Entrepreneurial Intention Model -which is adapted from the Theory of Planned Behavior- on a sample of 533 individuals from two quite different countries: one of them European (Spain) and the other South Asian (Taiwan). A newly developed Entrepreneurial Intention Questionnaire (EIQ) has being used which tries to overcome some of the limitations of previous instruments. Structural equations techniques were used in the empirical analysis. Results are generally satisfactory, indicating that the model is probably adequate for studying entrepreneurship. Support for the model was found not only in the combined sample, but also in each of the national ones. However, some differences arose that may indicate demographic variables contribute differently to the formation of perceptions in each culture.
Resumo:
This paper contributes to the on-going empirical debate regarding the role of the RBC model and in particular of technology shocks in explaining aggregate fluctuations. To this end we estimate the model’s posterior density using Markov-Chain Monte-Carlo (MCMC) methods. Within this framework we extend Ireland’s (2001, 2004) hybrid estimation approach to allow for a vector autoregressive moving average (VARMA) process to describe the movements and co-movements of the model’s errors not explained by the basic RBC model. The results of marginal likelihood ratio tests reveal that the more general model of the errors significantly improves the model’s fit relative to the VAR and AR alternatives. Moreover, despite setting the RBC model a more difficult task under the VARMA specification, our analysis, based on forecast error and spectral decompositions, suggests that the RBC model is still capable of explaining a significant fraction of the observed variation in macroeconomic aggregates in the post-war U.S. economy.