920 resultados para Performance-based design
Resumo:
Bridges with deck supported on either sliding or elastomeric bearings are very common in mid-seismicity regions. Their main seismic vulnerabilities are related to the pounding of the deck against abutments or between the different deck elements. A simplified model of the longitudinal behavior of those bridges will allow to characterize the reaction forces developed during pounding using the Pacific Earthquake Engineering Research Center framework formula. In order to ensure the general applicability of the results obtained, a large number of system parameter combinations will be considered. The heart of the formula is the identification of suitable intermediate variables. First, the pseudo acceleration spectral value for the fundamental period of the system (Sa(Ts)) will be used as an intensity measure (IM). This IM will result in a very large non-explained variability of the engineering demand parameter. A portion of this variability will be proved to be related to the relative content of high-frequency energy in the input motion. Two vector-valued IMs including a second parameter taking this energy content into account will then be considered. For both of them, a suitable form for the conditional intensity dependence of the response will be obtained. The question of which one to choose will also be analyzed. Finally, additional issues related to the IM will be studied: its applicability to pulse-type records, the validity of scaling records and the sufficiency of the IM.
Resumo:
Le dimensionnement basé sur la performance (DBP), dans une approche déterministe, caractérise les objectifs de performance par rapport aux niveaux de performance souhaités. Les objectifs de performance sont alors associés à l'état d'endommagement et au niveau de risque sismique établis. Malgré cette approche rationnelle, son application est encore difficile. De ce fait, des outils fiables pour la capture de l'évolution, de la distribution et de la quantification de l'endommagement sont nécessaires. De plus, tous les phénomènes liés à la non-linéarité (matériaux et déformations) doivent également être pris en considération. Ainsi, cette recherche montre comment la mécanique de l'endommagement pourrait contribuer à résoudre cette problématique avec une adaptation de la théorie du champ de compression modifiée et d'autres théories complémentaires. La formulation proposée adaptée pour des charges monotones, cycliques et de type pushover permet de considérer les effets non linéaires liés au cisaillement couplé avec les mécanismes de flexion et de charge axiale. Cette formulation est spécialement appliquée à l'analyse non linéaire des éléments structuraux en béton soumis aux effets de cisaillement non égligeables. Cette nouvelle approche mise en œuvre dans EfiCoS (programme d'éléments finis basé sur la mécanique de l'endommagement), y compris les critères de modélisation, sont également présentés ici. Des calibrations de cette nouvelle approche en comparant les prédictions avec des données expérimentales ont été réalisées pour les murs de refend en béton armé ainsi que pour des poutres et des piliers de pont où les effets de cisaillement doivent être pris en considération. Cette nouvelle version améliorée du logiciel EFiCoS a démontrée être capable d'évaluer avec précision les paramètres associés à la performance globale tels que les déplacements, la résistance du système, les effets liés à la réponse cyclique et la quantification, l'évolution et la distribution de l'endommagement. Des résultats remarquables ont également été obtenus en référence à la détection appropriée des états limites d'ingénierie tels que la fissuration, les déformations unitaires, l'éclatement de l'enrobage, l'écrasement du noyau, la plastification locale des barres d'armature et la dégradation du système, entre autres. Comme un outil pratique d'application du DBP, des relations entre les indices d'endommagement prédits et les niveaux de performance ont été obtenus et exprimés sous forme de graphiques et de tableaux. Ces graphiques ont été développés en fonction du déplacement relatif et de la ductilité de déplacement. Un tableau particulier a été développé pour relier les états limites d'ingénierie, l'endommagement, le déplacement relatif et les niveaux de performance traditionnels. Les résultats ont démontré une excellente correspondance avec les données expérimentales, faisant de la formulation proposée et de la nouvelle version d'EfiCoS des outils puissants pour l'application de la méthodologie du DBP, dans une approche déterministe.
Resumo:
A conventional method for seismic strengthening of masonry walls is externally application of reinforced concrete layer (shotcrete). However, due to the lack of analytical and experimental information on the behavior of strengthened walls, the design procedures are usually followed based on the empirical relations. Using these design procedures have resulted in massive strengthening details in retrofitting projects. This paper presents a computational framework for nonlinear analysis of strengthened masonry walls and its versatility has been verified by comparing the numerical and experimental results. Based on the developed numerical model and available experimental information, design relations and failure modes are proposed for strengthened walls in accordance with the ASCE 41 standard. Finally, a sample masonry structure has been strengthened using the proposed and available conventional methods. It has been shown that using the proposed method results in lower strengthening details and appropriate (ductile) failure modes
Resumo:
The design of containment walls suffering seismic loads traditionally has been realized with methods based on pseudoanalitic procedures such as Mononobe- Okabe's method, which it has led in certain occasions to insecure designs, that they have produced the ruin of many containment walls suffering the action of an earthquake. A method is proposed in this papers for the design of containment walls in different soils, suffering to the action of an earthquake, based on the Performance-Based Seismic Design.
Resumo:
The design of containment walls suffering seismic loads traditionally has been realized with methods based on pseudoanalitic procedures such as Mononobe-Okabe's method, which it has led in certain occasions to insecure designs, that they have produced the ruin of many containment walls suffering the action of an earthquake. The recommendations gathered in Mononobe-Okabe's theory have been included in numerous Codes of Seismic Design. It is clear that a revision of these recommendations must be done. At present there is taking place an important review of the design methods of anti-seismic structures such as containment walls placed in an area of numerous earthquakes, by means of the introduction at the beginning of the decade of 1990 the Displacement Response Spectrum (DRS) and the Capacity Demand Diagram (CDD) that suppose an important change in the way of presenting the Elastic Response Spectrum (ERS). On the other hand in case of action of an earthquake, the dynamic characteristics of a soil have been referred traditionally to the speed of the shear waves that can be generated in a site, together with the characteristics of plasticity and damping of the soil. The Principle of the energy conservation explains why a shear upward propagating seismic wave can be amplified when travelling from a medium with high shear wave velocity (rock) to other medium with lower velocity (soil deposit), as it happened in the earthquake of Mexico of 1985. This amplification is a function of the speed gradient or of the contrast of impedances in the border of both types of mediums. A method is proposed in this paper for the design of containment walls in different soils, suffering to the action of an earthquake, based on the Performance-Based Seismic Design.
Resumo:
Emergency management is one of the key aspects within the day-to-day operation procedures in a highway. Efficiency in the overall response in case of an incident is paramount in reducing the consequences of any incident. However, the approach of highway operators to the issue of incident management is still usually far from a systematic, standardized way. This paper attempts to address the issue and provide several hints on why this happens, and a proposal on how the situation could be overcome. An introduction to a performance based approach to a general system specification will be described, and then applied to a particular road emergency management task. A real testbed has been implemented to show the validity of the proposed approach. Ad-hoc sensors (one camera and one laser scanner) were efficiently deployed to acquire data, and advanced fusion techniques applied at the processing stage to reach the specific user requirements in terms of functionality, flexibility and accuracy.
Resumo:
This thesis proposes a methodology for modelling business interoperability in a context of cooperative industrial networks. The purpose is to develop a methodology that enables the design of cooperative industrial network platforms that are able to deliver business interoperability and the analysis of its impact on the performance of these platforms. To achieve the proposed objective, two modelling tools have been employed: the Axiomatic Design Theory for the design of interoperable platforms; and Agent-Based Simulation for the analysis of the impact of business interoperability. The sequence of the application of the two modelling tools depends on the scenario under analysis, i.e. whether the cooperative industrial network platform exists or not. If the cooperative industrial network platform does not exist, the methodology suggests first the application of the Axiomatic Design Theory to design different configurations of interoperable cooperative industrial network platforms, and then the use of Agent-Based Simulation to analyse or predict the business interoperability and operational performance of the designed configurations. Otherwise, one should start by analysing the performance of the existing platform and based on the achieved results, decide whether it is necessary to redesign it or not. If the redesign is needed, simulation is once again used to predict the performance of the redesigned platform. To explain how those two modelling tools can be applied in practice, a theoretical modelling framework, a theoretical Axiomatic Design model and a theoretical Agent-Based Simulation model are proposed. To demonstrate the applicability of the proposed methodology and/or to validate the proposed theoretical models, a case study regarding a Portuguese Reverse Logistics cooperative network (Valorpneu network) and a case study regarding a Portuguese construction project (Dam Baixo Sabor network) are presented. The findings of the application of the proposed methodology to these two case studies suggest that indeed the Axiomatic Design Theory can effectively contribute in the design of interoperable cooperative industrial network platforms and that Agent-Based Simulation provides an effective set of tools for analysing the impact of business interoperability on the performance of those platforms. However, these conclusions cannot be generalised as only two case studies have been carried out. In terms of relevance to theory, this is the first time that the network effect is addressed in the analysis of the impact of business interoperability on the performance of networked companies and also the first time that a holistic approach is proposed to design interoperable cooperative industrial network platforms. Regarding the practical implications, the proposed methodology is intended to provide industrial managers a management tool that can guide them easily, and in practical and systematic way, in the design of configurations of interoperable cooperative industrial network platforms and/or in the analysis of the impact of business interoperability on the performance of their companies and the networks where their companies operate.
Resumo:
The goal in highway construction and operation has shifted from method based specifications to specifications relating desired performance attributes to materials, mix designs, and construction methods. Shifting from method specifications to performance based specifications can work as an incentive or disincentive for the contractor to improve performance or extend pavement life. This literature search was directed at a review of existing portland cement concrete performance specification development, and the criteria that can effectively measure pavement performance. The criteria identified in the literature include concrete strength, slab thickness, air content, initial smoothness, water-cement ratio, unit weight, and slump. A description of each criterion, along with the advantages, disadvantages, and test methods for each are identified. Also included are the results from a survey that was sent out to various state, federal, and trade agencies. The responses indicated that 53% currently use or are developing a performance based specification program. Of the 47% of agencies that do not use a performance based specification program, over 34% indicated that they would consider a similar program. The most commonly measured characteristics include thickness, strength, smoothness, and air content. Lastly recommendations and conclusions are made regarding other factors that affect pavement performance and a proposed second phase of the research is suggested. The research team suggests that a regional expert task group be formed to identify performance levels and criteria. The results of that effort will guide the research team in the development of new or revised specifications.
Resumo:
A ligand-based drug design study was performed to acetaminophen regioisomers as analgesic candidates employing quantum chemical calculations at the DFT/B3LYP level of theory and the 6-31G* basis set. To do so, many molecular descriptors were used such as highest occupied molecular orbital, ionization potential, HO bond dissociation energies, and spin densities, which might be related to quench reactivity of the tyrosyl radical to give N-acetyl-p-benzosemiquinone-imine through an initial electron withdrawing or hydrogen atom abstraction. Based on this in silico work, the most promising molecule, orthobenzamol, was synthesized and tested. The results expected from the theoretical prediction were confirmed in vivo using mouse models of nociception such as writhing, paw licking, and hot plate tests. All biological results suggested an antinociceptive activity mediated by opioid receptors. Furthermore, at 90 and 120 min, this new compound had an effect that was comparable to morphine, the standard drug for this test. Finally, the pharmacophore model is discussed according to the electronic properties derived from quantum chemistry calculations.
Resumo:
In deterministic optimization, the uncertainties of the structural system (i.e. dimension, model, material, loads, etc) are not explicitly taken into account. Hence, resulting optimal solutions may lead to reduced reliability levels. The objective of reliability based design optimization (RBDO) is to optimize structures guaranteeing that a minimum level of reliability, chosen a priori by the designer, is maintained. Since reliability analysis using the First Order Reliability Method (FORM) is an optimization procedure itself, RBDO (in its classical version) is a double-loop strategy: the reliability analysis (inner loop) and the structural optimization (outer loop). The coupling of these two loops leads to very high computational costs. To reduce the computational burden of RBDO based on FORM, several authors propose decoupling the structural optimization and the reliability analysis. These procedures may be divided in two groups: (i) serial single loop methods and (ii) unilevel methods. The basic idea of serial single loop methods is to decouple the two loops and solve them sequentially, until some convergence criterion is achieved. On the other hand, uni-level methods employ different strategies to obtain a single loop of optimization to solve the RBDO problem. This paper presents a review of such RBDO strategies. A comparison of the performance (computational cost) of the main strategies is presented for several variants of two benchmark problems from the literature and for a structure modeled using the finite element method.
Resumo:
L’approccio performance-based nell’Ingegneria sismica è una metodologia di progetto che tiene esplicitamente in conto la performance dell’edificio tra i criteri progettuali. Nell’ambito dei metodi PBEE (Performance-Based Earthquake Engineering) di seconda generazione, quello proposto dal PEER (Pacific Earthquake Engineering Research Center) risulta essere il più diffuso. In esso la performance dell’edificio oggetto di studio viene valutata in termini quantitativi secondo le 3D’s (dollars, deaths, downtime – soldi, decessi, inutilizzo), quantità di notevole interesse per l’utente finale. Il metodo si compone di quattro step, indipendenti tra loro fino alla sintesi finale. Essi sono: l’analisi di pericolosità, l’analisi strutturale, l’analisi di danno, l’analisi delle perdite o di loss. Il risultato finale è la curva di loss, che assegna ad ogni possibile perdita economica conseguente all’evento sismico una probabilità di superamento nell’arco temporale di riferimento. Dopo la presentazione del metodo PEER, si è provveduto ad una sua applicazione su di un caso di studio, nella fattispecie un telaio piano di quattro campate, multipiano, in calcestruzzo armato, costruito secondo le norme del ’92. Per l’analisi di pericolosità si è fatto ricorso alle mappe di pericolosità disponibili sul sito INGV, mentre per l’analisi strutturale si è utilizzato il software open-source OpenSees. Le funzioni di fragilità e quelle di loss sono state sviluppate facendo riferimento alla letteratura scientifica, in particolare il bollettino Fib numero 68 “Probabilistic performance-based seismic design”. In questa sede ci si è concentrati unicamente sulla stima delle perdite economiche, tralasciando le altre due variabili decisionali. Al termine del procedimento si è svolta un’analisi di sensitività per indagare quali parametri influenzino maggiormente la curva di loss. Data la curva di pericolosità, il legame EDP(IM) e la deformazione ultima a collasso risultano essere i più rilevanti sul risultato dell’analisi.
Resumo:
This study investigates the degree to which textual complexity indices applied on students’ online contributions, corroborated with a longitudinal analysis performed on their weekly posts, predict academic performance. The source of student writing consists of blog and microblog posts, created in the context of a project-based learning scenario run on our eMUSE platform. Data is collected from six student cohorts, from six consecutive installments of the Web Applications Design course, comprising of 343 students. A significant model was obtained by relying on the textual complexity and longitudinal analysis indices, applied on the English contributions of 148 students that were actively involved in the undertaken projects.
Resumo:
Much management accounting research focuses on design of incentive compensation contracts. A basic assumption in these contracts is that performance-based incentives improve employee performance. This paper reports on a field test of the multi-period incentive effects of a performance-based compensation plan on the sales of a retail establishment. Analysis of panel data for 15 retail outlets over 66 months indicates a sales increase when the plan is implemented, an effect that persists and increases over time. Sales gains are significantly lower in the peak selling season when more temporary workers are employed.
Stability and simulation-based design of steel scaffolding without using the effective length method
Resumo:
It is proposed a new approach based on a methodology, assisted by a tool, to create new products in the automobile industry based on previous defined processes and experiences inspired on a set of best practices or principles: it is based on high-level models or specifications; it is component-based architecture centric; it is based on generative programming techniques. This approach follows in essence the MDA (Model Driven Architecture) philosophy with some specific characteristics. We propose a repository that keeps related information, such as models, applications, design information, generated artifacts and even information concerning the development process itself (e.g., generation steps, tests and integration milestones). Generically, this methodology receives the users' requirements to a new product (e.g., functional, non-functional, product specification) as its main inputs and produces a set of artifacts (e.g., design parts, process validation output) as its main output, that will be integrated in the engineer design tool (e.g. CAD system) facilitating the work.