918 resultados para Trial and error
Resumo:
In bipolar disorders, there are unclear diagnostic boundaries with unipolar depression and schizophrenia, inconsistency of treatment guidelines, relatively long trial-and-error phases of treatment optimization, and increasing use of complex combination therapies lacking empirical evidence. These suggest that the current definition of bipolar disorders based on clinical symptoms reflects a clinically and etiologically heterogeneous entity. Stratification of treatments for bipolar disorders based on biomarkers and improved clinical markers are greatly needed to increase the efficacy of currently available treatments and improve the chances of developing novel therapeutic approaches. This review provides a theoretical framework to identify biomarkers and summarizes the most promising markers for stratification regarding beneficial and adverse treatment effects. State and stage specifiers, neuropsychological tests, neuroimaging, and genetic and epigenetic biomarkers will be discussed with respect to their ability to predict the response to specific pharmacological and psychosocial psychotherapies for bipolar disorders. To date, the most reliable markers are derived from psychopathology and history-taking, while no biomarker has been found that reliably predicts individual treatment responses. This review underlines both the importance of clinical diagnostic skills and the need for biological research to identify markers that will allow the targeting of treatment specifically to sub-populations of bipolar patients who are more likely to benefit from a specific treatment and less likely to develop adverse reactions.
Resumo:
Background. Healthcare providers in pediatrics are faced with parents making medical decisions for their children. Refusal to consent to interventions can have life threatening sequelae, yet healthcare workers are provided little training in handling refusals. The healthcare provider's experience in parental refusal has not been well described, yet is an important first step in addressing this problem. ^ Specific aims. Describe: (1) the decision-making processes made by healthcare providers when parents refuse medical interventions for their children, (2) the source of healthcare workers' skills in handling situations of refusal, and (3) the perspectives of healthcare workers on parental refusals in the inpatient setting. ^ Methods. Nurses, physicians and respiratory therapists (RT) were recruited via e-mail at Texas Children's Hospital (TCH). Interview questions were developed using Social Cognitive Theory constructs and validated. One-on-one in-depth, one hour semi-structured interviews were held at TCH, audio recorded and transcribed. Coding and analysis were done using ATLAS ti. The constant comparative method was applied to describe emergent themes that were reviewed by an independent expert. ^ Results. Interviews have been conducted with nurses (n=6), physicians and practitioners (n=6), social workers (n=3) and RT (n=3) comprising 13 females and 5 males with 3–25 years of experience. Decision-making processes relate to the experience of the caregiver, familiarity with the family, and the acuity of the patient. Healthcare workers' skills were obtained through orientation processes or by trial-and-error. Themes emerged that related to the importance of: (1) Communication, where the initial discussion about a medical procedure should be done with clarity and an understanding of the parents' views; (2) Perceived loss of control by parents, a key factor in their refusal of interventions; and (3) Training, the need for skill development to handle refusals. ^ Conclusions. Effective training involving clarity in communication and a preservation of perceived control by parents is needed to avoid the current trial-and-error experience of healthcare workers in negotiating refusal situations. Such training could lessen the more serious outcomes of parental refusal. ^
Resumo:
Many macroscopic properties: hardness, corrosion, catalytic activity, etc. are directly related to the surface structure, that is, to the position and chemical identity of the outermost atoms of the material. Current experimental techniques for its determination produce a “signature” from which the structure must be inferred by solving an inverse problem: a solution is proposed, its corresponding signature computed and then compared to the experiment. This is a challenging optimization problem where the search space and the number of local minima grows exponentially with the number of atoms, hence its solution cannot be achieved for arbitrarily large structures. Nowadays, it is solved by using a mixture of human knowledge and local search techniques: an expert proposes a solution that is refined using a local minimizer. If the outcome does not fit the experiment, a new solution must be proposed again. Solving a small surface can take from days to weeks of this trial and error method. Here we describe our ongoing work in its solution. We use an hybrid algorithm that mixes evolutionary techniques with trusted region methods and reuses knowledge gained during the execution to avoid repeated search of structures. Its parallelization produces good results even when not requiring the gathering of the full population, hence it can be used in loosely coupled environments such as grids. With this algorithm, the solution of test cases that previously took weeks of expert time can be automatically solved in a day or two of uniprocessor time.
Resumo:
The arrangement of atoms at the surface of a solid accounts for many of its properties: Hardness, chemical activity, corrosion, etc. are dictated by the precise surface structure. Hence, finding it, has a broad range of technical and industrial applications. The ability to solve this problem opens the possibility of designing by computer materials with properties tailored to specific applications. Since the search space grows exponentially with the number of atoms, its solution cannot be achieved for arbitrarily large structures. Presently, a trial and error procedure is used: an expert proposes an structure as a candidate solution and tries a local optimization procedure on it. The solution relaxes to the local minimum in the attractor basin corresponding to the initial point, that might be the one corresponding to the global minimum or not. This procedure is very time consuming and, for reasonably sized surfaces, can take many iterations and much effort from the expert. Here we report on a visualization environment designed to steer this process in an attempt to solve bigger structures and reduce the time needed. The idea is to use an immersive environment to interact with the computation. It has immediate feedback to assess the quality of the proposed structure in order to let the expert explore the space of candidate solutions. The visualization environment is also able to communicate with the de facto local solver used for this problem. The user is then able to send trial structures to the local minimizer and track its progress as they approach the minimum. This allows for simultaneous testing of candidate structures. The system has also proved very useful as an educational tool for the field.
Resumo:
En los diseños y desarrollos de ingeniería, antes de comenzar la construcción e implementación de los objetivos de un proyecto, es necesario realizar una serie de análisis previos y simulaciones que corroboren las expectativas de la hipótesis inicial, con el fin de obtener una referencia empírica que satisfaga las condiciones de trabajo o funcionamiento de los objetivos de dicho proyecto. A menudo, los resultados que satisfacen las características deseadas se obtienen mediante la iteración de métodos de ensayo y error. Generalmente, éstos métodos utilizan el mismo procedimiento de análisis con la variación de una serie de parámetros que permiten adaptar una tecnología a la finalidad deseada. Hoy en día se dispone de computadoras potentes, así como algoritmos de resolución matemática que permiten resolver de forma veloz y eficiente diferentes tipos de problemas de cálculo. Resulta interesante el desarrollo de aplicaciones que permiten la resolución de éstos problemas de forma rápida y precisa en el análisis y síntesis de soluciones de ingeniería, especialmente cuando se tratan expresiones similares con variaciones de constantes, dado que se pueden desarrollar instrucciones de resolución con la capacidad de inserción de parámetros que definan el problema. Además, mediante la implementación de un código de acuerdo a la base teórica de una tecnología, se puede lograr un código válido para el estudio de cualquier problema relacionado con dicha tecnología. El desarrollo del presente proyecto pretende implementar la primera fase del simulador de dispositivos ópticos Slabsim, en cual se puede representar la distribución de la energía de una onda electromagnética en frecuencias ópticas guiada a través de una una guía dieléctrica plana, también conocida como slab. Este simulador esta constituido por una interfaz gráfica generada con el entorno de desarrollo de interfaces gráficas de usuario Matlab GUIDE, propiedad de Mathworks©, de forma que su manejo resulte sencillo e intuitivo para la ejecución de simulaciones con un bajo conocimiento de la base teórica de este tipo de estructuras por parte del usuario. De este modo se logra que el ingeniero requiera menor intervalo de tiempo para encontrar una solución que satisfaga los requisitos de un proyecto relacionado con las guías dieléctricas planas, e incluso utilizarlo para una amplia diversidad de objetivos basados en esta tecnología. Uno de los principales objetivos de este proyecto es la resolución de la base teórica de las guías slab a partir de métodos numéricos computacionales, cuyos procedimientos son extrapolables a otros problemas matemáticos y ofrecen al autor una contundente base conceptual de los mismos. Por este motivo, las resoluciones de las ecuaciones diferenciales y características que constituyen los problemas de este tipo de estructuras se realizan por estos medios de cálculo en el núcleo de la aplicación, dado que en algunos casos, no existe la alternativa de uso de expresiones analíticas útiles. ABSTRACT. The first step in engineering design and development is an analysis and simulation process which will successfully corroborate the initial hypothesis that was made and find solutions for a particular. In this way, it is possible to obtain empirical evidence which suitably substantiate the purposes of the project. Commonly, the characteristics to reach a particular target are found through iterative trial and error methods. These kinds of methods are based on the same theoretical analysis but with a variation of some parameters, with the objective to adapt the results for a particular aim. At present, powerful computers and mathematical algorithms are available to solve different kinds of calculation problems in a fast and efficient way. Computing application development is useful as it gives a high level of accurate results for engineering analysis and synthesis in short periods of time. This is more notable in cases where the mathematical expressions on a theoretical base are similar but with small variations of constant values. This is due to the ease of adaptation of the computer programming code into a parameter request system that defines a particular solution on each execution. Additionally, it is possible to code an application suitable to simulate any issue related to the studied technology. The aim of the present project consists of the construction of the first stage of an optoelectronics simulator named Slabsim. Slabism is capable of representing the energetic distribution of a light wave guided in the volume of a slab waveguide. The mentioned simulator is made through the graphic user interface development environment Matlab GUIDE, property of Mathworks©. It is designed for an easy and intuitive management by the user to execute simulations with a low knowledge of the technology theoretical bases. With this software it is possible to achieve several aims related to the slab waveguides by the user in low interval of time. One of the main purposes of this project is the mathematical solving of theoretical bases of slab structures through computing numerical analysis. This is due to the capability of adapting its criterion to other mathematical issues and provides a strong knowledge of its process. Based on these advantages, numerical solving methods are used in the core of the simulator to obtain differential and characteristic equations results that become represented on it.
Resumo:
Este proyecto tiene por objeto definir el diseño y ejecución de voladuras-tipo, que han sido llevadas a cabo en la mina Aguablanca (Badajoz),tanto de contorno como de producción. El diseño teórico de los diferentes parámetros (malla, consumo específico y cantidad de explosivo) necesarios para la ejecución de las voladuras se han llevado a cabo siguiendo la metodología de diferentes manuales sobre perforación y voladura y han sido ajustados a las necesidades reales del proyecto con el fin de mejorar los resultados de fragmentación, desplazamiento, esponjamiento y proyecciones de las voladuras. Los resultados obtenidos inicialmente en los cálculos teóricos de diseño de los distintos tipos de voladura no conducían a resultados óptimos. Para optimizar los resultados se han tenido que modificar algunos de los parámetros anteriormente mencionados. El Técnico debe tener capacidad para aplicar variaciones día a día que permitan mejorar los resultados obtenidos inicialmente en el cálculo teórico. ABSTRACT The project shows the design and implementation of contour and production blasts in the Mine Aguablanca (Badajoz, Spain). The explosive initial design of the blasts, including drilling pattern, powder factor and explosive charging pattern has been done following well known drilling and blasting calculations methods. As the initial theoretical values can lead to non-optimal results, the blast design has been modified by trial and error tests to achieve the desire rock fragmentation, swelling and minimize fly rocks. A good Blasting Technician must be able to adapt and modify, every day if needed, theoretical methodologies in order to cover the mining production necessitie.
Resumo:
La materia se presenta ante nosotros en multiplicidad de formas o “apariencias”. A lo largo de la historia se ha reflexionado sobre la relación entre la materia y la forma en distintos campos desde la filosofía hasta la ciencia, pasando por el arte y la arquitectura, moviéndose entre disciplinas desde las más prácticas a las más artísticas, generando posicionamientos opuestos, como el materialismo-idealismo. El concepto de materia a su vez ha ido cambiando a medida que la conciencia humana y la ciencia han ido evolucionando, pasando de considerarse materia como un ente con “masa” a la materia vacía, donde la “masa” es una ilusión que se nos “aparece” dependiendo de la frecuencia con la que vibra su sistema energético. A partir del concepto de “matière” , Josef Albers desarrolla su metodología docente. La matière es más que el aspecto, la “apariencia” que va más allá de la forma cristalizada. Es la forma cambiante que puede adoptar la materia cuando es transformada por el ser humano, dejando su huella sobre ella. Las tres cualidades de la “matière” que el profesor Albers propone en sus ejercicios del Curso Preliminar para desarrollar la “visión” con la “matière” desde la Bauhaus hasta la Universidad de Yale son: Estructural, Factural y Textural. Al desarrollar la observación, teniendo en cuenta estas tres referencias, se descubriá la desvinculación entre lo material y su apariencia desde la honradez. “La discrepancia entre los hechos físicos y el efecto psíquico”. En un proceso constante de ensayo y error se desarrollará la sensibilización individual hacia el material y la evaluación y critica gracias a la dinámica del taller que permite por comparación, aprender y evolucionar como individuo dentro de una sociedad. Esa metodología inductiva regulada por la economía de recursos, promueve el pensamiento creativo, fundamental para producir a través de la articulación un nuevo lenguaje que por medio de la formulación visual exprese nuestra relación con el mundo, con la vida. La vida que constantemente fluye y oscila entre dos polos opuestos, generando interrelaciones que tejen el mundo. Esas interacciones son las que dan vida a la obra artísitica de Albers. PALABRAS CLAVE: materia y matière, estructural factural y textural, vision, hecho físico y efectos psiquico, pensamiento creativo, vida. The matter stands before us in multiple ways or "appearances". Throughout history the relationship between matter and form has been thought out from different fields, from philosophy to science, including art and architecture, moving between disciplines from the most practical to the most artistic generating positions opposites, as materialism-idealism . The concept of matter in turn has changed as the humna consciousness and science have evolved, from being considered as a matter of "mass" to the empty field where the "mass" is an illusion that we "appears" depending on the frequency with which vibrates its energy system. Using the concept of "matière", Josef Albers develops its teaching methodology. The matière is more than the look, the "appearance" that goes beyond the crystallized form. It is the changing form that may take the matter when it is transformed by humans, leaving their mark on it. The three qualities of "matière" that Professor Albers exercises proposed in the Preliminary Course to develop a "vision" with the "matière" from the Bauhaus to Yale are: Structural, Factural and Textural. To develop observation, taking into account these three references, the separation between the material and its appearance was discovered from honesty. "The discrepancy between physical fact and psychic effect." In an ongoing process by trial and error to develop individual sensitzing towards material and critical evaluation through dynamic workshop. The workshop allows for comparison, learn and evolve as an individual within a society. That inductive methodology regulated by the economy of resources, promotes creative thinking, essential to produce through articulation a new language through visual formulation expresses our relationship with the world, with life. Life constantly flowing, oscillates between two opposite poles, creating relationships that weave the world. These interactions are what give life to the artistic work of Albers. KEYWORDS:
Resumo:
Os métodos de ondas superficiais com ênfase nas ondas Rayleigh foram utilizados como o núcleo desse trabalho de Doutorado. Inicialmente, as ondas Rayleigh foram modeladas permitindo o estudo de sensibilidade de suas curvas de dispersão sob diferentes configurações de parâmetros físicos representando diversos modelos de camadas, em que pôde ser observado parâmetros com maior e menor sensibilidade e também alguns efeitos provocados por baixas razões de Poisson. Além disso, na fase de inversão dos dados a modelagem das ondas Rayleigh foi utilizada para a construção da função objeto, que agregada ao método de mínimos quadrados, a partir do método de Levenberg-Marquardt, permitiu a implementação de um algoritmo de busca local responsável pela inversão de dados das ondas superficiais. Por se tratar de um procedimento de busca local, o algoritmo de inversão foi complementado por uma etapa de pré-inversão com a geração de um modelo inicial para que o procedimento de inversão fosse mais rápido e eficiente. Visando uma eficiência ainda maior do procedimento de inversão, principalmente em modelos de camadas com inversão de velocidades, foi implementado um algoritmo de pós-inversão baseado em um procedimento de tentativa e erro minimizando os valores relativos da raiz quadrada do erro quadrático médio (REQMr) da inversão dos dados. Mais de 50 modelos de camadas foram utilizados para testar a modelagem, a pré-inversão, inversão e pós-inversão dos dados permitindo o ajuste preciso de parâmetros matemáticos e físicos presentes nos diversos scripts implementados em Matlab. Antes de inverter os dados adquiridos em campo, os mesmos precisaram ser tratados na etapa de processamento de dados, cujo objetivo principal é a extração da curva de dispersão originada devido às ondas superficiais. Para isso, foram implementadas, também em Matlab, três metodologias de processamento com abordagens matemáticas distintas. Essas metodologias foram testadas e avaliadas com dados sintéticos e reais em que foi possível constatar as virtudes e deficiências de cada metodologia estudada, bem como as limitações provocadas pela discretização dos dados de campo. Por último, as etapas de processamento, pré-inversão, inversão e pós-inversão dos dados foram unificadas para formar um programa de tratamento de dados de ondas superficiais (Rayleigh). Ele foi utilizado em dados reais originados pelo estudo de um problema geológico na Bacia de Taubaté em que foi possível mapear os contatos geológicos ao longo dos pontos de aquisição sísmica e compará-los a um modelo inicial existente baseado em observações geomorfológicas da área de estudos, mapa geológico da região e informações geológicas globais e locais dos movimentos tectônicos na região. As informações geofísicas associadas às geológicas permitiram a geração de um perfil analítico da região de estudos com duas interpretações geológicas confirmando a suspeita de neotectônica na região em que os contatos geológicos entre os depósitos Terciários e Quaternários foram identificados e se encaixaram no modelo inicial de hemi-graben com mergulho para Sudeste.
Resumo:
Introdução: A prevalência de doenças crônicas, sobretudo na população idosa, nos coloca diante da necessidade de modelos longitudinais de cuidado. Atualmente os sujeitos estão sendo cada vez mais responsabilizados pelo gerenciamento de sua saúde através do uso de dispositivos de monitoramento, tais como o glicosímetro e o aferidor de pressão arterial. Esta nova realidade culmina na tomada de decisão no próprio domicílio. Objetivos: Identificar a tomada de decisão de idosos no monitoramento domiciliar das condições crônicas; identificar se as variáveis: sexo, escolaridade e renda influenciam a tomada de decisão; identificar a percepção dos idosos quanto às ações de cuidado no domicílio; identificar as dificuldades e estratégias no manuseio dos dispositivos de monitoramento. Materiais e métodos: Estudo quantitativo, exploratório e transversal. Casuística: 150 sujeitos com 60 anos de idade ou mais, sem comprometimento cognitivo, sem depressão e que façam uso do glicosímetro e/ou do aferidor de pressão arterial no domicílio. Instrumentos para seleção dos participantes: (1) Mini Exame do Estado Mental; (2) Escala de Depressão Geriátrica e (3) Escala de Atividades Instrumentais de Vida Diária de Lawton e Brody; Coleta de dados: realizada na cidade de Ribeirão Preto - SP entre setembro de 2014 e outubro de 2015. Instrumentos: (1) Questionário Socioeconômico; (2) Questionário sobre a tomada de decisão no monitoramento da saúde no domicílio (3) Classificação do uso de dispositivos eletrônicos voltados aos cuidados à saúde. Análise dos dados: Realizada estatística descritiva e quantificações absolutas e percentuais para identificar a relação entre tomada de decisão de acordo com o sexo, escolaridade e renda. Resultados: Participaram 150 idosos, sendo 117 mulheres e 33 homens, com média de idade de 72 anos. Destes, 113 são hipertensos e 62 são diabéticos. Quanto à tomada de decisão imediata, tanto os que fazem uso do aferidor de pressão arterial (n=128) quanto do glicosímetro (n=62) referem em sua maioria procurar ajuda médica, seguida da administração do medicamento prescrito e opções alternativas de tratamento. Em médio prazo destaca-se a procura por ajuda profissional para a maioria dos idosos em ambos os grupos. Foi notada pequena diferença na tomada de decisão com relação ao sexo. Quanto à escolaridade, os idosos com mais anos de estudos tendem a procurar mais pelo serviço de saúde se comparado aos idosos de menor escolaridade. A renda não mostrou influencia entre os usuários do glicosímetro. Já entre os usuários do aferidor de pressão arterial, idosos de maior renda tendem a procurar mais pelo serviço de saúde. A maioria dos participantes se refere ao monitoramento domiciliar da saúde de maneira positiva, principalmente pela praticidade em não sair de casa, obtenção rápida de resultados e possibilidade de controle contínuo da doença. As principais dificuldades no manuseio do glicosímetro estão relacionadas ao uso da lanceta e fita reagente, seguida da checagem dos resultados armazenados. Já as dificuldades no uso do aferidor de pressão arterial estão relacionadas a conferir o resultado após cada medida e ao posicionamento correto do corpo durante o monitoramento. Em ambos os grupos as estratégias utilizadas são pedir o auxílio de terceiros e tentativa e erro. Conclusão: Os idosos tem se mostrado favoráveis às ações de monitoramento domiciliar da saúde. De maneira geral, de imediato decidem por ações dentro do próprio domicílio para o controle dos sintomas e isto reforça a necessidade do investimento em informação de qualidade e educação em saúde para que o gerenciamento domiciliar possa vir a ser uma vertente do cuidado integral no tratamento das condições crônicas.
Resumo:
Evacuation route planning is a fundamental task for building engineering projects. Safety regulations are established so that all occupants are driven on time out of a building to a secure place when faced with an emergency situation. As an example, Spanish building code requires the planning of evacuation routes on large and, usually, public buildings. Engineers often plan these routes on single building projects, repeatedly assigning clusters of rooms to each emergency exit in a trial-and-error process. But problems may arise for a building complex where distribution and use changes make visual analysis cumbersome and sometimes unfeasible. This problem could be solved by using well-known spatial analysis techniques, implemented as a specialized software able to partially emulate engineer reasoning. In this paper we propose and test an easily reproducible methodology that makes use of free and open source software components for solving a case study. We ran a complete test on a building floor at the University of Alicante (Spain). This institution offers a web service (WFS) that allows retrieval of 2D geometries from any building within its campus. We demonstrate how geospatial technologies and computational geometry algorithms can be used for automating the creation and optimization of evacuation routes. In our case study, the engineers’ task is to verify that the load capacity of each emergency exit does not exceed the standards specified by Spain’s current regulations. Using Dijkstra’s algorithm, we obtain the shortest paths from every room to the most appropriate emergency exit. Once these paths are calculated, engineers can run simulations and validate, based on path statistics, different cluster configurations. Techniques and tools applied in this research would be helpful in the design and risk management phases of any complex building project.
Resumo:
El término gamificación está de moda. Los gurús la sitúan como una tecnología emergente y disruptiva, que cambiará muchas de nuestras experiencias en campos tan alejados de los juegos como el empresarial, el marketing y la relación con los clientes. Y el entorno educativo no escapará a ello. En este artículo presentamos la experiencia de un grupo de profesores preocupados por la docencia, que llevamos años experimentando con los videojuegos y las experiencias lúdicas, y que de repente nos hemos encontrado con el término gamificación. Estas son las lecciones que hemos aprendido, que podemos enmarcar en el campo de la gamificación en educación, pero que derivan de una experiencia práctica, de un análisis desmenuzado y de una reflexión concienzuda. Pretendemos mostrar qué es lo realmente importante y qué puntos debemos tener en cuenta los profesores antes de lanzarnos al diseño gamificado de nuestra propuesta docente.
Resumo:
In recent years much has been accomplished to make the EMU more resilient to banking crises, sovereign-debt crises or balance-of-payment crises. Several ‘backstops’ or financial safety nets were progressively put in place to absorb the shocks that could have otherwise broken the EMU as a system. These substantial advances reflected a gradual, trial-and-error approach rather than a grand design that would have completely overhauled the EMU architecture. While flexibility and realism have advantages, complacency is a clear risk. With no roadmap to follow, efforts to complete the architecture of the EMU may fade with time. Maintaining a sense of direction is crucial while potential vulnerabilities remain.
Resumo:
Chromosome bi-orientation at the metaphase spindle is essential for precise segregation of the genetic material. The process is error-prone, and error-correction mechanisms exist to switch misaligned chromosomes to the correct, bi-oriented configuration. Here, we analyze several possible dynamical scenarios to explore how cells might achieve correct bi-orientation in an efficient and robust manner. We first illustrate that tension-mediated feedback between the sister kinetochores can give rise to a bistable switch, which allows robust distinction between a loose attachment with low tension and a strong attachment with high tension. However, this mechanism has difficulties in explaining how bi-orientation is initiated starting from unattached kinetochores. We propose four possible mechanisms to overcome this problem (exploiting molecular noise; allowing an efficient attachment of kinetochores already in the absence of tension; a trial-and-error oscillation; and a stochastic bistable switch), and assess their impact on the bi-orientation process. Based on our results and supported by experimental data, we put forward a trial-and-error oscillation and a stochastic bistable switch as two elegant mechanisms with the potential to promote bi-orientation both efficiently and robustly.
Resumo:
All the structures designed by engineers are vulnerable to natural disasters including floods and earthquakes. The energy released during strong ground motions should be dissipated by structural elements. Before 1990’s, this energy was expected to be dissipated through the beams and columns which at the same time were a part of gravity-load-resisting system. However, the main disadvantage of this idea was that gravity-resisting-frame was not repairable. Hence, during 1990’s, the idea of designing passive energy dissipation systems, including dampers, emerged. At the beginning, main problem was lack of guidelines for passive energy dissipation systems. Although till 2000 many guidelines and procedures where published, yet most of them were based on complicated analysis which was not so convenient for engineers and practitioners. In order to solve this problem recently some alternative design methods are proposed including 1. Lopez Garcia (2001) simple procedure for optimal damper configuration in MDOF structures 2. Christopoulos and Filiatrault (2006) trial and error procedure 3. Silvestri et al. (2010) Five-Step Method. 4. Palermo et al. (2015) Direct Five-Step Method. 5. Palermo et al. (2016) Simplified Equivalent Static Analysis (ESA). In this study, effectiveness and differences between last three alternative methods have been evaluated.
Resumo:
Neutrality and nationalism.- The new order and the old.- Trial and error.- The changing League.