966 resultados para global behavior


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Seismic data is difficult to analyze and classical mathematical tools reveal strong limitations in exposing hidden relationships between earthquakes. In this paper, we study earthquake phenomena in the perspective of complex systems. Global seismic data, covering the period from 1962 up to 2011 is analyzed. The events, characterized by their magnitude, geographic location and time of occurrence, are divided into groups, either according to the Flinn-Engdahl (F-E) seismic regions of Earth or using a rectangular grid based in latitude and longitude coordinates. Two methods of analysis are considered and compared in this study. In a first method, the distributions of magnitudes are approximated by Gutenberg-Richter (G-R) distributions and the parameters used to reveal the relationships among regions. In the second method, the mutual information is calculated and adopted as a measure of similarity between regions. In both cases, using clustering analysis, visualization maps are generated, providing an intuitive and useful representation of the complex relationships that are present among seismic data. Such relationships might not be perceived on classical geographic maps. Therefore, the generated charts are a valid alternative to other visualization tools, for understanding the global behavior of earthquakes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Earthquakes are associated with negative events, such as large number of casualties, destruction of buildings and infrastructures, or emergence of tsunamis. In this paper, we apply the Multidimensional Scaling (MDS) analysis to earthquake data. MDS is a set of techniques that produce spatial or geometric representations of complex objects, such that, objects perceived to be similar/distinct in some sense are placed nearby/distant on the MDS maps. The interpretation of the charts is based on the resulting clusters since MDS produces a different locus for each similarity measure. In this study, over three million seismic occurrences, covering the period from January 1, 1904 up to March 14, 2012 are analyzed. The events, characterized by their magnitude and spatiotemporal distributions, are divided into groups, either according to the Flinn–Engdahl seismic regions of Earth or using a rectangular grid based in latitude and longitude coordinates. Space-time and Space-frequency correlation indices are proposed to quantify the similarities among events. MDS has the advantage of avoiding sensitivity to the non-uniform spatial distribution of seismic data, resulting from poorly instrumented areas, and is well suited for accessing dynamics of complex systems. MDS maps are proven as an intuitive and useful visual representation of the complex relationships that are present among seismic events, which may not be perceived on traditional geographic maps. Therefore, MDS constitutes a valid alternative to classic visualization tools, for understanding the global behavior of earthquakes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Distributed systems are one of the most vital components of the economy. The most prominent example is probably the internet, a constituent element of our knowledge society. During the recent years, the number of novel network types has steadily increased. Amongst others, sensor networks, distributed systems composed of tiny computational devices with scarce resources, have emerged. The further development and heterogeneous connection of such systems imposes new requirements on the software development process. Mobile and wireless networks, for instance, have to organize themselves autonomously and must be able to react to changes in the environment and to failing nodes alike. Researching new approaches for the design of distributed algorithms may lead to methods with which these requirements can be met efficiently. In this thesis, one such method is developed, tested, and discussed in respect of its practical utility. Our new design approach for distributed algorithms is based on Genetic Programming, a member of the family of evolutionary algorithms. Evolutionary algorithms are metaheuristic optimization methods which copy principles from natural evolution. They use a population of solution candidates which they try to refine step by step in order to attain optimal values for predefined objective functions. The synthesis of an algorithm with our approach starts with an analysis step in which the wanted global behavior of the distributed system is specified. From this specification, objective functions are derived which steer a Genetic Programming process where the solution candidates are distributed programs. The objective functions rate how close these programs approximate the goal behavior in multiple randomized network simulations. The evolutionary process step by step selects the most promising solution candidates and modifies and combines them with mutation and crossover operators. This way, a description of the global behavior of a distributed system is translated automatically to programs which, if executed locally on the nodes of the system, exhibit this behavior. In our work, we test six different ways for representing distributed programs, comprising adaptations and extensions of well-known Genetic Programming methods (SGP, eSGP, and LGP), one bio-inspired approach (Fraglets), and two new program representations called Rule-based Genetic Programming (RBGP, eRBGP) designed by us. We breed programs in these representations for three well-known example problems in distributed systems: election algorithms, the distributed mutual exclusion at a critical section, and the distributed computation of the greatest common divisor of a set of numbers. Synthesizing distributed programs the evolutionary way does not necessarily lead to the envisaged results. In a detailed analysis, we discuss the problematic features which make this form of Genetic Programming particularly hard. The two Rule-based Genetic Programming approaches have been developed especially in order to mitigate these difficulties. In our experiments, at least one of them (eRBGP) turned out to be a very efficient approach and in most cases, was superior to the other representations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Currently the interest in large-scale systems with a high degree of complexity has been much discussed in the scientific community in various areas of knowledge. As an example, the Internet, protein interaction, collaboration of film actors, among others. To better understand the behavior of interconnected systems, several models in the area of complex networks have been proposed. Barabási and Albert proposed a model in which the connection between the constituents of the system could dynamically and which favors older sites, reproducing a characteristic behavior in some real systems: connectivity distribution of scale invariant. However, this model neglects two factors, among others, observed in real systems: homophily and metrics. Given the importance of these two terms in the global behavior of networks, we propose in this dissertation study a dynamic model of preferential binding to three essential factors that are responsible for competition for links: (i) connectivity (the more connected sites are privileged in the choice of links) (ii) homophily (similar connections between sites are more attractive), (iii) metric (the link is favored by the proximity of the sites). Within this proposal, we analyze the behavior of the distribution of connectivity and dynamic evolution of the network are affected by the metric by A parameter that controls the importance of distance in the preferential binding) and homophily by (characteristic intrinsic site). We realized that the increased importance as the distance in the preferred connection, the connections between sites and become local connectivity distribution is characterized by a typical range. In parallel, we adjust the curves of connectivity distribution, for different values of A, the equation P(k) = P0e

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A fluidização de partículas é amplamente utilizada na indústria, principalmente devido às altas taxas de transferência de calor e massa entre as fases. O acoplamento entre a Dinâmica dos Fluidos Computacional (CFD – Computational Fluid Dynamics) e o Método dos Elementos Discretos (DEM – Discrete Element Method) tem se tornado atrativo para a simulação de fluidização, já que nesse caso o movimento das partículas é analisado de forma mais direta do que em outros tipos de abordagens. O grande problema do acoplamento CFD-DEM é a alta exigência computacional para rastrear todas as partículas do sistema, o que leva ao uso de estratégias de redução do tempo de simulação que em caso de utilização incorreta podem comprometer os resultados. O presente trabalho trata da aplicação do acoplamento CFD-DEM na análise de fluidização de alumina, que é um problema importante para o setor mineral. Foram analisados diversos parâmetros capazes de influenciar os resultados e o tempo de simulação como os passos de tempo, os modelos de arrasto, a distribuição granulométrica das partículas, a constante de rigidez, a utilização de partículas representativas com tamanho maior que o das partículas reais, etc. O modelo de força de interação DEM utilizado foi o modelo de mola e amortecedor lineares (LSD – Linear Spring Dashpot). Todas as simulações foram realizadas com o software ANSYS FLUENT 14.5 e os resultados obtidos foram comparados com dados experimentais e da literatura. Tais resultados permitiram comprovar a capacidade do modelo linear LSD em predizer o comportamento global de leitos de alumina e reduzir o tempo de simulação, desde que os parâmetros do modelo sejam definidos de forma adequada.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Self-stabilization is a property of a distributed system such that, regardless of the legitimacy of its current state, the system behavior shall eventually reach a legitimate state and shall remain legitimate thereafter. The elegance of self-stabilization stems from the fact that it distinguishes distributed systems by a strong fault tolerance property against arbitrary state perturbations. The difficulty of designing and reasoning about self-stabilization has been witnessed by many researchers; most of the existing techniques for the verification and design of self-stabilization are either brute-force, or adopt manual approaches non-amenable to automation. In this dissertation, we first investigate the possibility of automatically designing self-stabilization through global state space exploration. In particular, we develop a set of heuristics for automating the addition of recovery actions to distributed protocols on various network topologies. Our heuristics equally exploit the computational power of a single workstation and the available parallelism on computer clusters. We obtain existing and new stabilizing solutions for classical protocols like maximal matching, ring coloring, mutual exclusion, leader election and agreement. Second, we consider a foundation for local reasoning about self-stabilization; i.e., study the global behavior of the distributed system by exploring the state space of just one of its components. It turns out that local reasoning about deadlocks and livelocks is possible for an interesting class of protocols whose proof of stabilization is otherwise complex. In particular, we provide necessary and sufficient conditions – verifiable in the local state space of every process – for global deadlock- and livelock-freedom of protocols on ring topologies. Local reasoning potentially circumvents two fundamental problems that complicate the automated design and verification of distributed protocols: (1) state explosion and (2) partial state information. Moreover, local proofs of convergence are independent of the number of processes in the network, thereby enabling our assertions about deadlocks and livelocks to apply on rings of arbitrary sizes without worrying about state explosion.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, several computational schemes are presented for the optimal tuning of the global behavior of nonlinear dynamical sys- tems. Specifically, the maximization of the size of domains of attraction associated with invariants in parametrized dynamical sys- tems is addressed. Cell Mapping (CM) tech- niques are used to estimate the size of the domains, and such size is then maximized via different optimization tools. First, a ge- netic algorithm is tested whose performance shows to be good for determining global maxima at the expense of high computa- tional cost. Secondly, an iterative scheme based on a Stochastic Approximation proce- dure (the Kiefer-Wolfowitz algorithm) is eval- uated showing acceptable performance at low cost. Finally, several schemes combining neu- ral network based estimations and optimiza- tion procedures are addressed with promising results. The performance of the methods is illus- trated with two applications: first on the well-known van der Pol equation with stan- dard parametrization, and second the tuning of a controller for saturated systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A simplified model is proposed to show the importance that the dynamic soil-abutment interaction can have in the global behavior of bridges submitted to seismic loading. The modification of natural frequency and damping properties is shown in graphic form for typical short span bridges of the integral deck-abutment type for longitudinal vibrations or general ones for transverse vibrations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Durante mucho tiempo se han estudiado los efectos producidos por el impacto de objetos sobre estructuras, inicialmente la gran mayoría de los estudios se centraban en el impacto de proyectiles de tipo balístico dado el interés que se tenía en el diseño de estructuras capaces de soportar el impacto de dichos proyectiles. Dada la falta de capacidad de cálculo para resolver el problema que tuviera en cuenta el comportamiento global de la estructura junto con el comportamiento local, los estudios se centraban básicamente en la zona de impacto. El momento en el cual se pueden realizar cálculos que requieren de múltiples iteraciones para llegar a una solución satisfactoria al complejo problema planteado no se produce hasta la llegada de los modernos ordenadores. En el presente estudio se establece un sistema de múltiples grados de libertad (SMDF, System of Multiple Degrees of Freedom), que permite el estudio del impacto de una roca sobre una viga de hormigón armado teniendo en cuenta factores que afectan al ámbito local y global de la estructura analizada. El sistema se resuelve a través de un método de resolución implícita como es el método de Newmark, el cual nos permite, sin tener que acceder a un programa de elementos finitos, obtener una solución suficientemente aproximada al problema planteado con un coste computacional relativamente bajo. En el documento se comprueba el modelo propuesto con los resultados existentes de unos ensayos a escala real, y se plantean diversas hipótesis analizando las diferentes respuestas del sistema a la variación de las condiciones de partida. The effects produced by the impact of objects on structures have been studied for a long time. Initially, the vast majority of studies focused on the impact of ballistic missiles, due to the particular interest in the design of these structures being capable to withstand the impact such projectiles. Due to the lack of calculation capacity to solve the problem of taking into account the global behavior of the structure together with the local behavior, the studies focused mainly on the impact zone. The moment in which calculations that required multiple iterations could be performed with satisfactory solutions for the complex problem presented did not arrive until the introduction of modern computers. The present study establishes a System of Multiple Degrees of Freedom, which allows the study of the impact of a rock on a reinforced concrete beam, taking into account factors that affect the local and global behavior of the structure analyzed. The system is solved using an implicit solution method as is the Newmark method, which allows us, without using a finite element program, to obtain a sufficiently approximate solution to the problem with a relatively low computational cost. This paper tests the proposed model with existing results obtained in large-scale tests, and analyses the response of the system to various changing scenarios to the starting conditions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cualquier estructura vibra según unas frecuencias propias definidas por sus parámetros modales (frecuencias naturales, amortiguamientos y formas modales). A través de las mediciones de la vibración en puntos clave de la estructura, los parámetros modales pueden ser estimados. En estructuras civiles, es difícil excitar una estructura de manera controlada, por lo tanto, las técnicas que implican la estimación de los parámetros modales sólo registrando su respuesta son de vital importancia para este tipo de estructuras. Esta técnica se conoce como Análisis Modal Operacional (OMA). La técnica del OMA no necesita excitar artificialmente la estructura, atendiendo únicamente a su comportamiento en servicio. La motivación para llevar a cabo pruebas de OMA surge en el campo de la Ingeniería Civil, debido a que excitar artificialmente con éxito grandes estructuras no sólo resulta difícil y costoso, sino que puede incluso dañarse la estructura. Su importancia reside en que el comportamiento global de una estructura está directamente relacionado con sus parámetros modales, y cualquier variación de rigidez, masa o condiciones de apoyo, aunque sean locales, quedan reflejadas en los parámetros modales. Por lo tanto, esta identificación puede integrarse en un sistema de vigilancia de la integridad estructural. La principal dificultad para el uso de los parámetros modales estimados mediante OMA son las incertidumbres asociadas a este proceso de estimación. Existen incertidumbres en el valor de los parámetros modales asociadas al proceso de cálculo (internos) y también asociadas a la influencia de los factores ambientales (externas), como es la temperatura. Este Trabajo Fin de Máster analiza estas dos fuentes de incertidumbre. Es decir, en primer lugar, para una estructura de laboratorio, se estudian y cuantifican las incertidumbres asociadas al programa de OMA utilizado. En segundo lugar, para una estructura en servicio (una pasarela de banda tesa), se estudian tanto el efecto del programa OMA como la influencia del factor ambiental en la estimación de los parámetros modales. Más concretamente, se ha propuesto un método para hacer un seguimiento de las frecuencias naturales de un mismo modo. Este método incluye un modelo de regresión lineal múltiple que permite eliminar la influencia de estos agentes externos. A structure vibrates according to some of its vibration modes, defined by their modal parameters (natural frequencies, damping ratios and modal shapes). Through the measurements of the vibration at key points of the structure, the modal parameters can be estimated. In civil engineering structures, it is difficult to excite structures in a controlled manner, thus, techniques involving output-only modal estimation are of vital importance for these structure. This techniques are known as Operational Modal Analysis (OMA). The OMA technique does not need to excite artificially the structure, this considers its behavior in service only. The motivation for carrying out OMA tests arises in the area of Civil Engineering, because successfully artificially excite large structures is difficult and expensive. It also may even damage the structure. The main goal is that the global behavior of a structure is directly related to their modal parameters, and any variation of stiffness, mass or support conditions, although it is local, is also reflected in the modal parameters. Therefore, this identification may be within a Structural Health Monitoring system. The main difficulty for using the modal parameters estimated by an OMA is the uncertainties associated to this estimation process. Thus, there are uncertainties in the value of the modal parameters associated to the computing process (internal) and the influence of environmental factors (external), such as the temperature. This Master’s Thesis analyzes these two sources of uncertainties. That is, firstly, for a lab structure, the uncertainties associated to the OMA program used are studied and quantified. Secondly, for an in-service structure (a stress-ribbon footbridge), both the effect of the OMA program and the influence of environmental factor on the modal parameters estimation are studied. More concretely, a method to track natural frequencies of the same mode has been proposed. This method includes a multiple linear regression model that allows to remove the influence of these external agents.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We tested a model that children's tendency to attribute hostile intent to others in response to provocation is a key psychological process that statistically accounts for individual differences in reactive aggressive behavior and that this mechanism contributes to global group differences in children's chronic aggressive behavior problems. Participants were 1,299 children (mean age at year 1 = 8.3 y; 51% girls) from 12 diverse ecological-context groups in nine countries worldwide, followed across 4 y. In year 3, each child was presented with each of 10 hypothetical vignettes depicting an ambiguous provocation toward the child and was asked to attribute the likely intent of the provocateur (coded as benign or hostile) and to predict his or her own behavioral response (coded as nonaggression or reactive aggression). Mothers and children independently rated the child's chronic aggressive behavior problems in years 2, 3, and 4. In every ecological group, in those situations in which a child attributed hostile intent to a peer, that child was more likely to report that he or she would respond with reactive aggression than in situations when that same child attributed benign intent. Across children, hostile attributional bias scores predicted higher mother- and child-rated chronic aggressive behavior problems, even controlling for prior aggression. Ecological group differences in the tendency for children to attribute hostile intent statistically accounted for a significant portion of group differences in chronic aggressive behavior problems. The findings suggest a psychological mechanism for group differences in aggressive behavior and point to potential interventions to reduce aggressive behavior.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this study was to analyze the alpha-amylase (sAA) and cortisol levels in children with Global developmental delay (GDD) before and after dental treatment and its association with the children's behavior during treatment. The morning salivary cortisol levels and activity of sAA of 33 children with GDD were evaluated before and after dental treatment and were compared to 19 healthy children. The behavior of children with GDD during dental care was assessed by the Frankl scale. Children with GDD showed lower levels of sAA activity than healthy children, but this result was not significant. The salivary cortisol levels were similar between GOD and healthy children. GDD children showed increased levels of sAA (but not cortisol) prior to the dental treatment as compared to the post-treatment phase. GOD children who showed less favorable behavior during dental care had higher levels of sAA and salivary cortisol than GOD children with more favorable behavior, but only the sAA results were significant. In conclusion, GOD children show hyperactivity of the SNS-axis in anticipation of dental treatment which indicates the need for strategies to reduce their anxiety levels before and during dental care. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the accelerated trend of global warming, the thermal behavior of existing buildings, which were typically designed based on current weather data, may not be able to cope with the future climate. This paper quantifies, through computer simulations, the increased cooling loads imposed by potential global warming and probable indoor temperature increases due to possible undersized air-conditioning system. It is found from the sample office building examined that the existing buildings would generally be able to adapt to the increasing warmth of 2030 year Low and High scenarios projections and 2070 year Low scenario projection. However, for the 2070 year High scenario, the study indicates that the existing office buildings, in all capital cities except for Hobart, will suffer from overheating problems. When the annual average temperature increase exceeds 2°C, the risk of current office buildings subjected to overheating will be significantly increased. For existing buildings which are designed with current climate condition, it is shown that there is a nearly linear correlation between the increase of average external air temperature and the increase of building cooling load. For the new buildings, in which the possible global warming has been taken into account in the design, a 28-59% increase of cooling capacity under 2070 High scenario would be required to improve the building thermal comfort level to an acceptable standard.