919 resultados para Network dynamics
Resumo:
The complex systems approach offers an opportunity to replace the extant pre-dominant mechanistic view on sport-related phenomena. The emphasis on the environment-system relationship, the applications of complexity principles, and the use of nonlinear dynamics mathematical tools propose a deep change in sport science. Coordination dynamics, ecological dynamics, and network approaches have been successfully applied to the study of different sport-related behaviors, from movement patterns that emerge at different scales constrained by specific sport contexts to game dynamics. Sport benefit from the use of such approaches in the understanding of technical, tactical, or physical conditioning aspects which change their meaning and dilute their frontiers. The creation of new learning and training strategies for teams and individual athletes is a main practical consequence. Some challenges for the future are investigating the influence of key control parameters in the nonlinear behavior of athlete-environment systems and the possible relatedness of the dynamics and constraints acting at different spatio-temporal scales in team sports. Modelling sport-related phenomena can make useful contributions to a better understanding of complex systems and vice-versa.
Resumo:
Quantitative analysis is increasingly being used in team sports to better understand performance in these stylized, delineated, complex social systems. Here we provide a first step toward understanding the pattern-forming dynamics that emerge from collective offensive and defensive behavior in team sports. We propose a novel method of analysis that captures how teams occupy sub-areas of the field as the ball changes location. We used the method to analyze a game of association football (soccer) based upon a hypothesis that local player numerical dominance is key to defensive stability and offensive opportunity. We found that the teams consistently allocated more players than their opponents in sub-areas of play closer to their own goal. This is consistent with a predominantly defensive strategy intended to prevent yielding even a single goal. We also find differences between the two teams' strategies: while both adopted the same distribution of defensive, midfield, and attacking players (a 4:3:3 system of play), one team was significantly more effective both in maintaining defensive and offensive numerical dominance for defensive stability and offensive opportunity. That team indeed won the match with an advantage of one goal (2 to 1) but the analysis shows the advantage in play was more pervasive than the single goal victory would indicate. Our focus on the local dynamics of team collective behavior is distinct from the traditional focus on individual player capability. It supports a broader view in which specific player abilities contribute within the context of the dynamics of multiplayer team coordination and coaching strategy. By applying this complex system analysis to association football, we can understand how players' and teams' strategies result in successful and unsuccessful relationships between teammates and opponents in the area of play.
Resumo:
Capacity to produce data for performance analysis in sports has been enhanced in the last decade with substantial technological advances. However, current performance analysis methods have been criticised for the lack of a viable theoretical framework to assist on the development of fundamental principles that regulate performance achievement. Our aim in this paper is to discuss ecological dynamics as an explanatory framework for improving analysis and understanding of competitive performance behaviours. We argue that integration of ideas from ecological dynamics into previous approaches to performance analysis advances current understanding of how sport performance emerges from continuous interactions between individual players and teams. Exemplar data from previous studies in association football are presented to illustrate this novel perspective on performance analysis. Limitations of current ecological dynamics research and challenges for future research are discussed in order to improve the meaningfulness of information presented to coaches and managers.
Resumo:
This study investigated movement synchronization of players within and between teams during competitive association football performance. Cluster phase analysis was introduced as a method to assess synchronies between whole teams and between individual players with their team as a function of time, ball possession and field direction. Measures of dispersion (SD) and regularity (sample entropy – SampEn – and cross sample entropy – Cross-SampEn) were used to quantify the magnitude and structure of synchrony. Large synergistic relations within each professional team sport collective were observed, particularly in the longitudinal direction of the field (0.89 ± 0.12) compared to the lateral direction (0.73 ± 0.16, p < .01). The coupling between the group measures of the two teams also revealed that changes in the synchrony of each team were intimately related (Cross-SampEn values of 0.02 ± 0.01). Interestingly, ball possession did not influence team synchronization levels. In player–team synchronization, individuals tended to be coordinated under near in-phase modes with team behavior (mean ranges between −7 and 5° of relative phase). The magnitudes of variations were low, but more irregular in time, for the longitudinal (SD: 18 ± 3°; SampEn: 0.07 ± 0.01), compared to the lateral direction (SD: 28 ± 5°; SampEn: 0.06 ± 0.01, p < .05) on-field. Increases in regularity were also observed between the first (SampEn: 0.07 ± 0.01) and second half (SampEn: 0.06 ± 0.01, p < .05) of the observed competitive game. Findings suggest that the method of analysis introduced in the current study may offer a suitable tool for examining team’s synchronization behaviors and the mutual influence of each team’s cohesiveness in competing social collectives.
Resumo:
An Artificial Neural Network (ANN) is a computational modeling tool which has found extensive acceptance in many disciplines for modeling complex real world problems. An ANN can model problems through learning by example, rather than by fully understanding the detailed characteristics and physics of the system. In the present study, the accuracy and predictive power of an ANN was evaluated in predicting kinetic viscosity of biodiesels over a wide range of temperatures typically encountered in diesel engine operation. In this model, temperature and chemical composition of biodiesel were used as input variables. In order to obtain the necessary data for model development, the chemical composition and temperature dependent fuel properties of ten different types of biodiesels were measured experimentally using laboratory standard testing equipments following internationally recognized testing procedures. The Neural Networks Toolbox of MatLab R2012a software was used to train, validate and simulate the ANN model on a personal computer. The network architecture was optimised following a trial and error method to obtain the best prediction of the kinematic viscosity. The predictive performance of the model was determined by calculating the absolute fraction of variance (R2), root mean squared (RMS) and maximum average error percentage (MAEP) between predicted and experimental results. This study found that ANN is highly accurate in predicting the viscosity of biodiesel and demonstrates the ability of the ANN model to find a meaningful relationship between biodiesel chemical composition and fuel properties at different temperature levels. Therefore the model developed in this study can be a useful tool in accurately predict biodiesel fuel properties instead of undertaking costly and time consuming experimental tests.
Resumo:
During the evolution of the music industry, developments in the media environment have required music firms to adapt in order to survive. Changes in broadcast radio programming during the 1950s; the Compact Cassette during the 1970s; and the deregulation of media ownership during the 1990s are all examples of changes which have heavily affected the music industry. This study explores similar contemporary dynamics, examines how decision makers in the music industry perceive and make sense of the developments, and reveals how they revise their business strategies, based on their mental models of the media environment. A qualitative system dynamics model is developed in order to support the reasoning brought forward by the study. The model is empirically grounded, but is also based on previous music industry research and a theoretical platform constituted by concepts from evolutionary economics and sociology of culture. The empirical data primarily consist of 36 personal interviews with decision makers in the American, British and Swedish music industrial ecosystems. The study argues that the model which is proposed, more effectively explains contemporary music industry dynamics than music industry models presented by previous research initiatives. Supported by the model, the study is able to show how “new” media outlets make old music business models obsolete and challenge the industry’s traditional power structures. It is no longer possible to expose music at one outlet (usually broadcast radio) in the hope that it will lead to sales of the same music at another (e.g. a compact disc). The study shows that many music industry decision makers still have not embraced the new logic, and have not yet challenged their traditional mental models of the media environment. Rather, they remain focused on preserving the pivotal role held by the CD and other physical distribution technologies. Further, the study shows that while many music firms remain attached to the old models, other firms, primarily music publishers, have accepted the transformation, and have reluctantly recognised the realities of a virtualised environment.
Resumo:
A decision-making framework for image-guided radiotherapy (IGRT) is being developed using a Bayesian Network (BN) to graphically describe, and probabilistically quantify, the many interacting factors that are involved in this complex clinical process. Outputs of the BN will provide decision-support for radiation therapists to assist them to make correct inferences relating to the likelihood of treatment delivery accuracy for a given image-guided set-up correction. The framework is being developed as a dynamic object-oriented BN, allowing for complex modelling with specific sub-regions, as well as representation of the sequential decision-making and belief updating associated with IGRT. A prototype graphic structure for the BN was developed by analysing IGRT practices at a local radiotherapy department and incorporating results obtained from a literature review. Clinical stakeholders reviewed the BN to validate its structure. The BN consists of a sub-network for evaluating the accuracy of IGRT practices and technology. The directed acyclic graph (DAG) contains nodes and directional arcs representing the causal relationship between the many interacting factors such as tumour site and its associated critical organs, technology and technique, and inter-user variability. The BN was extended to support on-line and off-line decision-making with respect to treatment plan compliance. Following conceptualisation of the framework, the BN will be quantified. It is anticipated that the finalised decision-making framework will provide a foundation to develop better decision-support strategies and automated correction algorithms for IGRT.
Resumo:
Synaptic changes at sensory inputs to the dorsal nucleus of the lateral amygdala (LAd) play a key role in the acquisition and storage of associative fear memory. However, neither the temporal nor spatial architecture of the LAd network response to sensory signals is understood. We developed a method for the elucidation of network behavior. Using this approach, temporally patterned polysynaptic recurrent network responses were found in LAd (intra-LA), both in vitro and in vivo, in response to activation of thalamic sensory afferents. Potentiation of thalamic afferents resulted in a depression of intra-LA synaptic activity, indicating a homeostatic response to changes in synaptic strength within the LAd network. Additionally, the latencies of thalamic afferent triggered recurrent network activity within the LAd overlap with known later occurring cortical afferent latencies. Thus, this recurrent network may facilitate temporal coincidence of sensory afferents within LAd during associative learning.
Resumo:
Mosquito-borne diseases pose some of the greatest challenges in public health, especially in tropical and sub-tropical regions of theworld. Efforts to control these diseases have been underpinned by a theoretical framework developed for malaria by Ross and Macdonald, including models, metrics for measuring transmission, and theory of control that identifies key vulnerabilities in the transmission cycle. That framework, especially Macdonald’s formula for R0 and its entomological derivative, vectorial capacity, are nowused to study dynamics and design interventions for many mosquito-borne diseases. A systematic review of 388 models published between 1970 and 2010 found that the vast majority adopted the Ross–Macdonald assumption of homogeneous transmission in a well-mixed population. Studies comparing models and data question these assumptions and point to the capacity to model heterogeneous, focal transmission as the most important but relatively unexplored component in current theory. Fine-scale heterogeneity causes transmission dynamics to be nonlinear, and poses problems for modeling, epidemiology and measurement. Novel mathematical approaches show how heterogeneity arises from the biology and the landscape on which the processes of mosquito biting and pathogen transmission unfold. Emerging theory focuses attention on the ecological and social context formosquito blood feeding, themovement of both hosts and mosquitoes, and the relevant spatial scales for measuring transmission and for modeling dynamics and control.
Resumo:
Biodiesel, produced from renewable feedstock represents a more sustainable source of energy and will therefore play a significant role in providing the energy requirements for transportation in the near future. Chemically, all biodiesels are fatty acid methyl esters (FAME), produced from raw vegetable oil and animal fat. However, clear differences in chemical structure are apparent from one feedstock to the next in terms of chain length, degree of unsaturation, number of double bonds and double bond configuration-which all determine the fuel properties of biodiesel. In this study, prediction models were developed to estimate kinematic viscosity of biodiesel using an Artificial Neural Network (ANN) modelling technique. While developing the model, 27 parameters based on chemical composition commonly found in biodiesel were used as the input variables and kinematic viscosity of biodiesel was used as output variable. Necessary data to develop and simulate the network were collected from more than 120 published peer reviewed papers. The Neural Networks Toolbox of MatLab R2012a software was used to train, validate and simulate the ANN model on a personal computer. The network architecture and learning algorithm were optimised following a trial and error method to obtain the best prediction of the kinematic viscosity. The predictive performance of the model was determined by calculating the coefficient of determination (R2), root mean squared (RMS) and maximum average error percentage (MAEP) between predicted and experimental results. This study found high predictive accuracy of the ANN in predicting fuel properties of biodiesel and has demonstrated the ability of the ANN model to find a meaningful relationship between biodiesel chemical composition and fuel properties. Therefore the model developed in this study can be a useful tool to accurately predict biodiesel fuel properties instead of undertaking costly and time consuming experimental tests.
Resumo:
The term ‘‘new media’’ has been in play for decades now, and one might be forgiven for wondering how much longer digital forms and platforms can really be called ‘‘new,’’ or even what the scholarship of new media contributes to knowledge. Is it possible to say new things about new media? We think so. This Companion not only demonstrates the variety, salience, and importance of new media studies but also proposes a distinctive approach to the topic : an approach we call ‘‘new media dynamics.’’ In this view, what’s interesting about ‘‘new media’’ is not novelty as such but dynamism. Capitalism, technology, social networks, and media all evolve and change, sometimes to our delight, sometimes our dismay. This incessant process of disruption, renewal, and eventual (if often partial) replacement is now one of humanity’s central experiences. This cutting-edge collection brings together a stellar array of the world’s top researchers, cultural entrepreneurs, and emerging scholars to give the dynamics of new media their first full-length, multidisciplinary, historical, and critical treatment. Across 34 chapters, an international line-up of the very best authors reflects on the historical, technical, cultural, and political changes that underlie the emergence of new media, as existing patterns and assumptions are challenged by the forces of ‘‘creative destruction’’ and innovation, both economic and cultural. At the same time they show that familiar themes and problems carry through from ‘‘old’’media – questions of identity, sexuality, politics, relationships, and meaning.
Resumo:
This paper proposes a new distributed coordination approach to make load leveling, using Energy Storage Units (ESUs) in LV network. The proposed distributed control strategy is based on consensus algorithm which shares the required active power equally among the ESUs with respect to their rating. To show the effectiveness of the proposed approach, a typical radial LV network is simulated as a case study.
Resumo:
Voltage rise and drop are the main power quality challenges in Low Voltage (LV) network with Renewable Energy (RE) generators. This paper proposes a new voltage support strategy based on coordination of multiple Distribution Static Synchronous Compensators (DSTATCOMs) using consensus algorithm. The study focuses on LV network with PV as the RE source for customers. The proposed approach applied to a typical residential LV network and its advantages are shown comparing with other voltage control strategies.
Resumo:
The operation of Autonomous Underwater Vehicles (AUVs) within underwater sensor network fields provides an opportunity to reuse the network infrastructure for long baseline localisation of the AUV. Computationally efficient localisation can be accomplished using off-the-shelf hardware that is comparatively inexpensive and which could already be deployed in the environment for monitoring purposes. This paper describes the development of a particle filter based localisation system which is implemented onboard an AUV in real-time using ranging information obtained from an ad-hoc underwater sensor network. An experimental demonstration of this approach was conducted in a lake with results presented illustrating network communication and localisation performance.
Resumo:
The safety of passengers is a major concern to airports. In the event of crises, having an effective and efficient evacuation process in place can significantly aid in enhancing passenger safety. Hence, it is necessary for airport operators to have an in-depth understanding of the evacuation process of their airport terminal. Although evacuation models have been used in studying pedestrian behaviour for decades, little research has been done in considering the evacuees’ group dynamics and the complexity of the environment. In this paper, an agent-based model is presented to simulate passenger evacuation process. Different exits were allocated to passengers based on their location and security level. The simulation results show that the evacuation time can be influenced by passenger group dynamics. This model also provides a convenient way to design airport evacuation strategy and examine its efficiency. The model was created using AnyLogic software and its parameters were initialised using recent research data published in the literature.