620 resultados para Network Dynamics
Resumo:
Mosquito-borne diseases pose some of the greatest challenges in public health, especially in tropical and sub-tropical regions of theworld. Efforts to control these diseases have been underpinned by a theoretical framework developed for malaria by Ross and Macdonald, including models, metrics for measuring transmission, and theory of control that identifies key vulnerabilities in the transmission cycle. That framework, especially Macdonald’s formula for R0 and its entomological derivative, vectorial capacity, are nowused to study dynamics and design interventions for many mosquito-borne diseases. A systematic review of 388 models published between 1970 and 2010 found that the vast majority adopted the Ross–Macdonald assumption of homogeneous transmission in a well-mixed population. Studies comparing models and data question these assumptions and point to the capacity to model heterogeneous, focal transmission as the most important but relatively unexplored component in current theory. Fine-scale heterogeneity causes transmission dynamics to be nonlinear, and poses problems for modeling, epidemiology and measurement. Novel mathematical approaches show how heterogeneity arises from the biology and the landscape on which the processes of mosquito biting and pathogen transmission unfold. Emerging theory focuses attention on the ecological and social context formosquito blood feeding, themovement of both hosts and mosquitoes, and the relevant spatial scales for measuring transmission and for modeling dynamics and control.
Resumo:
Biodiesel, produced from renewable feedstock represents a more sustainable source of energy and will therefore play a significant role in providing the energy requirements for transportation in the near future. Chemically, all biodiesels are fatty acid methyl esters (FAME), produced from raw vegetable oil and animal fat. However, clear differences in chemical structure are apparent from one feedstock to the next in terms of chain length, degree of unsaturation, number of double bonds and double bond configuration-which all determine the fuel properties of biodiesel. In this study, prediction models were developed to estimate kinematic viscosity of biodiesel using an Artificial Neural Network (ANN) modelling technique. While developing the model, 27 parameters based on chemical composition commonly found in biodiesel were used as the input variables and kinematic viscosity of biodiesel was used as output variable. Necessary data to develop and simulate the network were collected from more than 120 published peer reviewed papers. The Neural Networks Toolbox of MatLab R2012a software was used to train, validate and simulate the ANN model on a personal computer. The network architecture and learning algorithm were optimised following a trial and error method to obtain the best prediction of the kinematic viscosity. The predictive performance of the model was determined by calculating the coefficient of determination (R2), root mean squared (RMS) and maximum average error percentage (MAEP) between predicted and experimental results. This study found high predictive accuracy of the ANN in predicting fuel properties of biodiesel and has demonstrated the ability of the ANN model to find a meaningful relationship between biodiesel chemical composition and fuel properties. Therefore the model developed in this study can be a useful tool to accurately predict biodiesel fuel properties instead of undertaking costly and time consuming experimental tests.
Resumo:
The term ‘‘new media’’ has been in play for decades now, and one might be forgiven for wondering how much longer digital forms and platforms can really be called ‘‘new,’’ or even what the scholarship of new media contributes to knowledge. Is it possible to say new things about new media? We think so. This Companion not only demonstrates the variety, salience, and importance of new media studies but also proposes a distinctive approach to the topic : an approach we call ‘‘new media dynamics.’’ In this view, what’s interesting about ‘‘new media’’ is not novelty as such but dynamism. Capitalism, technology, social networks, and media all evolve and change, sometimes to our delight, sometimes our dismay. This incessant process of disruption, renewal, and eventual (if often partial) replacement is now one of humanity’s central experiences. This cutting-edge collection brings together a stellar array of the world’s top researchers, cultural entrepreneurs, and emerging scholars to give the dynamics of new media their first full-length, multidisciplinary, historical, and critical treatment. Across 34 chapters, an international line-up of the very best authors reflects on the historical, technical, cultural, and political changes that underlie the emergence of new media, as existing patterns and assumptions are challenged by the forces of ‘‘creative destruction’’ and innovation, both economic and cultural. At the same time they show that familiar themes and problems carry through from ‘‘old’’media – questions of identity, sexuality, politics, relationships, and meaning.
Resumo:
This paper proposes a new distributed coordination approach to make load leveling, using Energy Storage Units (ESUs) in LV network. The proposed distributed control strategy is based on consensus algorithm which shares the required active power equally among the ESUs with respect to their rating. To show the effectiveness of the proposed approach, a typical radial LV network is simulated as a case study.
Resumo:
Voltage rise and drop are the main power quality challenges in Low Voltage (LV) network with Renewable Energy (RE) generators. This paper proposes a new voltage support strategy based on coordination of multiple Distribution Static Synchronous Compensators (DSTATCOMs) using consensus algorithm. The study focuses on LV network with PV as the RE source for customers. The proposed approach applied to a typical residential LV network and its advantages are shown comparing with other voltage control strategies.
Resumo:
The operation of Autonomous Underwater Vehicles (AUVs) within underwater sensor network fields provides an opportunity to reuse the network infrastructure for long baseline localisation of the AUV. Computationally efficient localisation can be accomplished using off-the-shelf hardware that is comparatively inexpensive and which could already be deployed in the environment for monitoring purposes. This paper describes the development of a particle filter based localisation system which is implemented onboard an AUV in real-time using ranging information obtained from an ad-hoc underwater sensor network. An experimental demonstration of this approach was conducted in a lake with results presented illustrating network communication and localisation performance.
Resumo:
The safety of passengers is a major concern to airports. In the event of crises, having an effective and efficient evacuation process in place can significantly aid in enhancing passenger safety. Hence, it is necessary for airport operators to have an in-depth understanding of the evacuation process of their airport terminal. Although evacuation models have been used in studying pedestrian behaviour for decades, little research has been done in considering the evacuees’ group dynamics and the complexity of the environment. In this paper, an agent-based model is presented to simulate passenger evacuation process. Different exits were allocated to passengers based on their location and security level. The simulation results show that the evacuation time can be influenced by passenger group dynamics. This model also provides a convenient way to design airport evacuation strategy and examine its efficiency. The model was created using AnyLogic software and its parameters were initialised using recent research data published in the literature.
Resumo:
Many large-scale GNSS CORS networks have been deployed around the world to support various commercial and scientific applications. To make use of these networks for real-time kinematic positioning services, one of the major challenges is the ambiguity resolution (AR) over long inter-station baselines in the presence of considerable atmosphere biases. Usually, the widelane ambiguities are fixed first, followed by the procedure of determination of the narrowlane ambiguity integers based on the ionosphere-free model in which the widelane integers are introduced as known quantities. This paper seeks to improve the AR performance over long baseline through efficient procedures for improved float solutions and ambiguity fixing. The contribution is threefold: (1) instead of using the ionosphere-free measurements, the absolute and/or relative ionospheric constraints are introduced in the ionosphere-constrained model to enhance the model strength, thus resulting in the better float solutions; (2) the realistic widelane ambiguity precision is estimated by capturing the multipath effects due to the observation complexity, leading to improvement of reliability of widelane AR; (3) for the narrowlane AR, the partial AR for a subset of ambiguities selected according to the successively increased elevation is applied. For fixing the scalar ambiguity, an error probability controllable rounding method is proposed. The established ionosphere-constrained model can be efficiently solved based on the sequential Kalman filter. It can be either reduced to some special models simply by adjusting the variances of ionospheric constraints, or extended with more parameters and constraints. The presented methodology is tested over seven baselines of around 100 km from USA CORS network. The results show that the new widelane AR scheme can obtain the 99.4 % successful fixing rate with 0.6 % failure rate; while the new rounding method of narrowlane AR can obtain the fix rate of 89 % with failure rate of 0.8 %. In summary, the AR reliability can be efficiently improved with rigorous controllable probability of incorrectly fixed ambiguities.
Resumo:
Detecting anomalies in the online social network is a significant task as it assists in revealing the useful and interesting information about the user behavior on the network. This paper proposes a rule-based hybrid method using graph theory, Fuzzy clustering and Fuzzy rules for modeling user relationships inherent in online-social-network and for identifying anomalies. Fuzzy C-Means clustering is used to cluster the data and Fuzzy inference engine is used to generate rules based on the cluster behavior. The proposed method is able to achieve improved accuracy for identifying anomalies in comparison to existing methods.
Resumo:
Safety concerns in the operation of autonomous aerial systems require safe-landing protocols be followed during situations where the mission should be aborted due to mechanical or other failure. This article presents a pulse-coupled neural network (PCNN) to assist in the vegetation classification in a vision-based landing site detection system for an unmanned aircraft. We propose a heterogeneous computing architecture and an OpenCL implementation of a PCNN feature generator. Its performance is compared across OpenCL kernels designed for CPU, GPU, and FPGA platforms. This comparison examines the compute times required for network convergence under a variety of images to determine the plausibility for real-time feature detection.
Resumo:
Network coding is a method for achieving channel capacity in networks. The key idea is to allow network routers to linearly mix packets as they traverse the network so that recipients receive linear combinations of packets. Network coded systems are vulnerable to pollution attacks where a single malicious node floods the network with bad packets and prevents the receiver from decoding correctly. Cryptographic defenses to these problems are based on homomorphic signatures and MACs. These proposals, however, cannot handle mixing of packets from multiple sources, which is needed to achieve the full benefits of network coding. In this paper we address integrity of multi-source mixing. We propose a security model for this setting and provide a generic construction.
Resumo:
For TREC Crowdsourcing 2011 (Stage 2) we propose a networkbased approach for assigning an indicative measure of worker trustworthiness in crowdsourced labelling tasks. Workers, the gold standard and worker/gold standard agreements are modelled as a network. For the purpose of worker trustworthiness assignment, a variant of the PageRank algorithm, named TurkRank, is used to adaptively combine evidence that suggests worker trustworthiness, i.e., agreement with other trustworthy co-workers and agreement with the gold standard. A single parameter controls the importance of co-worker agreement versus gold standard agreement. The TurkRank score calculated for each worker is incorporated with a worker-weighted mean label aggregation.
Resumo:
Despite board meetings representing the main arena where directors discharge their duties and make critical corporate decisions, we know little about what occurs in the boardroom. Consequently, there is increasing academic interest in understanding how meetings are run and how directors participate. This study contributes to this emerging literature by exploring the impact of board meeting arrangements on directors’ interactions and perceptions of meeting effectiveness. We video-taped board meetings at two Australian corporations operating in the same industry and use an in-depth analysis of interactions and board processes to reveal that a rather small difference in meeting arrangements (i.e. the timing and length of meetings) had a significant influence on interaction patterns. Specifically, given significant amounts of environmental turbulence in the sector, director inclusiveness and participation were reduced as time pressure increased due to shorter meetings, lowering director perceptions of meeting effectiveness.
Resumo:
Affordance is an important concept in HCI. There are various interpretations of affordances but it has been difficult to use this concept for design purposes. Often the treatment of affordances in the current HCI literature has been as a one-to-one relationship between a user and an artefact. According to our views, affordance is a dynamic, always emerging relationship between a human and his environment. We believe that the social and cultural contexts within which an artefact is situated affect the way in which the artefact is used. Using a Structuration Theory approach, we argue that affordances need also be treated at a much broader level, encompassing social and cultural aspects. We suggest that affordances should be seen at three levels: single user, organizational (or work group) and societal. Focusing on the organizational level affordances, we provide details of several important factors that affect the emergence of affordances.
Resumo:
While both the restoration of the blood supply and an appropriate local mechanical environment are critical for uneventful bone healing, their influence on each other remains unclear. Human bone fracture haematomas (<72h post-trauma) were cultivated for 3 days in fibrin matrices, with or without cyclic compression. Conditioned medium from these cultures enhanced the formation of vessel-like networks by HMEC-1 cells, and mechanical loading further elevated it, without affecting the cells’ metabolic activity. While haematomas released the angiogenesis-regulators, VEGF and TGF-β1, their concentrations were not affected by mechanical loading. However, direct cyclic stretching of the HMEC-1 cells decreased network formation. The appearance of the networks and a trend towards elevated VEGF under strain suggested physical disruption rather than biochemical modulation as the responsible mechanism. Thus, early fracture haematomas and their mechanical loading increase the paracrine stimulation of endothelial organisation in vitro, but direct periodic strains may disrupt or impair vessel assembly in otherwise favourable conditions.