825 resultados para network traffic analysis
Resumo:
Based on an original and comprehensive database of all feature fiction films produced in Mercosur between 2004 and 2012, the paper analyses whether the Mercosur film industry has evolved towards an integrated and culturally more diverse market. It provides a summary of policy opportunities in terms of integration and diversity, emphasizing the limiter role played by regional policies. It then shows that although the Mercosur film industry remains rather disintegrated, it tends to become more integrated and culturally more diverse. From a methodological point of view, the combination of Social Network Analysis and the Stirling Model opens up interesting research tracks to analyse creative industries in terms of their market integration and their cultural diversity.
Resumo:
Marine protected areas (MPAs) are commonly employed to protect ecosystems from threats like overfishing. Ideally, MPA design should incorporate movement data from multiple target species to ensure sufficient habitat is protected. We used long-term acoustic telemetry and network analysis to determine the fine-scale space use of five shark and one turtle species at a remote atoll in the Seychelles, Indian Ocean, and evaluate the efficacy of a proposed MPA. Results revealed strong, species-specific habitat use in both sharks and turtles, with corresponding variation in MPA use. Defining the MPA's boundary from the edge of the reef flat at low tide instead of the beach at high tide (the current best in Seychelles) significantly increased the MPA's coverage of predator movements by an average of 34%. Informed by these results, the larger MPA was adopted by the Seychelles government, demonstrating how telemetry data can improve shark spatial conservation by affecting policy directly.
Resumo:
Marine protected areas (MPAs) are commonly employed to protect ecosystems from threats like overfishing. Ideally, MPA design should incorporate movement data from multiple target species to ensure sufficient habitat is protected. We used long-term acoustic telemetry and network analysis to determine the fine-scale space use of five shark and one turtle species at a remote atoll in the Seychelles, Indian Ocean, and evaluate the efficacy of a proposed MPA. Results revealed strong, species-specific habitat use in both sharks and turtles, with corresponding variation in MPA use. Defining the MPA's boundary from the edge of the reef flat at low tide instead of the beach at high tide (the current best in Seychelles) significantly increased the MPA's coverage of predator movements by an average of 34%. Informed by these results, the larger MPA was adopted by the Seychelles government, demonstrating how telemetry data can improve shark spatial conservation by affecting policy directly.
Resumo:
Network security monitoring remains a challenge. As global networks scale up, in terms of traffic, volume and speed, effective attribution of cyber attacks is increasingly difficult. The problem is compounded by a combination of other factors, including the architecture of the Internet, multi-stage attacks and increasing volumes of nonproductive traffic. This paper proposes to shift the focus of security monitoring from the source to the target. Simply put, resources devoted to detection and attribution should be redeployed to efficiently monitor for targeting and prevention of attacks. The effort of detection should aim to determine whether a node is under attack, and if so, effectively prevent the attack. This paper contributes by systematically reviewing the structural, operational and legal reasons underlying this argument, and presents empirical evidence to support a shift away from attribution to favour of a target-centric monitoring approach. A carefully deployed set of experiments are presented and a detailed analysis of the results is achieved.
Resumo:
Safety on public transport is a major concern for the relevant authorities. We
address this issue by proposing an automated surveillance platform which combines data from video, infrared and pressure sensors. Data homogenisation and integration is achieved by a distributed architecture based on communication middleware that resolves interconnection issues, thereby enabling data modelling. A common-sense knowledge base models and encodes knowledge about public-transport platforms and the actions and activities of passengers. Trajectory data from passengers is modelled as a time-series of human activities. Common-sense knowledge and rules are then applied to detect inconsistencies or errors in the data interpretation. Lastly, the rationality that characterises human behaviour is also captured here through a bottom-up Hierarchical Task Network planner that, along with common-sense, corrects misinterpretations to explain passenger behaviour. The system is validated using a simulated bus saloon scenario as a case-study. Eighteen video sequences were recorded with up to six passengers. Four metrics were used to evaluate performance. The system, with an accuracy greater than 90% for each of the four metrics, was found to outperform a rule-base system and a system containing planning alone.
Resumo:
Data mining can be defined as the extraction of implicit, previously un-known, and potentially useful information from data. Numerous re-searchers have been developing security technology and exploring new methods to detect cyber-attacks with the DARPA 1998 dataset for Intrusion Detection and the modified versions of this dataset KDDCup99 and NSL-KDD, but until now no one have examined the performance of the Top 10 data mining algorithms selected by experts in data mining. The compared classification learning algorithms in this thesis are: C4.5, CART, k-NN and Naïve Bayes. The performance of these algorithms are compared with accuracy, error rate and average cost on modified versions of NSL-KDD train and test dataset where the instances are classified into normal and four cyber-attack categories: DoS, Probing, R2L and U2R. Additionally the most important features to detect cyber-attacks in all categories and in each category are evaluated with Weka’s Attribute Evaluator and ranked according to Information Gain. The results show that the classification algorithm with best performance on the dataset is the k-NN algorithm. The most important features to detect cyber-attacks are basic features such as the number of seconds of a network connection, the protocol used for the connection, the network service used, normal or error status of the connection and the number of data bytes sent. The most important features to detect DoS, Probing and R2L attacks are basic features and the least important features are content features. Unlike U2R attacks, where the content features are the most important features to detect attacks.
Resumo:
The cluster provides a greater commercial relationship between the companies that comprise it. This encourages companies to adopt competitive structures that allow solving problems that would hardly alone (Lubeck et. Al., 2011). With that this paper aims to describe the coopetition between companies operating on a commercial cluster planned, from the point of view of retailers, taking as a basis the theoretical models proposed by Bengtsson and Kock (1999) and Leon (2005) and operationalized by means of Social Network Analysis (SNA). Data collection consisted of two phases, the first exploratory aspect to identify the actors, and the second was characterized as descriptive as it aims to describe the coopetition among the enterprises. As a result we identified the companies that cooperate and compete simultaneously (coopetition), firms that only compete, companies just cooperate and businesses that do not compete and do not cooperate (coexistence).
Resumo:
La Banque mondiale propose la bonne gouvernance comme la stratégie visant à corriger les maux de la mauvaise gouvernance et de faciliter le développement dans les pays en développement (Carayannis, Pirzadeh, Popescu & 2012; & Hilyard Wilks 1998; Leftwich 1993; Banque mondiale, 1989). Dans cette perspective, la réforme institutionnelle et une arène de la politique publique plus inclusive sont deux stratégies critiques qui visent à établir la bonne gouvernance, selon la Banque et d’autres institutions de Bretton Woods. Le problème, c’est que beaucoup de ces pays en voie de développement ne possèdent pas l’architecture institutionnelle préalable à ces nouvelles mesures. Cette thèse étudie et explique comment un état en voie de développement, le Commonwealth de la Dominique, s’est lancé dans un projet de loi visant l’intégrité dans la fonction publique. Cette loi, la Loi sur l’intégrité dans la fonction publique (IPO) a été adoptée en 2003 et mis en œuvre en 2008. Cette thèse analyse les relations de pouvoir entre les acteurs dominants autour de évolution de la loi et donc, elle emploie une combinaison de technique de l’analyse des réseaux sociaux et de la recherche qualitative pour répondre à la question principale: Pourquoi l’État a-t-il développé et mis en œuvre la conception actuelle de la IPO (2003)? Cette question est d’autant plus significative quand nous considérons que contrairement à la recherche existante sur le sujet, l’IPO dominiquaise diverge considérablement dans la structure du l’IPO type idéal. Nous affirmons que les acteurs "rationnels," conscients de leur position structurelle dans un réseau d’acteurs, ont utilisé leurs ressources de pouvoir pour façonner l’institution afin qu’elle serve leurs intérêts et ceux et leurs alliés. De plus, nous émettons l’hypothèse que: d’abord, le choix d’une agence spécialisée contre la corruption et la conception ultérieure de cette institution reflètent les préférences des acteurs dominants qui ont participé à la création de ladite institution et la seconde, notre hypothèse rivale, les caractéristiques des modèles alternatifs d’institutions de l’intégrité publique sont celles des acteurs non dominants. Nos résultats sont mitigés. Le jeu de pouvoir a été limité à un petit groupe d’acteurs dominants qui ont cherché à utiliser la création de la loi pour assurer leur légitimité et la survie politique. Sans surprise, aucun acteur n’a avancé un modèle alternatif. Nous avons conclu donc que la loi est la conséquence d’un jeu de pouvoir partisan. Cette recherche répond à la pénurie de recherche sur la conception des institutions de l’intégrité publique, qui semblent privilégier en grande partie un biais organisationnel et structurel. De plus, en étudiant le sujet du point de vue des relations de pouvoir (le pouvoir, lui-même, vu sous l’angle actanciel et structurel), la thèse apporte de la rigueur conceptuelle, méthodologique, et analytique au discours sur la création de ces institutions par l’étude de leur genèse des perspectives tant actancielles que structurelles. En outre, les résultats renforcent notre capacité de prédire quand et avec quelle intensité un acteur déploierait ses ressources de pouvoir.
Resumo:
In 2013 the European Commission launched its new green infrastructure strategy to make another attempt to stop and possibly reverse the loss of biodiversity until 2020, by connecting habitats in the wider landscape. This means that conservation would go beyond current practices to include landscapes that are dominated by conventional agriculture, where biodiversity conservation plays a minor role at best. The green infrastructure strategy aims at bottom-up rather than top-down implementation, and suggests including local and regional stakeholders. Therefore, it is important to know which stakeholders influence land-use decisions concerning green infrastructure at the local and regional level. The research presented in this paper served to select stakeholders in preparation for a participatory scenario development process to analyze consequences of different implementation options of the European green infrastructure strategy. We used a mix of qualitative and quantitative social network analysis (SNA) methods to combine actors’ attributes, especially concerning their perceived influence, with structural and relational measures. Further, our analysis provides information on institutional backgrounds and governance settings for green infrastructure and agricultural policy. The investigation started with key informant interviews at the regional level in administrative units responsible for relevant policies and procedures such as regional planners, representatives of federal ministries, and continued at the local level with farmers and other members of the community. The analysis revealed the importance of information flows and regulations but also of social pressure, considerably influencing biodiversity governance with respect to green infrastructure and biodiversity.
Resumo:
The Internet of things (IoT) is still in its infancy and has attracted much interest in many industrial sectors including medical fields, logistics tracking, smart cities and automobiles. However, as a paradigm, it is susceptible to a range of significant intrusion threats. This paper presents a threat analysis of the IoT and uses an Artificial Neural Network (ANN) to combat these threats. A multi-level perceptron, a type of supervised ANN, is trained using internet packet traces, then is assessed on its ability to thwart Distributed Denial of Service (DDoS/DoS) attacks. This paper focuses on the classification of normal and threat patterns on an IoT Network. The ANN procedure is validated against a simulated IoT network. The experimental results demonstrate 99.4% accuracy and can successfully detect various DDoS/DoS attacks.
Resumo:
Travel demand models are important tools used in the analysis of transportation plans, projects, and policies. The modeling results are useful for transportation planners making transportation decisions and for policy makers developing transportation policies. Defining the level of detail (i.e., the number of roads) of the transport network in consistency with the travel demand model’s zone system is crucial to the accuracy of modeling results. However, travel demand modelers have not had tools to determine how much detail is needed in a transport network for a travel demand model. This dissertation seeks to fill this knowledge gap by (1) providing methodology to define an appropriate level of detail for a transport network in a given travel demand model; (2) implementing this methodology in a travel demand model in the Baltimore area; and (3) identifying how this methodology improves the modeling accuracy. All analyses identify the spatial resolution of the transport network has great impacts on the modeling results. For example, when compared to the observed traffic data, a very detailed network underestimates traffic congestion in the Baltimore area, while a network developed by this dissertation provides a more accurate modeling result of the traffic conditions. Through the evaluation of the impacts a new transportation project has on both networks, the differences in their analysis results point out the importance of having an appropriate level of network detail for making improved planning decisions. The results corroborate a suggested guideline concerning the development of a transport network in consistency with the travel demand model’s zone system. To conclude this dissertation, limitations are identified in data sources and methodology, based on which a plan of future studies is laid out.
Resumo:
I examine determinants of refugee return after conflicts. I argue that institutional constraints placed on the executive provide a credible commitment that signals to refugees that the conditions required for durable return will be created. This results in increased return flows for refugees. Further, when credible commitments are stronger in the country of origin than in the country of asylum, the level of return increases. Finally, I find that specific commitments made to refugees in the peace agreement do not lead to increased return because they are not credible without institutional constraints. Using data on returnees that has only recently been made available, along with network analysis and an original coding of the provisions in refugee agreements, statistical results are found to support this theory. An examination of cases in Djibouti, Sierra Leone, and Liberia provides additional support for this argument.
Resumo:
Still a big gap exists between clinical and genetic diagnosis of dyslipidemic disorders. Almost the 60% of the patients with a clinical diagnosis of Familial hypercholesterolemia (FH) still lack of a genetic diagnosis. Here we present the preliminary results of an integrative approach intended to identify new candidate genes and to dissect pathways that can be dysregulated in the disease. Interesting hits will be subsequently knocked down in vitro in order to evaluate their functional role in the uptake of fluorescently-labeled LDL and free cell cholesterol using automated microscopy.