934 resultados para Research networks


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As introduced by Bentley et al. (2005), artificial immune systems (AIS) are lacking tissue, which is present in one form or another in all living multi-cellular organisms. Some have argued that this concept in the context of AIS brings little novelty to the already saturated field of the immune inspired computational research. This article aims to show that such a component of an AIS has the potential to bring an advantage to a data processing algorithm in terms of data pre-processing, clustering and extraction of features desired by the immune inspired system. The proposed tissue algorithm is based on self-organizing networks, such as self-organizing maps (SOM) developed by Kohonen (1996) and an analogy of the so called Toll-Like Receptors (TLR) affecting the activation function of the clusters developed by the SOM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The next generation of vehicles will be equipped with automated Accident Warning Systems (AWSs) capable of warning neighbouring vehicles about hazards that might lead to accidents. The key enabling technology for these systems is the Vehicular Ad-hoc Networks (VANET) but the dynamics of such networks make the crucial timely delivery of warning messages challenging. While most previously attempted implementations have used broadcast-based data dissemination schemes, these do not cope well as data traffic load or network density increases. This problem of sending warning messages in a timely manner is addressed by employing a network coding technique in this thesis. The proposed NETwork COded DissEmination (NETCODE) is a VANET-based AWS responsible for generating and sending warnings to the vehicles on the road. NETCODE offers an XOR-based data dissemination scheme that sends multiple warning in a single transmission and therefore, reduces the total number of transmissions required to send the same number of warnings that broadcast schemes send. Hence, it reduces contention and collisions in the network improving the delivery time of the warnings. The first part of this research (Chapters 3 and 4) asserts that in order to build a warning system, it is needful to ascertain the system requirements, information to be exchanged, and protocols best suited for communication between vehicles. Therefore, a study of these factors along with a review of existing proposals identifying their strength and weakness is carried out. Then an analysis of existing broadcast-based warning is conducted which concludes that although this is the most straightforward scheme, loading can result an effective collapse, resulting in unacceptably long transmission delays. The second part of this research (Chapter 5) proposes the NETCODE design, including the main contribution of this thesis, a pair of encoding and decoding algorithms that makes the use of an XOR-based technique to reduce transmission overheads and thus allows warnings to get delivered in time. The final part of this research (Chapters 6--8) evaluates the performance of the proposed scheme as to how it reduces the number of transmissions in the network in response to growing data traffic load and network density and investigates its capacity to detect potential accidents. The evaluations use a custom-built simulator to model real-world scenarios such as city areas, junctions, roundabouts, motorways and so on. The study shows that the reduction in the number of transmissions helps reduce competition in the network significantly and this allows vehicles to deliver warning messages more rapidly to their neighbours. It also examines the relative performance of NETCODE when handling both sudden event-driven and longer-term periodic messages in diverse scenarios under stress caused by increasing numbers of vehicles and transmissions per vehicle. This work confirms the thesis' primary contention that XOR-based network coding provides a potential solution on which a more efficient AWS data dissemination scheme can be built.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The focus of this research is to explore the applications of the finite difference formulation based on the latency insertion method (LIM) to the analysis of circuit interconnects. Special attention is devoted to addressing the issues that arise in very large networks such as on-chip signal and power distribution networks. We demonstrate that the LIM has the power and flexibility to handle various types of analysis required at different stages of circuit design. The LIM is particularly suitable for simulations of very large scale linear networks and can significantly outperform conventional circuit solvers (such as SPICE).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis connections between messages on the public wall of the Russian social network Vkontakte are analysed and classified. A total of 1818 messages from three different Vkontakte groups were collected and analysed according to a new framework based on Halliday and Hasan’s (1976) research into cohesion and Simmons’s (1981) adaptation of their classification for Russian. The two categories of textuality, cohesion and coherence, describe the linguistic connections between messages. The main aim was to find out how far the traditional categories of cohesion are applicable to an online social network including written text as well as multimedia-files. In addition to linguistic cohesion the pragmatic and topic coherence between Vkontakte messages was also analysed. The analysis of pragmatic coherence classifies the messages with acts according to their pragmatic function in relation to surrounding messages. Topic coherence analyses the content of the messages, describes where a topic begins, changes or is abandoned. Linguistic cohesion, topic coherence and pragmatic coherence enable three different types of connections between messages and these together form a coherent communication on the message wall. The cohesion devices identified by Halliday and Hasan and Simmons were found to occur in these texts, but additional devices were also identified: these are multimodal, graphical and grammatical cohesion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents security issues and vulnerabilities in home and small office local area networks that can be used in cyber-attacks. There is previous research done on single vulnerabilities and attack vectors, but not many papers present full scale attack examples towards LAN. First this thesis categorizes different security threads and later in the paper methods to launch the attacks are shown by example. Offensive security and penetration testing is used as research methods in this thesis. As a result of this thesis an attack is conducted using vulnerabilities in WLAN, ARP protocol, browser as well as methods of social engineering. In the end reverse shell access is gained to the target machine. Ready-made tools are used in the attack and their inner workings are described. Prevention methods are presented towards the attacks in the end of the thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Virtual Screening (VS) methods can considerably aid clinical research, predicting how ligands interact with drug targets. Most VS methods suppose a unique binding site for the target, but it has been demonstrated that diverse ligands interact with unrelated parts of the target and many VS methods do not take into account this relevant fact. This problem is circumvented by a novel VS methodology named BINDSURF that scans the whole protein surface to find new hotspots, where ligands might potentially interact with, and which is implemented in massively parallel Graphics Processing Units, allowing fast processing of large ligand databases. BINDSURF can thus be used in drug discovery, drug design, drug repurposing and therefore helps considerably in clinical research. However, the accuracy of most VS methods is constrained by limitations in the scoring function that describes biomolecular interactions, and even nowadays these uncertainties are not completely understood. In order to solve this problem, we propose a novel approach where neural networks are trained with databases of known active (drugs) and inactive compounds, and later used to improve VS predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last two decades there have been but a handful of recorded cases of electoral fraud in Latin America. However, survey research consistently shows that often citizens do not trust the integrity of the electoral process. This dissertation addresses the puzzle by explaining the mismatch between how elections are conducted and how the process is perceived. My theoretical contribution provides a double-folded argument. First, voters’ trust in their community members (“the local experience”) impacts their level of confidence in the electoral process. Since voters often find their peers working at polling stations, negative opinions about them translate into negative opinions about the election. Second, perceptions of unfairness of the system (“the global effect”) negatively impact the way people perceive the transparency of the electoral process. When the political system fails to account for social injustice, citizens lose faith in the mechanism designed to elect representatives -and ultimately a set of policies. The fact that certain groups are systematically disregarded by the system triggers the notion that the electoral process is flawed. This is motivated by either egotropic or sociotropic considerations. To test these hypotheses, I employ a survey conducted in Costa Rica, El Salvador, Honduras, and Guatemala during May/June 2014, which includes a population-based experiment. I show that Voters who trust their peers consistently have higher confidence in the electoral process. Whereas respondents who were primed about social unfairness (treatment) expressed less confidence in the quality of the election. Finally, I find that the local experience is predominant over the global effect. The treatment has a statistically significant effect only for respondents who trust their community. Attribution of responsibility for voters who are skeptics of their peers is clear and simple, leaving no room for a more diffuse mechanism, the unfairness of the political system. Finally, now I extend analysis to the Latin America region. Using data from LAPOP that comprises four waves of surveys in 22 countries, I confirm the influence of the “local experience” and the “global effect” as determinants of the level of confidence in the electoral process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Part 15: Performance Management Frameworks

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The TOMO-ETNA experiment was devised to image of the crust underlying the volcanic edifice and, possibly, its plumbing system by using passive and active refraction/reflection seismic methods. This experiment included activities both on-land and offshore with the main objective of obtaining a new high-resolution seismic tomography to improve the knowledge of the crustal structures existing beneath the Etna volcano and northeast Sicily up to Aeolian Islands. The TOMO ETNA experiment was divided in two phases. The first phase started on June 15, 2014 and finalized on July 24, 2014, with the withdrawal of two removable seismic networks (a Short Period Network and a Broadband network composed by 80 and 20 stations respectively) deployed at Etna volcano and surrounding areas. During this first phase the oceanographic research vessel “Sarmiento de Gamboa” and the hydro-oceanographic vessel “Galatea” performed the offshore activities, which includes the deployment of ocean bottom seismometers (OBS), air-gun shooting for Wide Angle Seismic refraction (WAS), Multi-Channel Seismic (MCS) reflection surveys, magnetic surveys and ROV (Remotely Operated Vehicle) dives. This phase finished with the recovery of the short period seismic network. In the second phase the Broadband seismic network remained operative until October 28, 2014, and the R/V “Aegaeo” performed additional MCS surveys during November 19-27, 2014. Overall, the information deriving from TOMO-ETNA experiment could provide the answer to many uncertainties that have arisen while exploiting the large amount of data provided by the cutting-edge monitoring systems of Etna volcano and seismogenic area of eastern Sicily.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this dissertation, we apply mathematical programming techniques (i.e., integer programming and polyhedral combinatorics) to develop exact approaches for influence maximization on social networks. We study four combinatorial optimization problems that deal with maximizing influence at minimum cost over a social network. To our knowl- edge, all previous work to date involving influence maximization problems has focused on heuristics and approximation. We start with the following viral marketing problem that has attracted a significant amount of interest from the computer science literature. Given a social network, find a target set of customers to seed with a product. Then, a cascade will be caused by these initial adopters and other people start to adopt this product due to the influence they re- ceive from earlier adopters. The idea is to find the minimum cost that results in the entire network adopting the product. We first study a problem called the Weighted Target Set Selection (WTSS) Prob- lem. In the WTSS problem, the diffusion can take place over as many time periods as needed and a free product is given out to the individuals in the target set. Restricting the number of time periods that the diffusion takes place over to be one, we obtain a problem called the Positive Influence Dominating Set (PIDS) problem. Next, incorporating partial incentives, we consider a problem called the Least Cost Influence Problem (LCIP). The fourth problem studied is the One Time Period Least Cost Influence Problem (1TPLCIP) which is identical to the LCIP except that we restrict the number of time periods that the diffusion takes place over to be one. We apply a common research paradigm to each of these four problems. First, we work on special graphs: trees and cycles. Based on the insights we obtain from special graphs, we develop efficient methods for general graphs. On trees, first, we propose a polynomial time algorithm. More importantly, we present a tight and compact extended formulation. We also project the extended formulation onto the space of the natural vari- ables that gives the polytope on trees. Next, building upon the result for trees---we derive the polytope on cycles for the WTSS problem; as well as a polynomial time algorithm on cycles. This leads to our contribution on general graphs. For the WTSS problem and the LCIP, using the observation that the influence propagation network must be a directed acyclic graph (DAG), the strong formulation for trees can be embedded into a formulation on general graphs. We use this to design and implement a branch-and-cut approach for the WTSS problem and the LCIP. In our computational study, we are able to obtain high quality solutions for random graph instances with up to 10,000 nodes and 20,000 edges (40,000 arcs) within a reasonable amount of time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Part 9: Innovation Networks

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rolling stock circulation depends on two different problems: the rolling stock assignment and the train routing problems, which up to now have been solved sequentially. We propose a new approach to obtain better and more robust circulations of the rolling stock train units, solving the rolling stock assignment while accounting for the train routing problem. Here robustness means that difficult shunting operations are selectively penalized and propagated delays together with the need for human resources are minimized. This new integrated approach provides a huge model. Then, we solve the integrated model using Benders decomposition, where the main decision is the rolling stock assignment and the train routing is in the second level. For computational reasons we propose a heuristic based on Benders decomposition. Computational experiments show how the current solution operated by RENFE (the main Spanish train operator) can be improved: more robust and efficient solutions are obtained

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Part 5: Service Orientation in Collaborative Networks

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of virtual social networks (VSNs) has been prevalent among consumers worldwide. Numerous studies have investigated various aspects of VSNs. However, these studies have mainly focused on students and young adults as they were early adopters of these innovative networks. A search of the literature revealed there has been a paucity of research on adult consumers’ use of VSNs. This research study addressed this gap in the literature by examining the determinants of engagement in VSNs among adult consumers in Singapore. The objectives of this study are to empirically investigate the determinants of engagement in VSNs and to offer theoretical insights into consumers’ preference and usage of VSNs. This study tapped upon several theories developed in the discipline of technology and innovation adoption. These were Roger’s Diffusion of Innovation, Theory of Reasoned Action (TRA), Theory of Planned Behavior (TPB), Technology Acceptance Model (TAM), Conceptual Framework of Individual Innovation Adoption by Frambach and Schillewaert (2002), Enhanced Model of Innovation Adoption by Talukder (2011), Extended Unified Theory of Acceptance and Use of Technology (UTAUT2) and the Information Systems (IS) Success Model. The proposed research model, named the Media Usage Model (MUM), is a framework rooted in innovation diffusion and IS theories. The MUM distilled the essence of these established models and thus provides an updated, lucid explanation of engagement in VSNs. A cross-sectional, online social survey was conducted to collect quantitative data to examine the validity of the proposed research model. Multivariate data analysis was carried out on a data set comprising 806 usable responses by utilizing SPSS, and for structural equation modeling AMOS and SmartPLS. The results indicate that consumer attitude towards VSNs is significantly and positively influenced by: three individual factors – hedonic motivation, incentives and experience; two system characteristics – system quality and information quality; and one social factor – social bonding. Consumer demographics were found to influence people’s attitudes towards VSNs. In addition, consumer experience and attitude towards VSNs significantly and positively influence their usage of VSNs. The empirical data supported the proposed research model, explaining 80% of variance in attitude towards VSNs and 45% of variance in usage of VSNs. Therefore, the MUM achieves a definite contribution to theoretical knowledge of consumer engagement in VSNs by deepening and broadening our appreciation of the intricacies related to use of VSNs in Singapore. This study’s findings have implications for customer service management, services marketing and consumer behavior. These findings also have strategic implications for maximizing efficient utilization and effective management of VSNs by businesses and operators. The contributions of this research are: firstly, shifting the boundaries of technology or innovation adoption theories from research on employees to consumers as well as the boundaries of Internet usage or adoption research from students to adults, which is also known as empirical generalization; secondly, highlighting the issues associated with lack of significance of social factors in adoption research; and thirdly, augmenting information systems research by integrating important antecedents for success in information systems.