999 resultados para Queueing Networks
Resumo:
A remarkable growth in quantity and popularity of online social networks has been observed in recent years. There is a good number of online social networks exists which have over 100 million registered users. Many of these popular social networks offer automated recommendations to their users. This automated recommendations are normally generated using collaborative filtering systems based on the past ratings or opinions of the similar users. Alternatively, trust among the users in the network also can be used to find the neighbors while making recommendations. To obtain the optimum result, there must be a positive correlation exists between trust and interest similarity. Though the positive relations between trust and interest similarity are assumed and adopted by many researchers; no survey work on real life people’s opinion to support this hypothesis is found. In this paper, we have reviewed the state-of-the-art research work on trust in online social networks and have presented the result of the survey on the relationship between trust and interest similarity. Our result supports the assumed hypothesis of positive relationship between the trust and interest similarity of the users.
Resumo:
Recommender systems are one of the recent inventions to deal with ever growing information overload. Collaborative filtering seems to be the most popular technique in recommender systems. With sufficient background information of item ratings, its performance is promising enough. But research shows that it performs very poor in a cold start situation where previous rating data is sparse. As an alternative, trust can be used for neighbor formation to generate automated recommendation. User assigned explicit trust rating such as how much they trust each other is used for this purpose. However, reliable explicit trust data is not always available. In this paper we propose a new method of developing trust networks based on user’s interest similarity in the absence of explicit trust data. To identify the interest similarity, we have used user’s personalized tagging information. This trust network can be used to find the neighbors to make automated recommendations. Our experiment result shows that the proposed trust based method outperforms the traditional collaborative filtering approach which uses users rating data. Its performance improves even further when we utilize trust propagation techniques to broaden the range of neighborhood.
Resumo:
In recent years, there is a dramatic growth in number and popularity of online social networks. There are many networks available with more than 100 million registered users such as Facebook, MySpace, QZone, Windows Live Spaces etc. People may connect, discover and share by using these online social networks. The exponential growth of online communities in the area of social networks attracts the attention of the researchers about the importance of managing trust in online environment. Users of the online social networks may share their experiences and opinions within the networks about an item which may be a product or service. The user faces the problem of evaluating trust in a service or service provider before making a choice. Recommendations may be received through a chain of friends network, so the problem for the user is to be able to evaluate various types of trust opinions and recommendations. This opinion or recommendation has a great influence to choose to use or enjoy the item by the other user of the community. Collaborative filtering system is the most popular method in recommender system. The task in collaborative filtering is to predict the utility of items to a particular user based on a database of user rates from a sample or population of other users. Because of the different taste of different people, they rate differently according to their subjective taste. If two people rate a set of items similarly, they share similar tastes. In the recommender system, this information is used to recommend items that one participant likes, to other persons in the same cluster. But the collaborative filtering system performs poor when there is insufficient previous common rating available between users; commonly known as cost start problem. To overcome the cold start problem and with the dramatic growth of online social networks, trust based approach to recommendation has emerged. This approach assumes a trust network among users and makes recommendations based on the ratings of the users that are directly or indirectly trusted by the target user.
Resumo:
This article applies social network analysis techniques to a case study of police corruption in order to produce findings which will assist in corruption prevention and investigation. Police corruption is commonly studied but rarely are sophisticated tools of analyse engaged to add rigour to the field of study. This article analyses the ‘First Joke’ a systemic and long lasting corruption network in the Queensland Police Force, a state police agency in Australia. It uses the data obtained from a commission of inquiry which exposed the network and develops hypotheses as to the nature of the networks structure based on existing literature into dark networks and criminal networks. These hypotheses are tested by entering the data into UCINET and analysing the outcomes through social network analysis measures of average path distance, centrality and density. The conclusions reached show that the network has characteristics not predicted by the literature.
Resumo:
This paper describes the characterisation for airborne uses of the public mobile data communication systems known broadly as 3G. The motivation for this study was to explore how this mature public communication systems could be used for aviation purposes. An experimental system was fitted to a light aircraft to record communication latency, line speed, RF level, packet loss and cell tower identifier. Communications was established using internet protocols and connection was made to a local server. The aircraft was flown in both remote and populous areas at altitudes up to 8500ft in a region located in South East Queensland, Australia. Results show that the average airborne RF levels are better than those on the ground by 21% and in the order of -77 dbm. Latencies were in the order of 500 ms (1/2 the latency of Iridium), an average download speed of 0.48 Mb/s, average uplink speed of 0.85 Mb/s, a packet of information loss of 6.5%. The maximum communication range was also observed to be 70km from a single cell station. The paper also describes possible limitations and utility of using such a communications architecture for both manned and unmanned aircraft systems.
Resumo:
Attempts to map online networks, representing relationships between people and sites, have covered sites including Facebook, Twitter, and blogs. However, the predominant approach of static network visualization, treating months of data as a single case rather than depicting changes over time or between topics, remains a flawed process. As different events and themes provoke varying interactions and conversations, it is proposed that case-by-case analysis would aid studies of online social networks by further examining the dynamics of links and information flows. This study uses hyperlink analysis of a population of French political blogs to compare connections between sites from January to August 2009. Themes discussed in this period were identified for subsequent analysis of topic-oriented networks. By comparing static blogrolls with topical citations within posts, this research addresses challenges and methods in mapping online networks, providing new information on temporal aspects of linking behaviors and information flows within these systems.
Resumo:
Genomic and proteomic analyses have attracted a great deal of interests in biological research in recent years. Many methods have been applied to discover useful information contained in the enormous databases of genomic sequences and amino acid sequences. The results of these investigations inspire further research in biological fields in return. These biological sequences, which may be considered as multiscale sequences, have some specific features which need further efforts to characterise using more refined methods. This project aims to study some of these biological challenges with multiscale analysis methods and stochastic modelling approach. The first part of the thesis aims to cluster some unknown proteins, and classify their families as well as their structural classes. A development in proteomic analysis is concerned with the determination of protein functions. The first step in this development is to classify proteins and predict their families. This motives us to study some unknown proteins from specific families, and to cluster them into families and structural classes. We select a large number of proteins from the same families or superfamilies, and link them to simulate some unknown large proteins from these families. We use multifractal analysis and the wavelet method to capture the characteristics of these linked proteins. The simulation results show that the method is valid for the classification of large proteins. The second part of the thesis aims to explore the relationship of proteins based on a layered comparison with their components. Many methods are based on homology of proteins because the resemblance at the protein sequence level normally indicates the similarity of functions and structures. However, some proteins may have similar functions with low sequential identity. We consider protein sequences at detail level to investigate the problem of comparison of proteins. The comparison is based on the empirical mode decomposition (EMD), and protein sequences are detected with the intrinsic mode functions. A measure of similarity is introduced with a new cross-correlation formula. The similarity results show that the EMD is useful for detection of functional relationships of proteins. The third part of the thesis aims to investigate the transcriptional regulatory network of yeast cell cycle via stochastic differential equations. As the investigation of genome-wide gene expressions has become a focus in genomic analysis, researchers have tried to understand the mechanisms of the yeast genome for many years. How cells control gene expressions still needs further investigation. We use a stochastic differential equation to model the expression profile of a target gene. We modify the model with a Gaussian membership function. For each target gene, a transcriptional rate is obtained, and the estimated transcriptional rate is also calculated with the information from five possible transcriptional regulators. Some regulators of these target genes are verified with the related references. With these results, we construct a transcriptional regulatory network for the genes from the yeast Saccharomyces cerevisiae. The construction of transcriptional regulatory network is useful for detecting more mechanisms of the yeast cell cycle.
Resumo:
Network-based Intrusion Detection Systems (NIDSs) monitor network traffic for signs of malicious activities that have the potential to disrupt entire network infrastructures and services. NIDS can only operate when the network traffic is available and can be extracted for analysis. However, with the growing use of encrypted networks such as Virtual Private Networks (VPNs) that encrypt and conceal network traffic, a traditional NIDS can no longer access network traffic for analysis. The goal of this research is to address this problem by proposing a detection framework that allows a commercial off-the-shelf NIDS to function normally in a VPN without any modification. One of the features of the proposed framework is that it does not compromise on the confidentiality afforded by the VPN. Our work uses a combination of Shamir’s secret-sharing scheme and randomised network proxies to securely route network traffic to the NIDS for analysis. The detection framework is effective against two general classes of attacks – attacks targeted at the network hosts or attacks targeted at framework itself. We implement the detection framework as a prototype program and evaluate it. Our evaluation shows that the framework does indeed detect these classes of attacks and does not introduce any additional false positives. Despite the increase in network overhead in doing so, the proposed detection framework is able to consistently detect intrusions through encrypted networks.
Resumo:
A Wireless Sensor Network (WSN) is a set of sensors that are integrated with a physical environment. These sensors are small in size, and capable of sensing physical phenomena and processing them. They communicate in a multihop manner, due to a short radio range, to form an Ad Hoc network capable of reporting network activities to a data collection sink. Recent advances in WSNs have led to several new promising applications, including habitat monitoring, military target tracking, natural disaster relief, and health monitoring. The current version of sensor node, such as MICA2, uses a 16 bit, 8 MHz Texas Instruments MSP430 micro-controller with only 10 KB RAM, 128 KB program space, 512 KB external ash memory to store measurement data, and is powered by two AA batteries. Due to these unique specifications and a lack of tamper-resistant hardware, devising security protocols for WSNs is complex. Previous studies show that data transmission consumes much more energy than computation. Data aggregation can greatly help to reduce this consumption by eliminating redundant data. However, aggregators are under the threat of various types of attacks. Among them, node compromise is usually considered as one of the most challenging for the security of WSNs. In a node compromise attack, an adversary physically tampers with a node in order to extract the cryptographic secrets. This attack can be very harmful depending on the security architecture of the network. For example, when an aggregator node is compromised, it is easy for the adversary to change the aggregation result and inject false data into the WSN. The contributions of this thesis to the area of secure data aggregation are manifold. We firstly define the security for data aggregation in WSNs. In contrast with existing secure data aggregation definitions, the proposed definition covers the unique characteristics that WSNs have. Secondly, we analyze the relationship between security services and adversarial models considered in existing secure data aggregation in order to provide a general framework of required security services. Thirdly, we analyze existing cryptographic-based and reputationbased secure data aggregation schemes. This analysis covers security services provided by these schemes and their robustness against attacks. Fourthly, we propose a robust reputationbased secure data aggregation scheme for WSNs. This scheme minimizes the use of heavy cryptographic mechanisms. The security advantages provided by this scheme are realized by integrating aggregation functionalities with: (i) a reputation system, (ii) an estimation theory, and (iii) a change detection mechanism. We have shown that this addition helps defend against most of the security attacks discussed in this thesis, including the On-Off attack. Finally, we propose a secure key management scheme in order to distribute essential pairwise and group keys among the sensor nodes. The design idea of the proposed scheme is the combination between Lamport's reverse hash chain as well as the usual hash chain to provide both past and future key secrecy. The proposal avoids the delivery of the whole value of a new group key for group key update; instead only the half of the value is transmitted from the network manager to the sensor nodes. This way, the compromise of a pairwise key alone does not lead to the compromise of the group key. The new pairwise key in our scheme is determined by Diffie-Hellman based key agreement.
Resumo:
Online social networking has become one of the most popular Internet applications in the modern era. They have given the Internet users, access to information that other Internet based applications are unable to. Although many of the popular online social networking web sites are focused towards entertainment purposes, sharing information can benefit the healthcare industry in terms of both efficiency and effectiveness. But the capability to share personal information; the factor which has made online social networks so popular, is itself a major obstacle when considering information security and privacy aspects. Healthcare can benefit from online social networking if they are implemented such that sensitive patient information can be safeguarded from ill exposure. But in an industry such as healthcare where the availability of information is crucial for better decision making, information must be made available to the appropriate parties when they require it. Hence the traditional mechanisms for information security and privacy protection may not be suitable for healthcare. In this paper we propose a solution to privacy enhancement in online healthcare social networks through the use of an information accountability mechanism.
Resumo:
Public awareness of large infrastructure projects, many of which are delivered through networked arrangements is high for several reasons. These projects often involve significant public investment; they may involve multiple and conflicting stakeholders and can potentially have significant environmental impacts (Lim and Yang, 2008). To produce positive outcomes from infrastructure delivery it is imperative that stakeholder “buy in” be obtained particularly about decisions relating to the scale and location of infrastructure. Given the likelihood that stakeholders will have different levels of interest and investment in project outcomes, failure to manage this dynamic could potentially jeopardise project delivery by delaying or halting the construction of essential infrastructure. Consequently, stakeholder engagement has come to constitute a critical activity in infrastructure development delivered through networks. This paper draws on stakeholder theory and governance network theory and provides insights into how three multi-level networks within the Roads Alliance in Queensland engage with stakeholders in the delivery of road infrastructure. New knowledge about stakeholders has been obtained by testing a model of Stakeholder Salience and Engagement which combines and extends the stakeholder identification and salience theory and the ladder of stakeholder management and engagement. By applying this model, the broad research question: “How do governance networks engage with stakeholders?” has been addressed. A multiple embedded case study design was selected as the overall approach to explore, describe, explain and evaluate how stakeholder engagement occurred in three governance networks delivering road infrastructure in Queensland. The outcomes of this research contribute to and extend stakeholder theory by showing how stakeholder salience impacts on decisions about the types of engagement processes implemented. Governance network theory is extended by showing how governance networks interact with stakeholders. From a practical perspective this research provides governance networks with an indication of how to more effectively undertake engagement with different types of stakeholders.
Resumo:
Abstract As regional and continental carbon balances of terrestrial ecosystems become available, it becomes clear that the soils are the largest source of uncertainty. Repeated inventories of soil organic carbon (SOC) organized in soil monitoring networks (SMN) are being implemented in a number of countries. This paper reviews the concepts and design of SMNs in ten countries, and discusses the contribution of such networks to reducing the uncertainty of soil carbon balances. Some SMNs are designed to estimate country-specific land use or management effects on SOC stocks, while others collect soil carbon and ancillary data to provide a nationally consistent assessment of soil carbon condition across the major land-use/soil type combinations. The former use a single sampling campaign of paired sites, while for the latter both systematic (usually grid based) and stratified repeated sampling campaigns (5–10 years interval) are used with densities of one site per 10–1,040 km². For paired sites, multiple samples at each site are taken in order to allow statistical analysis, while for the single sites, composite samples are taken. In both cases, fixed depth increments together with samples for bulk density and stone content are recommended. Samples should be archived to allow for re-measurement purposes using updated techniques. Information on land management, and where possible, land use history should be systematically recorded for each site. A case study of the agricultural frontier in Brazil is presented in which land use effect factors are calculated in order to quantify the CO2 fluxes from national land use/management conversion matrices. Process-based SOC models can be run for the individual points of the SMN, provided detailed land management records are available. These studies are still rare, as most SMNs have been implemented recently or are in progress. Examples from the USA and Belgium show that uncertainties in SOC change range from 1.6–6.5 Mg C ha−1 for the prediction of SOC stock changes on individual sites to 11.72 Mg C ha−1 or 34% of the median SOC change for soil/land use/climate units. For national SOC monitoring, stratified sampling sites appears to be the most straightforward attribution of SOC values to units with similar soil/land use/climate conditions (i.e. a spatially implicit upscaling approach). Keywords Soil monitoring networks - Soil organic carbon - Modeling - Sampling design
Resumo:
This paper describes and evaluates the novel utility of network methods for understanding human interpersonal interactions within social neurobiological systems such as sports teams. We show how collective system networks are supported by the sum of interpersonal interactions that emerge from the activity of system agents (such as players in a sports team). To test this idea we trialled the methodology in analyses of intra-team collective behaviours in the team sport of water polo. We observed that the number of interactions between team members resulted in varied intra-team coordination patterns of play, differentiating between successful and unsuccessful performance outcomes. Future research on small-world networks methodologies needs to formalize measures of node connections in analyses of collective behaviours in sports teams, to verify whether a high frequency of interactions is needed between players in order to achieve competitive performance outcomes.