279 resultados para Packet Filtering
Resumo:
The striking color patterns of butterflies and birds have long interested biologists. But how these animals see color is less well understood. Opsins are the protein components of the visual pigments of the eye. Color vision has evolved in butterflies through opsin gene duplications, through positive selection at individual opsin loci, and by the use of filtering pigments. By contrast, birds have retained the same opsin complement present in early-jawed vertebrates, and their visual system has diversified primarily through tuning of the short-wavelength-sensitive photoreceptors, rather than by opsin duplication or the use of filtering elements. Butterflies and birds have evolved photoreceptors that might use some of the same amino acid sites for generating similar spectral phenotypes across approximately 540 million years of evolution, when rhabdomeric and ciliary-type opsins radiated during the early Cambrian period. Considering the similarities between the two taxa, it is surprising that the eyes of birds are not more diverse. Additional taxonomic sampling of birds may help clarify this mystery.
Resumo:
Approximate clone detection is the process of identifying similar process fragments in business process model collections. The tool presented in this paper can efficiently cluster approximate clones in large process model repositories. Once a repository is clustered, users can filter and browse the clusters using different filtering parameters. Our tool can also visualize clusters in the 2D space, allowing a better understanding of clusters and their member fragments. This demonstration will be useful for researchers and practitioners working on large process model repositories, where process standardization is a critical task for increasing the consistency and reducing the complexity of the repository.
Resumo:
Vehicular Ad-hoc Networks (VANET) have different characteristics compared to other mobile ad-hoc networks. The dynamic nature of the vehicles which act as routers and clients are connected with unreliable radio links and Routing becomes a complex problem. First we propose CO-GPSR (Cooperative GPSR), an extension of the traditional GPSR (Greedy Perimeter Stateless Routing) which uses relay nodes which exploit radio path diversity in a vehicular network to increase routing performance. Next we formulate a Multi-objective decision making problem to select optimum packet relaying nodes to increase the routing performance further. We use cross layer information for the optimization process. We evaluate the routing performance more comprehensively using realistic vehicular traces and a Nakagami fading propagation model optimized for highway scenarios in VANETs. Our results show that when Multi-objective decision making is used for cross layer optimization of routing a 70% performance increment can be obtained for low vehicle densities on average, which is a two fold increase compared to the single criteria maximization approach.
Resumo:
This paper proposes a new approach for state estimation of angles and frequencies of equivalent areas in large power systems with synchronized phasor measurement units. Defining coherent generators and their correspondent areas, generators are aggregated and system reduction is performed in each area of inter-connected power systems. The structure of the reduced system is obtained based on the characteristics of the reduced linear model and measurement data to form the non-linear model of the reduced system. Then a Kalman estimator is designed for the reduced system to provide an equivalent dynamic system state estimation using the synchronized phasor measurement data. The method is simulated on two test systems to evaluate the feasibility of the proposed method.
Resumo:
The R statistical environment and language has demonstrated particular strengths for interactive development of statistical algorithms, as well as data modelling and visualisation. Its current implementation has an interpreter at its core which may result in a performance penalty in comparison to directly executing user algorithms in the native machine code of the host CPU. In contrast, the C++ language has no built-in visualisation capabilities, handling of linear algebra or even basic statistical algorithms; however, user programs are converted to high-performance machine code, ahead of execution. A new method avoids possible speed penalties in R by using the Rcpp extension package in conjunction with the Armadillo C++ matrix library. In addition to the inherent performance advantages of compiled code, Armadillo provides an easy-to-use template-based meta-programming framework, allowing the automatic pooling of several linear algebra operations into one, which in turn can lead to further speedups. With the aid of Rcpp and Armadillo, conversion of linear algebra centered algorithms from R to C++ becomes straightforward. The algorithms retains the overall structure as well as readability, all while maintaining a bidirectional link with the host R environment. Empirical timing comparisons of R and C++ implementations of a Kalman filtering algorithm indicate a speedup of several orders of magnitude.
Resumo:
Deploying networked control systems (NCSs) over wireless networks is becoming more and more popular. However, the widely-used transport layer protocols, Transmission Control Protocol (TCP) and User Datagram Protocol (UDP), are not designed for real-time applications. Therefore, they may not be suitable for many NCS application scenarios because of their limitations on reliability and/or delay performance, which real-control systems concern. Considering a typical type of NCSs with periodic and sporadic real-time traffic, this paper proposes a highly reliable transport layer protocol featuring a packet loss-sensitive retransmission mechanism and a prioritized transmission mechanism. The packet loss-sensitive retransmission mechanism is designed to improve the reliability of all traffic flows. And the prioritized transmission mechanism offers differentiated services for periodic and sporadic flows. Simulation results show that the proposed protocol has better reliability than UDP and improved delay performance than TCP over wireless networks, particularly when channel errors and congestions occur.
Resumo:
Collaborative methods are promising tools for solving complex security tasks. In this context, the authors present the security overlay framework CIMD (Collaborative Intrusion and Malware Detection), enabling participants to state objectives and interests for joint intrusion detection and find groups for the exchange of security-related data such as monitoring or detection results accordingly; to these groups the authors refer as detection groups. First, the authors present and discuss a tree-oriented taxonomy for the representation of nodes within the collaboration model. Second, they introduce and evaluate an algorithm for the formation of detection groups. After conducting a vulnerability analysis of the system, the authors demonstrate the validity of CIMD by examining two different scenarios inspired sociology where the collaboration is advantageous compared to the non-collaborative approach. They evaluate the benefit of CIMD by simulation in a novel packet-level simulation environment called NeSSi (Network Security Simulator) and give a probabilistic analysis for the scenarios.
Resumo:
Topic recommendation can help users deal with the information overload issue in micro-blogging communities. This paper proposes to use the implicit information network formed by the multiple relationships among users, topics and micro-blogs, and the temporal information of micro-blogs to find semantically and temporally relevant topics of each topic, and to profile users' time-drifting topic interests. The Content based, Nearest Neighborhood based and Matrix Factorization models are used to make personalized recommendations. The effectiveness of the proposed approaches is demonstrated in the experiments conducted on a real world dataset that collected from Twitter.com.
Resumo:
Relevance feature and ontology are two core components to learn personalized ontologies for concept-based retrievals. However, how to associate user native information with common knowledge is an urgent issue. This paper proposes a sound solution by matching relevance feature mined from local instances with concepts existing in a global knowledge base. The matched concepts and their relations are used to learn personalized ontologies. The proposed method is evaluated elaborately by comparing it against three benchmark models. The evaluation demonstrates the matching is successful by achieving remarkable improvements in information filtering measurements.
Resumo:
With the overwhelming increase in the amount of texts on the web, it is almost impossible for people to keep abreast of up-to-date information. Text mining is a process by which interesting information is derived from text through the discovery of patterns and trends. Text mining algorithms are used to guarantee the quality of extracted knowledge. However, the extracted patterns using text or data mining algorithms or methods leads to noisy patterns and inconsistency. Thus, different challenges arise, such as the question of how to understand these patterns, whether the model that has been used is suitable, and if all the patterns that have been extracted are relevant. Furthermore, the research raises the question of how to give a correct weight to the extracted knowledge. To address these issues, this paper presents a text post-processing method, which uses a pattern co-occurrence matrix to find the relation between extracted patterns in order to reduce noisy patterns. The main objective of this paper is not only reducing the number of closed sequential patterns, but also improving the performance of pattern mining as well. The experimental results on Reuters Corpus Volume 1 data collection and TREC filtering topics show that the proposed method is promising.
Resumo:
Deploying wireless networks in networked control systems (NCSs) has become more and more popular during the last few years. As a typical type of real-time control systems, an NCS is sensitive to long and nondeterministic time delay and packet losses. However, the nature of the wireless channel has the potential to degrade the performance of NCS networks in many aspects, particularly in time delay and packet losses. Transport layer protocols could play an important role in providing both reliable and fast transmission service to fulfill NCS’s real-time transmission requirements. Unfortunately, none of the existing transport protocols, including the Transport Control Protocol (TCP) and the User Datagram Protocol (UDP), was designed for real-time control applications. Moreover, periodic data and sporadic data are two types of real-time data traffic with different priorities in an NCS. Due to the lack of support for prioritized transmission service, the real-time performance for periodic and sporadic data in an NCS network is often degraded significantly, particularly under congested network conditions. To address these problems, a new transport layer protocol called Reliable Real-Time Transport Protocol (RRTTP) is proposed in this thesis. As a UDP-based protocol, RRTTP inherits UDP’s simplicity and fast transmission features. To improve the reliability, a retransmission and an acknowledgement mechanism are designed in RRTTP to compensate for packet losses. They are able to avoid unnecessary retransmission of the out-of-date packets in NCSs, and collisions are unlikely to happen, and small transmission delay can be achieved. Moreover, a prioritized transmission mechanism is also designed in RRTTP to improve the real-time performance of NCS networks under congested traffic conditions. Furthermore, the proposed RRTTP is implemented in the Network Simulator 2 for comprehensive simulations. The simulation results demonstrate that RRTTP outperforms TCP and UDP in terms of real-time transmissions in an NCS over wireless networks.
Resumo:
Wireless networked control systems (WNCSs) have been widely used in the areas of manufacturing and industrial processing over the last few years. They provide real-time control with a unique characteristic: periodic traffic. These systems have a time-critical requirement. Due to current wireless mechanisms, the WNCS performance suffers from long time-varying delays, packet dropout, and inefficient channel utilization. Current wirelessly networked applications like WNCSs are designed upon the layered architecture basis. The features of this layered architecture constrain the performance of these demanding applications. Numerous efforts have attempted to use cross-layer design (CLD) approaches to improve the performance of various networked applications. However, the existing research rarely considers large-scale networks and congestion network conditions in WNCSs. In addition, there is a lack of discussions on how to apply CLD approaches in WNCSs. This thesis proposes a cross-layer design methodology to address the issues of periodic traffic timeliness, as well as to promote the efficiency of channel utilization in WNCSs. The design of the proposed CLD is highlighted by the measurement of the underlying network condition, the classification of the network state, and the adjustment of sampling period between sensors and controllers. This period adjustment is able to maintain the minimally allowable sampling period, and also maximize the control performance. Extensive simulations are conducted using the network simulator NS-2 to evaluate the performance of the proposed CLD. The comparative studies involve two aspects of communications, with and without using the proposed CLD, respectively. The results show that the proposed CLD is capable of fulfilling the timeliness requirement under congested network conditions, and is also able to improve the channel utilization efficiency and the proportion of effective data in WNCSs.
Resumo:
Focal segmental glomerulosclerosis (FSGS) is the consequence of a disease process that attacks the kidney's filtering system, causing serious scarring. More than half of FSGS patients develop chronic kidney failure within 10 years, ultimately requiring dialysis or renal transplantation. There are currently several genes known to cause the hereditary forms of FSGS (ACTN4, TRPC6, CD2AP, INF2, MYO1E and NPHS2). This study involves a large, unique, multigenerational Australian pedigree in which FSGS co-segregates with progressive heart block with apparent X-linked recessive inheritance. Through a classical combined approach of linkage and haplotype analysis, we identified a 21.19 cM interval implicated on the X chromosome. We then used a whole exome sequencing approach to identify two mutated genes, NXF5 and ALG13, which are located within this linkage interval. The two mutations NXF5-R113W and ALG13-T141L segregated perfectly with the disease phenotype in the pedigree and were not found in a large healthy control cohort. Analysis using bioinformatics tools predicted the R113W mutation in the NXF5 gene to be deleterious and cellular studies support a role in the stability and localization of the protein suggesting a causative role of this mutation in these co-morbid disorders. Further studies are now required to determine the functional consequence of these novel mutations to development of FSGS and heart block in this pedigree and to determine whether these mutations have implications for more common forms of these diseases in the general population.
Resumo:
[Extract] For just $5.29 Australians can now purchase "Skins" from local, independent grocers to cover their cigarette packet with the Aboriginal or Torres Strait Islander flag. We argue that this use of cultural content and copyright' imagery on cigarette packets negates health promotion efforts, such as Australia's recent introduction of plain packaging laws and the subsequent dismissal of a legal challenge from the tobacco industry.
Resumo:
Tag recommendation is a specific recommendation task for recommending metadata (tag) for a web resource (item) during user annotation process. In this context, sparsity problem refers to situation where tags need to be produced for items with few annotations or for user who tags few items. Most of the state of the art approaches in tag recommendation are rarely evaluated or perform poorly under this situation. This paper presents a combined method for mitigating sparsity problem in tag recommendation by mainly expanding and ranking candidate tags based on similar items’ tags and existing tag ontology. We evaluated the approach on two public social bookmarking datasets. The experiment results show better accuracy for recommendation in sparsity situation over several state of the art methods.