985 resultados para Graph G


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The challenge of persistent appearance-based navigation and mapping is to develop an autonomous robotic vision system that can simultaneously localize, map and navigate over the lifetime of the robot. However, the computation time and memory requirements of current appearance-based methods typically scale not only with the size of the environment but also with the operation time of the platform; also, repeated revisits to locations will develop multiple competing representations which reduce recall performance. In this paper we present a solution to the persistent localization, mapping and global path planning problem in the context of a delivery robot in an office environment over a one-week period. Using a graphical appearance-based SLAM algorithm, CAT-Graph, we demonstrate constant time and memory loop closure detection with minimal degradation during repeated revisits to locations, along with topological path planning that improves over time without using a global metric representation. We compare the localization performance of CAT-Graph to openFABMAP, an appearance-only SLAM algorithm, and the path planning performance to occupancy-grid based metric SLAM. We discuss the limitations of the algorithm with regard to environment change over time and illustrate how the topological graph representation can be coupled with local movement behaviors for persistent autonomous robot navigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Changing environments present a number of challenges to mobile robots, one of the most significant being mapping and localisation. This problem is particularly significant in vision-based systems where illumination and weather changes can cause feature-based techniques to fail. In many applications only sections of an environment undergo extreme perceptual change. Some range-based sensor mapping approaches exploit this property by combining occasional place recognition with the assumption that odometry is accurate over short periods of time. In this paper, we develop this idea in the visual domain, by using occasional vision-driven loop closures to infer loop closures in nearby locations where visual recognition is difficult due to extreme change. We demonstrate successful map creation in an environment in which change is significant but constrained to one area, where both the vanilla CAT-Graph and a Sum of Absolute Differences matcher fails, use the described techniques to link dissimilar images from matching locations, and test the robustness of the system against false inferences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Online social networks can be modelled as graphs; in this paper, we analyze the use of graph metrics for identifying users with anomalous relationships to other users. A framework is proposed for analyzing the effectiveness of various graph theoretic properties such as the number of neighbouring nodes and edges, betweenness centrality, and community cohesiveness in detecting anomalous users. Experimental results on real-world data collected from online social networks show that the majority of users typically have friends who are friends themselves, whereas anomalous users’ graphs typically do not follow this common rule. Empirical analysis also shows that the relationship between average betweenness centrality and edges identifies anomalies more accurately than other approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider Cooperative Intrusion Detection System (CIDS) which is a distributed AIS-based (Artificial Immune System) IDS where nodes collaborate over a peer-to-peer overlay network. The AIS uses the negative selection algorithm for the selection of detectors (e.g., vectors of features such as CPU utilization, memory usage and network activity). For better detection performance, selection of all possible detectors for a node is desirable but it may not be feasible due to storage and computational overheads. Limiting the number of detectors on the other hand comes with the danger of missing attacks. We present a scheme for the controlled and decentralized division of detector sets where each IDS is assigned to a region of the feature space. We investigate the trade-off between scalability and robustness of detector sets. We address the problem of self-organization in CIDS so that each node generates a distinct set of the detectors to maximize the coverage of the feature space while pairs of nodes exchange their detector sets to provide a controlled level of redundancy. Our contribution is twofold. First, we use Symmetric Balanced Incomplete Block Design, Generalized Quadrangles and Ramanujan Expander Graph based deterministic techniques from combinatorial design theory and graph theory to decide how many and which detectors are exchanged between which pair of IDS nodes. Second, we use a classical epidemic model (SIR model) to show how properties from deterministic techniques can help us to reduce the attack spread rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic congestion has a significant impact on the economy and environment. Encouraging the use of multimodal transport (public transport, bicycle, park’n’ride, etc.) has been identified by traffic operators as a good strategy to tackle congestion issues and its detrimental environmental impacts. A multi-modal and multi-objective trip planner provides users with various multi-modal options optimised on objectives that they prefer (cheapest, fastest, safest, etc) and has a potential to reduce congestion on both a temporal and spatial scale. The computation of multi-modal and multi-objective trips is a complicated mathematical problem, as it must integrate and utilize a diverse range of large data sets, including both road network information and public transport schedules, as well as optimising for a number of competing objectives, where fully optimising for one objective, such as travel time, can adversely affect other objectives, such as cost. The relationship between these objectives can also be quite subjective, as their priorities will vary from user to user. This paper will first outline the various data requirements and formats that are needed for the multi-modal multi-objective trip planner to operate, including static information about the physical infrastructure within Brisbane as well as real-time and historical data to predict traffic flow on the road network and the status of public transport. It will then present information on the graph data structures representing the road and public transport networks within Brisbane that are used in the trip planner to calculate optimal routes. This will allow for an investigation into the various shortest path algorithms that have been researched over the last few decades, and provide a foundation for the construction of the Multi-modal Multi-objective Trip Planner by the development of innovative new algorithms that can operate the large diverse data sets and competing objectives.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The risk of vitamin D insufficiency is increased in persons having limited sunlight exposure and dietary vitamin D. Supplementation compliance might be improved with larger doses taken less often, but this may increase the potential for side effects. The objective of the present study was to determine whether a weekly or weekly/monthly regimen of vitamin D supplementation is as effective as daily supplementation without increasing the risk of side effects. Participants were forty-eight healthy adults who were randomly assigned for 3 months to placebo or one of three supplementation regimens: 50 μg/d (2000 IU/d, analysed dose 70 μg/d), 250 μg/week (10 000 IU/week, analysed dose 331 μg/week) or 1250 μg/week (50 000 IU/week, analysed dose 1544 μg/week) for 4 weeks and then 1250 μg/month for 2 months. Daily and weekly doses were equally effective at increasing serum 25-hydroxyvitamin D, which was significantly greater than baseline in all the supplemented groups after 30 d of treatment. Subjects in the 1250 μg treatment group, who had a BMI >26 kg/m2, had a steady increase in urinary Ca in the first 3 weeks of supplementation, and, overall, the relative risk of hypercalciuria was higher in the 1250 μg group than in the placebo group (P= 0·01). Although vitamin D supplementation remains a controversial issue, these data document that supplementing with ≤ 250 μg/week ( ≤ 10 000 IU/week) can improve or maintain vitamin D status in healthy populations without the risk of hypercalciuria, but 24 h urinary Ca excretion should be evaluated in healthy persons receiving vitamin D3 supplementation in weekly single doses of 1250 μg (50 000 IU).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a semi-supervised approach of anomaly detection in Online Social Networks. The social network is modeled as a graph and its features are extracted to detect anomaly. A clustering algorithm is then used to group users based on these features and fuzzy logic is applied to assign degree of anomalous behavior to the users of these clusters. Empirical analysis shows effectiveness of this method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Hyperhomocysteinemia as a consequence of the MTHFR 677 C > T variant is associated with cardiovascular disease and stroke. Another factor that can potentially contribute to these disorders is a depleted nitric oxide level, which can be due to the presence of eNOS +894 G > T and eNOS −786 T > C variants that make an individual more susceptible to endothelial dysfunction. A number of genotyping methods have been developed to investigate these variants. However, simultaneous detection methods using polymerase chain reaction-restriction fragment length polymorphism (PCR-RFLP) analysis are still lacking. In this study, a novel multiplex PCR-RFLP method for the simultaneous detection of MTHFR 677 C > T and eNOS +894 G > T and eNOS −786 T > C variants was developed. A total of 114 healthy Malay subjects were recruited. The MTHFR 677 C > T and eNOS +894 G > T and eNOS −786 T > C variants were genotyped using the novel multiplex PCR-RFLP and confirmed by DNA sequencing as well as snpBLAST. Allele frequencies of MTHFR 677 C > T and eNOS +894 G > T and eNOS −786 T > C were calculated using the Hardy Weinberg equation. Methods The 114 healthy volunteers were recruited for this study, and their DNA was extracted. Primer pair was designed using Primer 3 Software version 0.4.0 and validated against the BLAST database. The primer specificity, functionality and annealing temperature were tested using uniplex PCR methods that were later combined into a single multiplex PCR. Restriction Fragment Length Polymorphism (RFLP) was performed in three separate tubes followed by agarose gel electrophoresis. PCR product residual was purified and sent for DNA sequencing. Results The allele frequencies for MTHFR 677 C > T were 0.89 (C allele) and 0.11 (T allele); for eNOS +894 G > T, the allele frequencies were 0.58 (G allele) and 0.43 (T allele); and for eNOS −786 T > C, the allele frequencies were 0.87 (T allele) and 0.13 (C allele). Conclusions Our PCR-RFLP method is a simple, cost-effective and time-saving method. It can be used to successfully genotype subjects for the MTHFR 677 C > T and eNOS +894 G > T and eNOS −786 T > C variants simultaneously with 100% concordance from DNA sequencing data. This method can be routinely used for rapid investigation of the MTHFR 677 C > T and eNOS +894 G > T and eNOS −786 T > C variants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

NCOA3 is a known low to moderate-risk breast cancer susceptibility gene, amplified in 5–10% and over expressed in about 60% of breast tumours. Additionally, this over expression is associated with Tamoxifen resistance and poor prognosis. Previously, two variants of NCOA3, 1758G > C and 2880A > G have been associated with breast cancer in two independent populations. Here we assessed the influence of the two NCOA3 variants on breast cancer risk by genotyping an Australian case–control study population. 172 cases and 178 controls were successfully genotyped for the 1758G > C variant and 186 cases and 182 controls were successfully genotyped for the 2880A > G variant using high-resolution melt analysis (HRM). The genotypes of the 1758G > C variant were validated by sequencing. χ2 tests were performed to determine if significant differences exist in the genotype and allele frequencies between the cases and controls. χ2 analysis returned no statistically significant difference (p > 0.05) for genotype frequencies between cases and controls for 1758G > C (χ2 = 0.97, p = 0.6158) or 2880A > G (χ2 = 2.09, p = 0.3516). Similarly, no statistical difference was observed for allele frequencies for 1758G > C (χ2 = 0.07, p = 0.7867) or 2880A > G (χ2 = 0.04, p = 0.8365). Haplotype analysis of the two SNPs also showed no difference between the cases and the controls (p = 0.9585). Our findings in an Australian Caucasian population composed of breast cancer sufferers and an age matched control population did not support the findings of previous studies demonstrating that these markers play a significant role in breast cancer susceptibility. Here, no significant difference was detected between breast cancer patients and healthy matched controls by either the genotype or allele frequencies for the investigated variants (all p ≥ 0.05). While an association of the two variants and breast cancer was not detected in our case–control study population, exploring these variants in a larger population of the same kind may obtain results in concordance with previous studies. Given the importance of NCOA3 and its involvement in biological processes involved in breast cancer and the possible implications variants of the gene could have on the response to Tamoxifen therapy, NCOA3 remains a candidate for further investigations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Essential hypertensives display enhanced signal transduction through pertussis toxin-sensitive G proteins. The T allele of a C825T variant in exon 10 of the G protein β3 subunit gene (GNB3) induces formation of a splice variant (Gβ3-s) with enhanced activity. The T allele of GNB3 was shown recently to be associated with hypertension in unselected German patients (frequency=0.31 versus 0.25 in control). To confirm and extend this finding in a different setting, we performed an association study in Australian white hypertensives. This involved an extensively examined cohort of 110 hypertensives, each of whom were the offspring of 2 hypertensive parents, and 189 normotensives whose parents were both normotensive beyond age 50 years. Genotyping was performed by polymerase chain reaction and digestion with BseDI, which either cut (C allele) or did not cut (T allele) the 268-bp polymerase chain reaction product. T allele frequency in the hypertensive group was 0.43 compared with 0.25 in the normotensive group (χ2=22; P=0.00002; odds ratio=2.3; 95% CI=1.7 to 3.3). The T allele tracked with higher pretreatment blood pressure: diastolic=105±7, 109±16, and 128±28 mm Hg (mean±SD) for CC, CT, and 7T, respectively (P=0.001 by 1-way ANOVA). Blood pressures were higher in female hypertensives with a T allele (P=0.006 for systolic and 0.0003 for diastolic by ANOVA) than they were in male hypertensives. In conclusion, the present study of a group with strong family history supports a role for a genetically determined, physiologically active splice variant of the G protein β3 subunit gene in the causation of essential hypertension.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a polynomial time algorithm is presented for solving the Eden problem for graph cellular automata. The algorithm is based on our neighborhood elimination operation which removes local neighborhood configurations which cannot be used in a pre-image of a given configuration. This paper presents a detailed derivation of our algorithm from first principles, and a detailed complexity and accuracy analysis is also given. In the case of time complexity, it is shown that the average case time complexity of the algorithm is \Theta(n^2), and the best and worst cases are \Omega(n) and O(n^3) respectively. This represents a vast improvement in the upper bound over current methods, without compromising average case performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A people-to-people matching system (or a match-making system) refers to a system in which users join with the objective of meeting other users with the common need. Some real-world examples of these systems are employer-employee (in job search networks), mentor-student (in university social networks), consume-to-consumer (in marketplaces) and male-female (in an online dating network). The network underlying in these systems consists of two groups of users, and the relationships between users need to be captured for developing an efficient match-making system. Most of the existing studies utilize information either about each of the users in isolation or their interaction separately, and develop recommender systems using the one form of information only. It is imperative to understand the linkages among the users in the network and use them in developing a match-making system. This study utilizes several social network analysis methods such as graph theory, small world phenomenon, centrality analysis, density analysis to gain insight into the entities and their relationships present in this network. This paper also proposes a new type of graph called “attributed bipartite graph”. By using these analyses and the proposed type of graph, an efficient hybrid recommender system is developed which generates recommendation for new users as well as shows improvement in accuracy over the baseline methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this paper is to describe a new decomposition construction for perfect secret sharing schemes with graph access structures. The previous decomposition construction proposed by Stinson is a recursive method that uses small secret sharing schemes as building blocks in the construction of larger schemes. When the Stinson method is applied to the graph access structures, the number of such “small” schemes is typically exponential in the number of the participants, resulting in an exponential algorithm. Our method has the same flavor as the Stinson decomposition construction; however, the linear programming problem involved in the construction is formulated in such a way that the number of “small” schemes is polynomial in the size of the participants, which in turn gives rise to a polynomial time construction. We also show that if we apply the Stinson construction to the “small” schemes arising from our new construction, both have the same information rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Businesses document their operational processes as process models. The common practice is to represent process models as directed graphs. The nodes of a process graph represent activities and directed edges constitute activity ordering constraints. A flexible process graph modeling approach proposes to generalize process graph structure to a hypergraph. Obtained process structure aims at formalization of ad-hoc process control flow. In this paper we discuss aspects relevant to concurrent execution of process activities in a collaborative manner organized as a flexible process graph. We provide a real world flexible process scenario to illustrate the approach.