804 resultados para Distance-balanced graph
Resumo:
Normally, vehicles queued at an intersection reach maximum flow rate after the fourth vehicle and results in a start-up lost time. This research demonstrated that the Enlarged Stopping Distance (ESD) concept could assist in reducing the start-up time and therefore increase traffic flow capacity at signalised intersections. In essence ESD gives sufficient space for a queuing vehicle to accelerate simultaneously without having to wait for the front vehicle to depart, hence reducing start-up lost time. In practice, the ESD concept would be most effective when enlarged stopping distance between the first and second vehicle allowing faster clearance of the intersection.
Resumo:
This practice-led research examines the generative function of loss in fiction that explores themes of grief and longing. This research considers how loss may be understood as a structuring mechanism through which characters evaluate time, resolve loss and affect future change. The creative work is a work of literary fiction titled A Distance Too Far Away. Aubrey, the story’s protagonist, is a woman in her twenties living in Brisbane in the early 1980s, carving out an independent life for herself away from her family. Through a flashback narrative sequence, told from the perspective of the twelve year narrator, Aubrey retraces a significant point of rupture in her life following a series of family tragedies. A Distance Too Far Away explores the tension between belonging and freedom, and considers how the past provides a malleable space for illuminating desire in order to traverse the gap between the world as it is and the world as we want it to be. The exegetical component of this research considers an alternative critical frame for interpreting the work of American author Anne Tyler, a writer who has had a significant influence on my own practice. Frequently criticised for creating sentimental and inert characters, many critics observe that nothing happens in Tyler’s circular plots. This research challenges these assertions, and through a contextual analysis of Tyler’s Ladder of Years (1995) investigates how Tyler engages with memory and nostalgia in order to move across time and resolve loss.
Resumo:
In this paper, we propose a semi-supervised approach of anomaly detection in Online Social Networks. The social network is modeled as a graph and its features are extracted to detect anomaly. A clustering algorithm is then used to group users based on these features and fuzzy logic is applied to assign degree of anomalous behavior to the users of these clusters. Empirical analysis shows effectiveness of this method.
Resumo:
The count-min sketch is a useful data structure for recording and estimating the frequency of string occurrences, such as passwords, in sub-linear space with high accuracy. However, it cannot be used to draw conclusions on groups of strings that are similar, for example close in Hamming distance. This paper introduces a variant of the count-min sketch which allows for estimating counts within a specified Hamming distance of the queried string. This variant can be used to prevent users from choosing popular passwords, like the original sketch, but it also allows for a more efficient method of analysing password statistics.
Resumo:
This paper discusses computer mediated distance learning on a Master's level course in the UK and student perceptions of this as a quality learning environment.
Resumo:
This research has successfully applied super-resolution and multiple modality fusion techniques to address the major challenges of human identification at a distance using face and iris. The outcome of the research is useful for security applications.
Resumo:
A people-to-people matching system (or a match-making system) refers to a system in which users join with the objective of meeting other users with the common need. Some real-world examples of these systems are employer-employee (in job search networks), mentor-student (in university social networks), consume-to-consumer (in marketplaces) and male-female (in an online dating network). The network underlying in these systems consists of two groups of users, and the relationships between users need to be captured for developing an efficient match-making system. Most of the existing studies utilize information either about each of the users in isolation or their interaction separately, and develop recommender systems using the one form of information only. It is imperative to understand the linkages among the users in the network and use them in developing a match-making system. This study utilizes several social network analysis methods such as graph theory, small world phenomenon, centrality analysis, density analysis to gain insight into the entities and their relationships present in this network. This paper also proposes a new type of graph called “attributed bipartite graph”. By using these analyses and the proposed type of graph, an efficient hybrid recommender system is developed which generates recommendation for new users as well as shows improvement in accuracy over the baseline methods.
Resumo:
The education sector has dramatically changed in the past half decade. In a time of globalisation of education and tightening budgets, various paradigm shifts and challenges have rapidly changed learning and teaching. These include: meeting student expectation for more engaging, more interactive learning experiences, the increased focus to deliver content online, and the complexities of fast-changing technologies. Rising to these challenges and responding to them is a complex and multi-faceted task. This paper discusses educational theories and issues and explores current educational practices in the context of teaching undergraduate students via distance education in the university context. A case study applies a framework drawn from engineering education using the learner-centric concept of academagogy. Results showed that academagogy actively empowers students to build effective learning, and engages facilitators in meaningful teaching and delivery methods.
Resumo:
In plants, silencing of mRNA can be transmitted from cell to cell and also over longer distances from roots to shoots. To investigate the long-distance mechanism, WT and mutant shoots were grafted onto roots silenced for an mRNA. We show that three genes involved in a chromatin silencing pathway, NRPD1a encoding RNA polymerase IVa, RNA-dependent RNA polymerase 2 (RDR2), and DICER-like 3 (DCL3), are required for reception of long-distance mRNA silencing in the shoot. A mutant representing a fourth gene in the pathway, argonaute4 (ago4), was also partially compromised in the reception of silencing. This pathway produces 24-nt siRNAs and resulted in decapped RNA, a known substrate for amplification of dsRNA by RDR6. Activation of silencing in grafted shoots depended on RDR6, but no 24-nt siRNAs were detected in mutant rdr6 shoots, indicating that RDR6 also plays a role in initial signal perception. After amplification of decapped transcripts, DCL4 and DCL2 act hierarchically as they do in antiviral resistance to produce 21- and 22-nt siRNAs, respectively, and these guide mRNA degradation. Several dcl genotypes were also tested for their capacity to transmit the mobile silencing signal from the rootstock. dcl1-8 and a dcl2 dcl3 dcl4 triple mutant are compromised in micro-RNA and siRNA biogenesis, respectively, but were unaffected in signal transmission. © 2007 by The National Academy of Sciences of the USA.
Resumo:
This paper investigates: - correlation between transit route passenger loading and travel distance - its implications on quality of service (QoS) and resource productivity. It uses Automatic Fare Collection (AFC) data across a weekday on a premium bus line in Brisbane, Australia. A composite load-distance factor is proposed as a new measure for profiling transit route on-board passenger comfort QoS. Understanding these measures and their correlation is important for planning, design, and operational activities.
Resumo:
This paper investigates quality of service and resource productivity implications of transit route passenger loading and travel distance. Weekday Automatic Fare Collection data for a premium radial bus route in Brisbane, Australia, is used to investigate correlation between load factor and distance factor. Relationships between boardings and transit work indicate that distance factor generally increases with load factor. Time series analysis is then presented by examining each direction on an hour by hour basis. Inbound correlation is medium to strong across the entire span of service and strong for daytime services up to 19:30, while outbound correlation is strong across the entire span. Passengers tend to be making longer distance, peak direction commuter trips under the least comfortable conditions under stretched peak schedules than off-peak. Therefore productivity gains may be possible by adjusting fleet utilization during off-peak times. Weekday profiles by direction are established for a composite load-distance factor. A threshold corresponding to standing passengers on the Maximum Load Segment reveals that on-board loading and travel distance combined are more severe during the morning inbound peak than evening outbound peak, although the sharpness of the former suggests that encouraging shoulder peak travel during the morning would be more effective than evening peak. Further research suggested includes: consideration of travel duration factor, relating noise within hour to Peak Hour Factor, profiling load-distance factor across a range of case studies, and relating load-distance factor threshold to line length.
Resumo:
The use of Mahalanobis squared distance–based novelty detection in statistical damage identification has become increasingly popular in recent years. The merit of the Mahalanobis squared distance–based method is that it is simple and requires low computational effort to enable the use of a higher dimensional damage-sensitive feature, which is generally more sensitive to structural changes. Mahalanobis squared distance–based damage identification is also believed to be one of the most suitable methods for modern sensing systems such as wireless sensors. Although possessing such advantages, this method is rather strict with the input requirement as it assumes the training data to be multivariate normal, which is not always available particularly at an early monitoring stage. As a consequence, it may result in an ill-conditioned training model with erroneous novelty detection and damage identification outcomes. To date, there appears to be no study on how to systematically cope with such practical issues especially in the context of a statistical damage identification problem. To address this need, this article proposes a controlled data generation scheme, which is based upon the Monte Carlo simulation methodology with the addition of several controlling and evaluation tools to assess the condition of output data. By evaluating the convergence of the data condition indices, the proposed scheme is able to determine the optimal setups for the data generation process and subsequently avoid unnecessarily excessive data. The efficacy of this scheme is demonstrated via applications to a benchmark structure data in the field.
Resumo:
The producer has for many years been a central agent in recording studio sessions; the validation of this role was, in many ways, related to the producer’s physical presence in the studio, to a greater or lesser extent. However, improvements in the speed of digital networks have allowed studio sessions to be produced long-distance, in real-time, through communication programs such as Skype or REDIS. How does this impact on the role of the producer, a “nexus between the creative inspiration of the artist, the technology of the recording studio, and the commercial aspirations of the record company” (Howlett 2012)? From observations of a studio recording session in Lisbon produced through Skype from New York, this article focuses on the role of the producer in these relatively new recording contexts involving long distance media networks. Methodology involved participant observation carried out in Estúdios Namouche in Lisbon (where the session took place), as part of doctoral research. This ethnographic approach also included a number of semi-directed ethnographic interviews of the different actors in this scenario—musicians, recording engineers, composers and producers. As a theoretical framework, the research of De Zutter and Sawyer on Distributed Creativity is used, as the recording studio sets an example of “a cognitive system where […] tasks are not accomplished by separate individuals, but rather through the interactions of those individuals” (DeZutter 2009:4). Therefore, creativity often emerges as a result of this interaction.
Resumo:
We study the natural problem of secure n-party computation (in the computationally unbounded attack model) of circuits over an arbitrary finite non-Abelian group (G,⋅), which we call G-circuits. Besides its intrinsic interest, this problem is also motivating by a completeness result of Barrington, stating that such protocols can be applied for general secure computation of arbitrary functions. For flexibility, we are interested in protocols which only require black-box access to the group G (i.e. the only computations performed by players in the protocol are a group operation, a group inverse, or sampling a uniformly random group element). Our investigations focus on the passive adversarial model, where up to t of the n participating parties are corrupted.
Resumo:
The purpose of this paper is to describe a new decomposition construction for perfect secret sharing schemes with graph access structures. The previous decomposition construction proposed by Stinson is a recursive method that uses small secret sharing schemes as building blocks in the construction of larger schemes. When the Stinson method is applied to the graph access structures, the number of such “small” schemes is typically exponential in the number of the participants, resulting in an exponential algorithm. Our method has the same flavor as the Stinson decomposition construction; however, the linear programming problem involved in the construction is formulated in such a way that the number of “small” schemes is polynomial in the size of the participants, which in turn gives rise to a polynomial time construction. We also show that if we apply the Stinson construction to the “small” schemes arising from our new construction, both have the same information rate.