665 resultados para Constrained network mapping
Resumo:
A whole-genome scan was conducted to map quantitative trait loci (QTL) for BSE resistance or susceptibility. Cows from four half-sib families were included and 173 microsatellite markers were used to construct a 2835-cM (Kosambi) linkage map covering 29 autosomes and the pseudoautosomal region of the sex chromosome. Interval mapping by linear regression was applied and extended to a multiple-QTL analysis approach that used identified QTL on other chromosomes as cofactors to increase mapping power. In the multiple-QTL analysis, two genome-wide significant QTL (BTA17 and X/Y ps) and four genome-wide suggestive QTL (BTA1, 6, 13, and 19) were revealed. The QTL identified here using linkage analysis do not overlap with regions previously identified using TDT analysis. One factor that may explain the disparity between the results is that a more extensive data set was used in the present study. Furthermore, methodological differences between TDT and linkage analyses may affect the power of these approaches.
Resumo:
As one of the measures for decreasing road traffic noise in a city, the control of the traffic flow and the physical distribution is considered. To conduct the measure effectively, the model for predicting the traffic flow in the citywide road network is necessary. In this study, the existing model named AVENUE was used as a traffic flow prediction model. The traffic flow model was integrated with the road vehicles' sound power model and the sound propagation model, and the new road traffic noise prediction model was established. As a case study, the prediction model was applied to the road network of Tsukuba city in Japan and the noise map of the city was made. To examine the calculation accuracy of the noise map, the calculated values of the noise at the main roads were compared with the measured values. As a result, it was found that there was a possibility that the high accuracy noise map of the city could be made by using the noise prediction model developed in this study.
Resumo:
The existence of the Macroscopic Fundamental Diagram (MFD), which relates network space-mean density and flow, has been shown in urban networks under homogeneous traffic conditions. Since the MFD represents the area-wide network traffic performances, studies on perimeter control strategies and an area traffic state estimation utilizing the MFD concept has been reported. The key requirements for the well-defined MFD is the homogeneity of the area wide traffic condition, which is not universally expected in real world. For the practical application of the MFD concept, several researchers have identified the influencing factors for network homogeneity. However, they did not explicitly take drivers’ behaviour under real time information provision into account, which has a significant impact on the shape of the MFD. This research aims to demonstrate the impact of drivers’ route choice behaviour on network performance by employing the MFD as a measurement. A microscopic simulation is chosen as an experimental platform. By changing the ratio of en-route informed drivers and pre-trip informed drivers as well as by taking different route choice parameters, various scenarios are simulated in order to investigate how drivers’ adaptation to the traffic congestion influences the network performance and the MFD shape. This study confirmed and addressed the impact of information provision on the MFD shape and highlighted the significance of the route choice parameter setting as an influencing factor in the MFD analysis.
Resumo:
The use of Wireless Sensor Networks (WSNs) for Structural Health Monitoring (SHM) has become a promising approach due to many advantages such as low cost, fast and flexible deployment. However, inherent technical issues such as data synchronization error and data loss have prevented these distinct systems from being extensively used. Recently, several SHM-oriented WSNs have been proposed and believed to be able to overcome a large number of technical uncertainties. Nevertheless, there is limited research verifying the applicability of those WSNs with respect to demanding SHM applications like modal analysis and damage identification. This paper first presents a brief review of the most inherent uncertainties of the SHM-oriented WSN platforms and then investigates their effects on outcomes and performance of the most robust Output-only Modal Analysis (OMA) techniques when employing merged data from multiple tests. The two OMA families selected for this investigation are Frequency Domain Decomposition (FDD) and Data-driven Stochastic Subspace Identification (SSI-data) due to the fact that they both have been widely applied in the past decade. Experimental accelerations collected by a wired sensory system on a large-scale laboratory bridge model are initially used as clean data before being contaminated by different data pollutants in sequential manner to simulate practical SHM-oriented WSN uncertainties. The results of this study show the robustness of FDD and the precautions needed for SSI-data family when dealing with SHM-WSN uncertainties. Finally, the use of the measurement channel projection for the time-domain OMA techniques and the preferred combination of the OMA techniques to cope with the SHM-WSN uncertainties is recommended.
Resumo:
Next generation screens of diverse dimensions such as the Pebble e-paper watch, Google’s Project Glass, Microsoft’s Kinect and IllumiRoom, and large-scale multi-touch screen surface areas, increasingly saturate and diversify the urban mediascape. This paper seeks to contribute to media architecture and interaction design theory by starting to critically examine how these different screen formats are creating a ubiquitous screen mediascape across the city. We introduce next generation personal, domestic, and public screens. The paper critically challenges conventional dichotomies such as local / global, online / offline, private / public, large / small, mobile / static, that have been created in the past to describe some of the qualities and characteristics of interfaces and their usage. More and more scholars recognise that the black and white nature of these dichotomies does not adequately represent the fluid and agile capabilities of many new screen interfaces. With this paper, we hope to illustrate the more nuanced ‘trans-scalar’ qualities of these new urban interactions, that is, ways in which they provide a range functionality, without being locked into either end of a scale.
Resumo:
Extracting and aggregating the relevant event records relating to an identified security incident from the multitude of heterogeneous logs in an enterprise network is a difficult challenge. Presenting the information in a meaningful way is an additional challenge. This paper looks at solutions to this problem by first identifying three main transforms; log collection, correlation, and visual transformation. Having identified that the CEE project will address the first transform, this paper focuses on the second, while the third is left for future work. To aggregate by correlating event records we demonstrate the use of two correlation methods, simple and composite. These make use of a defined mapping schema and confidence values to dynamically query the normalised dataset and to constrain result events to within a time window. Doing so improves the quality of results, required for the iterative re-querying process being undertaken. Final results of the process are output as nodes and edges suitable for presentation as a network graph.
Resumo:
Effective Wayfinding is the successful interplay of human and environmental factors resulting in a person successfully moving from their current position to a desired location in a timely manner. To date this process has not been modelled to reflect this interplay. This paper proposes a complex modelling system approach of wayfinding by using Bayesian Networks to model this process, and applies the model to airports. The model suggests that human factors have a greater impact on effective wayfinding in airports than environmental factors. The greatest influences on human factors are found to be the level of spatial anxiety experienced by travellers and their cognitive and spatial skills. The model also predicted that the navigation pathway that a traveller must traverse has a larger impact on the effectiveness of an airport’s environment in promoting effective wayfinding than the terminal design.
Resumo:
Strike-slip faults commonly display structurally complex areas of positive or negative topography. Understanding the development of such areas has important implications for earthquake studies and hydrocarbon exploration. Previous workers identified the key factors controlling the occurrence of both topographic modes and the related structural styles. Kinematic and stress boundary conditions are of first-order relevance. Surface mass transport and material properties affect fault network structure. Experiments demonstrate that dilatancy can generate positive topography even under simple-shear boundary conditions. Here, we use physical models with sand to show that the degree of compaction of the deformed rocks alone can determine the type of topography and related surface fault network structure in simple-shear settings. In our experiments, volume changes of ∼5% are sufficient to generate localized uplift or subsidence. We discuss scalability of model volume changes and fault network structure and show that our model fault zones satisfy geometrical similarity with natural flower structures. Our results imply that compaction may be an important factor in the development of topography and fault network structure along strike-slip faults in sedimentary basins.
Resumo:
The building sector is the dominant consumer of energy and therefore a major contributor to anthropomorphic climate change. The rapid generation of photorealistic, 3D environment models with incorporated surface temperature data has the potential to improve thermographic monitoring of building energy efficiency. In pursuit of this goal, we propose a system which combines a range sensor with a thermal-infrared camera. Our proposed system can generate dense 3D models of environments with both appearance and temperature information, and is the first such system to be developed using a low-cost RGB-D camera. The proposed pipeline processes depth maps successively, forming an ongoing pose estimate of the depth camera and optimizing a voxel occupancy map. Voxels are assigned 4 channels representing estimates of their true RGB and thermal-infrared intensity values. Poses corresponding to each RGB and thermal-infrared image are estimated through a combination of timestamp-based interpolation and a pre-determined knowledge of the extrinsic calibration of the system. Raycasting is then used to color the voxels to represent both visual appearance using RGB, and an estimate of the surface temperature. The output of the system is a dense 3D model which can simultaneously represent both RGB and thermal-infrared data using one of two alternative representation schemes. Experimental results demonstrate that the system is capable of accurately mapping difficult environments, even in complete darkness.
Resumo:
Mapping Multiple Literacies brings together the latest theory and research in the fields of literacy study and European philosophy, Multiple Literacies Theory (MLT) and the philosophical work of Gilles Deleuze. It frames the process of becoming literate as a fluid process involving multiple modes of presentation, and explains these processes in terms of making maps of our social lives and ways of doing things together. For Deleuze, language acquisition is a social activity of which we are a part, but only one part amongst many others. Masny and Cole draw on Deleuze's thinking to expand the repertoires of literacy research and understanding. They outline how we can understand literacy as a social activity and map the ways in which becoming literate may take hold and transform communities. The chapters in this book weave together theory, data and practice to open up a creative new area of literacy studies and to provoke vigorous debate about the sociology of literacy.
Resumo:
Purpose – The purpose of this paper is to contribute to the sociology-of-science type of accounting literature, addressing how accounting knowledge is established, advanced and extended. Design/methodology/approach – The research question is answered through the example of research into linkages between accounting and religion. Adopting an actor-network theory (ANT) approach, the paper follows the actors involved in the construction of accounting as an academic discipline through the controversies in which they engage to develop knowledge. Findings – The paper reveals that accounting knowledge is established, advanced and developed through the ongoing mobilisation of nonhumans (journals) who can enrol other humans and nonhumans. It shows that knowledge advancement, establishment and development is more contingent on network breadth than on research paradigms, which appear as side-effects of positioning vis-a-vis a community. Originality/value – The originality of this paper is twofold. First, ANT is applied to accounting knowledge, whereas the accounting literature applies it to the spread of management accounting ideas, methods and practices. Second, an original methodology for data collection is developed by inviting authors from the network to give a reflexive account of their writings at the time they joined the network. Well diffused in sociology and philosophy, such an approach is, albeit, original in accounting research.
Resumo:
Understanding network traffic behaviour is crucial for managing and securing computer networks. One important technique is to mine frequent patterns or association rules from analysed traffic data. On the one hand, association rule mining usually generates a huge number of patterns and rules, many of them meaningless or user-unwanted; on the other hand, association rule mining can miss some necessary knowledge if it does not consider the hierarchy relationships in the network traffic data. Aiming to address such issues, this paper proposes a hybrid association rule mining method for characterizing network traffic behaviour. Rather than frequent patterns, the proposed method generates non-similar closed frequent patterns from network traffic data, which can significantly reduce the number of patterns. This method also proposes to derive new attributes from the original data to discover novel knowledge according to hierarchy relationships in network traffic data and user interests. Experiments performed on real network traffic data show that the proposed method is promising and can be used in real applications. Copyright2013 John Wiley & Sons, Ltd.
Resumo:
This thesis explores how governance networks prioritise and engage with their stakeholders, by studying three exemplars of “Regional Road Group” governance networks in Queensland, Australia. In the context of managing regionally significant road works programs, stakeholder prioritisation is a complex activity which is unlikely to influence interactions with stakeholders outside of the network. However, stakeholder priority is more likely to influence stakeholder interactions within the networks themselves. Both stakeholder prioritisation and engagement are strongly influenced by the way that the networks are managed, and in particular network operating rules and continuing access to resources.
Resumo:
Objective: Effective management of multi-resistant organisms is an important issue for hospitals both in Australia and overseas. This study investigates the utility of using Bayesian Network (BN) analysis to examine relationships between risk factors and colonization with Vancomycin Resistant Enterococcus (VRE). Design: Bayesian Network Analysis was performed using infection control data collected over a period of 36 months (2008-2010). Setting: Princess Alexandra Hospital (PAH), Brisbane. Outcome of interest: Number of new VRE Isolates Methods: A BN is a probabilistic graphical model that represents a set of random variables and their conditional dependencies via a directed acyclic graph (DAG). BN enables multiple interacting agents to be studied simultaneously. The initial BN model was constructed based on the infectious disease physician‟s expert knowledge and current literature. Continuous variables were dichotomised by using third quartile values of year 2008 data. BN was used to examine the probabilistic relationships between VRE isolates and risk factors; and to establish which factors were associated with an increased probability of a high number of VRE isolates. Software: Netica (version 4.16). Results: Preliminary analysis revealed that VRE transmission and VRE prevalence were the most influential factors in predicting a high number of VRE isolates. Interestingly, several factors (hand hygiene and cleaning) known through literature to be associated with VRE prevalence, did not appear to be as influential as expected in this BN model. Conclusions: This preliminary work has shown that Bayesian Network Analysis is a useful tool in examining clinical infection prevention issues, where there is often a web of factors that influence outcomes. This BN model can be restructured easily enabling various combinations of agents to be studied.