872 resultados para Network cost allocation
Resumo:
The availability of bridges is crucial to people’s daily life and national economy. Bridge health prediction plays an important role in bridge management because maintenance optimization is implemented based on prediction results of bridge deterioration. Conventional bridge deterioration models can be categorised into two groups, namely condition states models and structural reliability models. Optimal maintenance strategy should be carried out based on both condition states and structural reliability of a bridge. However, none of existing deterioration models considers both condition states and structural reliability. This study thus proposes a Dynamic Objective Oriented Bayesian Network (DOOBN) based method to overcome the limitations of the existing methods. This methodology has the ability to act upon as a flexible unifying tool, which can integrate a variety of approaches and information for better bridge deterioration prediction. Two demonstrative case studies are conducted to preliminarily justify the feasibility of the methodology
Resumo:
Networked control systems (NCSs) offer many advantages over conventional control; however, they also demonstrate challenging problems such as network-induced delay and packet losses. This paper proposes an approach of predictive compensation for simultaneous network-induced delays and packet losses. Different from the majority of existing NCS control methods, the proposed approach addresses co-design of both network and controller. It also alleviates the requirements of precise process models and full understanding of NCS network dynamics. For a series of possible sensor-to-actuator delays, the controller computes a series of corresponding redundant control values. Then, it sends out those control values in a single packet to the actuator. Once receiving the control packet, the actuator measures the actual sensor-to-actuator delay and computes the control signals from the control packet. When packet dropout occurs, the actuator utilizes past control packets to generate an appropriate control signal. The effectiveness of the approach is demonstrated through examples.
Resumo:
Online social networks can be found everywhere from chatting websites like MSN, blogs such as MySpace to social media such as YouTube and second life. Among them, there is one interesting type of online social networks, online dating network that is growing fast. This paper analyzes an online dating network from social network analysis point of view. Observations are made and results are obtained in order to suggest a better recommendation system for people-to-people networks.
Resumo:
The head direction (HD) system in mammals contains neurons that fire to represent the direction the animal is facing in its environment. The ability of these cells to reliably track head direction even after the removal of external sensory cues implies that the HD system is calibrated to function effectively using just internal (proprioceptive and vestibular) inputs. Rat pups and other infant mammals display stereotypical warm-up movements prior to locomotion in novel environments, and similar warm-up movements are seen in adult mammals with certain brain lesion-induced motor impairments. In this study we propose that synaptic learning mechanisms, in conjunction with appropriate movement strategies based on warm-up movements, can calibrate the HD system so that it functions effectively even in darkness. To examine the link between physical embodiment and neural control, and to determine that the system is robust to real-world phenomena, we implemented the synaptic mechanisms in a spiking neural network and tested it on a mobile robot platform. Results show that the combination of the synaptic learning mechanisms and warm-up movements are able to reliably calibrate the HD system so that it accurately tracks real-world head direction, and that calibration breaks down in systematic ways if certain movements are omitted. This work confirms that targeted, embodied behaviour can be used to calibrate neural systems, demonstrates that ‘grounding’ of modeled biological processes in the real world can reveal underlying functional principles (supporting the importance of robotics to biology), and proposes a functional role for stereotypical behaviours seen in infant mammals and those animals with certain motor deficits. We conjecture that these calibration principles may extend to the calibration of other neural systems involved in motion tracking and the representation of space, such as grid cells in entorhinal cortex.
Resumo:
This paper presents a novel technique for performing SLAM along a continuous trajectory of appearance. Derived from components of FastSLAM and FAB-MAP, the new system dubbed Continuous Appearance-based Trajectory SLAM (CAT-SLAM) augments appearancebased place recognition with particle-filter based ‘pose filtering’ within a probabilistic framework, without calculating global feature geometry or performing 3D map construction. For loop closure detection CAT-SLAM updates in constant time regardless of map size. We evaluate the effectiveness of CAT-SLAM on a 16km outdoor road network and determine its loop closure performance relative to FAB-MAP. CAT-SLAM recognizes 3 times the number of loop closures for the case where no false positives occur, demonstrating its potential use for robust loop closure detection in large environments.
Resumo:
Collaborative question answering (cQA) portals such as Yahoo! Answers allow users as askers or answer authors to communicate, and exchange information through the asking and answering of questions in the network. In their current set-up, answers to a question are arranged in chronological order. For effective information retrieval, it will be advantageous to have the users’ answers ranked according to their quality. This paper proposes a novel approach of evaluating and ranking the users’answers and recommending the top-n quality answers to information seekers. The proposed approach is based on a user-reputation method which assigns a score to an answer reflecting its answer author’s reputation level in the network. The proposed approach is evaluated on a dataset collected from a live cQA, namely, Yahoo! Answers. To compare the results obtained by the non-content-based user-reputation method, experiments were also conducted with several content-based methods that assign a score to an answer reflecting its content quality. Various combinations of non-content and content-based scores were also used in comparing results. Empirical analysis shows that the proposed method is able to rank the users’ answers and recommend the top-n answers with good accuracy. Results of the proposed method outperform the content-based methods, various combinations, and the results obtained by the popular link analysis method, HITS.
Resumo:
While the phrase “six degrees of separation” is widely used to characterize a variety of humanderived networks, in this study we show that in patent citation network, related patents are connected with an average distance of 6, whereas an average distance for a random pair of nodes in the graph is approximately 15. We use this information to improve the recall level in prior-art retrieval in the setting of blind relevance feedback without any textual knowledge.
Resumo:
This thesis presents the outcomes of a comprehensive research study undertaken to investigate the influence of rainfall and catchment characteristics on urban stormwater quality. The knowledge created is expected to contribute to a greater understanding of urban stormwater quality and thereby enhance the design of stormwater quality treatment systems. The research study was undertaken based on selected urban catchments in Gold Coast, Australia. The research methodology included field investigations, laboratory testing, computer modelling and data analysis. Both univariate and multivariate data analysis techniques were used to investigate the influence of rainfall and catchment characteristics on urban stormwater quality. The rainfall characteristics investigated included average rainfall intensity and rainfall duration whilst catchment characteristics included land use, impervious area percentage, urban form and pervious area location. The catchment scale data for the analysis was obtained from four residential catchments, including rainfall-runoff records, drainage network data, stormwater quality data and land use and land cover data. Pollutants build-up samples were collected from twelve road surfaces in residential, commercial and industrial land use areas. The relationships between rainfall characteristics, catchment characteristics and urban stormwater quality were investigated based on residential catchments and then extended to other land uses. Based on the influence rainfall characteristics exert on urban stormwater quality, rainfall events can be classified into three different types, namely, high average intensity-short duration (Type 1), high average intensity-long duration (Type 2) and low average intensity-long duration (Type 3). This provides an innovative approach to conventional modelling which does not commonly relate stormwater quality to rainfall characteristics. Additionally, it was found that the threshold intensity for pollutant wash-off from urban catchments is much less than for rural catchments. High average intensity-short duration rainfall events are cumulatively responsible for the generation of a major fraction of the annual pollutants load compared to the other rainfall event types. Additionally, rainfall events less than 1 year ARI such as 6- month ARI should be considered for treatment design as they generate a significant fraction of the annual runoff volume and by implication a significant fraction of the pollutants load. This implies that stormwater treatment designs based on larger rainfall events would not be feasible in the context of cost-effectiveness, efficiency in treatment performance and possible savings in land area needed. This also suggests that the simulation of long-term continuous rainfall events for stormwater treatment design may not be needed and that event based simulations would be adequate. The investigations into the relationship between catchment characteristics and urban stormwater quality found that other than conventional catchment characteristics such as land use and impervious area percentage, other catchment characteristics such as urban form and pervious area location also play important roles in influencing urban stormwater quality. These outcomes point to the fact that the conventional modelling approach in the design of stormwater quality treatment systems which is commonly based on land use and impervious area percentage would be inadequate. It was also noted that the small uniformly urbanised areas within a larger mixed catchment produce relatively lower variations in stormwater quality and as expected lower runoff volume with the opposite being the case for large mixed use urbanised catchments. Therefore, a decentralised approach to water quality treatment would be more effective rather than an "end-of-pipe" approach. The investigation of pollutants build-up on different land uses showed that pollutant build-up characteristics vary even within the same land use. Therefore, the conventional approach in stormwater quality modelling, which is based solely on land use, may prove to be inappropriate. Industrial land use has relatively higher variability in maximum pollutant build-up, build-up rate and particle size distribution than the other two land uses. However, commercial and residential land uses had relatively higher variations of nutrients and organic carbon build-up. Additionally, it was found that particle size distribution had a relatively higher variability for all three land uses compared to the other build-up parameters. The high variability in particle size distribution for all land uses illustrate the dissimilarities associated with the fine and coarse particle size fractions even within the same land use and hence the variations in stormwater quality in relation to pollutants adsorbing to different sizes of particles.
Resumo:
This PhD study examines whether water allocation becomes more productive when it is re-allocated from 'low' to 'high' efficient alternative uses in village irrigation systems (VISs) in Sri Lanka. Reservoir-based agriculture is a collective farming economic activity, which inter-sectoral allocation of water is assumed to be inefficient due to market imperfections and weak user rights. Furthermore, the available literature shows that a „head-tail syndrome. is the most common issue for intra-sectoral water management in „irrigation. agriculture. This research analyses the issue of water allocation by using primary data collected from two surveys of 460 rice farmers and 325 fish farming groups in two administrative districts in Sri Lanka. Technical efficiency estimates are undertaken for both rice farming and culture-based fisheries (CBF) production. The equi-marginal principle is applied for inter and intra-sectoral allocation of water. Welfare benefits of water re-allocation are measured through consumer surplus estimation. Based on these analyses, the overall findings of the thesis can be summarised as follows. The estimated mean technical efficiency (MTE) for rice farming is 73%. For CBF production, the estimated MTE is 33%. The technical efficiency distribution is skewed to the left for rice farming, while it skewed to the right for CBF production. The results show that technical efficiency of rice farming can be improved by formalising transferability of land ownership and, therefore, water user rights by enhancing the institutional capacity of Farmer Organisations (FOs). Other effective tools for improving technical efficiency of CBF production are strengthening group stability of CBF farmers, improving the accessibility of official consultation, and attracting independent investments. Inter-sectoral optimal allocation shows that the estimated inefficient volume of water in rice farming, which can be re-allocated for CBF production, is 32%. With the application of successive policy instruments (e.g., a community transferable quota system and promoting CBF activities), there is potential for a threefold increase in marginal value product (MVP) of total reservoir water in VISs. The existing intra-sectoral inefficient volume of water use in tail-end fields and head-end fields can potentially be removed by reducing water use by 10% and 23% respectively and re-allocating this to middle fields. This re-allocation may enable a twofold increase in MVP of water used in rice farming without reducing the existing rice output, but will require developing irrigation practices to facilitate this re-allocation. Finally, the total productivity of reservoir water can be increased by responsible village level institutions and primary level stakeholders (i.e., co-management) sharing responsibility of water management, while allowing market forces to guide the efficient re-allocation decisions. This PhD has demonstrated that instead of farmers allocating water between uses haphazardly, they can now base their decisions on efficient water use with a view to increasing water productivity. Such an approach, no doubt will enhance farmer incomes and community welfare.
Using cost-effective multimedia to create engaging learning experiences in law and other disciplines
Resumo:
This is the final report of an Australian Learning and Teaching Council Teaching Fellowship which addressed the needs of two separate groups of learners: (1) final year law students studying ethics and (2) law academics and other interested educators in higher education wishing to use information and communication technologies (ICT) to create engaging learning environments for their students but lacking the capacity to do so. The Fellowship resulted in final year law students being infused with an improved appreciation of ethical practice than they receive from traditional lecture/tutorial means by the development of an integrated program of blended learning including an online program entitled "Entry into Valhalla". This "ethics capstone‟ utilises multimedia produced using cost effective resources (including the "Second Life" virtual environment) to create engaging, contextualised learning experiences. The Fellowship also constructed the knowledge of producing cost-effective multimedia projects in other law academics and other educators in higher education by staff development activities comprising workshops, conference presentations and an interactive website using the "Entry into Valhalla" program as a case study exemplar.
Resumo:
The Texas Department of Transportation (TxDOT) is concerned about the widening gap between preservation needs and available funding. Funding levels are not adequate to meet the preservation needs of the roadway network; therefore projects listed in the 4-Year Pavement Management Plan must be ranked to determine which projects should be funded now and which can be postponed until a later year. Currently, each district uses locally developed methods to prioritize projects. These ranking methods have relied on less formal qualitative assessments based on engineers’ subjective judgment. It is important for TxDOT to have a 4-Year Pavement Management Plan that uses a transparent, rational project ranking process. The objective of this study is to develop a conceptual framework that describes the development of the 4-Year Pavement Management Plan. It can be largely divided into three Steps; 1) Network-Level project screening process, 2) Project-Level project ranking process, and 3) Economic Analysis. A rational pavement management procedure and a project ranking method accepted by districts and the TxDOT administration will maximize efficiency in budget allocations and will potentially help improve pavement condition. As a part of the implementation of the 4-Year Pavement Management Plan, the Network-Level Project Screening (NLPS) tool including the candidate project identification algorithm and the preliminary project ranking matrix was developed. The NLPS has been used by the Austin District Pavement Engineer (DPE) to evaluate PMIS (Pavement Management Information System) data and to prepare a preliminary list of candidate projects for further evaluation.
Resumo:
A tunable decoupling and matching network (DMN) for a closely spaced two-element antenna array is presented. The DMN achieves perfect matching for the eigenmodes of the array and thus simultaneously isolates and matches the system ports while keeping the circuit small. Arrays of closely spaced wire and microstrip monopole pairs are used to demonstrate the proposed DMN. It is found that monopoles with different lengths can be used for the design frequency by using this DMN, which increases the design flexibility. This property also enables frequency tuning using the DMN only without having to change the length of the antennas. The proposed DMN uses only one varactor to achieve a tuning range of 18.8% with both return loss and isolation better than 10-dB when the spacing between the antenna is 0.05λ. When the spacing increases to 0.1λ, the simulated tuning range is more than 60%.
Resumo:
This research deals with an innovative methodology for optimising the coal train scheduling problem. Based on our previously published work, generic solution techniques are developed by utilising a “toolbox” of standard well-solved standard scheduling problems. According to our analysis, the coal train scheduling problem can be basically modelled a Blocking Parallel-Machine Job-Shop Scheduling (BPMJSS) problem with some minor constraints. To construct the feasible train schedules, an innovative constructive algorithm called the SLEK algorithm is proposed. To optimise the train schedule, a three-stage hybrid algorithm called the SLEK-BIH-TS algorithm is developed based on the definition of a sophisticated neighbourhood structure under the mechanism of the Best-Insertion-Heuristic (BIH) algorithm and Tabu Search (TS) metaheuristic algorithm. A case study is performed for optimising a complex real-world coal rail system in Australia. A method to calculate the lower bound of the makespan is proposed to evaluate results. The results indicate that the proposed methodology is promising to find the optimal or near-optimal feasible train timetables of a coal rail system under network and terminal capacity constraints.