842 resultados para Graph DBMS, BenchMarking, OLAP, NoSQL
Resumo:
This paper presents a mapping and navigation system for a mobile robot, which uses vision as its sole sensor modality. The system enables the robot to navigate autonomously, plan paths and avoid obstacles using a vision based topometric map of its environment. The map consists of a globally-consistent pose-graph with a local 3D point cloud attached to each of its nodes. These point clouds are used for direction independent loop closure and to dynamically generate 2D metric maps for locally optimal path planning. Using this locally semi-continuous metric space, the robot performs shortest path planning instead of following the nodes of the graph --- as is done with most other vision-only navigation approaches. The system exploits the local accuracy of visual odometry in creating local metric maps, and uses pose graph SLAM, visual appearance-based place recognition and point clouds registration to create the topometric map. The ability of the framework to sustain vision-only navigation is validated experimentally, and the system is provided as open-source software.
Resumo:
Constructing train schedules is vital in railways. This complex and time consuming task is however made more difficult by additional requirements to make train schedules robust to delays and other disruptions. For a timetable to be regarded as robust, it should be insensitive to delays of a specified level and its performance with respect to a given metric, should be within given tolerances. In other words the effect of delays should be identifiable and should be shown to be minimal. To this end, a sensitivity analysis is proposed that identifies affected operations. More specifically a sensitivity analysis for determining what operation delays cause each operation to be affected is proposed. The information provided by this analysis gives another measure of timetable robustness and also provides control information that can be used when delays occur in practice. Several algorithms are proposed to identify this information and they utilise a disjunctive graph model of train operations. Upon completion the sets of affected operations can also be used to define the impact of all delays without further disjunctive graph evaluations.
Resumo:
The internationalization of construction companies has become of significant interest as the global construction market continues to be integrated into a more competitive and turbulent business environment. However, due to the complicated and multifaceted nature of international business and performance, there is as yet no consensus on how to evaluate the performance of international construction firms (ICFs). The purpose of this paper, therefore, is to develop a practical framework for measuring the performance of ICFs. Based on the balanced scorecard (BSC), a framework with detailed measures is developed, investigated, and tested using a three-step research design. In the first step, 27 measures under six dimensions (financial, market, customer, internal business processes, stakeholders, and learning and growth) are determined by literature review, interviews with academics, and seminar discussions. Subsequently, a questionnaire survey is conducted to investigate weights of these 27 performance measures. The questionnaire survey also supports the importance of measuring intangible aspects of international construction performance from the practitioner’s viewpoint. Additionally, a case study is described to test the framework’s robustness and usefulness. This is achieved by benchmarking the performance of a Chinese ICF with nine other counterparts worldwide. It is found that the framework provides an effective basis for benchmarking ICFs to effectively monitor their performance and support the development of strategies for improved competitiveness in the international arena. This paper is the first attempt to present a balanced and practically tested framework for evaluating the performance of ICFs. It contributes to the practice of performance measurement and related internationalization in the construction industry in general.
Resumo:
Cross-Lingual Link Discovery (CLLD) is a new problem in Information Retrieval. The aim is to automatically identify meaningful and relevant hypertext links between documents in different languages. This is particularly helpful in knowledge discovery if a multi-lingual knowledge base is sparse in one language or another, or the topical coverage in each language is different; such is the case with Wikipedia. Techniques for identifying new and topically relevant cross-lingual links are a current topic of interest at NTCIR where the CrossLink task has been running since the 2011 NTCIR-9. This paper presents the evaluation framework for benchmarking algorithms for cross-lingual link discovery evaluated in the context of NTCIR-9. This framework includes topics, document collections, assessments, metrics, and a toolkit for pooling, assessment, and evaluation. The assessments are further divided into two separate sets: manual assessments performed by human assessors; and automatic assessments based on links extracted from Wikipedia itself. Using this framework we show that manual assessment is more robust than automatic assessment in the context of cross-lingual link discovery.
Resumo:
In the real world there are many problems in network of networks (NoNs) that can be abstracted to a so-called minimum interconnection cut problem, which is fundamentally different from those classical minimum cut problems in graph theory. Thus, it is desirable to propose an efficient and effective algorithm for the minimum interconnection cut problem. In this paper we formulate the problem in graph theory, transform it into a multi-objective and multi-constraint combinatorial optimization problem, and propose a hybrid genetic algorithm (HGA) for the problem. The HGA is a penalty-based genetic algorithm (GA) that incorporates an effective heuristic procedure to locally optimize the individuals in the population of the GA. The HGA has been implemented and evaluated by experiments. Experimental results have shown that the HGA is effective and efficient.
Resumo:
The structures of the anhydrous products from the interaction of 2-amino-5-(4-bromophenyl)-1,3,4-thiadiazole with (2-naphthoxy)acetic acid, the 1:1 adduct C8H6BrN3S . C12H10O3 (I) and 3,5-dinitrobenzoic acid, the salt C8H7BrN3S+ C7H3N2O6- (II) have been determined. In the adduct (I), a heterodimer is formed through a cyclic hydrogen-bonding motif [graph set R2/2(8)], involving carboxylic acid O-H...N(hetero)and amine N-H...O(carboxyl) interactions. The heterodimers are essentially planar with a thiadiazole to naphthyl ring dihedral angle of 15.9(2)deg. and the intramolecular thiadiazole to phenyl ring angle of 4.7(2)deg. An amine N-H...N(hetero) hydrogen bond between the heterodimers generates a one-dimensional chain structure extending down [001]. Also present are weak benzene-benzene and naphthalene-naphthalene pi-pi stacking interactions down the b axis [minimum ring centroid separation, 3.936(3) Ang.]. With the salt (II), the cation-anion association is also through a cyclic R2/2(8) motif but involving duplex N-H...O(carboxyl) hydrogen bonds, giving a heterodimer which is close to planar [dihedral angles between the thiadiazole ring and the two benzene rings, 5.00(16)deg. (intra) and 7.23(15)deg. (inter)]. A secondary centrosymmetric cyclic N-H...O(carboxyl) hydrogen-bonding association involving the second amino H-atom generates a heterotetramer. Also present in the crystal are weak pi-pi i-\p interactions between thiadiazolium rings [minimum ring centroid separation, 3.936(3)Ang.], as well as a short Br...O(nitro) interaction [3.314(4)Ang.]. The two structures reported here now provide a total of three crystallographically characterized examples of co-crystalline products from the interaction of 2-amino-5-(4-bromophenyl)-1,3,4-thiadiazole with carboxylic acids, of which only one involves proton-transfer.
Resumo:
Lean strategies have been developed to eliminate or reduce manufacturing waste and thus improve operational efficiency in manufacturing processes. However, implementing lean strategies requires a large amount of resources and, in practice, manufacturers encounter difficulties in selecting appropriate lean strategies within their resource constraints. There is currently no systematic methodology available for selecting appropriate lean strategies within a manufacturer's resource constraints. In the lean transformation process, it is also critical to measure the current and desired leanness levels in order to clearly evaluate lean implementation efforts. Despite the fact that many lean strategies are utilized to reduce or eliminate manufacturing waste, little effort has been directed towards properly assessing the leanness of manufacturing organizations. In practice, a single or specific group of metrics (either qualitative or quantitative) will only partially measure the overall leanness. Existing leanness assessment methodologies do not offer a comprehensive evaluation method, integrating both quantitative and qualitative lean measures into a single quantitative value for measuring the overall leanness of an organization. This research aims to develop mathematical models and a systematic methodology for selecting appropriate lean strategies and evaluating the leanness levels in manufacturing organizations. Mathematical models were formulated and a methodology was developed for selecting appropriate lean strategies within manufacturers' limited amount of available resources to reduce their identified wastes. A leanness assessment model was developed by using the fuzzy concept to assess the leanness level and to recommend an optimum leanness value for a manufacturing organization. In the proposed leanness assessment model, both quantitative and qualitative input factors have been taken into account. Based on program developed in MATLAB and C#, a decision support tool (DST) was developed for decision makers to select lean strategies and evaluate the leanness value based on the proposed models and methodology hence sustain the lean implementation efforts. A case study was conducted to demonstrate the effectiveness of these proposed models and methodology. Case study results suggested that out of 10 wastes identified, the case organization (ABC Limited) is able to improve a maximum of six wastes from the selected workstation within their resource limitations. The selected wastes are: unnecessary motion, setup time, unnecessary transportation, inappropriate processing, work in process and raw material inventory and suggested lean strategies are: 5S, Just-In-Time, Kanban System, the Visual Management System (VMS), Cellular Manufacturing, Standard Work Process using method-time measurement (MTM), and Single Minute Exchange of Die (SMED). From the suggested lean strategies, the impact of 5S was demonstrated by measuring the leanness level of two different situations in ABC. After that, MTM was suggested as a standard work process for further improvement of the current leanness value. The initial status of the organization showed a leanness value of 0.12. By applying 5S, the leanness level significantly improved to reach 0.19 and the simulation of MTM as a standard work method shows the leanness value could be improved to 0.31. The optimum leanness value of ABC was calculated to be 0.64. These leanness values provided a quantitative indication of the impacts of improvement initiatives in terms of the overall leanness level to the case organization. Sensitivity analsysis and a t-test were also performed to validate the model proposed. This research advances the current knowledge base by developing mathematical models and methodologies to overcome lean strategy selection and leanness assessment problems. By selecting appropriate lean strategies, a manufacturer can better prioritize implementation efforts and resources to maximize the benefits of implementing lean strategies in their organization. The leanness index is used to evaluate an organization's current (before lean implementation) leanness state against the state after lean implementation and to establish benchmarking (the optimum leanness state). Hence, this research provides a continuous improvement tool for a lean manufacturing organization.
Resumo:
In this paper, a polynomial time algorithm is presented for solving the Eden problem for graph cellular automata. The algorithm is based on our neighborhood elimination operation which removes local neighborhood configurations which cannot be used in a pre-image of a given configuration. This paper presents a detailed derivation of our algorithm from first principles, and a detailed complexity and accuracy analysis is also given. In the case of time complexity, it is shown that the average case time complexity of the algorithm is \Theta(n^2), and the best and worst cases are \Omega(n) and O(n^3) respectively. This represents a vast improvement in the upper bound over current methods, without compromising average case performance.
Resumo:
A graph theoretic approach is developed for accurately computing haulage costs in earthwork projects. This is vital as haulage is a predominant factor in the real cost of earthworks. A variety of metrics can be used in our approach, but a fuel consumption proxy is recommended. This approach is novel as it considers the constantly changing terrain that results from cutting and filling activities and replaces inaccurate “static” calculations that have been used previously. The approach is also capable of efficiently correcting the violation of top down cutting and bottom up filling conditions that can be found in existing earthwork assignments and sequences. This approach assumes that the project site is partitioned into uniform blocks. A directed graph is then utilised to describe the terrain surface. This digraph is altered after each cut and fill, in order to reflect the true state of the terrain. A shortest path algorithm is successively applied to calculate the cost of each haul and these costs are summed to provide a total cost of haulage
Resumo:
The assessment of choroidal thickness from optical coherence tomography (OCT) images of the human choroid is an important clinical and research task, since it provides valuable information regarding the eye’s normal anatomy and physiology, and changes associated with various eye diseases and the development of refractive error. Due to the time consuming and subjective nature of manual image analysis, there is a need for the development of reliable objective automated methods of image segmentation to derive choroidal thickness measures. However, the detection of the two boundaries which delineate the choroid is a complicated and challenging task, in particular the detection of the outer choroidal boundary, due to a number of issues including: (i) the vascular ocular tissue is non-uniform and rich in non-homogeneous features, and (ii) the boundary can have a low contrast. In this paper, an automatic segmentation technique based on graph-search theory is presented to segment the inner choroidal boundary (ICB) and the outer choroidal boundary (OCB) to obtain the choroid thickness profile from OCT images. Before the segmentation, the B-scan is pre-processed to enhance the two boundaries of interest and to minimize the artifacts produced by surrounding features. The algorithm to detect the ICB is based on a simple edge filter and a directional weighted map penalty, while the algorithm to detect the OCB is based on OCT image enhancement and a dual brightness probability gradient. The method was tested on a large data set of images from a pediatric (1083 B-scans) and an adult (90 B-scans) population, which were previously manually segmented by an experienced observer. The results demonstrate the proposed method provides robust detection of the boundaries of interest and is a useful tool to extract clinical data.
Resumo:
Non-communicable diseases (NCDs) dominate disease burdens globally and poor nutrition increasingly contributes to this global burden. Comprehensive monitoring of food environments, and evaluation of the impact of public and private sector policies on food environments is needed to strengthen accountability systems to reduce NCDs. The International Network for Food and Obesity/NCDs Research, Monitoring and Action Support (INFORMAS) is a global network of public-interest organizations and researchers that aims to monitor, benchmark and support public and private sector actions to create healthy food environments and reduce obesity, NCDs and their related inequalities. The INFORMAS framework includes two ‘process’ modules, that monitor the policies and actions of the public and private sectors, seven ‘impact’ modules that monitor the key characteristics of food environments and three ‘outcome’ modules that monitor dietary quality, risk factors and NCD morbidity and mortality. Monitoring frameworks and indicators have been developed for 10 modules to provide consistency, but allowing for stepwise approaches (‘minimal’, ‘expanded’, ‘optimal’) to data collection and analysis. INFORMAS data will enable benchmarking of food environments between countries, and monitoring of progress over time within countries. Through monitoring and benchmarking, INFORMAS will strengthen the accountability systems needed to help reduce the burden of obesity, NCDs and their related inequalities.
Resumo:
The International Network for Food and Obesity/non-communicable diseases Research, Monitoring and Action Support (INFORMAS) proposes to collect performance indicators on food policies, actions and environments related to obesity and non-communicable diseases. This paper reviews existing communications strategies used for performance indicators and proposes the approach to be taken for INFORMAS. Twenty-seven scoring and rating tools were identified in various fields of public health including alcohol, tobacco, physical activity, infant feeding and food environments. These were compared based on the types of indicators used and how they were quantified, scoring methods, presentation and the communication and reporting strategies used. There are several implications of these analyses for INFORMAS: the ratings/benchmarking approach is very commonly used, presumably because it is an effective way to communicate progress and stimulate action, although this has not been formally evaluated; the tools used must be trustworthy, pragmatic and policy-relevant; multiple channels of communication will be needed; communications need to be tailored and targeted to decision-makers; data and methods should be freely accessible. The proposed communications strategy for INFORMAS has been built around these lessons to ensure that INFORMAS's outputs have the greatest chance of being used to improve food environments.
Resumo:
Private-sector organizations play a critical role in shaping the food environments of individuals and populations. However, there is currently very limited independent monitoring of private-sector actions related to food environments. This paper reviews previous efforts to monitor the private sector in this area, and outlines a proposed approach to monitor private-sector policies and practices related to food environments, and their influence on obesity and non-communicable disease (NCD) prevention. A step-wise approach to data collection is recommended, in which the first (‘minimal’) step is the collation of publicly available food and nutrition-related policies of selected private-sector organizations. The second (‘expanded’) step assesses the nutritional composition of each organization's products, their promotions to children, their labelling practices, and the accessibility, availability and affordability of their products. The third (‘optimal’) step includes data on other commercial activities that may influence food environments, such as political lobbying and corporate philanthropy. The proposed approach will be further developed and piloted in countries of varying size and income levels. There is potential for this approach to enable national and international benchmarking of private-sector policies and practices, and to inform efforts to hold the private sector to account for their role in obesity and NCD prevention.
Resumo:
This thesis presents a novel approach to mobile robot navigation using visual information towards the goal of long-term autonomy. A novel concept of a continuous appearance-based trajectory is proposed in order to solve the limitations of previous robot navigation systems, and two new algorithms for mobile robots, CAT-SLAM and CAT-Graph, are presented and evaluated. These algorithms yield performance exceeding state-of-the-art methods on public benchmark datasets and large-scale real-world environments, and will help enable widespread use of mobile robots in everyday applications.
Resumo:
This thesis introduces improved techniques towards automatically estimating the pose of humans from video. It examines a complete workflow to estimating pose, from the segmentation of the raw video stream to extract silhouettes, to using the silhouettes in order to determine the relative orientation of parts of the human body. The proposed segmentation algorithms have improved performance and reduced complexity, while the pose estimation shows superior accuracy during difficult cases of self occlusion.