879 resultados para Scene graph


Relevância:

10.00% 10.00%

Publicador:

Resumo:

To obtain minimum time or minimum energy trajectories for robots it is necessary to employ planning methods which adequately consider the platform’s dynamic properties. A variety of sampling, graph-based or local receding-horizon optimisation methods have previously been proposed. These typically use simplified kino-dynamic models to avoid the significant computational burden of solving this problem in a high dimensional state-space. In this paper we investigate solutions from the class of pseudospectral optimisation methods which have grown in favour amongst the optimal control community in recent years. These methods have high computational efficiency and rapid convergence properties. We present a practical application of such an approach to the robot path planning problem to provide a trajectory considering the robot’s dynamic properties. We extend the existing literature by augmenting the path constraints with sensed obstacles rather than predefined analytical functions to enable real world application.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The XML Document Mining track was launched for exploring two main ideas: (1) identifying key problems and new challenges of the emerging field of mining semi-structured documents, and (2) studying and assessing the potential of Machine Learning (ML) techniques for dealing with generic ML tasks in the structured domain, i.e., classification and clustering of semi-structured documents. This track has run for six editions during INEX 2005, 2006, 2007, 2008, 2009 and 2010. The first five editions have been summarized in previous editions and we focus here on the 2010 edition. INEX 2010 included two tasks in the XML Mining track: (1) unsupervised clustering task and (2) semi-supervised classification task where documents are organized in a graph. The clustering task requires the participants to group the documents into clusters without any knowledge of category labels using an unsupervised learning algorithm. On the other hand, the classification task requires the participants to label the documents in the dataset into known categories using a supervised learning algorithm and a training set. This report gives the details of clustering and classification tasks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

On obstacle-cluttered construction sites, understanding the motion characteristics of objects is important for anticipating collisions and preventing accidents. This study investigates algorithms for object identification applications that can be used by heavy equipment operators to effectively monitor congested local environment. The proposed framework contains algorithms for three-dimensional spatial modeling and image matching that are based on 3D images scanned by a high-frame rate range sensor. The preliminary results show that an occupancy grid spatial modeling algorithm can successfully build the most pertinent spatial information, and that an image matching algorithm is best able to identify which objects are in the scanned scene.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"How do you film a punch?" This question can be posed by actors, make-up artists, directors and cameramen. Though they can all ask the same question, they are not all seeking the same answer. Within a given domain, based on the roles they play, agents of the domain have different perspectives and they want the answers to their question from their perspective. In this example, an actor wants to know how to act when filming a scene involving a punch. A make-up artist is interested in how to do the make-up of the actor to show bruises that may result from the punch. Likewise, a director wants to know how to direct such a scene and a cameraman is seeking guidance on how best to film such a scene. This role-based difference in perspective is the underpinning of the Loculus framework for information management for the Motion Picture Industry. The Loculus framework exploits the perspective of agent for information extraction and classification within a given domain. The framework uses the positioning of the agent’s role within the domain ontology and its relatedness to other concepts in the ontology to determine the perspective of the agent. Domain ontology had to be developed for the motion picture industry as the domain lacked one. A rule-based relatedness score was developed to calculate the relative relatedness of concepts with the ontology, which were then used in the Loculus system for information exploitation and classification. The evaluation undertaken to date have yielded promising results and have indicated that exploiting perspective can lead to novel methods of information extraction and classifications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The world we live in is well labeled for the benefit of humans but to date robots have made little use of this resource. In this paper we describe a system that allows robots to read and interpret visible text and use it to understand the content of the scene. We use a generative probabilistic model that explains spotted text in terms of arbitrary search terms. This allows the robot to understand the underlying function of the scene it is looking at, such as whether it is a bank or a restaurant. We describe the text spotting engine at the heart of our system that is able to detect and parse wild text in images, and the generative model, and present results from images obtained with a robot in a busy city setting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sonic Loom is a purpose built classroom tool for teachers and students of drama that will enable them to explore the use of music in live performance in theory and practice. It’s intended as a resource for drama classrooms, to encourage communication and exchange about the way music works on us so we can find new ways we can make it work for us. Working to consciously attend to music and how it’s used, particularly in cinema (as a popular way in to styles of western theatre and live performance) will allow students and teachers to use music in more subtle and complex ways an aid to narrative in performance. Sonic Loom encourages active listening, (aided but not encumbered by traditional musicology) so students (and teachers) can develop a ‘critical ear’ in the transformation and adaptation of music for their own artistic purposes, whether it’s soundtracking existing scene work, or acting as a pre-text for scenes which have yet to be created.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a method for measuring the in-bucket payload volume on a dragline excavator for the purpose of estimating the material's bulk density in real-time. Knowledge of the payload's bulk density can provide feedback to mine planning and scheduling to improve blasting and therefore provide a more uniform bulk density across the excavation site. This allows a single optimal bucket size to be used for maximum overburden removal per dig and in turn reduce costs and emissions in dragline operation and maintenance. The proposed solution uses a range bearing laser to locate and scan full buckets between the lift and dump stages of the dragline cycle. The bucket is segmented from the scene using cluster analysis, and the pose of the bucket is calculated using the Iterative Closest Point (ICP) algorithm. Payload points are identified using a known model and subsequently converted into a height grid for volume estimation. Results from both scaled and full scale implementations show that this method can achieve an accuracy of above 95%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The structures of the 1:1 proton-transfer compounds of isonipecotamide (4-piperidinecarboxamide) with 4-nitrophthalic acid, 4-carbamoylpiperidinium 2-carboxy-4-nitrobenzoate, C6H13N2O8+ C8H4O6- (I), 4,5-dichlorophthalic acid, 4-carbamoylpiperidinium 2-carboxy-4,5-dichlorobenzoate, C6H13N2O8+ C8H3Cl2O4- (II) and 5-nitroisophthalic acid, 4-carbamoylpiperidinium 3-carboxy-5-nitrobenzoate, C6H13N2O8+ C8H4O6- (III) as well as the 2:1 compound with terephthalic acid, bis(4-carbamoylpiperidinium)benzene-1,2-dicarboxylate dihydrate, 2(C6H13N2O8+) C8H4O42- . 2H2O (IV)have been determined at 200 K. All salts form hydrogen-bonded structures, one-dimensional in (II) and three-dimensional in (I), (III) and (IV). In (I) and (III) the centrosymmetric R2/2(8) cyclic amide-amide association is found while in (IV) several different types of water-bridged cyclic associations are present [graph sets R2/4(8), R3/4(10), R4/4(12), R3/3(18) and R4/6(22)]. The one-dimensional structure of (I), features the common 'planar' hydrogen 4,5-dichlorophthalate anion together with enlarged cyclic R3/3(13) and R3/4(17) associations. With the structures of (I) and (III) the presence of head-to-tail hydrogen phthalate chain substructures is found. In (IV) head-to-tail primary cation-anion associations are extended longitudinally into chains through the water-bridged cation associations and laterally by piperidinium N-H...O(carboxyl) and water O-H...O(carboxyl) hydrogen bonds. The structures reported here further demonstrate the utility of the isonipecotamide cation as a synthon for the generation of stable hydrogen-bonded structures. An additional example of cation--anion association with this cation is also shown in the asymmetric three-centre piperidinium N-H...O,O'(carboxyl) interaction in the first-reported structure of a 2:1 isonipecotamide-carboxylate salt.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Former Sex Pistols manager Malcolm McLaren has said that punk fashion truly began in New York. In the 1970s, New York was home to the burgeoning punk scene, Fluxus artists and Andy Warhol’s ‘Factory’. Trace the connections between designers, artists and the musicians who became fashion icons such as Robert Mapplethorpe, Patti Smith, Malcolm McLaren, Richard Hell, Lou Reed, and Andy Warhol with Alice Payne (PhD candidate).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an explanation of why the reuse of building components after demolition or deconstruction is critical to the future of the construction industry. An examination of the historical cause and response to climate change sets the scene as to why governance is becoming increasingly focused on the built environment as a mechanism to controlling waste generation associated with the process of demolition, construction and operation. Through an annotated description to the evolving design and construction methodology of a range of timber dwellings (typically 'Queenslanders' during the eras of 1880-1900, 1900-1920 & 1920-1940) the paper offers an evaluation to the variety of materials, which can be used advantageously by those wishing to 'regenerate' a Queenslander. This analysis of 'regeneration' details the constraints when considering relocation and/ or reuse by adaption including deconstruction of building components against the legislative framework requirements of the Queensland Building Act 1975 and the Queensland Sustainable Planning Act 2009, with a specific examination to those of the Building Codes of Australia. The paper concludes with a discussion of these constraints, their impacts on 'regeneration' and the need for further research to seek greater understanding of the practicalities and drivers of relocation, adaptive and building components suitability for reuse after deconstruction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the historical and contextual background of road construction by state and local government in Queensland. It also highlights some key events that have shaped stakeholder participation in road infrastructure planning and delivery in Queensland. This synthesis was developed from a review of publications, organisational documents and interviews. To set the scene, the factors that shaped road delivery will be discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the structure of the title compound, C5H7N2+ C8H11O4-, the cis-anions associate through head-to-tail carboxylic acid carboxyl O-H...O hydrogen-bonds [graph set C(7)], forming chains which extend along c and are inter-linked through the carboxyl groups forming cyclic R2/2(8) associations with the pyridinium and an amine H donor of the cation. Further amine...carboxyl N-H...O interactions form enlarged centrosymmetric rings [graph set R4/4(18)] and extensions down b to give a three-dimensional structure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analysis of either footprints or footwear impressions which have been recovered from a crime scene is a well known and well accepted part of forensic investigation. When this evidence is obtained by investigating officers, comparative analysis to a suspect’s evidence may be undertaken. This can be done either by the detectives or in some cases, podiatrists with experience in forensic analysis. Frequently asked questions of a podiatrist include; “What additional information should be collected from a suspect (for the purposes of comparison), and how should it be collected?” This paper explores the answers to these and related questions based on 20 years of practical experience in the field of crime scene analysis as it relates to podiatry and forensics. Elements of normal and abnormal foot function are explored and used to explain the high degree of variability in wear patterns produced by the interaction of the foot and footwear. Based on this understanding the potential for identifying unique features of the user and correlating this to footwear evidence becomes apparent. Standard protocols adopted by podiatrists allow for more precise, reliable, and valid results to be obtained from their analysis. Complex data sets are now being obtained by investigating officers and, in collaboration with the podiatrist; higher quality conclusions are being achieved. This presentation details the results of investigations which have used standard protocols to collect and analyse footwear and suspects of recent major crimes.