886 resultados para Localization real-world challenges
Resumo:
This presentation will provide an overview of the load applied on the residuum of transfemoral amputees fitted with an osseointegrated fixation during (A) rehabilitation, including static and dynamic load bearing exercises (e.g., rowing, adduction, abduction, squat, cycling, walking with aids), and (B) activities of daily living including standardized activities (e.g., level walking in straight line and around a circle, ascending and descending slopes and stairs) and activities in real world environments.
Resumo:
Discounted Cumulative Gain (DCG) is a well-known ranking evaluation measure for models built with multiple relevance graded data. By handling tagging data used in recommendation systems as an ordinal relevance set of {negative,null,positive}, we propose to build a DCG based recommendation model. We present an efficient and novel learning-to-rank method by optimizing DCG for a recommendation model using the tagging data interpretation scheme. Evaluating the proposed method on real-world datasets, we demonstrate that the method is scalable and outperforms the benchmarking methods by generating a quality top-N item recommendation list.
Resumo:
3D printing (3Dp) has long been used in the manufacturing sector as a way to automate, accelerate production and reduce waste materials. It is able to build a wide variety of objects if the necessary specifications are provided to the printer and no problems are presented by the limited range of materials available. With 3Dp becoming cheaper, more reliable and, as a result, more prevalent in the world at large, it may soon make inroads into the construction industry. Little is known however, of 3Dp in current use the construction industry and its potential for the future and this paper seeks to rectify this situation by providing a review of the relevant literature. In doing this, the three main 3Dp methods of contour crafting, concrete printing and D-shape 3Dp are described which, as opposed to the traditional construction method of cutting materials down to size, deliver only what is needed for completion, vastly reducing waste. Also identified is 3Dp’s potential to enable buildings to be constructed many times faster and with significantly reduced labour costs. In addition, it is clear that construction 3Dp can allow the further inclusion of Building Information Modelling into the construction process - streamlining and improving the scheduling requirements of a project. However, current 3Dp processes are known to be costly, unsuited to large-scale products and conventional design approaches, and have a very limited range of materials that can be used. Moreover, the only successful examples of construction in action to date have occurred in controlled laboratory environments and, as real world trials have yet to be completed, it is yet to be seen whether it can be it equally proficient in practical situations. Key Words: 3D Printing; Contour Crafting; Concrete Printing; D-shape; Building Automation.
Resumo:
Map-matching algorithms that utilise road segment connectivity along with other data (i.e.position, speed and heading) in the process of map-matching are normally suitable for high frequency (1 Hz or higher) positioning data from GPS. While applying such map-matching algorithms to low frequency data (such as data from a fleet of private cars, buses or light duty vehicles or smartphones), the performance of these algorithms reduces to in the region of 70% in terms of correct link identification, especially in urban and sub-urban road networks. This level of performance may be insufficient for some real-time Intelligent Transport System (ITS) applications and services such as estimating link travel time and speed from low frequency GPS data. Therefore, this paper develops a new weight-based shortest path and vehicle trajectory aided map-matching (stMM) algorithm that enhances the map-matching of low frequency positioning data on a road map. The well-known A* search algorithm is employed to derive the shortest path between two points while taking into account both link connectivity and turn restrictions at junctions. In the developed stMM algorithm, two additional weights related to the shortest path and vehicle trajectory are considered: one shortest path-based weight is related to the distance along the shortest path and the distance along the vehicle trajectory, while the other is associated with the heading difference of the vehicle trajectory. The developed stMM algorithm is tested using a series of real-world datasets of varying frequencies (i.e. 1 s, 5 s, 30 s, 60 s sampling intervals). A high-accuracy integrated navigation system (a high-grade inertial navigation system and a carrier-phase GPS receiver) is used to measure the accuracy of the developed algorithm. The results suggest that the algorithm identifies 98.9% of the links correctly for every 30 s GPS data. Omitting the information from the shortest path and vehicle trajectory, the accuracy of the algorithm reduces to about 73% in terms of correct link identification. The algorithm can process on average 50 positioning fixes per second making it suitable for real-time ITS applications and services.
Resumo:
In the structural health monitoring (SHM) field, long-term continuous vibration-based monitoring is becoming increasingly popular as this could keep track of the health status of structures during their service lives. However, implementing such a system is not always feasible due to on-going conflicts between budget constraints and the need of sophisticated systems to monitor real-world structures under their demanding in-service conditions. To address this problem, this paper presents a comprehensive development of a cost-effective and flexible vibration DAQ system for long-term continuous SHM of a newly constructed institutional complex with a special focus on the main building. First, selections of sensor type and sensor positions are scrutinized to overcome adversities such as low-frequency and low-level vibration measurements. In order to economically tackle the sparse measurement problem, a cost-optimized Ethernet-based peripheral DAQ model is first adopted to form the system skeleton. A combination of a high-resolution timing coordination method based on the TCP/IP command communication medium and a periodic system resynchronization strategy is then proposed to synchronize data from multiple distributed DAQ units. The results of both experimental evaluations and experimental–numerical verifications show that the proposed DAQ system in general and the data synchronization solution in particular work well and they can provide a promising cost-effective and flexible alternative for use in real-world SHM projects. Finally, the paper demonstrates simple but effective ways to make use of the developed monitoring system for long-term continuous structural health evaluation as well as to use the instrumented building herein as a multi-purpose benchmark structure for studying not only practical SHM problems but also synchronization related issues.
Resumo:
The most important aspect of modelling a geological variable, such as metal grade, is the spatial correlation. Spatial correlation describes the relationship between realisations of a geological variable sampled at different locations. Any method for spatially modelling such a variable should be capable of accurately estimating the true spatial correlation. Conventional kriged models are the most commonly used in mining for estimating grade or other variables at unsampled locations, and these models use the variogram or covariance function to model the spatial correlations in the process of estimation. However, this usage assumes the relationships of the observations of the variable of interest at nearby locations are only influenced by the vector distance between the locations. This means that these models assume linear spatial correlation of grade. In reality, the relationship with an observation of grade at a nearby location may be influenced by both distance between the locations and the value of the observations (ie non-linear spatial correlation, such as may exist for variables of interest in geometallurgy). Hence this may lead to inaccurate estimation of the ore reserve if a kriged model is used for estimating grade of unsampled locations when nonlinear spatial correlation is present. Copula-based methods, which are widely used in financial and actuarial modelling to quantify the non-linear dependence structures, may offer a solution. This method was introduced by Bárdossy and Li (2008) to geostatistical modelling to quantify the non-linear spatial dependence structure in a groundwater quality measurement network. Their copula-based spatial modelling is applied in this research paper to estimate the grade of 3D blocks. Furthermore, real-world mining data is used to validate this model. These copula-based grade estimates are compared with the results of conventional ordinary and lognormal kriging to present the reliability of this method.
Resumo:
Urbanisation significantly changes the characteristics of a catchment as natural areas are transformed to impervious surfaces such as roads, roofs and parking lots. The increased fraction of impervious surfaces leads to changes to the stormwater runoff characteristics, whilst a variety of anthropogenic activities common to urban areas generate a range of pollutants such as nutrients, solids and organic matter. These pollutants accumulate on catchment surfaces and are removed and trans- ported by stormwater runoff and thereby contribute pollutant loads to receiving waters. In summary, urbanisation influences the stormwater characteristics of a catchment, including hydrology and water quality. Due to the growing recognition that stormwater pollution is a significant environmental problem, the implementation of mitigation strategies to improve the quality of stormwater runoff is becoming increasingly common in urban areas. A scientifically robust stormwater quality treatment strategy is an essential requirement for effective urban stormwater management. The efficient design of treatment systems is closely dependent on the state of knowledge in relation to the primary factors influencing stormwater quality. In this regard, stormwater modelling outcomes provide designers with important guidance and datasets which significantly underpin the design of effective stormwater treatment systems. Therefore, the accuracy of modelling approaches and the reliability modelling outcomes are of particular concern. This book discusses the inherent complexity and key characteristics in the areas of urban hydrology and stormwater quality, based on the influence exerted by a range of rainfall and catchment characteristics. A comprehensive field sampling and testing programme in relation to pollutant build-up, an urban catchment monitoring programme in relation to stormwater quality and the outcomes from advanced statistical analyses provided the platform for the knowledge creation. Two case studies and two real-world applications are discussed to illustrate the translation of the knowledge created to practical use in relation to the role of rainfall and catchment characteristics on urban stormwater quality. An innovative rainfall classification based on stormwater quality was developed to support the effective and scientifically robust design of stormwater treatment systems. Underpinned by the rainfall classification methodology, a reliable approach for design rainfall selection is proposed in order to optimise stormwater treatment based on both, stormwater quality and quantity. This is a paradigm shift from the common approach where stormwater treatment systems are designed based solely on stormwater quantity data. Additionally, how pollutant build-up and stormwater runoff quality vary with a range of catchment characteristics was also investigated. Based on the study out- comes, it can be concluded that the use of only a limited number of catchment parameters such as land use and impervious surface percentage, as it is the case in current modelling approaches, could result in appreciable error in water quality estimation. Influential factors which should be incorporated into modelling in relation to catchment characteristics, should also include urban form and impervious surface area distribution. The knowledge created through the research investigations discussed in this monograph is expected to make a significant contribution to engineering practice such as hydrologic and stormwater quality modelling, stormwater treatment design and urban planning, as the study outcomes provide practical approaches and recommendations for urban stormwater quality enhancement. Furthermore, this monograph also demonstrates how fundamental knowledge of stormwater quality processes can be translated to provide guidance on engineering practice, the comprehensive application of multivariate data analyses techniques and a paradigm on integrative use of computer models and mathematical models to derive practical outcomes.
Resumo:
Semantic perception and object labeling are key requirements for robots interacting with objects on a higher level. Symbolic annotation of objects allows the usage of planning algorithms for object interaction, for instance in a typical fetchand-carry scenario. In current research, perception is usually based on 3D scene reconstruction and geometric model matching, where trained features are matched with a 3D sample point cloud. In this work we propose a semantic perception method which is based on spatio-semantic features. These features are defined in a natural, symbolic way, such as geometry and spatial relation. In contrast to point-based model matching methods, a spatial ontology is used where objects are rather described how they "look like", similar to how a human would described unknown objects to another person. A fuzzy based reasoning approach matches perceivable features with a spatial ontology of the objects. The approach provides a method which is able to deal with senor noise and occlusions. Another advantage is that no training phase is needed in order to learn object features. The use-case of the proposed method is the detection of soil sample containers in an outdoor environment which have to be collected by a mobile robot. The approach is verified using real world experiments.
Resumo:
This paper is based on a study examining the impact of young people’s backgrounds and educational experiences on career choice capability with the aim of informing education policy. A total of 706 students from secondary schools (Years 9-12) in New South Wales, Australia took part in an online survey. This paper focuses on the differences found between groups on the basis of their educational experiences. Participants who were uncertain of their future career plans were more likely to attend non-selective, non-metropolitan schools and were more likely to hold negative attitudes towards school. Career ‘uncertain’ students were also less likely to be satisfied with the elective subjects offered at their school and reported less access to career education sessions. It is concluded that timely career information and guidance should be provided to students and their families in order to allow them to more meaningfully make use of the resources and opportunities available to them with a view toward converting these into real world benefits.
Resumo:
Higher education institutions across the world are experiencing a new generation of students, known as millennial learners. They are more technologically literate and digitally connected than previous generations of learners. To meet the teaching and learning needs of these learners, we must offer more deliberate and meaningful learning experiences and opportunities, where students can see the connections between new material and their own experiences and real world applications – an academagogic approach. This study compares the implementation of academagogy for two different groups of millennial learners – one a traditional face-to-face undergraduate Engineering unit, and the other a mixed-mode (online and face-to-face) undergraduate Design unit. The units are discussed in terms of their student evaluation results, both qualitative and quantitative, and in terms of their academic outcomes for students. Conclusions are drawn about the applicability of academagogy as a heuristic for improving teaching and learning across disciplines, as well as its strengths and limitations in terms of student results.
Resumo:
In this paper, we assess whether quality survives the test of time in academia by comparing up to 80 years of academic journal article citations from two top journals, Econometrica and the American Economic Review. The research setting under analysis is analogous to a controlled real world experiment in that it involves a homogeneous task (trying to publish in top journals) by individuals with a homogenous job profile (academics) in a specific research environment (economics and econometrics). Comparing articles published concurrently in the same outlet at the same time (same issue) indicates that symbolic capital or power due to institutional affiliation or connection does seem to boost citation success at the beginning, giving those educated at or affiliated with leading universities an initial comparative advantage. Such advantage, however, does not hold in the long run: at a later stage,the publications of other researchers become as or even more successful.
Resumo:
This thesis introduces a new way of using prior information in a spatial model and develops scalable algorithms for fitting this model to large imaging datasets. These methods are employed for image-guided radiation therapy and satellite based classification of land use and water quality. This study has utilized a pre-computation step to achieve a hundredfold improvement in the elapsed runtime for model fitting. This makes it much more feasible to apply these models to real-world problems, and enables full Bayesian inference for images with a million or more pixels.
Resumo:
Provides an accessible foundation to Bayesian analysis using real world models This book aims to present an introduction to Bayesian modelling and computation, by considering real case studies drawn from diverse fields spanning ecology, health, genetics and finance. Each chapter comprises a description of the problem, the corresponding model, the computational method, results and inferences as well as the issues that arise in the implementation of these approaches. Case Studies in Bayesian Statistical Modelling and Analysis: •Illustrates how to do Bayesian analysis in a clear and concise manner using real-world problems. •Each chapter focuses on a real-world problem and describes the way in which the problem may be analysed using Bayesian methods. •Features approaches that can be used in a wide area of application, such as, health, the environment, genetics, information science, medicine, biology, industry and remote sensing. Case Studies in Bayesian Statistical Modelling and Analysis is aimed at statisticians, researchers and practitioners who have some expertise in statistical modelling and analysis, and some understanding of the basics of Bayesian statistics, but little experience in its application. Graduate students of statistics and biostatistics will also find this book beneficial.
Resumo:
Public buildings and large infrastructure are typically monitored by tens or hundreds of cameras, all capturing different physical spaces and observing different types of interactions and behaviours. However to date, in large part due to limited data availability, crowd monitoring and operational surveillance research has focused on single camera scenarios which are not representative of real-world applications. In this paper we present a new, publicly available database for large scale crowd surveillance. Footage from 12 cameras for a full work day covering the main floor of a busy university campus building, including an internal and external foyer, elevator foyers, and the main external approach are provided; alongside annotation for crowd counting (single or multi-camera) and pedestrian flow analysis for 10 and 6 sites respectively. We describe how this large dataset can be used to perform distributed monitoring of building utilisation, and demonstrate the potential of this dataset to understand and learn the relationship between different areas of a building.
Resumo:
Games and activities, often involving aspects of pretence and fantasy play, are an essential aspect of everyday preschool life for many young children. Young children’s spontaneous play activities can be understood as social life in action. Increasingly, young children’s games and activities involve their engagement in pretence using play props to represent computers, laptops and other pieces of technology equipment. In this way, pretend play becomes a context for engaging with matters from the real world. There are a number of studies investigating school-aged children engaging in gaming and other online activities, but less is known about what young children are doing with online technologies. Drawing on Australian Research Council funded research of children engaging with technologies at home and school, this chapter investigates how young children use technologies in everyday life by showing how they draw on props, both real or imaginary, to support their play activities. An ethnomethodological approach using conversation analysis is used to explore how children’s gestures, gaze and talk work to introduce ideas and activities. This chapter contributes to understandings of how children’s play intersects with technologies and pretend play.