76 resultados para weighting


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis addresses the problem of detecting and describing the same scene points in different wide-angle images taken by the same camera at different viewpoints. This is a core competency of many vision-based localisation tasks including visual odometry and visual place recognition. Wide-angle cameras have a large field of view that can exceed a full hemisphere, and the images they produce contain severe radial distortion. When compared to traditional narrow field of view perspective cameras, more accurate estimates of camera egomotion can be found using the images obtained with wide-angle cameras. The ability to accurately estimate camera egomotion is a fundamental primitive of visual odometry, and this is one of the reasons for the increased popularity in the use of wide-angle cameras for this task. Their large field of view also enables them to capture images of the same regions in a scene taken at very different viewpoints, and this makes them suited for visual place recognition. However, the ability to estimate the camera egomotion and recognise the same scene in two different images is dependent on the ability to reliably detect and describe the same scene points, or ‘keypoints’, in the images. Most algorithms used for this purpose are designed almost exclusively for perspective images. Applying algorithms designed for perspective images directly to wide-angle images is problematic as no account is made for the image distortion. The primary contribution of this thesis is the development of two novel keypoint detectors, and a method of keypoint description, designed for wide-angle images. Both reformulate the Scale- Invariant Feature Transform (SIFT) as an image processing operation on the sphere. As the image captured by any central projection wide-angle camera can be mapped to the sphere, applying these variants to an image on the sphere enables keypoints to be detected in a manner that is invariant to image distortion. Each of the variants is required to find the scale-space representation of an image on the sphere, and they differ in the approaches they used to do this. Extensive experiments using real and synthetically generated wide-angle images are used to validate the two new keypoint detectors and the method of keypoint description. The best of these two new keypoint detectors is applied to vision based localisation tasks including visual odometry and visual place recognition using outdoor wide-angle image sequences. As part of this work, the effect of keypoint coordinate selection on the accuracy of egomotion estimates using the Direct Linear Transform (DLT) is investigated, and a simple weighting scheme is proposed which attempts to account for the uncertainty of keypoint positions during detection. A word reliability metric is also developed for use within a visual ‘bag of words’ approach to place recognition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Safety interventions (e.g., median barriers, photo enforcement) and road features (e.g., median type and width) can influence crash severity, crash frequency, or both. Both dimensions—crash frequency and crash severity—are needed to obtain a full accounting of road safety. Extensive literature and common sense both dictate that crashes are not created equal, with fatalities costing society more than 1,000 times the cost of property damage crashes on average. Despite this glaring disparity, the profession has not unanimously embraced or successfully defended a nonarbitrary severity weighting approach for analyzing safety data and conducting safety analyses. It is argued here that the two dimensions (frequency and severity) are made available by intelligently and reliably weighting crash frequencies and converting all crashes to property-damage-only crash equivalents (PDOEs) by using comprehensive societal unit crash costs. This approach is analogous to calculating axle load equivalents in the prediction of pavement damage: for instance, a 40,000-lb truck causes 4,025 times more stress than does a 4,000-lb car and so simply counting axles is not sufficient. Calculating PDOEs using unit crash costs is the most defensible and nonarbitrary weighting scheme, allows for the simple incorporation of severity and frequency, and leads to crash models that are sensitive to factors that affect crash severity. Moreover, using PDOEs diminishes the errors introduced by underreporting of less severe crashes—an added benefit of the PDOE analysis approach. The method is illustrated with rural road segment data from South Korea (which in practice would develop PDOEs with Korean crash cost data).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the research focuses in the integer least squares problem is the decorrelation technique to reduce the number of integer parameter search candidates and improve the efficiency of the integer parameter search method. It remains as a challenging issue for determining carrier phase ambiguities and plays a critical role in the future of GNSS high precise positioning area. Currently, there are three main decorrelation techniques being employed: the integer Gaussian decorrelation, the Lenstra–Lenstra–Lovász (LLL) algorithm and the inverse integer Cholesky decorrelation (IICD) method. Although the performance of these three state-of-the-art methods have been proved and demonstrated, there is still a potential for further improvements. To measure the performance of decorrelation techniques, the condition number is usually used as the criterion. Additionally, the number of grid points in the search space can be directly utilized as a performance measure as it denotes the size of search space. However, a smaller initial volume of the search ellipsoid does not always represent a smaller number of candidates. This research has proposed a modified inverse integer Cholesky decorrelation (MIICD) method which improves the decorrelation performance over the other three techniques. The decorrelation performance of these methods was evaluated based on the condition number of the decorrelation matrix, the number of search candidates and the initial volume of search space. Additionally, the success rate of decorrelated ambiguities was calculated for all different methods to investigate the performance of ambiguity validation. The performance of different decorrelation methods was tested and compared using both simulation and real data. The simulation experiment scenarios employ the isotropic probabilistic model using a predetermined eigenvalue and without any geometry or weighting system constraints. MIICD method outperformed other three methods with conditioning improvements over LAMBDA method by 78.33% and 81.67% without and with eigenvalue constraint respectively. The real data experiment scenarios involve both the single constellation system case and dual constellations system case. Experimental results demonstrate that by comparing with LAMBDA, MIICD method can significantly improve the efficiency of reducing the condition number by 78.65% and 97.78% in the case of single constellation and dual constellations respectively. It also shows improvements in the number of search candidate points by 98.92% and 100% in single constellation case and dual constellations case.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates the use of lip information, in conjunction with speech information, for robust speaker verification in the presence of background noise. It has been previously shown in our own work, and in the work of others, that features extracted from a speaker's moving lips hold speaker dependencies which are complementary with speech features. We demonstrate that the fusion of lip and speech information allows for a highly robust speaker verification system which outperforms the performance of either sub-system. We present a new technique for determining the weighting to be applied to each modality so as to optimize the performance of the fused system. Given a correct weighting, lip information is shown to be highly effective for reducing the false acceptance and false rejection error rates in the presence of background noise

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Investigates the use of lip information, in conjunction with speech information, for robust speaker verification in the presence of background noise. We have previously shown (Int. Conf. on Acoustics, Speech and Signal Proc., vol. 6, pp. 3693-3696, May 1998) that features extracted from a speaker's moving lips hold speaker dependencies which are complementary with speech features. We demonstrate that the fusion of lip and speech information allows for a highly robust speaker verification system which outperforms either subsystem individually. We present a new technique for determining the weighting to be applied to each modality so as to optimize the performance of the fused system. Given a correct weighting, lip information is shown to be highly effective for reducing the false acceptance and false rejection error rates in the presence of background noise

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this research is to develop an indexing model to evaluate sutainability performance of urban settings, in order to assess environmental impacts of urban development and to provide planning agencies an indexing model as a decision support tool to be used in curbing negative impacts of urban development. Indicator-based sustainability assessment is embraced as the method. Neigbourhood-level urban form and transport related indicators are derived from the literature by conducting a content analysis and finalised via a focus group meeting. The model is piloted on three suburbs of Gold Coast City, Australia. Final neighbourhood level sustainability index score was calculated by employing equal weighting schema. The results of the study show that indexing modelling is a reasonably practical method to measure and visualise local sustainability performance, which can be employed as an effective communication and decision making tool.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we present a sequential Monte Carlo algorithm for Bayesian sequential experimental design applied to generalised non-linear models for discrete data. The approach is computationally convenient in that the information of newly observed data can be incorporated through a simple re-weighting step. We also consider a flexible parametric model for the stimulus-response relationship together with a newly developed hybrid design utility that can produce more robust estimates of the target stimulus in the presence of substantial model and parameter uncertainty. The algorithm is applied to hypothetical clinical trial or bioassay scenarios. In the discussion, potential generalisations of the algorithm are suggested to possibly extend its applicability to a wide variety of scenarios

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ultrafine particles (UFPs, <100 nm) are produced in large quantities by vehicular combustion and are implicated in causing several adverse human health effects. Recent work has suggested that a large proportion of daily UFP exposure may occur during commuting. However, the determinants, variability and transport mode-dependence of such exposure are not well-understood. The aim of this review was to address these knowledge gaps by distilling the results of ‘in-transit’ UFP exposure studies performed to-date, including studies of health effects. We identified 47 exposure studies performed across 6 transport modes: automobile, bicycle, bus, ferry, rail and walking. These encompassed approximately 3000 individual trips where UFP concentrations were measured. After weighting mean UFP concentrations by the number of trips in which they were collected, we found overall mean UFP concentrations of 3.4, 4.2, 4.5, 4.7, 4.9 and 5.7 × 10^4 particles cm^-3 for the bicycle, bus, automobile, rail, walking and ferry modes, respectively. The mean concentration inside automobiles travelling through tunnels was 3.0 × 10^5 particles cm^-3. While the mean concentrations were indicative of general trends, we found that the determinants of exposure (meteorology, traffic parameters, route, fuel type, exhaust treatment technologies, cabin ventilation, filtration, deposition, UFP penetration) exhibited marked variability and mode-dependence, such that it is not necessarily appropriate to rank modes in order of exposure without detailed consideration of these factors. Ten in-transit health effects studies have been conducted and their results indicate that UFP exposure during commuting can elicit acute effects in both healthy and health-compromised individuals. We suggest that future work should focus on further defining the contribution of in-transit UFP exposure to total UFP exposure, exploring its specific health effects and investigating exposures in the developing world. Keywords: air pollution; transport modes; acute health effects; travel; public transport

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Selecting an appropriate design-builder is critical to the success of DB projects. The objective of this study is to identify selection criteria for design-builders and compare their relative importance by means of a robust content analysis of 94 Request For Proposals (RFPs) for public DB projects. These DB projects had an aggregate contract value of over US$3.5 billion and were advertised between 2000 and 2010. This study summarized twenty-six selection criteria and classified into ten categories, i.e.: price, experience, technical approach, management approach, qualification, schedule, past performance, financial capability, responsiveness to the RFP, and legal status in descending order of their relative importance. The results showed that even though price still remains as the most important selection category, its relative importance declines significantly in the last decade. The categories of qualification, experience, past performance, by contrast, have been becoming more important to DB owners for selecting design-builders. Finally, it is found that the importance weighting of price in large projects is significantly higher than that in small projects. This study provides a useful reference for owners in selecting their preferred design-builders.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This PhD represents my attempt to make sense of my personal experiences of depression through the form of cabaret. I first experienced depression in 2006. Previously, I had considered myself to be a happy and optimistic person. I found the experience of depression to be a shock: both in the experience itself, and also in the way it effected my own self image. These personal experiences, together with my professional history as a songwriter and cabaret performer, have been the motivating force behind the research project. This study has explored the question: What are the implications of applying principles of Michael White’s narrative therapy to the creation of a cabaret performance about depression and bipolar disorder? There is a 50 percent weighting on the creative work, the cabaret performance Mind Games, and a 50 percent weighting on the written exegesis. This research has focussed on the illustration of therapeutic principles in order to play games of truth within a cabaret performance. The research project investigates ways of telling my own story in relation to others’ stories through three re-authoring principles articulated in Michael White’s narrative therapy: externalisation, an autonomous ethic of living and rich descriptions. The personal stories presented in the cabaret were drawn from my own experiences and from interviews with individuals with depression or bipolar disorder. The cabaret focussed on the illustration of therapeutic principles, and was not focussed on therapeutic ends for myself or the interviewees. The research question has been approached through a methodology combining autoethnographic, practice-led and action research. Auto ethnographic research is characterised by close investigation of assumptions, attitudes, and beliefs. The combination of autoethnographic, practice-led, action research has allowed me to bring together personal experiences of mental illness, research into therapeutic techniques, social attitudes and public discourses about mental illness and forms of contemporary cabaret to facilitate the creation of a one-woman cabaret performance. The exegesis begins with a discussion of games of truth as informed by Michel Foucault and Michael White and self-stigma as informed by Michael White and Erving Goffman. These concepts form the basis for a discussion of my own personal experiences. White’s narrative therapy is focused on individuals re-authoring their stories, or telling their stories in different ways. White’s principles are influenced by Foucault’s notions of truth and power. Foucault’s term games of truth has been used to describe the effect of a ‘truth in flux’ that occurs through White’s re-authoring process. This study argues that cabaret is an appropriate form to represent this therapeutic process because it favours heightened performativity over realism, and showcases its ‘constructedness’ and artificiality. Thus cabaret is well suited to playing games of truth. A contextual review compares two major cabaret trends, personal cabaret and provocative cabaret, in reference to the performer’s relationship with the audience in terms of distance and intimacy. The study draws a parallel between principles of distance and intimacy in Michael White’s narrative therapy and relates these to performative terms of distance and intimacy. The creative component of this study, the cabaret Mind Games, used principles of narrative therapy to present the character ‘Jo’ playing games of truth through: externalising an aspect of her personality (externalisation); exploring different life values (an autonomous ethic of living); and enacting multiple versions of her identity (rich descriptions). This constant shifting between distance and intimacy within the cabaret created the effect of a truth in ‘constant flux’, to use one of White’s terms. There are three inter-related findings in the study. The first finding is that the application of principles of White’s narrative therapy was able to successfully combine provocative and empathetic elements within the cabaret. The second finding is that the personal agenda of addressing my own self-stigma within the project limited the effective portrayal of a ‘truth in flux’ within the cabaret. The third finding presents the view that the cabaret expressed ‘Jo’ playing games of truth in order to journey towards her own "preferred identity claim" (White 2004b) through an act of "self care" (Foucault 2005). The contribution to knowledge of this research project is the application of therapeutic principles to the creation of a cabaret performance. This process has focussed on creating a self-revelatory cabaret that questions notions of a ‘fixed truth’ through combining elements of existing cabaret forms in new ways. Two major forms in contemporary cabaret, the personal cabaret and the provocative cabaret use the performer-audience relationship in distinctive ways. Through combining elements of these two cabaret forms, I have explored ways to create a provocative cabaret focussed on the act of self-revelation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report three developments toward resolving the challenge of the apparent basal polytomy of neoavian birds. First, we describe improved conditional down-weighting techniques to reduce noise relative to signal for deeper divergences and find increased agreement between data sets. Second, we present formulae for calculating the probabilities of finding predefined groupings in the optimal tree. Finally, we report a significant increase in data: nine new mitochondrial (mt) genomes (the dollarbird, New Zealand kingfisher, great potoo, Australian owlet-nightjar, white-tailed trogon, barn owl, a roadrunner [a ground cuckoo], New Zealand long-tailed cuckoo, and the peach-faced lovebird) and together they provide data for each of the six main groups of Neoaves proposed by Cracraft J (2001). We use his six main groups of modern birds as priors for evaluation of results. These include passerines, cuckoos, parrots, and three other groups termed “WoodKing” (woodpeckers/rollers/kingfishers), “SCA” (owls/potoos/owlet-nightjars/hummingbirds/swifts), and “Conglomerati.” In general, the support is highly significant with just two exceptions, the owls move from the “SCA” group to the raptors, particularly accipitrids (buzzards/eagles) and the osprey, and the shorebirds may be an independent group from the rest of the “Conglomerati”. Molecular dating mt genomes support a major diversification of at least 12 neoavian lineages in the Late Cretaceous. Our results form a basis for further testing with both nuclear-coding sequences and rare genomic changes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"This letter aims to highlight the multisensory integration weighting mechanisms that may account for the results in studies investigating haptic feedback in laparoscopic surgery. The current lack of multisensory theoretical knowledge in laparoscopy is evident, and “a much better understanding of how multimodal displays in virtual environments influence human performance is required” ...publisher website

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we consider a space Riesz fractional advection-dispersion equation. The equation is obtained from the standard advection-diffusion equation by replacing the ¯rst-order and second-order space derivatives by the Riesz fractional derivatives of order β 1 Є (0; 1) and β2 Є(1; 2], respectively. Riesz fractional advection and dispersion terms are approximated by using two fractional centered difference schemes, respectively. A new weighted Riesz fractional ¯nite difference approximation scheme is proposed. When the weighting factor Ѳ = 1/2, a second- order accurate numerical approximation scheme for the Riesz fractional advection-dispersion equation is obtained. Stability, consistency and convergence of the numerical approximation scheme are discussed. A numerical example is given to show that the numerical results are in good agreement with our theoretical analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Forecasts generated by time series models traditionally place greater weight on more recent observations. This paper develops an alternative semi-parametric method for forecasting that does not rely on this convention and applies it to the problem of forecasting asset return volatility. In this approach, a forecast is a weighted average of historical volatility, with the greatest weight given to periods that exhibit similar market conditions to the time at which the forecast is being formed. Weighting is determined by comparing short-term trends in volatility across time (as a measure of market conditions) by means of a multivariate kernel scheme. It is found that the semi-parametric method produces forecasts that are significantly more accurate than a number of competing approaches at both short and long forecast horizons.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Queensland University of Technology (QUT) was one of the first universities in Australia to establish an institutional repository. Launched in November 2003, the repository (QUT ePrints) uses the EPrints open source repository software (from Southampton) and has enjoyed the benefit of an institutional deposit mandate since January 2004. Currently (April 2012), the repository holds over 36,000 records, including 17,909 open access publications with another 2,434 publications embargoed but with mediated access enabled via the ‘Request a copy’ button which is a feature of the EPrints software. At QUT, the repository is managed by the library.QUT ePrints (http://eprints.qut.edu.au) The repository is embedded into a number of other systems at QUT including the staff profile system and the University’s research information system. It has also been integrated into a number of critical processes related to Government reporting and research assessment. Internally, senior research administrators often look to the repository for information to assist with decision-making and planning. While some statistics could be drawn from the advanced search feature and the existing download statistics feature, they were rarely at the level of granularity or aggregation required. Getting the information from the ‘back end’ of the repository was very time-consuming for the Library staff. In 2011, the Library funded a project to enhance the range of statistics which would be available from the public interface of QUT ePrints. The repository team conducted a series of focus groups and individual interviews to identify and prioritise functionality requirements for a new statistics ‘dashboard’. The participants included a mix research administrators, early career researchers and senior researchers. The repository team identified a number of business criteria (eg extensible, support available, skills required etc) and then gave each a weighting. After considering all the known options available, five software packages (IRStats, ePrintsStats, AWStats, BIRT and Google Urchin/Analytics) were thoroughly evaluated against a list of 69 criteria to determine which would be most suitable. The evaluation revealed that IRStats was the best fit for our requirements. It was deemed capable of meeting 21 out of the 31 high priority criteria. Consequently, IRStats was implemented as the basis for QUT ePrints’ new statistics dashboards which were launched in Open Access Week, October 2011. Statistics dashboards are now available at four levels; whole-of-repository level, organisational unit level, individual author level and individual item level. The data available includes, cumulative total deposits, time series deposits, deposits by item type, % fulltexts, % open access, cumulative downloads, time series downloads, downloads by item type, author ranking, paper ranking (by downloads), downloader geographic location, domains, internal v external downloads, citation data (from Scopus and Web of Science), most popular search terms, non-search referring websites. The data is displayed in charts, maps and table format. The new statistics dashboards are a great success. Feedback received from staff and students has been very positive. Individual researchers have said that they have found the information to be very useful when compiling a track record. It is now very easy for senior administrators (including the Deputy Vice Chancellor-Research) to compare the full-text deposit rates (i.e. mandate compliance rates) across organisational units. This has led to increased ‘encouragement’ from Heads of School and Deans in relation to the provision of full-text versions.