5 resultados para Filters methods

em Deakin Research Online - Australia


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In theory, our research questions should drive our choice of method. In practice, we know this is not always the case. At various stages of the research process different factors may apply to restrict the choice of research method. These filters might include a series of inter-related factors such as the political context of the research, the disciplinary affiliation of the researchers, the research setting and peer-review. We suggest that as researchers conduct research and encounter the various filters they come to know the methods that are more likely to survive the filtering process. In future projects they may favour these methods. Public health problems and research questions may increasingly be framed in the terms that can be addressed by a restricted array of methods. Innovative proposals - where new methods are applied to old problems, old methods to new areas of inquiry and high-quality interdisciplinary research - may be unlikely to survive the processes of filtering. This may skew the public health knowledge base, limiting public health action. We argue that we must begin to investigate the process of research. We need to document how and why particular methods are chosen to investigate particular sets of public health problems. This will help us understand how we know what we know in public health and help us plan how we may more appropriately draw upon a range of research methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background : The aim of the ACE-Obesity study was to determine the economic credentials of interventions which aim to prevent unhealthy weight gain in children and adolescents. We have reported elsewhere on the modelled effectiveness of 13 obesity prevention interventions in children. In this paper, we report on the cost results and associated methods together with the innovative approach to priority setting that underpins the ACE-Obesity study.

Methods : The Assessing Cost-Effectiveness (ACE) approach combines technical rigour with 'due process' to facilitate evidence-based policy analysis. Technical rigour was achieved through use of standardised evaluation methods, a research team that assembles best available evidence and extensive uncertainty analysis. Cost estimates were based on pathway analysis, with resource usage estimated for the interventions and their 'current practice' comparator, as well as associated cost offsets. Due process was achieved through involvement of stakeholders, consensus decisions informed by briefing papers and 2nd stage filter analysis that captures broader factors that influence policy judgements in addition to cost-effectiveness results. The 2nd stage filters agreed by stakeholders were 'equity', 'strength of the evidence', 'feasibility of implementation', 'acceptability to stakeholders', 'sustainability' and 'potential for side-effects'.

Results :
The intervention costs varied considerably, both in absolute terms (from cost saving [6 interventions] to in excess of AUD50m per annum) and when expressed as a 'cost per child' estimate (from <AUD1.0 [reduction of TV advertising of high fat foods/high sugar drinks] to >AUD31,000 [laparoscopic adjustable gastric banding for morbidly obese adolescents]). High costs per child reflected cost structure, target population and/or under-utilisation.

Conclusions : The use of consistent methods enables valid comparison of potential intervention costs and cost-offsets for each of the interventions. ACE-Obesity informs policy-makers about cost-effectiveness, health impact, affordability and 2nd stage filters for important options for preventing unhealthy weight gain in children. In related articles cost-effectiveness results and second stage filter considerations for each intervention assessed will be presented and analysed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents two novel algorithms for blind chancel equalization (BCE) and blind source separation (BSS). Beside these, a general framework for global convergent analysis is proposed. Finally, the open problem of equalising a non-irreducible system is answered by the algorithm proposed in this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, many scholars make use of fusion of filters to enhance the performance of spam filtering. In the past several years, a lot of effort has been devoted to different ensemble methods to achieve better performance. In reality, how to select appropriate ensemble methods towards spam filtering is an unsolved problem. In this paper, we investigate this problem through designing a framework to compare the performances among various ensemble methods. It is helpful for researchers to fight spam email more effectively in applied systems. The experimental results indicate that online based methods perform well on accuracy, while the off-line batch methods are evidently influenced by the size of data set. When a large data set is involved, the performance of off-line batch methods is not at par with online methods, and in the framework of online methods, the performance of parallel ensemble is better when using complex filters only.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a comparative evaluation of the state-of-art algorithms for detecting pedestrians in low frame rate and low resolution footage acquired by mobile sensors. Four approaches are compared: a) The Histogram of Oriented Gradient (HoG) approach [1]; b) A new histogram feature that is formed by the weighted sum of both the gradient magnitude and the filter responses from a set of elongated Gaussian filters [2] corresponding to the quantised orientation, called Histogram of Oriented Gradient Banks (HoGB) approach; c) The codebook based HoG feature with branch-and-bound (efficient subwindow search) algorithm [3] and; d) The codebook based HoGB approach. Results show that the HoG based detector achieves the highest performance in terms of the true positive detection, the HoGB approach has the lowest false positives whilst maintaining a comparable true positive rate to the HoG, and the codebook approaches allow computationally efficient detection.