878 resultados para Observational techniques and algorithms


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Efficient and effective feature detection and representation is an important consideration when processing videos, and a large number of applications such as motion analysis, 3D scene understanding, tracking etc. depend on this. Amongst several feature description methods, local features are becoming increasingly popular for representing videos because of their simplicity and efficiency. While they achieve state-of-the-art performance with low computational complexity, their performance is still too limited for real world applications. Furthermore, rapid increases in the uptake of mobile devices has increased the demand for algorithms that can run with reduced memory and computational requirements. In this paper we propose a semi binary based feature detectordescriptor based on the BRISK detector, which can detect and represent videos with significantly reduced computational requirements, while achieving comparable performance to the state of the art spatio-temporal feature descriptors. First, the BRISK feature detector is applied on a frame by frame basis to detect interest points, then the detected key points are compared against consecutive frames for significant motion. Key points with significant motion are encoded with the BRISK descriptor in the spatial domain and Motion Boundary Histogram in the temporal domain. This descriptor is not only lightweight but also has lower memory requirements because of the binary nature of the BRISK descriptor, allowing the possibility of applications using hand held devices.We evaluate the combination of detectordescriptor performance in the context of action classification with a standard, popular bag-of-features with SVM framework. Experiments are carried out on two popular datasets with varying complexity and we demonstrate comparable performance with other descriptors with reduced computational complexity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Railway crew scheduling problem is the process of allocating train services to the crew duties based on the published train timetable while satisfying operational and contractual requirements. The problem is restricted by many constraints and it belongs to the class of NP-hard. In this paper, we develop a mathematical model for railway crew scheduling with the aim of minimising the number of crew duties by reducing idle transition times. Duties are generated by arranging scheduled trips over a set of duties and sequentially ordering the set of trips within each of duties. The optimisation model includes the time period of relief opportunities within which a train crew can be relieved at any relief point. Existing models and algorithms usually only consider relieving a crew at the beginning of the interval of relief opportunities which may be impractical. This model involves a large number of decision variables and constraints, and therefore a hybrid constructive heuristic with the simulated annealing search algorithm is applied to yield an optimal or near-optimal schedule. The performance of the proposed algorithms is evaluated by applying computational experiments on randomly generated test instances. The results show that the proposed approaches obtain near-optimal solutions in a reasonable computational time for large-sized problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Textual document set has become an important and rapidly growing information source in the web. Text classification is one of the crucial technologies for information organisation and management. Text classification has become more and more important and attracted wide attention of researchers from different research fields. In this paper, many feature selection methods, the implement algorithms and applications of text classification are introduced firstly. However, because there are much noise in the knowledge extracted by current data-mining techniques for text classification, it leads to much uncertainty in the process of text classification which is produced from both the knowledge extraction and knowledge usage, therefore, more innovative techniques and methods are needed to improve the performance of text classification. It has been a critical step with great challenge to further improve the process of knowledge extraction and effectively utilization of the extracted knowledge. Rough Set decision making approach is proposed to use Rough Set decision techniques to more precisely classify the textual documents which are difficult to separate by the classic text classification methods. The purpose of this paper is to give an overview of existing text classification technologies, to demonstrate the Rough Set concepts and the decision making approach based on Rough Set theory for building more reliable and effective text classification framework with higher precision, to set up an innovative evaluation metric named CEI which is very effective for the performance assessment of the similar research, and to propose a promising research direction for addressing the challenging problems in text classification, text mining and other relative fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper takes a multimethod approach which combines ethnographic techniques and discourse studies to investigate two contrasting professional groups: community photographers, who are favela dwellers who have developed photographic projects in Brazil‘s favelas, and photojournalists of the mainstream media. Its purpose is to determine how a cultural and social divide in the city of Rio de Janeiro shapes both community photographers and mainstream photojournalists’ practices, discourses, and identities. While community photographers strive to establish a humane and positive view about favelas and their residents by shifting the focus from poverty, shortages, violence, and criminality to images of the ordinary life, mainstream photojournalists express the view that their role is of primary importance for the defence of human rights in the favelas by helping to prevent, for instance, police abuses and violations. As the data analysis indicated the existence of socio-spatial borders all over Rio de Janeiro, this study adopted the idea of a divided city without denying interconnections between favelas and the city’s political life. Through the analysis of categories which emerged from the data, the complex world of documenting favela life is explored. The major themes touched upon are: the breakdown between the mainstream media and the favela communities; the different kinds of relationships which arise in Rio’s low income suburbs; and the gradual return of mainstream news workers to favelas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diabetic peripheral neuropathy (DPN) is one of the most common long-term complications of diabetes. The accurate detection and quantification of DPN are important for defining at-risk patients, anticipating deterioration, and assessing new therapies. Current methods of detecting and quantifying DPN, such as neurophysiology, lack sensitivity, require expert assessment and focus primarily on large nerve fibers. However, the earliest damage to nerve fibers in diabetic neuropathy is to the small nerve fibers. At present, small nerve fiber damage is currently assessed using skin/nerve biopsy; both are invasive technique and are not suitable for repeated investigations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There has been an intense debate about climatic impacts on the transmission of malaria. It is vitally important to accurately project future impacts of climate change on malaria to support effective policy–making and intervention activity concerning malaria control and prevention. This paper critically reviewed the published literature and examined both key findings and methodological issues in projecting future impacts of climate change on malaria transmission. A literature search was conducted using the electronic databases MEDLINE, Web of Science and PubMed. The projected impacts of climate change on malaria transmission were spatially heterogeneous and somewhat inconsistent. The variation in results may be explained by the interaction of climatic factors and malaria transmission cycles, variations in projection frameworks and uncertainties of future socioecological (including climate) changes. Current knowledge gaps are identified, future research directions are proposed and public health implications are assessed. Improving the understanding of the dynamic effects of climate on malaria transmission cycles, the advancement of modelling techniques and the incorporation of uncertainties in future socioecological changes are critical factors for projecting the impact of climate change on malaria transmission.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This contribution outlines Synchrotron-based X-ray micro-tomography and its potential use in structural geology and rock mechanics. The paper complements several recent reviews of X-ray microtomography. We summarize the general approach to data acquisition, post-processing as well as analysis and thereby aim to provide an entry point for the interested reader. The paper includes tables listing relevant beamlines, a list of all available imaging techniques, and available free and commercial software packages for data visualization and quantification. We highlight potential applications in a review of relevant literature including time-resolved experiments and digital rock physics. The paper concludes with a report on ongoing developments and upgrades at synchrotron facilities to frame the future possibilities for imaging sub-second processes in centimetre-sized samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research treated the response of pile foundations to blast loads. The influence of important parameters was investigated. The research techniques and the results will enable safer design of pile foundations that are vulnerable to blast loads.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we describe the approaches adopted to generate the runs submitted to ImageCLEFPhoto 2009 with an aim to promote document diversity in the rankings. Four of our runs are text based approaches that employ textual statistics extracted from the captions of images, i.e. MMR [1] as a state of the art method for result diversification, two approaches that combine relevance information and clustering techniques, and an instantiation of Quantum Probability Ranking Principle. The fifth run exploits visual features of the provided images to re-rank the initial results by means of Factor Analysis. The results reveal that our methods based on only text captions consistently improve the performance of the respective baselines, while the approach that combines visual features with textual statistics shows lower levels of improvements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research proposes the development of interfaces to support collaborative, community-driven inquiry into data, which we refer to as Participatory Data Analytics. Since the investigation is led by local communities, it is not possible to anticipate which data will be relevant and what questions are going to be asked. Therefore, users have to be able to construct and tailor visualisations to their own needs. The poster presents early work towards defining a suitable compositional model, which will allow users to mix, match, and manipulate data sets to obtain visual representations with little-to-no programming knowledge. Following a user-centred design process, we are subsequently planning to identify appropriate interaction techniques and metaphors for generating such visual specifications on wall-sized, multi-touch displays.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the establishment of Australia’s earliest formal studies in landscape architecture, landscape planning has been a traditional focus within post-graduate studies at QUT. Study in this area has evolved from an earlier emphasis on applied physical geography through to traditional techniques and processes in visual assessment and management. The emphasis on these techniques has shifted again to a more complex exploration of natural, economic, social and cultural landscapes. Recently, the School has explored more innovative and complex dimensions of human and natural landscapes. This has involved a focus on particular regions under pressure from local social and economic change. These have included the under-threat ‘picturesque’ landscapes of the Blackall Range and the Tweed Valley. Attempts to bridge the institution and the landscape have unearthed, through a studio focus, strong connections with notions of sustainable villages, roadside interpretation, way finding, local economic initiatives, special area creation, cultural heritage brokering and ecological enhancements. These initiatives have spanned both local practice interests and academic pursuits. Central to this exploration is the concept of problem solving through the investigation of the concept of ‘multiple scales’. An open, yet intensive program is being developed with a team of ‘futurist’ practitioners offering a range of experiences and perspectives to students. The program is being increasingly linked to design studios so that landscape planning and landscape design form a fabric of inquiry that works towards reclaiming complex landscapes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computational models in physiology often integrate functional and structural information from a large range of spatio-temporal scales from the ionic to the whole organ level. Their sophistication raises both expectations and scepticism concerning how computational methods can improve our understanding of living organisms and also how they can reduce, replace and refine animal experiments. A fundamental requirement to fulfil these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present study aims at informing strategies for validation by elucidating the complex interrelations between experiments, models and simulations in cardiac electrophysiology. We describe the processes, data and knowledge involved in the construction of whole ventricular multiscale models of cardiac electrophysiology. Our analysis reveals that models, simulations, and experiments are intertwined, in an assemblage that is a system itself, namely the model-simulation-experiment (MSE) system. Validation must therefore take into account the complex interplay between models, simulations and experiments. Key points for developing strategies for validation are: 1) understanding sources of bio-variability is crucial to the comparison between simulation and experimental results; 2) robustness of techniques and tools is a pre-requisite to conducting physiological investigations using the MSE system; 3) definition and adoption of standards facilitates interoperability of experiments, models and simulations; 4) physiological validation must be understood as an iterative process that defines the specific aspects of electrophysiology the MSE system targets, and is driven by advancements in experimental and computational methods and the combination of both.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Effluent from sewage treatment plants has been associated with a range of pollutant effects. Depending on the influent composition and treatment processes the effluent may contain a myriad of different chemicals which makes monitoring very complex. In this study we aimed to monitor relatively polar organic pollutant mixtures using a combination of passive sampling techniques and a set of biochemistry based assays covering acute bacterial toxicity (Microtox™), phytotoxicity (Max-I-PAM assay) and genotoxicity (umuC assay). The study showed that all of the assays were able to detect effects in the samples and allowed a comparison of the two plants as well as a comparison between the two sampling periods. Distinct improvements in water quality were observed in one of the plants as result of an upgrade to a UV disinfection system, which improved from 24× sample enrichment required to induce a 50% response in the Microtox™ assay to 84×, from 30× sample enrichment to induce a 50% reduction in photosynthetic yield to 125×, and the genotoxicity observed in the first sampling period was eliminated. Thus we propose that biochemical assay techniques in combination with time integrated passive sampling can substantially contribute to the monitoring of polar organic toxicants in STP effluents.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human saliva harbours proteins of clinical relevance and about 30% of blood proteins are also present in saliva. This highlights that saliva can be used for clinical applications just as urine or blood. However, the translation of salivary biomarker discoveries into clinical settings is hampered by the dynamics and complexity of the salivary proteome. This review focuses on the current status of technological developments and achievements relating to approaches for unravelling the human salivary proteome. We discuss the dynamics of the salivary proteome, as well as the importance of sample preparation and processing techniques and their influence on downstream protein applications; post-translational modifications of salivary proteome and protein: protein interactions. In addition, we describe possible enrichment strategies for discerning post-translational modifications of salivary proteins, the potential utility of selected-reaction-monitoring techniques for biomarker discovery and validation, limitations to proteomics and the biomarker challenge and future perspectives. In summary, we provide recommendations for practical saliva sampling, processing and storage conditions to increase the quality of future studies in an emerging field of saliva clinical proteomics. We propose that the advent of technologies allowing sensitive and high throughput proteome-wide analyses, coupled to well-controlled study design, will allow saliva to enter clinical practice as an alternative to blood-based methods due to its simplistic nature of sampling, non-invasiveness, easy of collection and multiple collections by untrained professionals and cost-effective advantages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we propose and study low complexity algorithms for on-line estimation of hidden Markov model (HMM) parameters. The estimates approach the true model parameters as the measurement noise approaches zero, but otherwise give improved estimates, albeit with bias. On a nite data set in the high noise case, the bias may not be signi cantly more severe than for a higher complexity asymptotically optimal scheme. Our algorithms require O(N3) calculations per time instant, where N is the number of states. Previous algorithms based on earlier hidden Markov model signal processing methods, including the expectation-maximumisation (EM) algorithm require O(N4) calculations per time instant.