849 resultados para adaptive blind source separation method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The World Health Organization recommends that data on mortality in its member countries are collected utilising the Medical Certificate of Cause of Death published in the instruction volume of the ICD-10. However, investment in health information processes necessary to promote the use of this certificate and improve mortality information is lacking in many countries. An appeal for support to make improvements has been launched through the Health Metrics Network’s MOVE-IT strategy (Monitoring of Vital Events – Information Technology) [World Health Organization, 2011]. Despite this international spotlight on the need for capture of mortality data and in the use of the ICD-10 to code the data reported on such certificates, there is little cohesion in the way that certifiers of deaths receive instruction in how to complete the death certificate, which is the main source document for mortality statistics. Complete and accurate documentation of the immediate, underlying and contributory causes of death of the decedent on the death certificate is a requirement to produce standardised statistical information and to the ability to produce cause-specific mortality statistics that can be compared between populations and across time. This paper reports on a research project conducted to determine the efficacy and accessibility of the certification module of the WHO’s newly-developed web based training tool for coders and certifiers of deaths. Involving a population of medical students from the Fiji School of Medicine and a pre and post research design, the study entailed completion of death certificates based on vignettes before and after access to the training tool. The ability of the participants to complete the death certificates and analysis of the completeness and specificity of the ICD-10 coding of the reported causes of death were used to measure the effect of the students’ learning from the training tool. The quality of death certificate completion was assessed using a Quality Index before and after the participants accessed the training tool. In addition, the views of the participants about accessibility and use of the training tool were elicited using a supplementary questionnaire. The results of the study demonstrated improvement in the ability of the participants to complete death certificates completely and accurately according to best practice. The training tool was viewed very positively and its implementation in the curriculum for medical students was encouraged. Participants also recommended that interactive discussions to examine the certification exercises would be an advantage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microbial pollution in water periodically affects human health in Australia, particularly in times of drought and flood. There is an increasing need for the control of waterborn microbial pathogens. Methods, allowing the determination of the origin of faecal contamination in water, are generally referred to as Microbial Source Tracking (MST). Various approaches have been evaluated as indicatorsof microbial pathogens in water samples, including detection of different microorganisms and various host-specific markers. However, until today there have been no universal MST methods that could reliably determine the source (human or animal) of faecal contamination. Therefore, the use of multiple approaches is frequently advised. MST is currently recognised as a research tool, rather than something to be included in routine practices. The main focus of this research was to develop novel and universally applicable methods to meet the demands for MST methods in routine testing of water samples. Escherichia coli was chosen initially as the object organism for our studies as, historically and globally, it is the standard indicator of microbial contamination in water. In this thesis, three approaches are described: single nucleotide polymorphism (SNP) genotyping, clustered regularly interspaced short palindromic repeats (CRISPR) screening using high resolution melt analysis (HRMA) methods and phage detection development based on CRISPR types. The advantage of the combination SNP genotyping and CRISPR genes has been discussed in this study. For the first time, a highly discriminatory single nucleotide polymorphism interrogation of E. coli population was applied to identify the host-specific cluster. Six human and one animal-specific SNP profile were revealed. SNP genotyping was successfully applied in the field investigations of the Coomera watershed, South-East Queensland, Australia. Four human profiles [11], [29], [32] and [45] and animal specific SNP profile [7] were detected in water. Two human-specific profiles [29] and [11] were found to be prevalent in the samples over a time period of years. The rainfall (24 and 72 hours), tide height and time, general land use (rural, suburban), seasons, distance from the river mouth and salinity show a lack of relashionship with the diversity of SNP profiles present in the Coomera watershed (p values > 0.05). Nevertheless, SNP genotyping method is able to identify and distinquish between human- and non-human specific E. coli isolates in water sources within one day. In some samples, only mixed profiles were detected. To further investigate host-specificity in these mixed profiles CRISPR screening protocol was developed, to be used on the set of E. coli, previously analysed for SNP profiles. CRISPR loci, which are the pattern of previous DNA coliphages attacks, were considered to be a promising tool for detecting host-specific markers in E. coli. Spacers in CRISPR loci could also reveal the dynamics of virulence in E. coli as well in other pathogens in water. Despite the fact that host-specificity was not observed in the set of E. coli analysed, CRISPR alleles were shown to be useful in detection of the geographical site of sources. HRMA allows determination of ‘different’ and ‘same’ CRISPR alleles and can be introduced in water monitoring as a cost-effective and rapid method. Overall, we show that the identified human specific SNP profiles [11], [29], [32] and [45] can be useful as marker genotypes globally for identification of human faecal contamination in water. Developed in the current study, the SNP typing approach can be used in water monitoring laboratories as an inexpensive, high-throughput and easy adapted protocol. The unique approach based on E. coli spacers for the search for unknown phage was developed to examine the host-specifity in phage sequences. Preliminary experiments on the recombinant plasmids showed the possibility of using this method for recovering phage sequences. Future studies will determine the host-specificity of DNA phage genotyping as soon as first reliable sequences can be acquired. No doubt, only implication of multiple approaches in MST will allow identification of the character of microbial contamination with higher confidence and readability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Power systems in many countries are stressed towards their stability limit. If these stable systems experience any unexpected serious contingencies, or disturbances, there is a significant risk of instability, which may lead to wide-spread blackout. Frequency is a reliable indicator for such instability condition exists on the power system; therefore under-frequency load shedding technique is used to stable the power system by curtail some load. In this paper, the SFR-UFLS model redeveloped to generate optimal load shedding method is that optimally shed load following one single particular contingency event. The proposed optimal load shedding scheme is then tested on the 39-bus New England test system to show the performance against random load shedding scheme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Natural convection flow from an isothermal vertical plate with uniform heat source embedded in a stratified medium has been discussed in this paper. The resulting momentum and energy equations of boundary layer approximation are made non-similar by introducing the usual non-similarity transformations. Numerical solutions of these equations are obtained by an implicit finite difference method for a wide range of the stratification parameter, X. The solutions are also obtained for different values of pertinent parameters, namely, the Prandtl number, Pr and the heat generation or absorption parameter, λ and are expressed in terms of the local skin-friction and local heat transfer, which are shown in the graphical form. Effect of heat generation or absorption on the streamlines and isotherms are also shown graphically for different values of λ.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a novel approach to video deblocking which performs perceptually adaptive bilateral filtering by considering color, intensity, and motion features in a holistic manner. The method is based on bilateral filter which is an effective smoothing filter that preserves edges. The bilateral filter parameters are adaptive and avoid over-blurring of texture regions and at the same time eliminate blocking artefacts in the smooth region and areas of slow motion content. This is achieved by using a saliency map to control the strength of the filter for each individual point in the image based on its perceptual importance. The experimental results demonstrate that the proposed algorithm is effective in deblocking highly compressed video sequences and to avoid over-blurring of edges and textures in salient regions of image.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

House dust is a heterogeneous matrix, which contains a number of biological materials and particulate matter gathered from several sources. It is the accumulation of a number of semi-volatile and non-volatile contaminants. The contaminants are trapped and preserved. Therefore, house dust can be viewed as an archive of both the indoor and outdoor air pollution. There is evidence to show that on average, people tend to stay indoors most of the time and this increases exposure to house dust. The aims of this investigation were to: " assess the levels of Polycyclic Aromatic Hydrocarbons (PAHs), elements and pesticides in the indoor environment of the Brisbane area; " identify and characterise the possible sources of elemental constituents (inorganic elements), PAHs and pesticides by means of Positive Matrix Factorisation (PMF); and " establish the correlations between the levels of indoor air pollutants (PAHs, elements and pesticides) with the external and internal characteristics or attributes of the buildings and indoor activities by means of multivariate data analysis techniques. The dust samples were collected during the period of 2005-2007 from homes located in different suburbs of Brisbane, Ipswich and Toowoomba, in South East Queensland, Australia. A vacuum cleaner fitted with a paper bag was used as a sampler for collecting the house dust. A survey questionnaire was filled by the house residents which contained information about the indoor and outdoor characteristics of their residences. House dust samples were analysed for three different pollutants: Pesticides, Elements and PAHs. The analyses were carried-out for samples of particle size less than 250 µm. The chemical analyses for both pesticides and PAHs were performed using a Gas Chromatography Mass Spectrometry (GC-MS), while elemental analysis was carried-out by using Inductively-Coupled Plasma-Mass Spectroscopy (ICP-MS). The data was subjected to multivariate data analysis techniques such as multi-criteria decision-making procedures, Preference Ranking Organisation Method for Enrichment Evaluations (PROMETHEE), coupled with Geometrical Analysis for Interactive Aid (GAIA) in order to rank the samples and to examine data display. This study showed that compared to the results from previous works, which were carried-out in Australia and overseas, the concentrations of pollutants in house dusts in Brisbane and the surrounding areas were relatively very high. The results of this work also showed significant correlations between some of the physical parameters (types of building material, floor level, distance from industrial areas and major road, and smoking) and the concentrations of pollutants. Types of building materials and the age of houses were found to be two of the primary factors that affect the concentrations of pesticides and elements in house dust. The concentrations of these two types of pollutant appear to be higher in old houses (timber houses) than in the brick ones. In contrast, the concentrations of PAHs were noticed to be higher in brick houses than in the timber ones. Other factors such as floor level, and distance from the main street and industrial area, also affected the concentrations of pollutants in the house dust samples. To apportion the sources and to understand mechanisms of pollutants, Positive Matrix Factorisation (PMF) receptor model was applied. The results showed that there were significant correlations between the degree of concentration of contaminants in house dust and the physical characteristics of houses, such as the age and the type of the house, the distance from the main road and industrial areas, and smoking. Sources of pollutants were identified. For PAHs, the sources were cooking activities, vehicle emissions, smoking, oil fumes, natural gas combustion and traces of diesel exhaust emissions; for pesticides the sources were application of pesticides for controlling termites in buildings and fences, treating indoor furniture and in gardens for controlling pests attacking horticultural and ornamental plants; for elements the sources were soil, cooking, smoking, paints, pesticides, combustion of motor fuels, residual fuel oil, motor vehicle emissions, wearing down of brake linings and industrial activities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Appearance-based loop closure techniques, which leverage the high information content of visual images and can be used independently of pose, are now widely used in robotic applications. The current state-of-the-art in the field is Fast Appearance-Based Mapping (FAB-MAP) having been demonstrated in several seminal robotic mapping experiments. In this paper, we describe OpenFABMAP, a fully open source implementation of the original FAB-MAP algorithm. Beyond the benefits of full user access to the source code, OpenFABMAP provides a number of configurable options including rapid codebook training and interest point feature tuning. We demonstrate the performance of OpenFABMAP on a number of published datasets and demonstrate the advantages of quick algorithm customisation. We present results from OpenFABMAP’s application in a highly varied range of robotics research scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates a mixed centralised-decentralised air traffic separation management system, which combines the best features of the centralised and decentralised systems whilst ensuring the reliability of the air traffic management system during degraded conditions. To overcome communication band limits, we propose a mixed separation manager on the basis of a robust decision (or min-max) problem that is posed on a reduced set of admissible flight avoidance manoeuvres (or a FAM alphabet). We also present a design method for selecting an appropriate FAM alphabet for use in the mixed separation management system. Simulation studies are presented to illustrate the benefits of our proposed FAM alphabet based mixed separation manager.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interleukin(IL)-18 is a pleiotrophic cytokine with functions in immune modulation, angiogenesis and bone metabolism. In this study, the potential of IL-18 as an immunotherapy for prostate cancer (PCa) was examined using the murine model of prostate carcinoma, RM1 and a bone metastatic variant RM1(BM)/B4H7-luc. RM1 and RM1(BM)/B4H7-luc cells were stably transfected to express bioactive IL-18. These cells were implanted into syngeneic immunocompetent mice, with or without an IL-18-neutralising antibody (αIL-18, SK113AE4). IL-18 significantly inhibited the growth of both subcutaneous and orthotopic RM1 tumors and the IL-18 neutralizing antibody abrogated the tumor growth-inhibition. In vivo neutralization of interferon-gamma (IFN-γ) completely eliminated the anti-tumor effects of IL-18 confirming an essential role of IFN-γ as a down-stream mediator of the anti-tumor activity of IL-18. Tumors from mice in which IL-18 and/or IFN-γ was neutralized contained significantly fewer CD4+ and CD8+ T cells than those with functional IL-18. The essential role of adaptive immunity was demonstrated as tumors grew more rapidly in RAG1−/− mice or in mice depleted of CD4+ and/or CD8+ cells than in normal mice. The tumors in RAG1−/− mice were also significantly smaller when IL-18 was present, indicating that innate immune mechanisms are involved. IL-18 also induced an increase in tumor infiltration of macrophages and neutrophils but not NK cells. In other experiments, direct injection of recombinant IL-18 into established tumors also inhibited tumor growth, which was associated with an increase in intratumoral macrophages, but not T cells. These results suggest that local IL-18 in the tumor environment can significantly potentiate anti-tumor immunity in the prostate and clearly demonstrate that this effect is mediated by innate and adaptive immune mechanisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Phenomenology is a term that has been described as a philosophy, a research paradigm, a methodology, and equated with qualitative research. In this paper first we clarify phenomenology by tracing its movement both as a philosophy and as a research method. Next we make a case for the use of phenomenology in empirical investigations of management phenomena. The paper discusses a selection of central concepts pertaining to phenomenology as a scientific research method, which include description, phenomenological reduction and free imaginative variation. In particular, the paper elucidates the efficacy of Giorgi’s descriptive phenomenological research praxis as a qualitative research method and how its utility can be applied in creating a deeper and richer understanding of management practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deciding the appropriate population size and number of is- lands for distributed island-model genetic algorithms is often critical to the algorithm’s success. This paper outlines a method that automatically searches for good combinations of island population sizes and the number of islands. The method is based on a race between competing parameter sets, and collaborative seeding of new parameter sets. This method is applicable to any problem, and makes distributed genetic algorithms easier to use by reducing the number of user-set parameters. The experimental results show that the proposed method robustly and reliably finds population and islands settings that are comparable to those found with traditional trial-and-error approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Linear adaptive channel equalization using the least mean square (LMS) algorithm and the recursive least-squares(RLS) algorithm for an innovative multi-user (MU) MIMOOFDM wireless broadband communications system is proposed. The proposed equalization method adaptively compensates the channel impairments caused by frequency selectivity in the propagation environment. Simulations for the proposed adaptive equalizer are conducted using a training sequence method to determine optimal performance through a comparative analysis. Results show an improvement of 0.15 in BER (at a SNR of 16 dB) when using Adaptive Equalization and RLS algorithm compared to the case in which no equalization is employed. In general, adaptive equalization using LMS and RLS algorithms showed to be significantly beneficial for MU-MIMO-OFDM systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Recent clinical studies have demonstrated an emerging subgroup of head and neck cancers that are virally mediated. This disease appears to be a distinct clinical entity with patients presenting younger and with more advanced nodal disease, having lower tobacco and alcohol exposure and highly radiosensitive tumours. This means they are living longer, often with the debilitating functional side effects of treatment. The primary objective of this study was to determine how virally mediated nasopharyngeal and oropharyngeal cancers respond to radiation therapy treatment. The aim was to determine risk categories and corresponding adaptive treatment management strategies to proactively manage these patients. Method/Results: 121 patients with virally mediated, node positive nasopharyngeal or oropharyngeal cancer who received radiotherapy treatment with curative intent between 2005 and 2010 were studied. Relevant patient demographics including age, gender, diagnosis, TNM stage, pre-treatment nodal size and dose delivered was recorded. Each patient’s treatment plan was reviewed to determine if another computed tomography (re-CT) scan was performed and at what time point (dose/fraction) this occurred. The justification for this re-CT was determined using four categories: tumour and/or nodal regression, weight loss, both or other. Patients who underwent a re-CT were further investigated to determine whether a new plan was calculated. If a re-plan was performed, the dosimetric effect was quantified by comparing dose volume histograms of planning target volumes and critical structures from the actual treatment delivered and the original treatment plan. Preliminary results demonstrated that 25/121 (20.7%) patients required a re-CT and that these re-CTs were performed between fractions 20 to 25 of treatment. The justification for these re-CTs consisted of a combination of tumour and/or nodal regression and weight loss. 16/25 (13.2%) patients had a replan calculated. 9 (7.4%) of these replans were implemented clinically due to the resultant dosimetric effect calculated. The data collected from this assessment was statistically analysed to identify the major determining factors for patients to undergo a re-CT and/or replan. Specific factors identified included nodal size and timing of the required intervention (i.e. how when a plan is to be adapted). This data was used to generate specific risk profiles that will form the basis of a biologically guided adaptive treatment management strategy for virally mediated head and neck cancer. Conclusion: Preliminary data indicates that virally mediated head and neck cancers respond significantly during radiation treatment (tumour and/or nodal regression and weight loss). Implications of this response are the potential underdosing or overdosing of tumour and/or surrounding critical structures. This could lead to sub-optimal patient outcomes and compromised quality of life. Consequently, the development of adaptive treatment strategies that improve organ sparing for this patient group is important to ensure delivery of the prescribed dose to the tumour volume whilst minimizing the dose received to surrounding critical structures. This could reduce side effects and improve overall patient quality of life. The risk profiles and associated adaptive treatment approaches developed in this study will be tested prospectively in the clinical setting in Phase 2 of this investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Clinical investigation has revealed a subgroup of head and neck cancers that are virally mediated. The relationship between nasopharyngeal cancer and Epstein Barr Virus (EBV) has long been established and more recently, the association between oropharyngeal cancer and Human Papillomavirus (HPV) has been revealed1,2 These cancers often present with nodal involvement and generally respond well to radiation treatment, evidenced by tumour regression1. This results in the need for treatment plan adaptation or re-planning in a subset of patients. Adaptive techniques allow the target region of the radiotherapy treatment plan to be altered in accordance with treatment-induced changes to ensure that under or over dosing does not occur3. It also assists in limiting potential overdosing of surrounding critical normal tissues4. We sought to identify a high-risk group based on nodal size to be evaluated in a future prospective adaptive radiotherapy trial. Method: Between 2005-2010, 121 patients with virally mediated, node positive nasopharyngeal (EBV positive) or oropharyngeal (HPV positive) cancers, receiving curative intent radiotherapy treatment were reviewed. Patients were analysed based on maximum size of the dominant node at diagnosis with a view to grouping them in varying risk categories to determine the need of re-planning. The frequency and timing of the re-planning scans were also evaluated. Results: Sixteen nasopharyngeal and 105 oropharyngeal tumours were reviewed. Twenty-five (21%) patients underwent a re-planning CT at a median of 22 (range, 0-29) fractions with 1 patient requiring re-planning prior to the commencement of treatment. Based on the analysis, patients were subsequently placed into risk categories; ≤35mm (Group 1), 36-45mm (Group 2), ≥46mm (Group 3). Re-planning CT’s were performed in Group 1- 8/68 (11.8%), Group 2- 4/28 (14.3%), Group 3- 13/25 (52%). Conclusion: In this series, patients with virally mediated head and neck cancer and nodal size > 46mm appear to be a high-risk group for the need of re-planning during a course of curative radiotherapy. This finding will now be tested in a prospective adaptive radiotherapy study. ‘Real World’ Implications: This research identifies predictive factors for those patients with virally mediated head and neck cancer that will benefit most from treatment adaptation. This will assist in minimising the side effects experienced by these patients thereby improving their quality of life after treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The numerical solution of stochastic differential equations (SDEs) has been focused recently on the development of numerical methods with good stability and order properties. These numerical implementations have been made with fixed stepsize, but there are many situations when a fixed stepsize is not appropriate. In the numerical solution of ordinary differential equations, much work has been carried out on developing robust implementation techniques using variable stepsize. It has been necessary, in the deterministic case, to consider the "best" choice for an initial stepsize, as well as developing effective strategies for stepsize control-the same, of course, must be carried out in the stochastic case. In this paper, proportional integral (PI) control is applied to a variable stepsize implementation of an embedded pair of stochastic Runge-Kutta methods used to obtain numerical solutions of nonstiff SDEs. For stiff SDEs, the embedded pair of the balanced Milstein and balanced implicit method is implemented in variable stepsize mode using a predictive controller for the stepsize change. The extension of these stepsize controllers from a digital filter theory point of view via PI with derivative (PID) control will also be implemented. The implementations show the improvement in efficiency that can be attained when using these control theory approaches compared with the regular stepsize change strategy.