906 resultados para automated thematic analysis of textual data


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the past decade, the utilization of ambulance data to inform the prevalence of nonfatal heroin overdose has increased. These data can assist public health policymakers, law enforcement agencies, and health providers in planning and allocating resources. This study examined the 672 ambulance attendances at nonfatal heroin overdoses in Queensland, Australia, in 2000. Gender distribution showed a typical 70/30 male-to-female ratio. An equal number of persons with nonfatal heroin overdose were between 15 and 24 years of age and 25 and 34 years of age. Police were present in only 1 of 6 cases, and 28.1% of patients reported using drugs alone. Ambulance data are proving to be a valuable population-based resource for describing the incidence and characteristics of nonfatal heroin overdose episodes. Future studies could focus on the differences between nonfatal heroin overdose and fatal heroin overdose samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective To assess whether trends in mortality from heart failure(HF) in Australia are due to a change in awareness of the condition or real changes in its epidemiology. Methods We carried out a retrospective analysis of official data on national mortality data between 1997 and 2003. A death was attributed to HF if the death certificate mentioned HF as either the underlying cause of death (UCD) or among the contributory factors. Findings From a total of 907 242 deaths, heart failure was coded as the UCD for 29 341 (3.2%) and was mentioned anywhere on the death certificate in 135 268 (14.9%). Between 1997 and 2003, there were decreases in the absolute numbers of deaths and in the age-specific and age-standardized mortality rates for HF either as UCD or mentioned anywhere for both sexes. HF was mentioned for 24.6% and 17.8% of deaths attributed to ischaemic heart disease and circulatory disease, respectively, and these proportions remained unchanged over the period of study. In addition, HF as UCD accounted for 8.3% of deaths attributed to circulatory disease and this did not change materially from 1997 to 2003. Conclusion The decline in mortality from HF measured as either number of deaths or rate probably reflects a real change in the epidemiology of HF. Population-based studies are required to determine accurately the contributions of changes in incidence, survival and demographic factors to the evolving epidemiology of HF.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To describe the workload profile in a network of Australian skin cancer clinics. Design and setting: Analysis of billing data for the first 6 months of 2005 in a primary-care skin cancer clinic network, consisting of seven clinics and staffed by 20 doctors, located in the Northern Territory, Queensland and New South Wales. Main outcome measures: Consultation to biopsy ratio (CBR); biopsy to treatment ratio (BTR); number of benign naevi excised per melanoma (number needed to treat [NNT]). Results: Of 69780 billed activities, 34 622 (49.6%) were consultations, 19 358 (27.7%) biopsies, 8055 (11.5%) surgical excisions, 2804 (4.0%) additional surgical repairs, 1613 (2.3%) non-surgical treatments of cancers and 3328 (4.8%) treatments of premalignant or non-malignant lesions. A total of 6438 cancers were treated (116 melanomas by excision, 4709 non-melanoma skin cancers [NMSCs] by excision, and 1613 NMSCs non-surgically); 5251 (65.2%) surgical wounds were repaired by direct suture, 2651 (32.9%) by a flap (of which 44.8% were simple flaps), 42 (0.5%) by wedge excision and 111 (1.4%) by grafts. The CBR was 1.79, the BTR was 3.1 and the NNT was 28.6. Conclusions: In this network of Australian skin cancer clinics, one in three biopsies identified a skin cancer (BTR, 3.1), and about 29 benign lesions were excised per melanoma (NNT, 28.6). The estimated NNT was similar to that reported previously in general practice. More data are needed on health outcomes, including effectiveness of treatment and surgical repair.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Around 50 per cent of men with diabetes experience erectile dysfunction. Much of the literature focuses on quality of life measures with heterosexual men in monogamous relationships. This study explores gay and bisexual men's experiences of sex and diabetes. Thirteen interviews were analysed and three themes identified: erectile problems; other 'physical' problems; and disclosing diabetes to sexual partners. Findings highlight a range of sexual problems experienced by non-heterosexual men and the significance of the cultural and relational context in which they are situated. The personalized care promised by the UK government should acknowledge the diversity of sexual practices which might be affected by diabetes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this paper is to investigate the technological development of electronic inventory solutions from perspective of patent analysis. We first applied the international patent classification to classify the top categories of data processing technologies and their corresponding top patenting countries. Then we identified the core technologies by the calculation of patent citation strength and standard deviation criterion for each patent. To eliminate those core innovations having no reference relationships with the other core patents, relevance strengths between core technologies were evaluated also. Our findings provide market intelligence not only for the research and development community, but for the decision making of advanced inventory solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The Framework Method is becoming an increasingly popular approach to the management and analysis of qualitative data in health research. However, there is confusion about its potential application and limitations. Discussion. The article discusses when it is appropriate to adopt the Framework Method and explains the procedure for using it in multi-disciplinary health research teams, or those that involve clinicians, patients and lay people. The stages of the method are illustrated using examples from a published study. Summary. Used effectively, with the leadership of an experienced qualitative researcher, the Framework Method is a systematic and flexible approach to analysing qualitative data and is appropriate for use in research teams even where not all members have previous experience of conducting qualitative research. © 2013 Gale et al.; licensee BioMed Central Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The microarray technology provides a high-throughput technique to study gene expression. Microarrays can help us diagnose different types of cancers, understand biological processes, assess host responses to drugs and pathogens, find markers for specific diseases, and much more. Microarray experiments generate large amounts of data. Thus, effective data processing and analysis are critical for making reliable inferences from the data. ^ The first part of dissertation addresses the problem of finding an optimal set of genes (biomarkers) to classify a set of samples as diseased or normal. Three statistical gene selection methods (GS, GS-NR, and GS-PCA) were developed to identify a set of genes that best differentiate between samples. A comparative study on different classification tools was performed and the best combinations of gene selection and classifiers for multi-class cancer classification were identified. For most of the benchmarking cancer data sets, the gene selection method proposed in this dissertation, GS, outperformed other gene selection methods. The classifiers based on Random Forests, neural network ensembles, and K-nearest neighbor (KNN) showed consistently god performance. A striking commonality among these classifiers is that they all use a committee-based approach, suggesting that ensemble classification methods are superior. ^ The same biological problem may be studied at different research labs and/or performed using different lab protocols or samples. In such situations, it is important to combine results from these efforts. The second part of the dissertation addresses the problem of pooling the results from different independent experiments to obtain improved results. Four statistical pooling techniques (Fisher inverse chi-square method, Logit method. Stouffer's Z transform method, and Liptak-Stouffer weighted Z-method) were investigated in this dissertation. These pooling techniques were applied to the problem of identifying cell cycle-regulated genes in two different yeast species. As a result, improved sets of cell cycle-regulated genes were identified. The last part of dissertation explores the effectiveness of wavelet data transforms for the task of clustering. Discrete wavelet transforms, with an appropriate choice of wavelet bases, were shown to be effective in producing clusters that were biologically more meaningful. ^