25 resultados para Content analysis method

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A large number of studies have been devoted to modeling the contents and interactions between users on Twitter. In this paper, we propose a method inspired from Social Role Theory (SRT), which assumes that a user behaves differently in different roles in the generation process of Twitter content. We consider the two most distinctive social roles on Twitter: originator and propagator, who respectively posts original messages and retweets or forwards the messages from others. In addition, we also consider role-specific social interactions, especially implicit interactions between users who share some common interests. All the above elements are integrated into a novel regularized topic model. We evaluate the proposed method on real Twitter data. The results show that our method is more effective than the existing ones which do not distinguish social roles. Copyright 2013 ACM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we explore the idea of social role theory (SRT) and propose a novel regularized topic model which incorporates SRT into the generative process of social media content. We assume that a user can play multiple social roles, and each social role serves to fulfil different duties and is associated with a role-driven distribution over latent topics. In particular, we focus on social roles corresponding to the most common social activities on social networks. Our model is instantiated on microblogs, i.e., Twitter and community question-answering (cQA), i.e., Yahoo! Answers, where social roles on Twitter include "originators" and "propagators", and roles on cQA are "askers" and "answerers". Both explicit and implicit interactions between users are taken into account and modeled as regularization factors. To evaluate the performance of our proposed method, we have conducted extensive experiments on two Twitter datasets and two cQA datasets. Furthermore, we also consider multi-role modeling for scientific papers where an author's research expertise area is considered as a social role. A novel application of detecting users' research interests through topical keyword labeling based on the results of our multi-role model has been presented. The evaluation results have shown the feasibility and effectiveness of our model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

E-atmospherics have been often analyzed in terms of functional features, leaving its characteristics' link to social capital co-creation as a fertile research area. Prior research have demonstrated the capacity of e-atmospherics' at modifying shopping habits towards deeper engagement. Little is known on how processes and cues emerging from the social aspects of lifestyle influence purchasing behavior. The anatomy of social dimension and ICT is the focus of this research, where attention is devoted to unpack the meanings and type of online mundane social capital creation. Taking a cross-product/services approach to better investigate social construction impact, our approach also involves both an emerging and a mature market where exploratory content analysis of landing page are done on Turkish and French web sites, respectively. We contend that by comprehending social capital, daily micro practices, habits and routine, a better and deeper understanding on e-atmospherics incumbent and potential effects on its multi-national e-customer will be acquired.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background - This study investigates the coverage of adherence to medicine by the UK and US newsprint media. Adherence to medicine is recognised as an important issue facing healthcare professionals and the newsprint media is a key source of health information, however, little is known about newspaper coverage of medication adherence. Methods - A search of the newspaper database Nexis®UK from 2004–2011 was performed. Content analysis of newspaper articles which referenced medication adherence from the twelve highest circulating UK and US daily newspapers and their Sunday equivalents was carried out. A second researcher coded a 15% sample of newspaper articles to establish the inter-rater reliability of coding. Results - Searches of newspaper coverage of medication adherence in the UK and US yielded 181 relevant articles for each country. There was a large increase in the number of scientific articles on medication adherence in PubMed® over the study period, however, this was not reflected in the frequency of newspaper articles published on medication adherence. UK newspaper articles were significantly more likely to report the benefits of adherence (p = 0.005), whereas US newspaper articles were significantly more likely to report adherence issues in the elderly population (p = 0.004) and adherence associated with diseases of the central nervous system (p = 0.046). The most commonly reported barriers to adherence were patient factors e.g. poor memory, beliefs and age, whereas, the most commonly reported facilitators to adherence were medication factors including simplified regimens, shorter treatment duration and combination tablets. HIV/AIDS was the single most frequently cited disease (reported in 20% of newspaper articles). Poor quality reporting of medication adherence was identified in 62% of newspaper articles. Conclusion - Adherence is not well covered in the newspaper media despite a significant presence in the medical literature. The mass media have the potential to help educate and shape the public’s knowledge regarding the importance of medication adherence; this potential is not being realised at present.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract A new LIBS quantitative analysis method based on analytical line adaptive selection and Relevance Vector Machine (RVM) regression model is proposed. First, a scheme of adaptively selecting analytical line is put forward in order to overcome the drawback of high dependency on a priori knowledge. The candidate analytical lines are automatically selected based on the built-in characteristics of spectral lines, such as spectral intensity, wavelength and width at half height. The analytical lines which will be used as input variables of regression model are determined adaptively according to the samples for both training and testing. Second, an LIBS quantitative analysis method based on RVM is presented. The intensities of analytical lines and the elemental concentrations of certified standard samples are used to train the RVM regression model. The predicted elemental concentration analysis results will be given with a form of confidence interval of probabilistic distribution, which is helpful for evaluating the uncertainness contained in the measured spectra. Chromium concentration analysis experiments of 23 certified standard high-alloy steel samples have been carried out. The multiple correlation coefficient of the prediction was up to 98.85%, and the average relative error of the prediction was 4.01%. The experiment results showed that the proposed LIBS quantitative analysis method achieved better prediction accuracy and better modeling robustness compared with the methods based on partial least squares regression, artificial neural network and standard support vector machine.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data fluctuation in multiple measurements of Laser Induced Breakdown Spectroscopy (LIBS) greatly affects the accuracy of quantitative analysis. A new LIBS quantitative analysis method based on the Robust Least Squares Support Vector Machine (RLS-SVM) regression model is proposed. The usual way to enhance the analysis accuracy is to improve the quality and consistency of the emission signal, such as by averaging the spectral signals or spectrum standardization over a number of laser shots. The proposed method focuses more on how to enhance the robustness of the quantitative analysis regression model. The proposed RLS-SVM regression model originates from the Weighted Least Squares Support Vector Machine (WLS-SVM) but has an improved segmented weighting function and residual error calculation according to the statistical distribution of measured spectral data. Through the improved segmented weighting function, the information on the spectral data in the normal distribution will be retained in the regression model while the information on the outliers will be restrained or removed. Copper elemental concentration analysis experiments of 16 certified standard brass samples were carried out. The average value of relative standard deviation obtained from the RLS-SVM model was 3.06% and the root mean square error was 1.537%. The experimental results showed that the proposed method achieved better prediction accuracy and better modeling robustness compared with the quantitative analysis methods based on Partial Least Squares (PLS) regression, standard Support Vector Machine (SVM) and WLS-SVM. It was also demonstrated that the improved weighting function had better comprehensive performance in model robustness and convergence speed, compared with the four known weighting functions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most of the previous studies on intellectual capital disclosures have been conducted from developed countries' context. There is very limited empirical evidence in this area from the context of emerging economies in general and Africa in particular. This paper is one of the early attempts in this regard. The main purpose of this study is to examine the extent and nature of intellectual capitaldisclosures in ‘Top 20’ South African companies over a 5 years period (2002–2006). The study uses content analysis method to scrutinise the patterns of intellectual capital disclosures during the study period. The results show that intellectual capital disclosures in South Africa have increased over the 5 years study period with certain firms reporting considerably more than others. Out of the three broad categories of intellectual capital disclosures human capital appears to be the most popular category. This finding stands in sharp contrast to the previous studies in this area where external capital was found to be most popular category.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bone marrow mesenchymal stem cells (MSCs) promote nerve growth and functional recovery in animal models of spinal cord injury (SCI) to varying levels. The authors have tested high-content screening to examine the effects of MSC-conditioned medium (MSC-CM) on neurite outgrowth from the human neuroblastoma cell line SH-SY5Y and from explants of chick dorsal root ganglia (DRG). These analyses were compared to previously published methods that involved hand-tracing individual neurites. Both methods demonstrated that MSC-CM promoted neurite outgrowth. Each showed the proportion of SH-SY5Y cells with neurites increased by ~200% in MSC-CM within 48 h, and the number of neurites/SH-SY5Y cells was significantly increased in MSC-CM compared with control medium. For high-content screening, the analysis was performed within minutes, testing multiple samples of MSC-CM and in each case measuring >15,000 SH-SY5Y cells. In contrast, the manual measurement of neurite outgrowth from >200 SH-SY5Y cells in a single sample of MSC-CM took at least 1 h. High-content analysis provided additional measures of increased neurite branching in MSC-CM compared with control medium. MSC-CM was also found to stimulate neurite outgrowth in DRG explants using either method. The application of the high-content analysis was less well optimized for measuring neurite outgrowth from DRG explants than from SH-SY5Y cells.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Graphic depiction is an established method for academics to present concepts about theories of innovation. These expressions have been adopted by policy-makers, the media and businesses. However, there has been little research on the extent of their usage or effectiveness ex-academia. In addition, innovation theorists have ignored this area of study, despite the communication of information about innovation being acknowledged as a major determinant of success for corporate enterprise. The thesis explores some major themes in the theories of innovation and compares how graphics are used to represent them. The thesis examines the contribution of visual sociology and graphic theory to an investigation of a sample of graphics. The methodological focus is a modified content analysis. The following expressions are explored: check lists, matrices, maps and mapping in the management of innovation; models, flow charts, organisational charts and networks in the innovation process; and curves and cycles in the representation of performance and progress. The main conclusion is that academia is leading the way in usage as well as novelty. The graphic message is switching from prescription to description. The computerisation of graphics has created a major role for the information designer. It is recommended that use of the graphic representation of innovation should be increased in all domains, though it is conceded that its content and execution need to improve, too. Education of graphic 'producers', 'intermediaries' and 'consumers' will play a part in this, as will greater exploration of diversity, novelty and convention. Work has begun to tackle this and suggestions for future research are made.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article is aimed primarily at eye care practitioners who are undertaking advanced clinical research, and who wish to apply analysis of variance (ANOVA) to their data. ANOVA is a data analysis method of great utility and flexibility. This article describes why and how ANOVA was developed, the basic logic which underlies the method and the assumptions that the method makes for it to be validly applied to data from clinical experiments in optometry. The application of the method to the analysis of a simple data set is then described. In addition, the methods available for making planned comparisons between treatment means and for making post hoc tests are evaluated. The problem of determining the number of replicates or patients required in a given experimental situation is also discussed. Copyright (C) 2000 The College of Optometrists.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose – Previous reviews of Corporate Social Reporting (CSR) literature have tended to focus on developed economies. The aim of this study is to extend reviews of CSR literature to emerging economies. Design/methodology/approach – A desk-based research method, using a classification framework of three categories. Findings – Most CSR studies in emerging economies have concentrated on the Asia-Pacific and African regions and are descriptive in nature, used content analysis methods and measured the extent and volume of disclosures contained within the annual reports. Such studies provide indirect explanation of the reasons behind CSR adoption, but of late, a handful of studies have started to probe managerial motivations behind CSR directly through in-depth interviews finding that CSR agendas in emerging economies are largely driven by external forces, namely pressures from parent companies, international market and international agencies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A content analysis examined the way majorities and minorities are represented in the British press. An analysis of the headlines of five British newspapers, over a period of five years, revealed that the words ‘majority’ and ‘minority’ appeared 658 times. Majority headlines were most frequent (66% ), more likely to emphasize the numerical size of the majority, to link majority status with political groups, to be described with positive evaluations, and to cover political issues. By contrast, minority headlines were less frequent (34%), more likely to link minority status with ethnic groups and to other social issues, and less likely to be described with positive evaluations. The implications of examining how real-life majorities and minorities are represented for our understanding of experimental research are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recently, we introduced a new 'GLM-beamformer' technique for MEG analysis that enables accurate localisation of both phase-locked and non-phase-locked neuromagnetic effects, and their representation as statistical parametric maps (SPMs). This provides a useful framework for comparison of the full range of MEG responses with fMRI BOLD results. This paper reports a 'proof of principle' study using a simple visual paradigm (static checkerboard). The five subjects each underwent both MEG and fMRI paradigms. We demonstrate, for the first time, the presence of a sustained (DC) field in the visual cortex, and its co-localisation with the visual BOLD response. The GLM-beamformer analysis method is also used to investigate the main non-phase-locked oscillatory effects: an event-related desynchronisation (ERD) in the alpha band (8-13 Hz) and an event-related synchronisation (ERS) in the gamma band (55-70 Hz). We show, using SPMs and virtual electrode traces, the spatio-temporal covariance of these effects with the visual BOLD response. Comparisons between MEG and fMRI data sets generally focus on the relationship between the BOLD response and the transient evoked response. Here, we show that the stationary field and changes in oscillatory power are also important contributors to the BOLD response, and should be included in future studies on the relationship between neuronal activation and the haemodynamic response. © 2005 Elsevier Inc. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose: Most published surface wettability data are based on hydrated materials and are dominated by the air-water interface. Water soluble species with hydrophobic domains (such as surfactants) interact directly with the hydrophobic domains in the lens polymer. Characterisation of relative polar and non-polar fractions of the dehydrated material provides an additional approach to surface analysis. Method: Probe liquids (water and diiodomethane) were used to characterise polar and dispersive components of surface energies of dehydrated lenses using the method of Owens and Wendt. A range of conventional and silicone hydrogel soft lenses was studied. The polar fraction (i.e. polar/total) of surface energy was used as a basis for the study of the structural effects that influence surfactant persistence on the lens surface. Results: When plotted against water content of the hydrated lens, polar fraction of surface energy (PFSE) values of the dehydrated lenses fell into two rectilinear bands. One of these bands covered PFSE values ranging from 0.4 to 0.8 and contained only conventional hydrogels, with two notable additions: the plasma coated silicone hydrogels lotrafilcon A and B. The second band covered PFSE values ranging from 0.04 to 0.28 and contained only silicone hydrogels. Significantly, the silicone hydrogel lenses with lowest PFSE values (p<0.15) are found to be prone to lipid deposition duringwear. Additionally, more hydrophobic surfactants were found to be more persistent on lenses with lower PFSE values. Conclusions: Measurement of polar fraction of surface energy provides an importantmechanistic insight into surface interactions of silicone hydrogels.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The concept of a task is fundamental to the discipline of ergonomics. Approaches to the analysis of tasks began in the early 1900's. These approaches have evolved and developed to the present day, when there is a vast array of methods available. Some of these methods are specific to particular contexts or applications, others more general. However, whilst many of these analyses allow tasks to be examined in detail, they do not act as tools to aid the design process or the designer. The present thesis examines the use of task analysis in a process control context, and in particular the use of task analysis to specify operator information and display requirements in such systems. The first part of the thesis examines the theoretical aspect of task analysis and presents a review of the methods, issues and concepts relating to task analysis. A review of over 80 methods of task analysis was carried out to form a basis for the development of a task analysis method to specify operator information requirements in industrial process control contexts. Of the methods reviewed Hierarchical Task Analysis was selected to provide such a basis and developed to meet the criteria outlined for such a method of task analysis. The second section outlines the practical application and evolution of the developed task analysis method. Four case studies were used to examine the method in an empirical context. The case studies represent a range of plant contexts and types, both complex and more simple, batch and continuous and high risk and low risk processes. The theoretical and empirical issues are drawn together and a method developed to provide a task analysis technique to specify operator information requirements and to provide the first stages of a tool to aid the design of VDU displays for process control.