15 resultados para content analysis and indexing – thesauruses general terms

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

E-atmospherics have been often analyzed in terms of functional features, leaving its characteristics' link to social capital co-creation as a fertile research area. Prior research have demonstrated the capacity of e-atmospherics' at modifying shopping habits towards deeper engagement. Little is known on how processes and cues emerging from the social aspects of lifestyle influence purchasing behavior. The anatomy of social dimension and ICT is the focus of this research, where attention is devoted to unpack the meanings and type of online mundane social capital creation. Taking a cross-product/services approach to better investigate social construction impact, our approach also involves both an emerging and a mature market where exploratory content analysis of landing page are done on Turkish and French web sites, respectively. We contend that by comprehending social capital, daily micro practices, habits and routine, a better and deeper understanding on e-atmospherics incumbent and potential effects on its multi-national e-customer will be acquired.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the feasibility of simultaneous suppressing of the amplification noise and nonlinearity, representing the most fundamental limiting factors in modern optical communication. To accomplish this task we developed a general design optimisation technique, based on concepts of noise and nonlinearity management. We demonstrate the immense efficiency of the novel approach by applying it to a design optimisation of transmission lines with periodic dispersion compensation using Raman and hybrid Raman-EDFA amplification. Moreover, we showed, using nonlinearity management considerations, that the optimal performance in high bit-rate dispersion managed fibre systems with hybrid amplification is achieved for a certain amplifier spacing which is different from commonly known optimal noise performance corresponding to fully distributed amplification. Required for an accurate estimation of the bit error rate, the complete knowledge of signal statistics is crucial for modern transmission links with strong inherent nonlinearity. Therefore, we implemented the advanced multicanonical Monte Carlo (MMC) method, acknowledged for its efficiency in estimating distribution tails. We have accurately computed acknowledged for its efficiency in estimating distribution tails. We have accurately computed marginal probability density functions for soliton parameters, by numerical modelling of Fokker-Plank equation applying the MMC simulation technique. Moreover, applying a powerful MMC method we have studied the BER penalty caused by deviations from the optimal decision level in systems employing in-line 2R optical regeneration. We have demonstrated that in such systems the analytical linear approximation that makes a better fit in the central part of the regenerator nonlinear transfer function produces more accurate approximation of the BER and BER penalty. We present a statistical analysis of RZ-DPSK optical signal at direct detection receiver with Mach-Zehnder interferometer demodulation

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The optometric profession in the UK has a major role in the detection, assessment and management of ocular anomalies in children between 5 and 16 years of age. The role complements a variety of associated screening services provided across several health care sectors. The review examines the evidence-base for the content, provision and efficacy of these screening services in terms of the prevalence of anomalies such as refractive error, amblyopia, binocular vision and colour vision and considers the consequences of their curtailment. Vision screening must focus on pre-school children if the aim of the screening is to detect and treat conditions that may lead to amblyopia, whereas if the aim is to detect and correct significant refractive errors (not likely to lead to amblyopia) then it would be expedient for the optometric profession to act as the major provider of refractive (and colour vision) screening at 5-6 years of age. Myopia is the refractive error most likely to develop during primary school presenting typically between 8 and 12 years of age, thus screening at entry to secondary school is warranted. Given the inevitable restriction on resources for health care, establishing screening at 5 and 11 years of age, with exclusion of any subsequent screening, is the preferred option. © 2004 The College of Optometrists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The topic of my research is consumer brand equity (CBE). My thesis is that the success or otherwise of a brand is better viewed from the consumers’ perspective. I specifically focus on consumers as a unique group of stakeholders whose involvement with brands is crucial to the overall success of branding strategy. To this end, this research examines the constellation of ideas on brand equity that have hitherto been offered by various scholars. Through a systematic integration of the concepts and practices identified but these scholars (concepts and practices such as: competitiveness, consumer searching, consumer behaviour, brand image, brand relevance, consumer perceived value, etc.), this research identifies CBE as a construct that is shaped, directed and made valuable by the beliefs, attitudes and the subjective preferences of consumers. This is done by examining the criteria on the basis of which the consumers evaluate brands and make brand purchase decisions. Understanding the criteria by which consumers evaluate brands is crucial for several reasons. First, as the basis upon which consumers select brands changes with consumption norms and technology, understanding the consumer choice process will help in formulating branding strategy. Secondly, an understanding of these criteria will help in formulating a creative and innovative agenda for ‘new brand’ propositions. Thirdly, it will also influence firms’ ability to simulate and mould the plasticity of demand for existing brands. In examining these three issues, this thesis presents a comprehensive account of CBE. This is because the first issue raised in the preceding paragraph deals with the content of CBE. The second issue addresses the problem of how to develop a reliable and valid measuring instrument for CBE. The third issue examines the structural and statistical relationships between the factors of CBE and the consequences of CBE on consumer perceived value (CPV). Using LISREL-SIMPLIS 8.30, the study finds direct and significant influential links between consumer brand equity and consumer value perception.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this research was to investigate the integration of computer-aided drafting and finite-element analysis in a linked computer-aided design procedure and to develop the necessary software. The Be'zier surface patch for surface representation was used to bridge the gap between the rather separate fields of drafting and finite-element analysis because the surfaces are defined by analytical functions which allow systematic and controlled variation of the shape and provide continuous derivatives up to any required degree. The objectives of this research were achieved by establishing : (i) A package which interpretes the engineering drawings of plate and shell structures and prepares the Be'zier net necessary for surface representation. (ii) A general purpose stand-alone meshed-surface modelling package for surface representation of plates and shells using the Be'zier surface patch technique. (iii) A translator which adapts the geometric description of plate and shell structures as given by the meshed-surface modeller to the form needed by the finite-element analysis package. The translator was extended to suit fan impellers by taking advantage of their sectorial symmetry. The linking processes were carried out for simple test structures, simplified and actual fan impellers to verify the flexibility and usefulness of the linking technique adopted. Finite-element results for thin plate and shell structures showed excellent agreement with those obtained by other investigators while results for the simplified and actual fan impellers also showed good agreement with those obtained in an earlier investigation where finite-element analysis input data were manually prepared. Some extensions of this work have also been discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes the development of a simple and accurate method for estimating the quantity and composition of household waste arisings. The method is based on the fundamental tenet that waste arisings can be predicted from information on the demographic and socio-economic characteristics of households, thus reducing the need for the direct measurement of waste arisings to that necessary for the calibration of a prediction model. The aim of the research is twofold: firstly to investigate the generation of waste arisings at the household level, and secondly to devise a method for supplying information on waste arisings to meet the needs of waste collection and disposal authorities, policy makers at both national and European level and the manufacturers of plant and equipment for waste sorting and treatment. The research was carried out in three phases: theoretical, empirical and analytical. In the theoretical phase specific testable hypotheses were formulated concerning the process of waste generation at the household level. The empirical phase of the research involved an initial questionnaire survey of 1277 households to obtain data on their socio-economic characteristics, and the subsequent sorting of waste arisings from each of the households surveyed. The analytical phase was divided between (a) the testing of the research hypotheses by matching each household's waste against its demographic/socioeconomic characteristics (b) the development of statistical models capable of predicting the waste arisings from an individual household and (c) the development of a practical method for obtaining area-based estimates of waste arisings using readily available data from the national census. The latter method was found to represent a substantial improvement over conventional methods of waste estimation in terms of both accuracy and spatial flexibility. The research therefore represents a substantial contribution both to scientific knowledge of the process of household waste generation, and to the practical management of waste arisings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study considers the application of image analysis in petrography and investigates the possibilities for advancing existing techniques by introducing feature extraction and analysis capabilities of a higher level than those currently employed. The aim is to construct relevant, useful descriptions of crystal form and inter-crystal relations in polycrystalline igneous rock sections. Such descriptions cannot be derived until the `ownership' of boundaries between adjacent crystals has been established: this is the fundamental problem of crystal boundary assignment. An analysis of this problem establishes key image features which reveal boundary ownership; a set of explicit analysis rules is presented. A petrographic image analysis scheme based on these principles is outlined and the implementation of key components of the scheme considered. An algorithm for the extraction and symbolic representation of image structural information is developed. A new multiscale analysis algorithm which produces a hierarchical description of the linear and near-linear structure on a contour is presented in detail. Novel techniques for symmetry analysis are developed. The analyses considered contribute both to the solution of the boundary assignment problem and to the construction of geologically useful descriptions of crystal form. The analysis scheme which is developed employs grouping principles such as collinearity, parallelism, symmetry and continuity, so providing a link between this study and more general work in perceptual grouping and intermediate level computer vision. Consequently, the techniques developed in this study may be expected to find wider application beyond the petrographic domain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Published data indicate that the polar lipid content of human meibomian gland secretions (MGS) could be anything between 0.5% and 13% of the total lipid. The tear film phospholipid composition has not been studied in great detail and it has been understood that the relative proportions of lipids in MGS would be maintained in the tear film. The purpose of this work was to determine the concentration of phospholipids in the human tear film. Methods: Liquid chromatography mass spectrometry (LCMS) and thin layer chromatography (TLC) were used to determine the concentration of phospholipid in the tear film. Additionally, an Amplex Red phosphatidylcholine-specific phospholipase C (PLC) assay kit was used for determination of the activity of PLC in the tear film. Results: Phospholipids were not detected in any of the tested human tear samples with the low limit of detection being 1.3 µg/mL for TLC and 4 µg/mL for liquid chromatography mass spectrometry. TLC indicated that diacylglycerol (DAG) may be present in the tear film. PLC was in the tear film with an activity determined at approximately 15 mU/mL, equivalent to the removal of head groups from phosphatidylcholine at a rate of approximately 15 µM/min. Conclusions: This work shows that phospholipid was not detected in any of the tested human tear samples (above the lower limits of detection as described) and suggests the presence of DAG in the tear film. DAG is known to be at low concentrations in MGS. These observations indicate that PLC may play a role in modulating the tear film phospholipid concentration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background - This study investigates the coverage of adherence to medicine by the UK and US newsprint media. Adherence to medicine is recognised as an important issue facing healthcare professionals and the newsprint media is a key source of health information, however, little is known about newspaper coverage of medication adherence. Methods - A search of the newspaper database Nexis®UK from 20042011 was performed. Content analysis of newspaper articles which referenced medication adherence from the twelve highest circulating UK and US daily newspapers and their Sunday equivalents was carried out. A second researcher coded a 15% sample of newspaper articles to establish the inter-rater reliability of coding. Results - Searches of newspaper coverage of medication adherence in the UK and US yielded 181 relevant articles for each country. There was a large increase in the number of scientific articles on medication adherence in PubMed® over the study period, however, this was not reflected in the frequency of newspaper articles published on medication adherence. UK newspaper articles were significantly more likely to report the benefits of adherence (p = 0.005), whereas US newspaper articles were significantly more likely to report adherence issues in the elderly population (p = 0.004) and adherence associated with diseases of the central nervous system (p = 0.046). The most commonly reported barriers to adherence were patient factors e.g. poor memory, beliefs and age, whereas, the most commonly reported facilitators to adherence were medication factors including simplified regimens, shorter treatment duration and combination tablets. HIV/AIDS was the single most frequently cited disease (reported in 20% of newspaper articles). Poor quality reporting of medication adherence was identified in 62% of newspaper articles. Conclusion - Adherence is not well covered in the newspaper media despite a significant presence in the medical literature. The mass media have the potential to help educate and shape the public’s knowledge regarding the importance of medication adherence; this potential is not being realised at present.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A large number of studies have been devoted to modeling the contents and interactions between users on Twitter. In this paper, we propose a method inspired from Social Role Theory (SRT), which assumes that a user behaves differently in different roles in the generation process of Twitter content. We consider the two most distinctive social roles on Twitter: originator and propagator, who respectively posts original messages and retweets or forwards the messages from others. In addition, we also consider role-specific social interactions, especially implicit interactions between users who share some common interests. All the above elements are integrated into a novel regularized topic model. We evaluate the proposed method on real Twitter data. The results show that our method is more effective than the existing ones which do not distinguish social roles. Copyright 2013 ACM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neuroimaging (NI) technologies are having increasing impact in the study of complex cognitive and social processes. In this emerging field of social cognitive neuroscience, a central goal should be to increase the understanding of the interaction between the neurobiology of the individual and the environment in which humans develop and function. The study of sex/gender is often a focus for NI research, and may be motivated by a desire to better understand general developmental principles, mental health problems that show female-male disparities, and gendered differences in society. In order to ensure the maximum possible contribution of NI research to these goals, we draw attention to four key principles—overlap, mosaicism, contingency and entanglement—that have emerged from sex/gender research and that should inform NI research design, analysis and interpretation. We discuss the implications of these principles in the form of constructive guidelines and suggestions for researchers, editors, reviewers and science communicators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we explore the idea of social role theory (SRT) and propose a novel regularized topic model which incorporates SRT into the generative process of social media content. We assume that a user can play multiple social roles, and each social role serves to fulfil different duties and is associated with a role-driven distribution over latent topics. In particular, we focus on social roles corresponding to the most common social activities on social networks. Our model is instantiated on microblogs, i.e., Twitter and community question-answering (cQA), i.e., Yahoo! Answers, where social roles on Twitter include "originators" and "propagators", and roles on cQA are "askers" and "answerers". Both explicit and implicit interactions between users are taken into account and modeled as regularization factors. To evaluate the performance of our proposed method, we have conducted extensive experiments on two Twitter datasets and two cQA datasets. Furthermore, we also consider multi-role modeling for scientific papers where an author's research expertise area is considered as a social role. A novel application of detecting users' research interests through topical keyword labeling based on the results of our multi-role model has been presented. The evaluation results have shown the feasibility and effectiveness of our model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Studies of framing in the EU political system are still a rarity and they suffer from a lack of systematic empirical analysis. Addressing this gap, we ask if institutional and policy contexts intertwined with the strategic side of framing can explain the number and types of frames employed by different stakeholders. We use a computer-assisted manual content analysis and develop a fourfold typology of frames to study the frames that were prevalent in the debates on four EU policy proposals within financial market regulation and environmental policy at the EU level and in Germany, Sweden, the Netherlands and the United Kingdom. The main empirical finding is that both contexts and strategies exert a significant impact on the number and types of frames in EU policy debates. In conceptual terms, the article contributes to developing more fine-grained tools for studying frames and their underlying dimensions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and objective: Safe prescribing requires accurate and practical information about drugs. Our objective was to measure the utility of current sources of prescribing guidance when used to inform practical prescribing decisions, and to compare current sources of prescribing guidance in the UK with idealized prescribing guidance. Methods: We developed 25 clinical scenarios. Two independent assessors rated and ranked the performance of five common sources of prescribing guidance in the UK when used to answer the clinical scenarios. A third adjudicator facilitated review of any disparities. An idealized list of contents for prescribing guidance was developed and sent for comments to academics and users of prescribing guidance. Following consultation an operational check was used to assess compliance with the idealized criteria. The main outcome measures were relative utility in answering the clinical scenarios and compliance with the idealized prescribing guidance. Results: Current sources of prescribing guidance used in the UK differ in their utility, when measured using clinical scenarios. The British National Formulary (BNF) and EMIS LV were the best performing sources in terms of both ranking [mean rank 1·24 and 2·20] and rating [%excellent or adequate 100% and 72%]. Current sources differed in the extent to which they fulfilled criteria for ideal prescribing guidance, but the BNF, and EMIS LV to a lesser extent, closely matched the criteria. Discussion: We have demonstrated how clinical scenarios can be used to assess prescribing guidance resources. Producers of prescribing guidance documents should consider our idealized template. Prescribers require high-quality information to support their practice. Conclusion: Our test was helpful in distinguishing between prescribing resources. Producers of prescribing guidance should consider the utility of their products to end-users, particularly in those more complex areas where prescribers may need most support. Existing UK prescribing guidance resources differ in their ability to provide assistance to prescribers. © 2010 Blackwell Publishing Ltd.