871 resultados para Content analysis approach
Resumo:
Purpose – Previous reviews of Corporate Social Reporting (CSR) literature have tended to focus on developed economies. The aim of this study is to extend reviews of CSR literature to emerging economies. Design/methodology/approach – A desk-based research method, using a classification framework of three categories. Findings – Most CSR studies in emerging economies have concentrated on the Asia-Pacific and African regions and are descriptive in nature, used content analysis methods and measured the extent and volume of disclosures contained within the annual reports. Such studies provide indirect explanation of the reasons behind CSR adoption, but of late, a handful of studies have started to probe managerial motivations behind CSR directly through in-depth interviews finding that CSR agendas in emerging economies are largely driven by external forces, namely pressures from parent companies, international market and international agencies. Originality/value – This is the first review and analysis of CSR studies from the emerging economy perspective. Following this analysis, the authors have identified some important future research questions.
Resumo:
CONCLUSIONS: The new HSA protocol used in the mfVEP testing can be applied to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patients. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss. PURPOSE: Multifocal visual evoked potential (mfVEP) is a newly introduced method used for objective visual field assessment. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard automated perimetry (SAP) visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. METHODS: Three groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey field analyzer (HFA) test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the hemifield sector analysis (HSA) protocol. Analysis of the HFA was done using the standard grading system. RESULTS: Analysis of mfVEP results showed that there was a statistically significant difference between the three groups in the mean signal to noise ratio (ANOVA test, p < 0.001 with a 95% confidence interval). The difference between superior and inferior hemispheres in all subjects were statistically significant in the glaucoma patient group in all 11 sectors (t-test, p < 0.001), partially significant in 5 / 11 (t-test, p < 0.01), and no statistical difference in most sectors of the normal group (1 / 11 sectors was significant, t-test, p < 0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86%, respectively, and for glaucoma suspect patients the values were 89% and 79%, respectively.
Resumo:
Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard SAP visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. The purpose of this study is to examine the benefit of adding mfVEP hemifield Intersector analysis protocol to the standard HFA test when there is suspicious glaucomatous visual field loss. 3 groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey visual field HFA test 24-2, optical coherence tomography of the optic nerve head, and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the Hemifield Sector Analysis HSA protocol. The retinal nerve fibre (RNFL) thickness was recorded to identify subjects with suspicious RNFL loss. The hemifield Intersector analysis of mfVEP results showed that signal to noise ratio (SNR) difference between superior and inferior hemifields was statistically significant between the 3 groups (ANOVA p<0.001 with a 95% CI). The difference between superior and inferior hemispheres in all subjects were all statistically significant in the glaucoma patient group 11/11 sectors (t-test p<0.001), partially significant 5/11 in glaucoma suspect group (t-test p<0.01) and no statistical difference between most sectors in normal group (only 1/11 was significant) (t-test p<0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86% respectively, while for glaucoma suspect were 89% and 79%. The use of SAP and mfVEP results in subjects with suspicious glaucomatous visual field defects, identified by low RNFL thickness, is beneficial in confirming early visual field defects. The new HSA protocol used in the mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patient. Using this protocol in addition to SAP analysis can provide information about focal visual field differences across the horizontal midline, and confirm suspicious field defects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss. The Intersector analysis protocol can detect early field changes not detected by standard HFA test.
Resumo:
The morphology of asphalt mixture can be defined as a set of parameters describing the geometrical characteristics of its constituent materials, their relative proportions as well as spatial arrangement in the mixture. The present study is carried out to investigate the effect of the morphology on its meso- and macro-mechanical response. An analysis approach is used for the meso-structural characterisation based on the X-ray computed tomography (CT) data. Image processing techniques are used to systematically vary the internal structure to obtain different morphology structures. A morphology framework is used to characterise the average mastic coating thickness around the main load carrying structure in the structures. The uniaxial tension simulation shows that the mixtures with the lowest coating thickness exhibit better inter-particle interaction with more continuous load distribution chains between adjacent aggregate particles, less stress concentrations and less strain localisation in the mastic phase.
Resumo:
Ten years ago, Bowen and Ostroff (2004) criticized the one-sided focus on the content-based approach, where researchers take into account the inherent virtues (or vices) associated with the content of HR practices to explain performance. They explicitly highlight the role of the psychological processes through which employees attach meaning to HRM. In this first article of the special section entitled “Is the HRM Process Important?” we present an overview of past, current, and future challenges. For past challenges, we attempt to categorize the various research streams that originated from the seminal piece. To outline current challenges, we present the results of a content analysis of the original 15 articles put forward for the special section. In addition, we provide the overview of a caucus focused on this theme that was held at the Academy of Management annual meeting in Boston in 2012. In conclusion, we discuss future challenges relating to the HRM process approach and review the contributions that have been selected—against a competitive field—for this special issue
Resumo:
Purpose – The objective of this paper is to address the question whether and how firms can follow a standard management process to cope with emerging corporate social responsibility (CSR) challenges? Both researchers and practitioners have paid increasing attention to the question because of the rapidly evolving CSR expectations of stakeholders and the limited diffusion of CSR standardization. The question was addressed by developing a theoretical framework to explain how dynamic capabilities can contribute to effective CSR management. Design/methodology/approach – Based on 64 world-leading companies’ contemporary CSR reports, we carried out a large-scale content analysis to identify and examine the common organizational processes involved in CSR management and the dynamic capabilities underpinning those management processes. Findings – Drawing on the dynamic capabilities perspective, we demonstrate how the deployment of three dynamic capabilities for CSR management, namely, scanning, sensing and reconfiguration capabilities can help firms to meet emerging CSR requirements by following a set of common management processes. The findings demonstrate that what is more important in CSR standardization is the identification and development of the underlying dynamic capabilities and the related organizational processes and routines, rather than the detailed operational activities. Originality/value - Our study is an early attempt to examine the fundamental organizational capabilities and processes involved in CSR management from the dynamic capabilities perspective. Our research findings contribute to CSR standardization literature by providing a new theoretical perspective to better understand the capabilities enabling common CSR management processes.
Resumo:
Purpose – Differences in corporate commitments to sustainability have attracted increasing attentions of both researchers and practitioners. However, reasons behind such differences still lack a generic theorization. We propose that one source of these differences lies in the development and application of what we refer to as dynamic capabilities for corporate sustainability within the firm. Drawing on the dynamic capabilities view, the objective of this paper is to examine the fundamental role of dynamic capabilities in corporate sustainable development. Design/methodology/approach – The research developed a framework of dynamic capabilities for corporate sustainability and used the approach of content analysis to verify the framework based on the CSR reports of UK leading companies. Findings – The research demonstrates that the dynamic capabilities for corporate sustainability enable firms to monitor the emerging sustainability needs of various stakeholders, seize sustainable development opportunities from the rapidly changing stakeholders’ expectations, and reconfigure existing functional capabilities for corporate sustainability. Practical implications – The framework of dynamic capabilities for corporate sustainability developed in this paper may be used by practitioners to better understand firms’ status in the corporate sustainable development, identify areas of improvement, and more effectively overcome emerging sustainability challenges. Originality/value – This study makes an early attempt to extend the dynamic capabilities perspective to the area of corporate sustainable development.
Resumo:
In the new social media context, it is gradually more common to say that each party can itself be considered a media content provider, firms included (through their brand pages). This tendency is reflected in a rising professional field called “content marketing”. This study incorporates the perspective of small and medium-sized enterprises (SMEs) into the scope of social media (SM) as a marketing communications and media content distribution system. In an exploratory content analysis of 20 official SM brand pages with 1281 analyzed posts the authors study how SMEs respond to the advent of a new paradigm of marketing communications with special attention to their usage of media-specific contents. SM impels companies to eventually rethink the traditional one-way communication flow of their marketing messages and to incorporate a new, two-way communication into their marketing strategy, where (their engaged and involved) users can create, modify, share and discuss content related to the firm’s activity. This study’s preliminary results show that diffusing content generally acts for SMEs as a facilitator to involve fans by offering a thematized space for them to manifest themselves in company-related topics. Therefore, content adds to the firms’ possibilities of brand positioning by offering a reflection of fans’ company- and contentrelated behavior, which is a supplementary source of information.
Resumo:
New digital trends are transforming the media industry landscape, modifying elemental characteristics and attitudes of companies as well as of consumers. Firms often claim that their presence in social media (SM) is a key element to success. SM helps companies rethink the traditional one-way flow of their marketing messages and to incorporate a new interactive pattern into their communications. Nevertheless, these tendencies involve problems of strategic myopia for firms that do not structurally integrate these tools. One main problem is that institutions can rarely differentiate between the various types of SM and the attributes thereof, while the literature equally reveals a number of contradictions in the subject. The present conceptual paper lays the foundations of a strategic approach to SM and discusses its theoretical implications. Following an overview on the concept of SM, through a content analysis of the specialized management literature (n = 14), we present various best practices and reflect on the apparent lack of strategic thinking in using SM as a marketing application. Then, we compare these practical examples with general marketing strategy theory. By merging theory and practice, we aim to provide an insight towards a well-founded application of SM as a genuinely strategic marketing tool.
Resumo:
An assessment tool designed to measure a customer service orientation among RN's and LPN's was developed using a content-oriented approach. Critical incidents were first developed by asking two samples of healthcare managers (n = 52 and 25) to identify various customer-contact situations. The critical incidents were then used to formulate a 121-item instrument. Patient-contact workers from 3 hospitals (n = 102) completed the instrument along with the NEO-FFI, a measure of the Big Five personality factors. Concurrently, managers completed a performance evaluation scale on the employees participating in the study in order to determine the predictive validity of the instrument.^ Through a criterion-keying approach, the instrument was scaled down to 38 items. The correlation between HealthServe and the supervisory ratings of performance evaluation data supported the instrument's criterion-related validity (r =.66, p $<$.0001). Incremental validity of HealthServe over the Big Five was found with HealthServe accounting for 46% of the variance.^ The NEO-FFI was used to assess the correlation between personality traits and HealthServe. A factor analysis of HealthServe suggested 4 factors which were correlated with the NEO-FFI scores. Results indicated that HealthServe was related to Extraversion, Openness to Experience, Agreeableness, Conscientiousness and negatively related to Neuroticism.^ The benefits of the test construction procedure used here over the use of broad-based measures of personality were discussed as well as the limitations of using a concurrent validation strategy. Recommendations for future studies were provided. ^
Resumo:
The purpose of this research was to compare the delivery methods as practiced by higher education faculty teaching distance courses with recommended or emerging standard instructional delivery methods for distance education. Previous research shows that traditional-type instructional strategies have been used in distance education and that there has been no training to distance teach. Secondary data, however, appear to suggest emerging practices which could be pooled toward the development of standards. This is a qualitative study based on the constant comparative analysis approach of grounded theory.^ Participants (N = 5) of this study were full-time faculty teaching distance education courses. The observation method used was unobtrusive content analysis of videotaped instruction. Triangulation of data was accomplished through one-on-one in-depth interviews and from literature review. Due to the addition of non-media content being analyzed, a special time-sampling technique was designed by the researcher--influenced by content analyst theories of media-related data--to sample portions of the videotape instruction that were observed and counted. A standardized interview guide was used to collect data from in-depth interviews. Coding was done based on categories drawn from review of literature, and from Cranton and Weston's (1989) typology of instructional strategies. The data were observed, counted, tabulated, analyzed, and interpreted solely by the researcher. It should be noted however, that systematic and rigorous data collection and analysis led to credible data.^ The findings of this study supported the proposition that there are no standard instructional practices for distance teaching. Further, the findings revealed that of the emerging practices suggested by proponents and by faculty who teach distance education courses, few were practiced even minimally. A noted example was the use of lecture and questioning. Questioning, as a teaching tool was used a great deal, with students at the originating site but not with distance students. Lectures were given, but were mostly conducted in traditional fashion--long in duration and with no interactive component.^ It can be concluded from the findings that while there are no standard practices for instructional delivery for distance education, there appears to be sufficient information from secondary and empirical data to initiate some standard instructional practices. Therefore, grounded in this research data is the theory that the way to arrive at some instructional delivery standards for televised distance education is a pooling of the tacitly agreed-upon emerging practices by proponents and practicing instructors. Implicit in this theory is a need for experimental research so that these emerging practices can be tested, tried, and proven, ultimately resulting in formal standards for instructional delivery in television education. ^
Resumo:
The purpose of this study was to examine the perspectives of three graduates of a problem-based leaning (PBL) physical therapy (PT) program about their clinical practice. The study used the qualitative methods of observations, interviews, and journaling to gather the data. Three sessions of audiotaped interviews and two observation sessions were conducted with three exemplars from Nova Southeastern University PBL PT program. Each participant also maintained a reflective journal. The data were analyzed using content analysis. A systematic filing system was used by employing a mechanical means of maintaining and indexing coded data and sorting data into coded classifications of subtopics or themes. All interview transcripts, field notes from observations, and journal accounts were read, and index sheets were appropriately annotated. From the findings of the study, it was noted that, from the participants' perspectives, they were practicing at typically expected levels as clinicians. The attributes that governed the perspectives of the participants about their physical therapy clinical practice included flexibility, reflection, analysis, decision-making, self-reliance, problem-solving, independent thinking, and critical thinking. Further, the findings indicated that the factors that influenced those attributes included the PBL process, parents' value system, self-reliant personality, innate personality traits, and deliberate choice. Finally, the findings indicated that the participants' perspectives, for the most part, appeared to support the espoused efficacy of the PBL educational approach. In conclusion, there is evidence that the physical therapy clinical practice of the participants were positively impacted by the PBL curriculum. Among the many attributes they noted which governed these perspectives, problem-solving, as postulated by Barrows, was one of the most frequently mentioned benefits gained from their PBL PT training. With more schools adopting the PBL approach, this research will hopefully add to the knowledge base regarding the efficacy of embracing a problem-based learning instructional approach in physical therapy programs. ^
Resumo:
This qualitative case study was limited to an eighteen-hour workshop on “Constructing a Reflective Teacher Portfolio.” The study was conducted at the Nova Center, a research and development school, in the Broward County Public School System. Six participants took part in the study. The study examined the process used by the participants as they constructed their portfolios, explored the reflective aspect of their construction, and investigated the impact that constructing a portfolio had on them and their work. ^ Data was gathered using interviews, observations, and artifacts. Content analysis and the combined frameworks of Van Manen (1977), Smyth (1989), and Pugach and Johnson (1990) were used to examine the data. The data indicates that the portfolios and workshop were not as effective as anticipated in encouraging the participants to examine their work. The following themes emerged as a result of this study: (a) teachers begin constructing their portfolios by gathering material that represents past successes; (b) examining philosophies of education, writing a personal narrative and sharing with colleagues stimulates reflective practice; (c) teachers have difficulty expressing their personal beliefs about education; (d) creating a reflective portfolio is a constructivist process that encourages divergent products; (e) teachers initially do not recognize a strong connection between constructing a portfolio and improving their work; and (f) constructing a portfolio may be an inside-out approach to educational reform. ^ Recommendations were presented to improve the workshop, specifically focusing on teachers examining their practices and learning from students' work. Additional study is needed to evaluate the influence of these changes in the workshop. ^
Resumo:
Historical accuracy is only one of the components of a scholarly college textbook used to teach the history of jazz music. Textbooks in this field should include accurate ethnic representation of the most important musical figures as jazz is considered the only original American art form. As college and universities celebrate diversity, it is important that jazz history be accurate and complete. ^ The purpose of this study was to examine the content of the most commonly used jazz history textbooks currently used at American colleges and universities. This qualitative study utilized grounded and textual analysis to explore the existence of ethnic representation in these texts. The methods used were modeled after the work of Kane and Selden each of whom conducted a content analysis focused on a limited field of study. This study is focused on key jazz artists and composers whose work was created in the periods of early jazz (1915-1930), swing (1930-1945) and modern jazz (1945-1960). ^ This study considered jazz notables within the texts in terms of ethnic representation, authors' use of language, contributions to the jazz canon, and place in the standard jazz repertoire. Appropriate historical sections of the selected texts were reviewed and coded using predetermined rubrics. Data were then aggregated into categories and then analyzed according to the character assigned to the key jazz personalities noted in the text as well as the comparative standing afforded each personality. ^ The results of this study demonstrate that particular key African-American jazz artists and composers occupy a significant place in these texts while other significant individuals representing other ethnic groups are consistently overlooked. This finding suggests that while America and the world celebrates the quality of the product of American jazz as great musically and significant socially, many ethnic contributors are not mentioned with the result being a less than complete picture of the evolution of this American art form. ^
Resumo:
The necessity of elemental analysis techniques to solve forensic problems continues to expand as the samples collected from crime scenes grow in complexity. Laser ablation ICP-MS (LA-ICP-MS) has been shown to provide a high degree of discrimination between samples that originate from different sources. In the first part of this research, two laser ablation ICP-MS systems were compared, one using a nanosecond laser and another a femtosecond laser source for the forensic analysis of glass. The results showed that femtosecond LA-ICP-MS did not provide significant improvements in terms of accuracy, precision and discrimination, however femtosecond LA-ICP-MS did provide lower detection limits. In addition, it was determined that even for femtosecond LA-ICP-MS an internal standard should be utilized to obtain accurate analytical results for glass analyses. In the second part, a method using laser induced breakdown spectroscopy (LIBS) for the forensic analysis of glass was shown to provide excellent discrimination for a glass set consisting of 41 automotive fragments. The discrimination power was compared to two of the leading elemental analysis techniques, μXRF and LA-ICP-MS, and the results were similar; all methods generated >99% discrimination and the pairs found indistinguishable were similar. An extensive data analysis approach for LIBS glass analyses was developed to minimize Type I and II errors en route to a recommendation of 10 ratios to be used for glass comparisons. Finally, a LA-ICP-MS method for the qualitative analysis and discrimination of gel ink sources was developed and tested for a set of ink samples. In the first discrimination study, qualitative analysis was used to obtain 95.6% discrimination for a blind study consisting of 45 black gel ink samples provided by the United States Secret Service. A 0.4% false exclusion (Type I) error rate and a 3.9% false inclusion (Type II) error rate was obtained for this discrimination study. In the second discrimination study, 99% discrimination power was achieved for a black gel ink pen set consisting of 24 self collected samples. The two pairs found to be indistinguishable came from the same source of origin (the same manufacturer and type of pen purchased in different locations). It was also found that gel ink from the same pen, regardless of the age, was indistinguishable as were gel ink pens (four pens) originating from the same pack.