15 resultados para Content analysis (Communication) -- Data processing

em Digital Commons at Florida International University


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A purpose of this research study was to demonstrate the practical linguistic study and evaluation of dissertations by using two examples of the latest technology, the microcomputer and optical scanner. That involved developing efficient methods for data entry plus creating computer algorithms appropriate for personal, linguistic studies. The goal was to develop a prototype investigation which demonstrated practical solutions for maximizing the linguistic potential of the dissertation data base. The mode of text entry was from a Dest PC Scan 1000 Optical Scanner. The function of the optical scanner was to copy the complete stack of educational dissertations from the Florida Atlantic University Library into an I.B.M. XT microcomputer. The optical scanner demonstrated its practical value by copying 15,900 pages of dissertation text directly into the microcomputer. A total of 199 dissertations or 72% of the entire stack of education dissertations (277) were successfully copied into the microcomputer's word processor where each dissertation was analyzed for a variety of syntax frequencies. The results of the study demonstrated the practical use of the optical scanner for data entry, the microcomputer for data and statistical analysis, and the availability of the college library as a natural setting for text studies. A supplemental benefit was the establishment of a computerized dissertation corpus which could be used for future research and study. The final step was to build a linguistic model of the differences in dissertation writing styles by creating 7 factors from 55 dependent variables through principal components factor analysis. The 7 factors (textual components) were then named and described on a hypothetical construct defined as a continuum from a conversational, interactional style to a formal, academic writing style. The 7 factors were then grouped through discriminant analysis to create discriminant functions for each of the 7 independent variables. The results indicated that a conversational, interactional writing style was associated with more recent dissertations (1972-1987), an increase in author's age, females, and the department of Curriculum and Instruction. A formal, academic writing style was associated with older dissertations (1972-1987), younger authors, males, and the department of Administration and Supervision. It was concluded that there were no significant differences in writing style due to subject matter (community college studies) compared to other subject matter. It was also concluded that there were no significant differences in writing style due to the location of dissertation origin (Florida Atlantic University, University of Central Florida, Florida International University).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an analysis of articles involving children and youth in the last 9 years (1990–1998) of professional literature in recreational therapy. A total of 539 articles were analyzed to examine the authors, subjects, methods, and outcomes of therapeutic recreation studies published in three selected journals: Therapeutic Recreation Journal, Leisure Studies , and Leisure Sciences. A central finding was that the literature involving children and youth was very under-represented in the journals. Only 6.5% of the articles targeted children and youth; of this group, approximately two thirds were research-based; the remaining articles were conceptual papers. The findings are discussed in terms of the need for future scholarships in recreational therapy to target children and youth, including those with disabilities. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation is a study of customer relationship management theory and practice. Customer Relationship Management (CRM) is a business strategy whereby companies build strong relationships with existing and prospective customers with the goal of increasing organizational profitability. It is also a learning process involving managing change in processes, people, and technology. CRM implementation and its ramifications are also not completely understood as evidenced by the high number of failures in CRM implementation in organizations and the resulting disappointments. ^ The goal of this dissertation is to study emerging issues and trends in CRM, including the effect of computer software and the accompanying new management processes on organizations, and the dynamics of the alignment of marketing, sales and services, and all other functions responsible for delivering customers a satisfying experience. ^ In order to understand CRM better a content analysis of more than a hundred articles and documents from academic and industry sources was undertaken using a new methodological twist to the traditional method. An Internet domain name (http://crm.fiu.edu) was created for the purpose of this research by uploading an initial one hundred plus abstracts of articles and documents onto it to form a knowledge database. Once the database was formed a search engine was developed to enable the search of abstracts using relevant CRM keywords to reveal emergent dominant CRM topics. The ultimate aim of this website is to serve as an information hub for CRM research, as well as a search engine where interested parties can enter CRM-relevant keywords or phrases to access abstracts, as well as submit abstracts to enrich the knowledge hub. ^ Research questions were investigated and answered by content analyzing the interpretation and discussion of dominant CRM topics and then amalgamating the findings. This was supported by comparisons within and across individual, paired, and sets-of-three occurrences of CRM keywords in the article abstracts. ^ Results show that there is a lack of holistic thinking and discussion of CRM in both academics and industry which is required to understand how the people, process, and technology in CRM impact each other to affect successful implementation. Industry has to get their heads around CRM and holistically understand how these important dimensions affect each other. Only then will organizational learning occur, and overtime result in superior processes leading to strong profitable customer relationships and a hard to imitate competitive advantage. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As massive data sets become increasingly available, people are facing the problem of how to effectively process and understand these data. Traditional sequential computing models are giving way to parallel and distributed computing models, such as MapReduce, both due to the large size of the data sets and their high dimensionality. This dissertation, as in the same direction of other researches that are based on MapReduce, tries to develop effective techniques and applications using MapReduce that can help people solve large-scale problems. Three different problems are tackled in the dissertation. The first one deals with processing terabytes of raster data in a spatial data management system. Aerial imagery files are broken into tiles to enable data parallel computation. The second and third problems deal with dimension reduction techniques that can be used to handle data sets of high dimensionality. Three variants of the nonnegative matrix factorization technique are scaled up to factorize matrices of dimensions in the order of millions in MapReduce based on different matrix multiplication implementations. Two algorithms, which compute CANDECOMP/PARAFAC and Tucker tensor decompositions respectively, are parallelized in MapReduce based on carefully partitioning the data and arranging the computation to maximize data locality and parallelism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study used content analysis to interpret and evaluate outcome evaluation matrices of undergraduate Global Learning foundations courses. The findings revealed a lack of uniformity in the faculty members’ interpretation and implementation of global learning components in the coursework. Successful teaching practices and challenges were identified and classified.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines the triple bottom line of sustainability, in the context of both profit-oriented and non-profit oriented organizations. Sustainability is a compound result of interaction between economic, environmental, and social dimensions. Sustainability cannot be achieved without balance between all three dimensions, which has implications for measuring sustainability and prioritizing goals. This study demonstrates a method for measuring organizational sustainability achievement in these three dimensions of sustainability. Content analysis of the annual reports of corporations from the United States, Continental Europe (and Scandinavia), and Asia reveals that the economic dimension remains the preeminent aspect, and corporations still have a long way to go to reach comprehensive sustainability by maintaining a balance between the three dimensions of sustainability. The analysis also shows a high level of isomorphism in the sustainability practices of corporations, suggesting that even the most sustainable corporations are taking a somewhat passive role in prioritizing sustainability goals. A list of 25 terms for each dimension of sustainability (economic, environmental, and social) has been developed which can be used by corporations to develop and communicate their sustainability practices most effectively to the maximum number of their stakeholders. In contrast, botanical gardens demonstrate more balance among the three dimensions of sustainability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Television (TV) reaches more people than any other medium which makes it an important source of health information. Since TV ads often offer information obliquely, this study investigated implied health messages found in food and nutrition TV ads. The goals were to determine the proportion of food and nutrition ads among all TV advertising and to use content analysis to identify their implied messages and health claims. A randomly selected sample of TV ads were collected over a 28-day period beginning May 8, 1987. The sample contained 3547 ads; 725 (20%) were food-related. All were analyzed. About 10% of food-related TV ads contained a health claim. Twenty-five representative ads of the 725 food ads were also reviewed by 10 dietitians to test the reliability of the instrument. Although the dietitians agreed upon whether a health claim existed in a televised food ad, their agreement was poor when evaluating the accuracy of the claim. The number of food-related ads dropped significantly on Saturday, but the number of alcohol ads rose sharply on Saturday and Sunday. Snack ads were shown more often on Thursday, but snack commercials were also numerous on Saturday morning and afternoon, as were cereal ads. Ads for snack foods accounted for the greatest proportion of ads (20%) while fast food accounted for only 7%. Alcohol constituted about 9% of all food and nutrition ads.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as “histogram binning” inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. ^ Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. ^ The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. ^ These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. ^ In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as "histogram binning" inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The microarray technology provides a high-throughput technique to study gene expression. Microarrays can help us diagnose different types of cancers, understand biological processes, assess host responses to drugs and pathogens, find markers for specific diseases, and much more. Microarray experiments generate large amounts of data. Thus, effective data processing and analysis are critical for making reliable inferences from the data. ^ The first part of dissertation addresses the problem of finding an optimal set of genes (biomarkers) to classify a set of samples as diseased or normal. Three statistical gene selection methods (GS, GS-NR, and GS-PCA) were developed to identify a set of genes that best differentiate between samples. A comparative study on different classification tools was performed and the best combinations of gene selection and classifiers for multi-class cancer classification were identified. For most of the benchmarking cancer data sets, the gene selection method proposed in this dissertation, GS, outperformed other gene selection methods. The classifiers based on Random Forests, neural network ensembles, and K-nearest neighbor (KNN) showed consistently god performance. A striking commonality among these classifiers is that they all use a committee-based approach, suggesting that ensemble classification methods are superior. ^ The same biological problem may be studied at different research labs and/or performed using different lab protocols or samples. In such situations, it is important to combine results from these efforts. The second part of the dissertation addresses the problem of pooling the results from different independent experiments to obtain improved results. Four statistical pooling techniques (Fisher inverse chi-square method, Logit method. Stouffer's Z transform method, and Liptak-Stouffer weighted Z-method) were investigated in this dissertation. These pooling techniques were applied to the problem of identifying cell cycle-regulated genes in two different yeast species. As a result, improved sets of cell cycle-regulated genes were identified. The last part of dissertation explores the effectiveness of wavelet data transforms for the task of clustering. Discrete wavelet transforms, with an appropriate choice of wavelet bases, were shown to be effective in producing clusters that were biologically more meaningful. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Historical accuracy is only one of the components of a scholarly college textbook used to teach the history of jazz music. Textbooks in this field should include accurate ethnic representation of the most important musical figures as jazz is considered the only original American art form. As college and universities celebrate diversity, it is important that jazz history be accurate and complete. ^ The purpose of this study was to examine the content of the most commonly used jazz history textbooks currently used at American colleges and universities. This qualitative study utilized grounded and textual analysis to explore the existence of ethnic representation in these texts. The methods used were modeled after the work of Kane and Selden each of whom conducted a content analysis focused on a limited field of study. This study is focused on key jazz artists and composers whose work was created in the periods of early jazz (1915-1930), swing (1930-1945) and modern jazz (1945-1960). ^ This study considered jazz notables within the texts in terms of ethnic representation, authors' use of language, contributions to the jazz canon, and place in the standard jazz repertoire. Appropriate historical sections of the selected texts were reviewed and coded using predetermined rubrics. Data were then aggregated into categories and then analyzed according to the character assigned to the key jazz personalities noted in the text as well as the comparative standing afforded each personality. ^ The results of this study demonstrate that particular key African-American jazz artists and composers occupy a significant place in these texts while other significant individuals representing other ethnic groups are consistently overlooked. This finding suggests that while America and the world celebrates the quality of the product of American jazz as great musically and significant socially, many ethnic contributors are not mentioned with the result being a less than complete picture of the evolution of this American art form. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examined how the themes of environmental sustainability are evident in the national, state and local standards that guide k–12 science curriculum. The study applied the principles of content analysis within the framework of an ecological paradigm. In education, an ecological paradigm focuses on students' use of a holistic lens to view and understand material. The intent of this study was to analyze the seventh grade science content standards at the national, state, and local textbook levels to determine how and the extent to which each of the five themes of environmental sustainability are presented in the language of each text. The themes are: (a) Climate Change Indicators, (b) Biodiversity, (c) Human Population Density, (d) Impact and Presence of Environmental Pollution, (e) Earth as a Closed System. The research study offers practical insight on using a method of content analysis to locate keywords of environmental sustainability in the three texts and determine if the context of each term relates to this ecological paradigm. Using a concordance program, the researcher identified the frequency and context of each vocabulary item associated with these themes. Nine chi squares were run to determine if there were differences in content between the national and state standards and the textbook. Within each level chi squares were also run to determine if there were differences between the appearance of content knowledge and skill words. Results indicate that there is a lack of agreement between levels that is significant p < .01. A discussion of these results in relation to curriculum development and standardized assessments followed. The study found that at the national and state levels, there is a lack of articulation of the goals of environmental sustainability or an ecological paradigm. With respect to the science textbook, a greater number of keywords were present; however, the context of many of these keywords did not align with the discourse of an ecological paradigm. Further, the environmental sustainability themes present in the textbook were limited to the last four chapters of the text. Additional research is recommended to determine whether this situation also exists in other settings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation established a software-hardware integrated design for a multisite data repository in pediatric epilepsy. A total of 16 institutions formed a consortium for this web-based application. This innovative fully operational web application allows users to upload and retrieve information through a unique human-computer graphical interface that is remotely accessible to all users of the consortium. A solution based on a Linux platform with My-SQL and Personal Home Page scripts (PHP) has been selected. Research was conducted to evaluate mechanisms to electronically transfer diverse datasets from different hospitals and collect the clinical data in concert with their related functional magnetic resonance imaging (fMRI). What was unique in the approach considered is that all pertinent clinical information about patients is synthesized with input from clinical experts into 4 different forms, which were: Clinical, fMRI scoring, Image information, and Neuropsychological data entry forms. A first contribution of this dissertation was in proposing an integrated processing platform that was site and scanner independent in order to uniformly process the varied fMRI datasets and to generate comparative brain activation patterns. The data collection from the consortium complied with the IRB requirements and provides all the safeguards for security and confidentiality requirements. An 1-MR1-based software library was used to perform data processing and statistical analysis to obtain the brain activation maps. Lateralization Index (LI) of healthy control (HC) subjects in contrast to localization-related epilepsy (LRE) subjects were evaluated. Over 110 activation maps were generated, and their respective LIs were computed yielding the following groups: (a) strong right lateralization: (HC=0%, LRE=18%), (b) right lateralization: (HC=2%, LRE=10%), (c) bilateral: (HC=20%, LRE=15%), (d) left lateralization: (HC=42%, LRE=26%), e) strong left lateralization: (HC=36%, LRE=31%). Moreover, nonlinear-multidimensional decision functions were used to seek an optimal separation between typical and atypical brain activations on the basis of the demographics as well as the extent and intensity of these brain activations. The intent was not to seek the highest output measures given the inherent overlap of the data, but rather to assess which of the many dimensions were critical in the overall assessment of typical and atypical language activations with the freedom to select any number of dimensions and impose any degree of complexity in the nonlinearity of the decision space.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigates the effects of content-based ESOL instruction on the overall English proficiency of foreign-born college students. Based on various psychological and social factors which affect second language acquisition, it is suggested that the techniques of content-based instruction, while focusing on subject matter, allow the learners to overcome the language barrier by neutralizing their subconscious defense mechanism, thus attaining greater proficiency.^ Two groups of Miami-Dade Community College ESOL students were chosen as subjects for this study: a control group composed of students from the North and Wolfson campuses, where the ESOL program is based predominantly on structural or structural-functional approach, and an experimental group of Medical Center campus students, where content-based instruction is incorporated into curriculum. Ethnicity, gender, age, and other differences in the population are discussed in the study.^ The students' English Placement Test (EPT) scores were used as covariate, and the scores on Multiple Assessment Programs and Services (MAPS) test as dependent variables. Multivariate analysis of variance (MANOVA) was applied to test significant difference between the means. The results of the analysis of data indicate that there is a consistent difference in the mean performance of the Medical Center campus ESOL students demonstrated by their scores on MAPS. Although neither ethnicity, nor gender of the subjects has affected the outcome, age had a contributing effect. The implications of these findings suggest that content-based instruction facilitates greater overall English proficiency in foreign-born college students. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Unequaled improvements in processor and I/O speeds make many applications such as databases and operating systems to be increasingly I/O bound. Many schemes such as disk caching and disk mirroring have been proposed to address the problem. In this thesis we focus only on disk mirroring. In disk mirroring, a logical disk image is maintained on two physical disks allowing a single disk failure to be transparent to application programs. Although disk mirroring improves data availability and reliability, it has two major drawbacks. First, writes are expensive because both disks must be updated. Second, load balancing during failure mode operation is poor because all requests are serviced by the surviving disk. Distorted mirrors was proposed to address the write problem and interleaved declustering to address the load balancing problem. In this thesis we perform a comparative study of these two schemes under various operating modes. In addition we also study traditional mirroring to provide a common basis for comparison.