906 resultados para Content analysis (Communication) -- Data processing
Resumo:
Whereas whole first-milked colostrum IgG1 variation is documented, the IgG1 difference between the quarter mammary glands of dairy animals is unknown. First colostrum was quarter-collected from healthy udders of 8 multiparous dairy cows, all within 3h of parturition. Weight of colostrum produced by individual quarters was determined and a sample of each was frozen for subsequent analysis. Immunoglobulin G1 concentration (mg/mL) was measured by ELISA and total mass (g) was calculated. Standard addition method was used to overcome colostrum matrix effects and validate the standard ELISA measures. Analysis of the data showed that cow and quarter (cow) were significantly different in both concentration and total mass per quarter. Analysis of the mean IgG1 concentration of the front and rear quarters showed that this was not different, but the large variation in individual quarters confounds the analysis. This quarter difference finding indicates that each mammary gland develops a different capacity to accumulate precolostrum IgG1, whereas the circulating hormone concentrations that induce colostrogenesis reach the 4 glands similarly. This finding also shows that the variation in quarter colostrum production is a contributor to the vast variation in first milking colostrum IgG1 content. Finally, the data suggests other factors, such as locally acting autocrine or paracrine, epigenetic, or stochasticity, in gene regulation mechanisms may impinge on colostrogenesis capacity.
Resumo:
Clinical Research Data Quality Literature Review and Pooled Analysis We present a literature review and secondary analysis of data accuracy in clinical research and related secondary data uses. A total of 93 papers meeting our inclusion criteria were categorized according to the data processing methods. Quantitative data accuracy information was abstracted from the articles and pooled. Our analysis demonstrates that the accuracy associated with data processing methods varies widely, with error rates ranging from 2 errors per 10,000 files to 5019 errors per 10,000 fields. Medical record abstraction was associated with the highest error rates (70–5019 errors per 10,000 fields). Data entered and processed at healthcare facilities had comparable error rates to data processed at central data processing centers. Error rates for data processed with single entry in the presence of on-screen checks were comparable to double entered data. While data processing and cleaning methods may explain a significant amount of the variability in data accuracy, additional factors not resolvable here likely exist. Defining Data Quality for Clinical Research: A Concept Analysis Despite notable previous attempts by experts to define data quality, the concept remains ambiguous and subject to the vagaries of natural language. This current lack of clarity continues to hamper research related to data quality issues. We present a formal concept analysis of data quality, which builds on and synthesizes previously published work. We further posit that discipline-level specificity may be required to achieve the desired definitional clarity. To this end, we combine work from the clinical research domain with findings from the general data quality literature to produce a discipline-specific definition and operationalization for data quality in clinical research. While the results are helpful to clinical research, the methodology of concept analysis may be useful in other fields to clarify data quality attributes and to achieve operational definitions. Medical Record Abstractor’s Perceptions of Factors Impacting the Accuracy of Abstracted Data Medical record abstraction (MRA) is known to be a significant source of data errors in secondary data uses. Factors impacting the accuracy of abstracted data are not reported consistently in the literature. Two Delphi processes were conducted with experienced medical record abstractors to assess abstractor’s perceptions about the factors. The Delphi process identified 9 factors that were not found in the literature, and differed with the literature by 5 factors in the top 25%. The Delphi results refuted seven factors reported in the literature as impacting the quality of abstracted data. The results provide insight into and indicate content validity of a significant number of the factors reported in the literature. Further, the results indicate general consistency between the perceptions of clinical research medical record abstractors and registry and quality improvement abstractors. Distributed Cognition Artifacts on Clinical Research Data Collection Forms Medical record abstraction, a primary mode of data collection in secondary data use, is associated with high error rates. Distributed cognition in medical record abstraction has not been studied as a possible explanation for abstraction errors. We employed the theory of distributed representation and representational analysis to systematically evaluate cognitive demands in medical record abstraction and the extent of external cognitive support employed in a sample of clinical research data collection forms. We show that the cognitive load required for abstraction in 61% of the sampled data elements was high, exceedingly so in 9%. Further, the data collection forms did not support external cognition for the most complex data elements. High working memory demands are a possible explanation for the association of data errors with data elements requiring abstractor interpretation, comparison, mapping or calculation. The representational analysis used here can be used to identify data elements with high cognitive demands.
Resumo:
Mode of access: Internet.
Resumo:
"Research was supported by the United States Air Force through the Air Force Office of Scientific Research, Air Research and Development Command."
Resumo:
Fluorescence spectroscopy has recently become more common in clinical medicine. However, there are still many unresolved issues related to the methodology and implementation of instruments with this technology. In this study, we aimed to assess individual variability of fluorescence parameters of endogenous markers (NADH, FAD, etc.) measured by fluorescent spectroscopy (FS) in situ and to analyse the factors that lead to a significant scatter of results. Most studied fluorophores have an acceptable scatter of values (mostly up to 30%) for diagnostic purposes. Here we provide evidence that the level of blood volume in tissue impacts FS data with a significant inverse correlation. The distribution function of the fluorescence intensity and the fluorescent contrast coefficient values are a function of the normal distribution for most of the studied fluorophores and the redox ratio. The effects of various physiological (different content of skin melanin) and technical (characteristics of optical filters) factors on the measurement results were additionally studied.The data on the variability of the measurement results in FS should be considered when interpreting the diagnostic parameters, as well as when developing new algorithms for data processing and FS devices.
Resumo:
The microarray technology provides a high-throughput technique to study gene expression. Microarrays can help us diagnose different types of cancers, understand biological processes, assess host responses to drugs and pathogens, find markers for specific diseases, and much more. Microarray experiments generate large amounts of data. Thus, effective data processing and analysis are critical for making reliable inferences from the data. ^ The first part of dissertation addresses the problem of finding an optimal set of genes (biomarkers) to classify a set of samples as diseased or normal. Three statistical gene selection methods (GS, GS-NR, and GS-PCA) were developed to identify a set of genes that best differentiate between samples. A comparative study on different classification tools was performed and the best combinations of gene selection and classifiers for multi-class cancer classification were identified. For most of the benchmarking cancer data sets, the gene selection method proposed in this dissertation, GS, outperformed other gene selection methods. The classifiers based on Random Forests, neural network ensembles, and K-nearest neighbor (KNN) showed consistently god performance. A striking commonality among these classifiers is that they all use a committee-based approach, suggesting that ensemble classification methods are superior. ^ The same biological problem may be studied at different research labs and/or performed using different lab protocols or samples. In such situations, it is important to combine results from these efforts. The second part of the dissertation addresses the problem of pooling the results from different independent experiments to obtain improved results. Four statistical pooling techniques (Fisher inverse chi-square method, Logit method. Stouffer's Z transform method, and Liptak-Stouffer weighted Z-method) were investigated in this dissertation. These pooling techniques were applied to the problem of identifying cell cycle-regulated genes in two different yeast species. As a result, improved sets of cell cycle-regulated genes were identified. The last part of dissertation explores the effectiveness of wavelet data transforms for the task of clustering. Discrete wavelet transforms, with an appropriate choice of wavelet bases, were shown to be effective in producing clusters that were biologically more meaningful. ^
Resumo:
Climate change is thought to be one of the most pressing environmental problems facing humanity. However, due in part to failures in political communication and how the issue has been historically defined in American politics, discussions of climate change remain gridlocked and polarized. In this dissertation, I explore how climate change has been historically constructed as a political issue, how conflicts between climate advocates and skeptics have been communicated, and what effects polarization has had on political communication, particularly on the communication of climate change to skeptical audiences. I use a variety of methodological tools to consider these questions, including evolutionary frame analysis, which uses textual data to show how issues are framed and constructed over time; Kullback-Leibler divergence content analysis, which allows for comparison of advocate and skeptical framing over time; and experimental framing methods to test how audiences react to and process different presentations of climate change. I identify six major portrayals of climate change from 1988 to 2012, but find that no single construction of the issue has dominated the public discourse defining the problem. In addition, the construction of climate change may be associated with changes in public political sentiment, such as greater pessimism about climate action when the electorate becomes more conservative. As the issue of climate change has become more polarized in American politics, one proposed causal pathway for the observed polarization is that advocate and skeptic framing of climate change focuses on different facets of the issue and ignores rival arguments, a practice known as “talking past.” However, I find no evidence of increased talking past in 25 years of popular newsmedia reporting on the issue, suggesting both that talking past has not driven public polarization or that polarization is occurring in venues outside of the mainstream public discourse, such as blogs. To examine how polarization affects political communication on climate change, I test the cognitive processing of a variety of messages and sources that promote action against climate change among Republican individuals. Rather than identifying frames that are powerful enough to overcome polarization, I find that Republicans exhibit telltale signs of motivated skepticism on the issue, that is, they reject framing that runs counter to their party line and political identity. This result suggests that polarization constrains political communication on polarized issues, overshadowing traditional message and source effects of framing and increasing the difficulty communicators experience in reaching skeptical audiences.
Resumo:
Canadian young people are increasingly more connected through technological devices. This computer-mediated communication (CMC) can result in heightened connection and social support but can also lead to inadequate personal and physical connections. As technology evolves, its influence on health and well-being is important to investigate, especially among youth. This study aims to investigate the potential influences of computer-mediated communication (CMC) on the health of Canadian youth, using both quantitative and qualitative research approaches. This mixed-methods study utilized data from the 2013-2014 Health Behaviour in School-aged Children survey for Canada (n=30,117) and focus group data involving Ontario youth (7 groups involving 40 youth). In the quantitative component, a random-effects multilevel Poisson regression was employed to identify the effects of CMC on loneliness, stratified to explore interaction with family communication quality. A qualitative, inductive content analysis was applied to the focus group transcripts using a grounded theory inspired methodology. Through open line-by-line coding followed by axial coding, main categories and themes were identified. The quality of family communication modified the association between CMC use and loneliness. Among youth experiencing the highest quartile of family communication, daily use of verbal and social media CMC was significantly associated with reports of loneliness. The qualitative analysis revealed two overarching concepts that: (1) the health impacts of CMC are multidimensional and (2) there exists a duality of both positive and negative influences of CMC on health. Four themes were identified within this framework: (1) physical activity, (2) mental and emotional disturbance, (3) mindfulness, and (4) relationships. Overall, there is a high proportion of loneliness among Canadian youth, but this is not uniform for all. The associations between CMC and health are influenced by external and contextual factors, including family communication quality. Further, the technologically rich world in which young people live has a diverse impact on their health. For youth, their relationships with others and the context of CMC use shape overall influences on their health.
Resumo:
Purpose: To qualitatively explore the communication between healthcare professionals and oncology patients based on the perception of patients undergoing chemotherapy.Method: Qualitative and exploratory design. Participants were 14 adult patients undergoing chemotherapy at different stages of the disease. A socio-demographic and clinical data form was utilized along with semi-structured interviews. The interviews were audio-recorded, transcribed and content analysis was performed. Two independent judges evaluated the interview content in regards to emerging categories and obtained a Kappa index of 0.834.Results: Three categories emerged from the data: 1) Technical communication without emotional support, in which the information provided is composed of strictly technical information regarding the diagnosis, treatment and/or prognosis; 2) Technical communication, in which the information provided is oriented towards the technical aspects of the patient’s physical condition, while also providing psychological support for the patients’ subjective needs; and 3) Insufficient technical communication, win which there are gaps in the information provided causing confusion and suffering to the patient.Conclusions: Communication with emotional support contributes to greater satisfaction of chemotherapy patients. Practical implications: the results provide elements for the training of healthcare professionals regarding the importance of the emotional support that can be offered to cancer patients during their treatment.
Resumo:
This paper is part of a special issue of Applied Geochemistry focusing on reliable applications of compositional multivariate statistical methods. This study outlines the application of compositional data analysis (CoDa) to calibration of geochemical data and multivariate statistical modelling of geochemistry and grain-size data from a set of Holocene sedimentary cores from the Ganges-Brahmaputra (G-B) delta. Over the last two decades, understanding near-continuous records of sedimentary sequences has required the use of core-scanning X-ray fluorescence (XRF) spectrometry, for both terrestrial and marine sedimentary sequences. Initial XRF data are generally unusable in ‘raw-format’, requiring data processing in order to remove instrument bias, as well as informed sequence interpretation. The applicability of these conventional calibration equations to core-scanning XRF data are further limited by the constraints posed by unknown measurement geometry and specimen homogeneity, as well as matrix effects. Log-ratio based calibration schemes have been developed and applied to clastic sedimentary sequences focusing mainly on energy dispersive-XRF (ED-XRF) core-scanning. This study has applied high resolution core-scanning XRF to Holocene sedimentary sequences from the tidal-dominated Indian Sundarbans, (Ganges-Brahmaputra delta plain). The Log-Ratio Calibration Equation (LRCE) was applied to a sub-set of core-scan and conventional ED-XRF data to quantify elemental composition. This provides a robust calibration scheme using reduced major axis regression of log-ratio transformed geochemical data. Through partial least squares (PLS) modelling of geochemical and grain-size data, it is possible to derive robust proxy information for the Sundarbans depositional environment. The application of these techniques to Holocene sedimentary data offers an improved methodological framework for unravelling Holocene sedimentation patterns.
Resumo:
The advancement of GPS technology has made it possible to use GPS devices as orientation and navigation tools, but also as tools to track spatiotemporal information. GPS tracking data can be broadly applied in location-based services, such as spatial distribution of the economy, transportation routing and planning, traffic management and environmental control. Therefore, knowledge of how to process the data from a standard GPS device is crucial for further use. Previous studies have considered various issues of the data processing at the time. This paper, however, aims to outline a general procedure for processing GPS tracking data. The procedure is illustrated step-by-step by the processing of real-world GPS data of car movements in Borlänge in the centre of Sweden.
Resumo:
The production and perception of music is a multimodal activity involving auditory, visual and conceptual processing, integrating these with prior knowledge and environmental experience. Musicians utilise expressive physical nuances to highlight salient features of the score. The question arises within the literature as to whether performers’ non-technical, non-sound-producing movements may be communicatively meaningful and convey important structural information to audience members and co-performers. In the light of previous performance research (Vines et al., 2006, Wanderley, 2002, Davidson, 1993), and considering findings within co-speech gestural research and auditory and audio-visual neuroscience, this thesis examines the nature of those movements not directly necessary for the production of sound, and their particular influence on audience perception. Within the current research 3D performance analysis is conducted using the Vicon 12- camera system and Nexus data-processing software. Performance gestures are identified as repeated patterns of motion relating to music structure, which not only express phrasing and structural hierarchy but are consistently and accurately interpreted as such by a perceiving audience. Gestural characteristics are analysed across performers and performance style using two Chopin preludes selected for their diverse yet comparable structures (Opus 28:7 and 6). Effects on perceptual judgements of presentation modes (visual-only, auditory-only, audiovisual, full- and point-light) and viewing conditions are explored. This thesis argues that while performance style is highly idiosyncratic, piano performers reliably generate structural gestures through repeated patterns of upper-body movement. The shapes and locations of phrasing motions are identified particular to the sample of performers investigated. Findings demonstrate that despite the personalised nature of the gestures, performers use increased velocity of movements to emphasise musical structure and that observers accurately and consistently locate phrasing junctures where these patterns and variation in motion magnitude, shape and velocity occur. By viewing performance motions in polar (spherical) rather than cartesian coordinate space it is possible to get mathematically closer to the movement generated by each of the nine performers, revealing distinct patterns of motion relating to phrasing structures, regardless of intended performance style. These patterns are highly individualised both to each performer and performed piece. Instantaneous velocity analysis indicates a right-directed bias of performance motion variation at salient structural features within individual performances. Perceptual analyses demonstrate that audience members are able to accurately and effectively detect phrasing structure from performance motion alone. This ability persists even for degraded point-light performances, where all extraneous environmental information has been removed. The relative contributions of audio, visual and audiovisual judgements demonstrate that the visual component of a performance does positively impact on the over- all accuracy of phrasing judgements, indicating that receivers are most effective in their recognition of structural segmentations when they can both see and hear a performance. Observers appear to make use of a rapid online judgement heuristics, adjusting response processes quickly to adapt and perform accurately across multiple modes of presentation and performance style. In line with existent theories within the literature, it is proposed that this processing ability may be related to cognitive and perceptual interpretation of syntax within gestural communication during social interaction and speech. Findings of this research may have future impact on performance pedagogy, computational analysis and performance research, as well as potentially influencing future investigations of the cognitive aspects of musical and gestural understanding.
Resumo:
Trabalho de Projeto apresentado à Escola Superior de Educação do Instituto Politécnico de Castelo Branco para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Educação Especial – Domínio Cognitivo e Motor.
Resumo:
The aim of this novel experimental study is to investigate the behaviour of a 2m x 2m model of a masonry groin vault, which is built by the assembly of blocks made of a 3D-printed plastic skin filled with mortar. The choice of the groin vault is due to the large presence of this vulnerable roofing system in the historical heritage. Experimental tests on the shaking table are carried out to explore the vault response on two support boundary conditions, involving four lateral confinement modes. The data processing of markers displacement has allowed to examine the collapse mechanisms of the vault, based on the arches deformed shapes. There then follows a numerical evaluation, to provide the orders of magnitude of the displacements associated to the previous mechanisms. Given that these displacements are related to the arches shortening and elongation, the last objective is the definition of a critical elongation between two diagonal bricks and consequently of a diagonal portion. This study aims to continue the previous work and to take another step forward in the research of ground motion effects on masonry structures.