968 resultados para Analysis process
Resumo:
This article reflects on key methodological issues emerging from children and young people's involvement in data analysis processes. We outline a pragmatic framework illustrating different approaches to engaging children, using two case studies of children's experiences of participating in data analysis. The article highlights methods of engagement and important issues such as the balance of power between adults and children, training, support, ethical considerations, time and resources. We argue that involving children in data analysis processes can have several benefits, including enabling a greater understanding of children's perspectives and helping to prioritise children's agendas in policy and practice. (C) 2007 The Author(s). Journal compilation (C) 2007 National Children's Bureau.
Resumo:
Background: Microarray based comparative genomic hybridisation (CGH) experiments have been used to study numerous biological problems including understanding genome plasticity in pathogenic bacteria. Typically such experiments produce large data sets that are difficult for biologists to handle. Although there are some programmes available for interpretation of bacterial transcriptomics data and CGH microarray data for looking at genetic stability in oncogenes, there are none specifically to understand the mosaic nature of bacterial genomes. Consequently a bottle neck still persists in accurate processing and mathematical analysis of these data. To address this shortfall we have produced a simple and robust CGH microarray data analysis process that may be automated in the future to understand bacterial genomic diversity. Results: The process involves five steps: cleaning, normalisation, estimating gene presence and absence or divergence, validation, and analysis of data from test against three reference strains simultaneously. Each stage of the process is described and we have compared a number of methods available for characterising bacterial genomic diversity, for calculating the cut-off between gene presence and absence or divergence, and shown that a simple dynamic approach using a kernel density estimator performed better than both established, as well as a more sophisticated mixture modelling technique. We have also shown that current methods commonly used for CGH microarray analysis in tumour and cancer cell lines are not appropriate for analysing our data. Conclusion: After carrying out the analysis and validation for three sequenced Escherichia coli strains, CGH microarray data from 19 E. coli O157 pathogenic test strains were used to demonstrate the benefits of applying this simple and robust process to CGH microarray studies using bacterial genomes.
Resumo:
The monitoring data collected during tunnel excavation can be used in inverse analysis procedures in order to identify more realistic geomechanical parameters that can increase the knowledge about the interested formations. These more realistic parameters can be used in real time to adapt the project to the real structure in situ behaviour. However, monitoring plans are normally designed for safety assessment and not especially for the purpose of inverse analysis. In fact, there is a lack of knowledge about what types and quantity of measurements are needed to succeed in identifying the parameters of interest. Also, the optimisation algorithm chosen for the identification procedure may be important for this matter. In this work, this problem is addressed using a theoretical case with which a thorough parametric study was carried out using two optimisation algorithms based on different calculation paradigms, namely a conventional gradient-based algorithm and an evolution strategy algorithm. Calculations were carried for different sets of parameters to identify several combinations of types and amount of monitoring data. The results clearly show the high importance of the available monitoring data and the chosen algorithm for the success rate of the inverse analysis process.
Resumo:
The aim of this research was to develop a piping stress analysis guideline to be widely used in Neste Jacobs Oy’s domestic and foreign projects. The company’s former guideline to performing stress analysis was partial and lacked important features, which were to be fixed through this research. The development of the guideline was based on literature research and gathering of existing knowledge from the experts in piping engineering. Case study method was utilized by performing stress analysis on an existing project with help of the new guideline. Piping components, piping engineering in process industry, and piping stress analysis were studied in the theory section of this research. Also, the existing piping standards were studied and compared with one another. By utilizing the theory found in literature and the vast experience and know-how collected from the company’s employees, a new guideline for stress analysis was developed. The guideline would be widely used in various projects. The purpose of the guideline was to clarify certain issues such as which of the piping would have to be analyzed, how are different material values determined and how will the results be reported. As a result, an extensive and comprehensive guideline for stress analysis was created. The new guideline more clearly defines formerly unclear points and creates clear parameters to performing calculations. The guideline is meant to be used by both new and experienced analysts and with its aid, the calculation process was unified throughout the whole company’s organization. Case study was used to exhibit how the guideline is utilized in practice, and how it benefits the calculation process.
Resumo:
Abstract Sugarcane monosaccharides are reducing sugars, and classical analytical methodologies (Lane-Eynon, Benedict, complexometric-EDTA, Luff-Schoorl, Musson-Walker, Somogyi-Nelson) are based on reducing copper ions in alkaline solutions. In Brazil, certain factories use Lane-Eynon, others use the equipment referred to as “REDUTEC”, and additional factories analyze reducing sugars based on a mathematic model. The objective of this paper is to understand the relationship between variations in millivolts, mass and tenors of reducing sugars during the analysis process. Another objective is to generate an automatic model for this process. The work herein uses the equipment referred to as “REDUTEC”, a digital balance, a peristaltic pump, a digital camcorder, math programs and graphics programs. We conclude that the millivolts, mass and tenors of reducing sugars exhibit a good mathematical correlation, and the mathematical model generated was benchmarked to low-concentration reducing sugars (<0.3%). Using the model created herein, reducing sugars analyses can be automated using the new equipment.
Resumo:
This article gives an overview over the methods used in the low--level analysis of gene expression data generated using DNA microarrays. This type of experiment allows to determine relative levels of nucleic acid abundance in a set of tissues or cell populations for thousands of transcripts or loci simultaneously. Careful statistical design and analysis are essential to improve the efficiency and reliability of microarray experiments throughout the data acquisition and analysis process. This includes the design of probes, the experimental design, the image analysis of microarray scanned images, the normalization of fluorescence intensities, the assessment of the quality of microarray data and incorporation of quality information in subsequent analyses, the combination of information across arrays and across sets of experiments, the discovery and recognition of patterns in expression at the single gene and multiple gene levels, and the assessment of significance of these findings, considering the fact that there is a lot of noise and thus random features in the data. For all of these components, access to a flexible and efficient statistical computing environment is an essential aspect.
Resumo:
Current understanding of the synaptic organization of the brain depends to a large extent on knowledge about the synaptic inputs to the neurons. Indeed, the dendritic surfaces of pyramidal cells (the most common neuron in the cerebral cortex) are covered by thin protrusions named dendritic spines. These represent the targets of most excitatory synapses in the cerebral cortex and therefore, dendritic spines prove critical in learning, memory and cognition. This paper presents a new method that facilitates the analysis of the 3D structure of spine insertions in dendrites, providing insight on spine distribution patterns. This method is based both on the implementation of straightening and unrolling transformations to move the analysis process to a planar, unfolded arrangement, and on the design of DISPINE, an interactive environment that supports the visual analysis of 3D patterns.
Resumo:
The goal of my study is to investigate the relationship between selected deictic shields on the pronoun ‘I’ and the involvement/detachment dichotomy in a sample of television news interviews. I focus on the use of personal pronouns in political discourse. Drawing upon Caffi’s (2007) classification of mitigating devices into bushes, hedges and shields, I focus on deictic shields on the pronoun ‘I’: I examine the way a selection of ‘I’-related deictic shields is employed in a collection of news interviews broadcast during the electoral campaign prior to the UK 2015 General Election. My purpose is to uncover the frequencies of each of the linguistic items selected and the pragmatic functions of those linguistic items in the involvement/detachment dichotomy. The research is structured as follows. Chapter 1 provides an account of previous studies on the three main areas of research: speech event analysis, institutional interaction and the news interview, and the UK 2015 General Election television programmes. Chapter 2 is centred on the involvement/detachment dichotomy: I provide an overview of nonlinguistic and linguistic features of involvement and detachment at all levels of sentence structure. Chapter 3 contains a detailed account of the data collection and data analysis process. Chapter 4 provides an accurate description of results in three steps: quantitative analysis, qualitative analysis and discussion of the pragmatic functions of the selected linguistic features of involvement and detachment. Chapter 5 includes a brief summary of the investigation, reviews the main findings, and indicates limitations of the study and possible inputs for further research. The results of the analysis confirm that, while some of the linguistic items examined point toward involvement, others have a detaching effect. I therefore conclude that deictic shields on the pronoun ‘I’ permit the realisation of the involvement/detachment dichotomy in the speech genre of the news interview.
Resumo:
Abstract (provisional): Background Failing a high-stakes assessment at medical school is a major event for those who go through the experience. Students who fail at medical school may be more likely to struggle in professional practice, therefore helping individuals overcome problems and respond appropriately is important. There is little understanding about what factors influence how individuals experience failure or make sense of the failing experience in remediation. The aim of this study was to investigate the complexity surrounding the failure experience from the student’s perspective using interpretative phenomenological analysis (IPA). Methods The accounts of 3 medical students who had failed final re-sit exams, were subjected to in-depth analysis using IPA methodology. IPA was used to analyse each transcript case-by-case allowing the researcher to make sense of the participant’s subjective world. The analysis process allowed the complexity surrounding the failure to be highlighted, alongside a narrative describing how students made sense of the experience. Results The circumstances surrounding students as they approached assessment and experienced failure at finals were a complex interaction between academic problems, personal problems (specifically finance and relationships), strained relationships with friends, family or faculty, and various mental health problems. Each student experienced multi-dimensional issues, each with their own individual combination of problems, but experienced remediation as a one-dimensional intervention with focus only on improving performance in written exams. What these students needed to be included was help with clinical skills, plus social and emotional support. Fear of termination of the their course was a barrier to open communication with staff. Conclusions These students’ experience of failure was complex. The experience of remediation is influenced by the way in which students make sense of failing. Generic remediation programmes may fail to meet the needs of students for whom personal, social and mental health issues are a part of the picture.
Resumo:
Thermodynamic stability measurements on proteins and protein-ligand complexes can offer insights not only into the fundamental properties of protein folding reactions and protein functions, but also into the development of protein-directed therapeutic agents to combat disease. Conventional calorimetric or spectroscopic approaches for measuring protein stability typically require large amounts of purified protein. This requirement has precluded their use in proteomic applications. Stability of Proteins from Rates of Oxidation (SPROX) is a recently developed mass spectrometry-based approach for proteome-wide thermodynamic stability analysis. Since the proteomic coverage of SPROX is fundamentally limited by the detection of methionine-containing peptides, the use of tryptophan-containing peptides was investigated in this dissertation. A new SPROX-like protocol was developed that measured protein folding free energies using the denaturant dependence of the rate at which globally protected tryptophan and methionine residues are modified with dimethyl (2-hydroxyl-5-nitrobenzyl) sulfonium bromide and hydrogen peroxide, respectively. This so-called Hybrid protocol was applied to proteins in yeast and MCF-7 cell lysates and achieved a ~50% increase in proteomic coverage compared to probing only methionine-containing peptides. Subsequently, the Hybrid protocol was successfully utilized to identify and quantify both known and novel protein-ligand interactions in cell lysates. The ligands under study included the well-known Hsp90 inhibitor geldanamycin and the less well-understood omeprazole sulfide that inhibits liver-stage malaria. In addition to protein-small molecule interactions, protein-protein interactions involving Puf6 were investigated using the SPROX technique in comparative thermodynamic analyses performed on wild-type and Puf6-deletion yeast strains. A total of 39 proteins were detected as Puf6 targets and 36 of these targets were previously unknown to interact with Puf6. Finally, to facilitate the SPROX/Hybrid data analysis process and minimize human errors, a Bayesian algorithm was developed for transition midpoint assignment. In summary, the work in this dissertation expanded the scope of SPROX and evaluated the use of SPROX/Hybrid protocols for characterizing protein-ligand interactions in complex biological mixtures.
Resumo:
For a long time, electronic data analysis has been associated with quantitative methods. However, Computer Assisted Qualitative Data Analysis Software (CAQDAS) are increasingly being developed. Although the CAQDAS has been there for decades, very few qualitative health researchers report using it. This may be due to the difficulties that one has to go through to master the software and the misconceptions that are associated with using CAQDAS. While the issue of mastering CAQDAS has received ample attention, little has been done to address the misconceptions associated with CAQDAS. In this paper, the author reflects on his experience of interacting with one of the popular CAQDAS (NVivo) in order to provide evidence-based implications of using the software. The key message is that unlike statistical software, the main function of CAQDAS is not to analyse data but rather to aid the analysis process, which the researcher must always remain in control of. In other words, researchers must equally know that no software can analyse qualitative data. CAQDAS are basically data management packages, which support the researcher during analysis.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Purpose. This study was designed to explore the cultural meaning and dimensions of quality of life from the perspective of Brazilian burn patients. Method. A qualitative research approach was used. Nineteen burn patients and their close relatives participated in this ethnographic study. Data were collected by means of direct observation and semi-structured interviews, conducted in a hospital outpatient clinic and during visits to patients` homes. The following inter-related phases guided the analysis process: reading of the material and data reduction, data display, conclusion outlining and verification. Results. Participants reported that the quality of life is related to autonomy and the ability to work. The dimensions of quality of life included: resuming work and functional ability, body image, having leisure and interpersonal relationships. Their descriptions revealed their feelings and attitudes about resuming their previous activities and social lives, particularly concerning the work. Conclusion. For burn patients, quality of life is associated with the concept of normality, the satisfactory performance of social roles in the context of family life and the social world. The results showed the importance of the sociocultural dimension in the concept of quality of life for persons undergoing burn rehabilitation.
Resumo:
COORDINSPECTOR is a Software Tool aiming at extracting the coordination layer of a software system. Such a reverse engineering process provides a clear view of the actually invoked services as well as the logic behind such invocations. The analysis process is based on program slicing techniques and the generation of, System Dependence Graphs and Coordination Dependence Graphs. The tool analyzes Common Intermediate Language (CIL), the native language of the Microsoft .Net Framework, thus making suitable for processing systems developed in any .Net Framework compilable language. COORDINSPECTOR generates graphical representations of the coordination layer together with business process orchestrations specified in WSBPEL 2.0
Resumo:
O estudo do comportamento motor, nomeadamente as áreas do desenvolvimento e controlo motor, têm permitido fundamentar a prática da terapia ocupacional, proporcionando um entendimento mais abrangente de aspetos relacionados com a análise de movimento. Todavia, o processo de análise de atividades, por norma, é realizado de forma empírica, principalmente devido à carência de métodos que avaliem de forma objetiva e precisa o comportamento motor e, consequentemente, os movimentos realizados no desempenho de atividades. Neste sentido, este estudo pretendeu encontrar padrões motores em crianças entre os nove e os dez anos de idade, com desenvolvimento normal, que traduzam o desempenho de uma tarefa motora funcional, com recurso ao sistema de captura e parametrização do movimento em tempo real BioStage®. Por outro lado, tentou-se perceber se o sistema poderia revelar-se um contributo para a prática da terapia ocupacional, possibilitando a obtenção de dados que possam ser utilizados na clínica. As tarefas selecionadas para análise foram os cinco lançamentos propostos pelo Bruininks-Oseretsky Test of Motor Proficiency, que consistem no lançamento por baixo uni e bilateral, lançamento ao chão uni e bilateral e lançamento ao alvo (unilateral). Os resultados encontrados apontam que aos nove e dez anos existem padrões motores similares entre as crianças, no entanto ainda se nota uma ligeira variabilidade no comportamento. Aferiu-se, também, que a idade, sexo e prática de exercício físico podem influenciar os padrões utilizados, estando de acordo com a literatura. O sistema BioStage® mostrou-se uma ferramenta eficaz para a análise de movimento, providenciando informação detalhada sobre o comportamento motor das crianças, no decorrer das tarefas. Deste modo, pode ser uma mais-valia para a prática da terapia ocupacional, podendo contribuir para uma análise de atividades mais precisa, objetiva e fundamentada.