41 resultados para video as a research tool
em Aston University Research Archive
Resumo:
Problem-structuring group workshops can be used in organizations as a consulting tool and as a research tool. One example of the latter is using a problem-structuring method (PSM) to help a group tackle an organizational issue; meanwhile, researchers collect the participants' initial views, discussion of divergent views, the negotiated agreement, and the reasoning for outcomes emerging. Technology can help by supporting participants in freely sharing their opinions and by logging data for post-workshop analyses. For example, computers let participants share views anonymously and without being influenced by others (as well as logging those views), and video-cameras can record discussions and intra-group dynamics. This paper evaluates whether technology-supported Journey Making workshops can be effective research tools that can capture quality research data when compared against theoretical performance benchmarks and other qualitative research tools. © 2006 Operational Research Society Ltd. All rights reserved.
Resumo:
This paper examines the role of creative resources in the emergence of the Japanese video game industry. We argue that creative resources nurtured by popular cartoons and animation sector, combined with technological knowledge accumulated in the consumer electronics industry, facilitated the emergence of successful video game industry in Japan. First we trace the development of the industry from its origin to the rise of platform developers and software publishers. Then, knowledge and creative foundations that influenced the developmental trajectory of this industry are analyzed, with links to consumer electronics and in regards to cartoons and animation industry.
Resumo:
This thesis criticises many psychological experiments on 'pornography' which attempt to demonstrate how 'pornography' causes and/or equals rape. It challenges simplistic definitions of 'pornography', arguing that sexually explicit materials (SEM) are constructed and interpreted in a number of different ways; and demonstrates that how, when and where materials are depicted or viewed will influence perceptions and reactions. In addition, it opposes the overreliance on male undergraduates as participants in 'porn' research. Theories of feminist psychology and reflexivity are used throughout the thesis, and provide a detailed contextual framework in a complex area. Results from a number of interlinking studies which use a variety of methodological approaches (focus groups, questionnaires and content analysis), indicate how contextual issues are omitted in much existing research on SEM. These include the views and experiences participants' hold prior to completing SEM studies; their opinions about those who 'use' 'pornography'; their understanding of key terms linked with SEM (eg: pornography and erotica); and discussions of sexual magazines aimed at male and female audiences. Participants' reactions to images and texts associated with SEM presented in different contexts are discussed. Three main conclusions are drawn from this thesis. Firstly, images deemed 'pornographic' differ through historical and cultural periods' and political, economic and social climates, so 'experimental' approaches may not always be the most appropriate research tool. Secondly, there is not one definition, source, or factor which may be named 'pornography'; and thirdly the context and presentation of materials influence how images are perceived and reacted to. The thesis argues a number of factors influence view of 'pornography', suggesting SEM may be 'in the eye of the beholder'.
Resumo:
Purpose: A clinical evaluation of the Grand Seiko Auto Ref/Keratometer WAM-5500 (Japan) was performed to evaluate validity and repeatability compared with non-cycloplegic subjective refraction and Javal–Schiotz keratometry. An investigation into the dynamic recording capabilities of the instrument was also conducted. Methods: Refractive error measurements were obtained from 150 eyes of 75 subjects (aged 25.12 ± 9.03 years), subjectively by a masked optometrist, and objectively with the WAM-5500 at a second session. Keratometry measurements from the WAM-5500 were compared to Javal–Schiotz readings. Intratest variability was examined on all subjects, whilst intertest variability was assessed on a subgroup of 44 eyes 7–14 days after the initial objective measures. The accuracy of the dynamic recording mode of the instrument and its tolerance to longitudinal movement was evaluated using a model eye. An additional evaluation of the dynamic mode was performed using a human eye in relaxed and accommodated states. Results: Refractive error determined by the WAM-5500 was found to be very similar (p = 0.77) to subjective refraction (difference, -0.01 ± 0.38 D). The instrument was accurate and reliable over a wide range of refractive errors (-6.38 to +4.88 D). WAM-5500 keratometry values were steeper by approximately 0.05 mm in both the vertical and horizontal meridians. High intertest repeatability was demonstrated for all parameters measured: for sphere, cylinder power and MSE, over 90% of retest values fell within ±0.50 D of initial testing. In dynamic (high-speed) mode, the root-mean-square of the fluctuations was 0.005 ± 0.0005 D and a high level of recording accuracy was maintained when the measurement ring was significantly blurred by longitudinal movement of the instrument head. Conclusion: The WAM-5500 Auto Ref/Keratometer represents a reliable and valid objective refraction tool for general optometric practice, with important additional features allowing pupil size determination and easy conversion into high-speed mode, increasing its usefulness post-surgically following accommodating intra-ocular lens implantation, and as a research tool in the study of accommodation.
Resumo:
Following a scene-setting introduction are detailed reviews of the relevant scientific principles, thermal analysis as a research tool and the development of the zinc-aluminium family of alloys. A recently introduced simultaneous thermal analyser, the STA 1500, its use for differential thermal analysis (DTA) being central to the investigation, is described, together with the sources of support information, chemical analysis, scanning electron microscopy, ingot cooling curves and fluidity spiral castings. The compositions of alloys tested were from the binary zinc-aluminium system, the ternary zinc-aluminium-silicon system at 30%, 50% and 70% aluminium levels, binary and ternary alloys with additions of copper and magnesium to simulate commercial alloys and five widely used commercial alloys. Each alloy was shotted to provide the smaller, 100mg, representative sample required for DTA. The STA 1500 was characterised and calibrated with commercially pure zinc, and an experimental procedure established for the determination of DTA heating curves at 10°C per minute and cooling curves at 2°C per minute. Phase change temperatures were taken from DTA traces, most importantly, liquidus from a cooling curve and solidus from both heating and cooling curves. The accepted zinc-aluminium binary phase diagram was endorsed with the added detail that the eutectic is at 5.2% aluminium rather than 5.0%. The ternary eutectic trough was found to run through the points, 70% Al, 7.1% Si, 545°C; 50% Al, 3.9% Si, 520°C; 30% Al, 1.4% Si, 482°C. The dendrite arm spacing in samples after DTA increased with increasing aluminium content from 130m at 30% to 220m at 70%. The smallest dendrite arm spacing of 60m was in the 30% aluminium 2% silicon alloy. A 1kg ingot of the 10% aluminium binary alloy, insulated with Kaowool, solidified at the same 2°C per minute rate as the DTA samples. A similar sized sand casting was solidified at 3°C per minute and a chill casting at 27°C per minute. During metallographic examination the following features were observed: heavily cored phase which decomposed into ' and '' on cooling; needles of the intermetallic phase FeAl4; copper containing ternary eutectic and copper rich T phase.
Resumo:
Aims: To establish the sensitivity and reliability of objective image analysis in direct comparison with subjective grading of bulbar hyperaemia. Methods: Images of the same eyes were captured with a range of bulbar hyperaemia caused by vasodilation. The progression was recorded and 45 images extracted. The images were objectively analysed on 14 occasions using previously validated edge-detection and colour-extraction techniques. They were also graded by 14 eye-care practitioners (ECPs) and 14 non-clinicians (NCb) using the Efron scale. Six ECPs repeated the grading on three separate occasions Results: Subjective grading was only able to differentiate images with differences in grade of 0.70-1.03 Efron units (sensitivity of 0.30-0.53), compared to 0,02-0.09 Efron units with objective techniques (sensitivity of 0.94-0.99). Significant differences were found between ECPs and individual repeats were also inconsistent (p<0.001). Objective analysis was 16x more reliable than subjective analysis. The NCLs used wider ranges of the scale but were more variable than ECPs, implying that training may have an effect on grading. Conclusions: Objective analysis may offer a new gold standard in anterior ocular examination, and should be developed further as a clinical research tool to allow more highly powered analysis, and to enhance the clinical monitoring of anterior eye disease.
Resumo:
This article uses a research project into the online conversations of sex offenders and the children they abuse to further the arguments for the acceptability of experimental work as a research tool for linguists. The research reported here contributes to the growing body of work within linguistics that has found experimental methods to be useful in answering questions about representation and constraints on linguistic expression (Hemforth 2013). The wider project examines online identity assumption in online paedophile activity and the policing of such activity, and involves dealing with the linguistic analysis of highly sensitive sexual grooming transcripts. Within the linguistics portion of the project, we examine theories of idiolect and identity through analysis of the ‘talk’ of perpetrators of online sexual abuse, and of the undercover officers that must assume alternative identities in order to investigate such crimes. The essential linguistic question in this article is methodological and concerns the applicability of experimental work to exploration of online identity and identity disguise. Although we touch on empirical questions, such as the sufficiency of linguistic description that will enable convincing identity disguise, we do not explore the experimental results in detail. In spite of the preference within a range of discourse analytical paradigms for ‘naturally occurring’ data, we argue that not only does the term prove conceptually problematic, but in certain contexts, and particularly in the applied forensic context described, a rejection of experimentally elicited data would limit the possible types and extent of analyses. Thus, it would restrict the contribution that academic linguistics can make in addressing a serious social problem.
Resumo:
The yeast Saccharomyces cerevisiae is an important model organism for the study of cell biology. The similarity between yeast and human genes and the conservation of fundamental pathways means it can be used to investigate characteristics of healthy and diseased cells throughout the lifespan. Yeast is an equally important biotechnological tool that has long been the organism of choice for the production of alcoholic beverages, bread and a large variety of industrial products. For example, yeast is used to manufacture biofuels, lubricants, detergents, industrial enzymes, food additives and pharmaceuticals such as anti-parasitics, anti-cancer compounds, hormones (including insulin), vaccines and nutraceuticals. Its function as a cell factory is possible because of the speed with which it can be grown to high cell yields, the knowledge that it is generally recognized as safe (GRAS) and the ease with which metabolism and cellular pathways, such as translation can be manipulated. In this thesis, these two pathways are explored in the context of their biotechnological application to ageing research: (i) understanding translational processes during the high-yielding production of membrane protein drug targets and (ii) the manipulation of yeast metabolism to study the molecule, L-carnosine, which has been proposed to have anti-ageing properties. In the first of these themes, the yeast strains, spt3?, srb5?, gcn5? and yTHCBMS1, were examined since they have been previously demonstrated to dramatically increase the yields of a target membrane protein (the aquaporin, Fps1) compared to wild-type cells. The mechanisms underlying this discovery were therefore investigated. All high yielding strains were shown to have an altered translational state (mostly characterised by an initiation block) and constitutive phosphorylation of the translational initiation factor, eIF2a. The relevance of the initiation block was further supported by the finding that other strains, with known initiation blocks, are also high yielding for Fps1. A correlation in all strains between increased Fps1 yields and increased production of the transcriptional activator protein, Gcn4, suggested that yields are subject to translational control. Analysis of the 5´ untranslated region (UTR) of FPS1 revealed two upstream open reading frames (uORFs). Mutagenesis data suggest that high yielding strains may circumvent these control elements through either a leaky scanning or a re-initiation mechanism. In the second theme, the dipeptide L-carnosine (ß-alanyl-L-histidine) was investigated: it has previously been shown to inhibit the growth of cancer cells but delay senescence in cultured human fibroblasts and extend the lifespan of male fruit flies. To understand these apparently contradictory properties, the effects of L-carnosine on yeast were studied. S. cerevisiae can respire aerobically when grown on a non-fermentable carbon source as a substrate but has a respiro-fermentative metabolism when grown on a fermentable carbon source; these metabolisms mimic normal cell and cancerous cell metabolisms, respectively. When yeast were grown on fermentable carbon sources, in the presence of L-carnosine, a reduction in cell growth and viability was observed, which was not apparent for cells grown on a non-fermentable carbon source. The metabolism-dependent mechanism was confirmed in the respiratory yeast species Pichia pastoris. Further analysis of S. cerevisiae yeast strains with deletions in their nutrient-sensing pathway, which result in an increase in respiratory metabolism, confirmed the metabolism-dependent effects of L-carnosine.
Resumo:
Engineering education in the United Kingdom is at the point of embarking upon an interesting journey into uncharted waters. At no point in the past have there been so many drivers for change and so many opportunities for the development of engineering pedagogy. This paper will look at how Engineering Education Research (EER) has developed within the UK and what differentiates it from the many small scale practitioner interventions, perhaps without a clear research question or with little evaluation, which are presented at numerous staff development sessions, workshops and conferences. From this position some examples of current projects will be described, outcomes of funding opportunities will be summarised and the benefits of collaboration with other disciplines illustrated. In this study, I will account for how the design of task structure according to variation theory, as well as the probe-ware technology, make the laws of force and motion visible and learnable and, especially, in the lab studied make Newton's third law visible and learnable. I will also, as a comparison, include data from a mechanics lab that use the same probe-ware technology and deal with the same topics in mechanics, but uses a differently designed task structure. I will argue that the lower achievements on the FMCE-test in this latter case can be attributed to these differences in task structure in the lab instructions. According to my analysis, the necessary pattern of variation is not included in the design. I will also present a microanalysis of 15 hours collected from engineering students' activities in a lab about impulse and collisions based on video recordings of student's activities in a lab about impulse and collisions. The important object of learning in this lab is the development of an understanding of Newton's third law. The approach analysing students interaction using video data is inspired by ethnomethodology and conversation analysis, i.e. I will focus on students practical, contingent and embodied inquiry in the setting of the lab. I argue that my result corroborates variation theory and show this theory can be used as a 'tool' for designing labs as well as for analysing labs and lab instructions. Thus my results have implications outside the domain of this study and have implications for understanding critical features for student learning in labs. Engineering higher education is well used to change. As technology develops the abilities expected by employers of graduates expand, yet our understanding of how to make informed decisions about learning and teaching strategies does not without a conscious effort to do so. With the numerous demands of academic life, we often fail to acknowledge our incomplete understanding of how our students learn within our discipline. The journey facing engineering education in the UK is being driven by two classes of driver. Firstly there are those which we have been working to expand our understanding of, such as retention and employability, and secondly the new challenges such as substantial changes to funding systems allied with an increase in student expectations. Only through continued research can priorities be identified, addressed and a coherent and strong voice for informed change be heard within the wider engineering education community. This new position makes it even more important that through EER we acquire the knowledge and understanding needed to make informed decisions regarding approaches to teaching, curriculum design and measures to promote effective student learning. This then raises the question 'how does EER function within a diverse academic community?' Within an existing community of academics interested in taking meaningful steps towards understanding the ongoing challenges of engineering education a Special Interest Group (SIG) has formed in the UK. The formation of this group has itself been part of the rapidly changing environment through its facilitation by the Higher Education Academy's Engineering Subject Centre, an entity which through the Academy's current restructuring will no longer exist as a discrete Centre dedicated to supporting engineering academics. The aims of this group, the activities it is currently undertaking and how it expects to network and collaborate with the global EER community will be reported in this paper. This will include explanation of how the group has identified barriers to the progress of EER and how it is seeking, through a series of activities, to facilitate recognition and growth of EER both within the UK and with our valued international colleagues.
Resumo:
The point of departure for this study was a recognition of the differences in suppliers' and acquirers' judgements of the value of technology when transferred between the two, and the significant impacts of technology valuation on the establishment of technology partnerships and effectiveness of technology collaborations. The perceptions, transfer strategies and objectives, perceived benefits and assessed technology contributions as well as associated costs and risks of both suppliers and acquirers were seen to be the core to these differences. This study hypothesised that the capability embodied in technology to yield future returns makes technology valuation distinct from the process of valuing manufacturing products. The study hence has gone beyond the dimensions of cost calculation and price determination that have been discussed in the existing literature, by taking a broader view of how to achieve and share future added value from transferred technology. The core of technology valuation was argued as the evaluation of the 'quality' of the capability (technology) in generating future value and the effectiveness of the transfer arrangement for best use of such a capability. A dynamic approach comprising future value generation and realisation within the context of specific forms of collaboration was therefore adopted. The research investigations focused on the UK and China machine tool industries, where there are many technology transfer activities and the value issue has already been recognised in practice. Data were gathered from three groups: machine tool manufacturing technology suppliers in the UK and acquirers in China, and machine tool users in China. Data collecting methods included questionnaire surveys and case studies within all the three groups. The study has focused on identifying and examining the major factors affecting value as well as their interactive effects on technology valuation from both the supplier's and acquirer's point of view. The survey results showed the perceptions and the assessments of the owner's value and transfer value from the supplier's and acquirer's point of view respectively. Benefits, costs and risks related to the technology transfer were the major factors affecting the value of technology. The impacts of transfer payment on the value of technology by the sharing of financial benefits, costs and risks between partners were assessed. The close relationship between technology valuation and transfer arrangements was established by which technical requirements and strategic implications were considered. The case studies reflected the research propositions and revealed that benefits, costs and risks in the financial, technical and strategic dimensions interacted in the process of technology valuation within the context of technology collaboration. Further to the assessment of factors affecting value, a technology valuation framework was developed which suggests that technology attributes for the enhancement of contributory factors and their contributions to the realisation of transfer objectives need to be measured and compared with the associated costs and risks. The study concluded that technology valuation is a dynamic process including the generation and sharing of future value and the interactions between financial, technical and strategic achievements.
Resumo:
Knowledge elicitation is a well-known bottleneck in the production of knowledge-based systems (KBS). Past research has shown that visual interactive simulation (VIS) could effectively be used to elicit episodic knowledge that is appropriate for machine learning purposes, with a view to building a KBS. Nonetheless, the VIS-based elicitation process still has much room for improvement. Based in the Ford Dagenham Engine Assembly Plant, a research project is being undertaken to investigate the individual/joint effects of visual display level and mode of problem case generation on the elicitation process. This paper looks at the methodology employed and some issues that have been encountered to date. Copyright © 2007 Inderscience Enterprises Ltd.
Resumo:
Purpose - To introduce the contents of the special issue, and provide an integrative overview of the development of observational methodologies in marketing research, as well as some directions for the future. Design/methodology/approach - A historical review of the development of observational methods, beginning with philosophical foundations, is provided. Key philosophical debates are summarized, and trends in observational methods are described and analyzed, with particular reference to the impact of technology. Following this, the contributions to the special issue are summarized and brought together. Findings - Observational research in marketing is more than the well-known method of "participant-observation." In fact, technology has the potential to revolutionize observational research, and move it beyond a solely "qualitative" method. The internet, video, scanner-tracking, and neuroimaging methods are all likely to have a big impact on the development of traditional and innovative observation methods in the future. The articles in the special issue provide a good overview of these developments. Research limitations/implications - The views of the authors may differ from those of others. Practical implications - Observation is a far more wide-ranging strategy than many perceive. There is a need for more expertise in all types of observational methodologies within marketing research schools and departments, in order to take account of the vast opportunities which are currently emerging. Originality/value - Provides an original perspective on observational methods, and serves as a useful overview of trends and developments in the field.
Resumo:
This accessible, practice-oriented and compact text provides a hands-on introduction to the principles of market research. Using the market research process as a framework, the authors explain how to collect and describe the necessary data and present the most important and frequently used quantitative analysis techniques, such as ANOVA, regression analysis, factor analysis, and cluster analysis. An explanation is provided of the theoretical choices a market researcher has to make with regard to each technique, as well as how these are translated into actions in IBM SPSS Statistics. This includes a discussion of what the outputs mean and how they should be interpreted from a market research perspective. Each chapter concludes with a case study that illustrates the process based on real-world data. A comprehensive web appendix includes additional analysis techniques, datasets, video files and case studies. Several mobile tags in the text allow readers to quickly browse related web content using a mobile device.
Resumo:
In this article we describe and evaluate the process of conducting online survey research about the legal recognition of same-sex relationships (key findings from which we have reported elsewhere, see Harding and Peel, 2006). Our aim in so doing is to contribute to the growing generic literature on internet-based research methods (Nosek et al., 2002; Rhodes et al., 2003; Stern, 2003; Strickland et al., 2003; Thomas et al., 2000) to the research methods literature within lesbian, gay, bisexual, trans and queer (LGBTQ) psychologies (Fish, 2000; Morris and Rothblum, 1999; Meezan and Martin, 2003; Mustanski, 2001) and also to extend the germinal literature focusing on internet research with non-heterosexual groups (Elford et al., 2004; Ellis et al., 2003; Ross et al., 2000). We begin by discussing the process of developing the online survey tool, before outlining the experience of the survey ‘going live’ and providing details of who completed the survey. We conclude by exploring some of the positives and pitfalls of this type of research methodology.