984 resultados para Parke, Davis
Resumo:
A series of experiments is described, evaluating user recall of visualisations of historical chronology. Such visualisations are widely created but have not hitherto been evaluated. Users were tested on their ability to learn a sequence of historical events presented in a virtual environment (VE) fly-through visualisation, compared with the learning of equivalent material in other formats that are sequential but lack the 3D spatial aspect. Memorability is a particularly important function of visualisation in education. The measures used during evaluation are enumerated and discussed. The majority of the experiments reported compared three conditions, one using a virtual environment visualisation with a significant spatial element, one using a serial on-screen presentation in PowerPoint, and one using serial presentation on paper. Some aspects were trialled with groups having contrasting prior experience of computers, in the UK and Ukraine. Evidence suggests that a more complex environment including animations and sounds or music, intended to engage users and reinforce memorability, were in fact distracting. Findings are reported in relation to the age of the participants, suggesting that children at 11–14 years benefit less from, or are even disadvantaged by, VE visualisations when compared with 7–9 year olds or undergraduates. Finally, results suggest that VE visualisations offering a ‘landscape’ of information are more memorable than those based on a linear model. Keywords: timeline, chronographics
Resumo:
Our paper is concerned with the visualisation of historical events and artefacts in the context of time. It arises from a project bringing together expertise in visualisation, historiography and software engineering. The work is the result of an extended enquiry over several years which has included investigation of the prior history of such chronographics and their grounding in the temporal ontology of the Enlightenment. Timelines - visual, spatial presentations of chronology - are generally regarded as being too simple, perhaps too childish, to be worthy of academic attention, yet such chronographics should be capable of supporting sophisticated thinking about history and historiography, especially if they take full advantage of the capabilities of digital technologies. They should enable even professional academic historians to 'make sense' of history in new ways, allowing them insights they would not otherwise have achieved. In our paper we highlight key findings from the history of such representations, principally from the eighteenth and nineteenth centuries, and show how, in a project to develop new digital chronographics for collections of cultural objects and events, we have explored new implementations of the important ideas we have extracted about timewise presentation and interaction. This includes the representation of uncertainty, of relations between events, and the epistemology of time as a 'space' for history. We present developed examples, in particular a chronographic presentation of a large database of works by a single author, a composer, and discuss the extent to which our ambitions for chronographics have been realised in practice. Keywords: timeline, chronographics
Resumo:
This chapter focuses on the visualisation of historical time, illustrated by key examples from the eighteenth century when the modern timeline was invented. We are fortunate in having not only surviving examples of printed timelines from the period but also explanations written by their makers, revealing the ambitions they had for visualisation. An important divergence is evident, between those who want to use rhetorical visual metaphors to tell a graphical story, and those who prefer to let the data ‘speak for itself’, allowing patterns to emerge from the distribution of data points across a surface. Keywords: timeline, chronographics
Resumo:
The paper is concerned with the role of art and design in the history and philosophy of computing. It offers insights arising from research into a period in the 1960s and 70s, particularly in the UK, when computing became more available to artists and designers, focusing on John Lansdown (1929-1999) and Bruce Archer (1922-2005) in London. Models of computing interacted with conceptualisations of art, design and related creative activities in important ways.
Resumo:
The paper centres on a single document, the 1968 doctoral thesis of L Bruce Archer. It traces the author’s earlier publications and the sources that informed and inspired his thinking, as a way of understanding the trajectory of his ideas and the motivations for his work at the Royal College of Art from 1962. Analysis of the thesis suggests that Archer’s ambition for a rigorous ‘science of design’ inspired by algorithmic approaches was increasingly threatened with disruption by his experience of large, complex design projects. His attempts to deal with this problem are shown to involve a particular interpretation of cybernetics. The paper ends with Archer’s own retrospective view and a brief account of his dramatically changed opinions. Archer is located as both a theorist and someone intensely interested in the commercial world of industrial design.
Resumo:
Dissertação de mest., Arqueologia (Teoria e Métodos da Arqueologia), Faculdade de Ciências Humanas e Sociais, Univ. do Algarve, 2011
Resumo:
Cette recherche vise à étudier l’interrelation entre l’acceptation des technologies de l'information et de la communication (TIC) et les conceptions de l’enseignement et de l’apprentissage d’enseignants du primaire au Québec. En s’appuyant sur les modèles d’intervention éducative (Lenoir, 1991) comme cadre d’analyse des conceptions de l’enseignement et de l’apprentissage et sur le technology acceptance model (Davis, 1986) comme cadre d’analyse de l’acceptation des TIC, cette recherche développe un modèle conceptuel dans l’objectif d’explorer la relation entre les pratiques des enseignants liées à l’intégration des TIC et leur conceptions de l’enseignement et de l’apprentissage. Le modèle prend également en considération d’autres variables tel que l’âge, l’expérience d’enseignement et la formation initiale et continue, identifiées par la documentation scientifique sur le sujet comme associées à l’intégration des TIC en éducation. Pour tester le modèle nous procédons par enquête postale auprès d’un échantillon de convenance de 137 enseignants du primaire au Québec. Les données recueillies ont été traitées selon la nature des variables qui les déterminent (analyse des variances, test de Chi-Deux et analyse des corrélations). Les résultats de notre recherche mettent en évidence que les enseignants, sujets de l’étude, n’acceptent pas assez les TIC au point d’en faire «un usage habituel et suffisamment régulier » (Depover et Strebelle, 1996, p. 35). En plus, ces enseignants ne tirent pas plein profit des TIC vu qu’ils n’exploitent que certaines technologies (logiciels courants). D’autres TIC (environnements de communication et logiciels de création et de gestion de sites ou de pages Web) sont rarement utilisées en classe par les répondants et, encore plus rarement, par leurs élèves. Les résultats de notre recherche soulignent l’existence d’une relation significative entre l’acceptation des TIC et les conceptions de l’enseignement et de l’apprentissage des enseignants. L’acceptation des TIC semble également fonction de la perception des enseignants de l’utilité et de la facilité d’utilisation des TIC. Les variables de contexte qui influencent le plus significativement l’acceptation des TIC sont l’âge de l’enseignant, son expérience d’enseignement, le nombre de postes disponibles en salle de classe ainsi que le fait d’avoir suivi des activités de formation initiale ou continue sur l’utilisation pédagogique des TIC et sur l’utilisation de certains types de logiciels.
Resumo:
This work describes the electrochemical methodology for the determination of the Donnan potential from diffusion-limited steady-state voltammograms of acrylamide gels. The technique is based upon the measurement of gel–sol systems that have reached Donnan equilibrium and contain Cd2+ as a probe ion. Au-amalgam microelectrodes are used to measure the Cd concentration in the gel phase relative to the solution phase, thus permitting comparison of the Cd voltammograms obtained in both phases. This approach yields two independent measures of the Donnan potential resulting from (i) the potential shift relative to the reference electrode, and (ii) the enhancement of the Cd2+ wave. Two suites of acrylamide gels containing 0.2% and 0.5% Na-acrylate were studied as a function of ionic strength by varying [NaNO3] and maintaining a constant concentration of the electroactive probe ion, [Cd2+] = 1 · 10 5 mol/L in the equilibrating solutions. Independent model predictions of the Donnan potential as a function of ionic strength that consider the effects of differential swelling on the charge density, the influence of a mixed electrolyte on the potential developed in the gel at the limit of low ionic strength and the effects of incomplete dissociation of the carboxylic functional groups were in agreement with the Donnan potentials independently measured by the twofold steady-state voltammetric approach.
Resumo:
Induced pluripotent stem cells (iPSc) have great potential for applications in regenerative medicine, disease modeling and basic research. Several methods have been developed for their derivation. The original method of Takahashi and Yamanaka involved the use of retroviral vectors which result in insertional mutagenesis, presence in the genome of potential oncogenes and effects of residual transgene expression on differentiation bias of each particular iPSc line. Other methods have been developed, using different viral vectors (adenovirus and Sendai virus), transient plasmid transfection, mRNA transduction, protein transduction and use of small molecules. However, these methods suffer from low efficiencies; can be extremely labor intensive, or both. An additional method makes use of the piggybac transposon, which has the advantage of inserting its payload into the host genome and being perfectly excised upon re-expression of the transposon transposase. Briefly, a policistronic cassette expressing Oct4, Sox2, Klf4 and C-Myc flanked by piggybac terminal repeats is delivered to the cells along with a plasmid transiently expressing piggybac transposase. Once reprogramming occurs, the cells are re-transfected with transposase and subclones free of tranposon integrations screened for. The procedure is therefore very labor intensive, requiring multiple manipulations and successive rounds of cloning and screening. The original method for reprogramming with the the PiggyBac transposon was created by Woltjen et al in 2009 (schematized here) and describes a process with which it is possible to obtain insert-free iPSc. Insert-free iPSc enables the establishment of better cellular models of iPS and adds a new level of security to the use of these cells in regenerative medicine. Due to the fact that it was based on several low efficiency steps, the overall efficiency of the method is very low (<1%). Moreover, the stochastic transfection, integration, excision and the inexistence of an active way of selection leaves this method in need of extensive characterization and screening of the final clones. In this work we aime to develop a non-integrative iPSc derivation system in which integration and excision of the transgenes can be controlled by simple media manipulations, avoiding labor intensive and potentially mutagenic procedures. To reach our goal we developed a two vector system which is simultaneously delivered to original population of fibroblasts. The first vector, Remo I, carries the reprogramming cassette and GFP under the regulation of a constitutive promoter (CAG). The second vector, Eneas, carries the piggybac transposase associated with an estrogen receptor fragment (ERT2), regulated in a TET-OFF fashion, and its equivalent reverse trans-activator associated with a positive-negative selection cassette under a constitutive promoter. We tested its functionality in HEK 293T cells. The protocol is divided in two the following steps: 1) Obtaining acceptable transfection efficiency into human fibroblasts. 2) Testing the functionality of the construct 3) Determining the ideal concentration of DOX for repressing mPB-ERT2 expression 4) Determining the ideal concentration of TM for transposition into the genome 5) Determining the ideal Windows of no DOX/TM pulse for transposition into the genome 6) 3, 4 and 5) for transposition out of the genome 7) Determination of the ideal concentration of GCV for negative selection We successfully demonstrated that ENEAS behaved as expected in terms of DOX regulation of the expression of mPB-ERT2. We also demonstrated that by delivering the plasmid into 293T HEK cells and manipulating the levels of DOX and TM in the medium, we could obtain puromycin resistant lines. The number of puromycin resistant colonies obtained was significantly higher when DOX as absent, suggesting that the colonies resulted from transposition events. Presence of TM added an extra layer of regulation, albeit weaker. Our PCR analysis, while not a clean as would be desired, suggested that transposition was indeed occurring, although a background level of random integration could not be ruled out. Finally, our attempt to determine whether we could use GVC to select clones that had successfully mobilized PB out of the genome was unsuccessful. Unexpectedly, 293T HEK cells that had been transfected with ENEAS and selected for puromycin resistance were insensitive to GCV.