833 resultados para Learning from one Example
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Psicologia, Departamento de Processos Psicológicos Básicos, Programa de Pós-Graduação em Ciências do Comportamento, 2015.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Letras, Departamento de Línguas Estrangeiras e Tradução, Programa de Pós-Graduação em Estudos da Tradução, 2016.
Resumo:
L’isolement avec ou sans contention (IC) en milieu psychiatrique touche près d’un patient sur quatre au Québec (Dumais, Larue, Drapeau, Ménard, & Giguère-Allard, 2011). Il est pourtant largement documenté que cette pratique porte préjudice aux patients, aux infirmières et à l’organisation (Stewart, Van der Merwe, Bowers, Simpson, & Jones, 2010). Cette mesure posant un problème éthique fait l’objet de politiques visant à la restreindre, voire à l’éliminer. Les études sur l’expérience de l’isolement du patient de même que sur la perception des infirmières identifient le besoin d'un retour sur cet évènement. Plusieurs équipes de chercheurs proposent un retour post-isolement (REPI) intégrant à la fois l’équipe traitante, plus particulièrement les infirmières, et le patient comme intervention afin de diminuer l’incidence de l’IC. Le REPI vise l’échange émotionnel, l’analyse des étapes ayant mené à la prise de décision d’IC et la projection des interventions futures. Le but de cette étude était de développer, implanter et évaluer le REPI auprès des intervenants et des patients d’une unité de soins psychiatriques aigus afin d’améliorer leur expérience de soins. Les questions de recherche étaient : 1) Quel est le contexte d’implantation du REPI? 2) Quels sont les éléments facilitants et les obstacles à l’implantation du REPI selon les patients et les intervenants? 3) Quelle est la perception des patients et des intervenants des modalités et retombées du REPI?; et 4) L’implantation du REPI est-elle associée à une diminution de la prévalence et de la durée des épisodes d’IC? Cette étude de cas instrumentale (Stake, 1995, 2008) était ancrée dans une approche participative. Le cas était celui de l’unité de soins psychiatriques aigus pour premier épisode psychotique où a été implanté le REPI. En premier lieu, le développement du REPI a d’abord fait l’objet d’une documentation du contexte par une immersion dans le milieu (n=56 heures) et des entretiens individuels avec un échantillonnage de convenance (n=3 patients, n=14 intervenants). Un comité d’experts (l’étudiante-chercheuse, six infirmières du milieu et un patient partenaire) a par la suite développé le REPI qui comporte deux volets : avec le patient et en équipe. L’évaluation des retombées a été effectuée par des entretiens individuels (n= 3 patients, n= 12 intervenants) et l’examen de la prévalence et de la durée des IC six mois avant et après l’implantation du REPI. Les données qualitatives ont été examinées selon une analyse thématique (Miles, Huberman, & Saldana, 2014), tandis que les données quantitatives ont fait l’objet de tests descriptifs et non-paramétriques. Les résultats proposent que le contexte d’implantation est défini par des normes implicites et explicites où l’utilisation de l’IC peut générer un cercle vicieux de comportements agressifs nourris par un profond sentiment d’injustice de la part des patients. Ceux-ci ont l’impression qu’ils doivent se conformer aux attentes du personnel et aux règles de l’unité. Les participants ont exprimé le besoin de créer des opportunités pour une communication authentique qui pourrait avoir lieu lors du REPI, bien que sa pratique soit variable d’un intervenant à un autre. Les résultats suggèrent que le principal élément ayant facilité l’implantation du REPI est l’approche participative de l’étude, alors que les obstacles rencontrés relèvent surtout de la complexité de la mise en œuvre du REPI en équipe. Lors du REPI avec le patient, les infirmières ont pu explorer ses sentiments et son point de vue, ce qui a favorisé la reconstruction de la relation thérapeutique. Quant au REPI avec l’équipe de soins, il a été perçu comme une opportunité d’apprentissage, ce qui a permis d’ajuster le plan d’intervention des patients. Suite à l’implantation du REPI, les résultats ont d’ailleurs montré une réduction significative de l’utilisation de l’isolement et du temps passé en isolement. Les résultats de cette thèse soulignent la possibilité d’outrepasser le malaise initial perçu tant par le patient que par l’infirmière en systématisant le REPI. De plus, cette étude met l’accent sur le besoin d’une présence authentique pour atteindre un partage significatif dans la relation thérapeutique, ce qui est la pierre d’assise de la pratique infirmière en santé mentale. Cette étude contribue aux connaissances sur la prévention des comportements agressifs en milieu psychiatrique en documentant le contexte dans lequel se situe l’IC, en proposant un REPI comportant deux volets de REPI et en explorant ses retombées. Nos résultats soutiennent le potentiel du développement d’une prévention tertiaire qui intègre à la fois la perspective des patients et des intervenants.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Exatas, Departamento de Ciência da Computação, Programa de Pós-Graducação em Informática, 2016.
Resumo:
Dissertação de Mestrado apresentada ao Instituto Superior de Psicologia Aplicada para obtenção de grau de Mestre na especialidade de Psicologia Educacional.
Resumo:
Este artículo explora diferentes tipos de apropiación de tecnologías mediáticas en las márgenes y propone un cambio en el acercamiento investigativo en diferentes niveles: 1) en lugar de centrarse en tecnologías individuales, la investigación sobre medios en las márgenes debe examinar cómo los/as comunicadores locales se desenvuelven en ecologías mediáticas que ofrecen recursos y retos específicos en cada situación histórica; 2) en lugar de tratar de determinar si las tecnologías mediáticas usadas en las márgenes son nuevas u obsoletas, digitales o no, es urgente comprender cómo los/as comunicadores asentados en lo local detectan necesidades de información y comunicación específicas y usan las tecnologías disponibles para abordar tales necesidades; 3) la investigación sobre medios en las márgenes debe esclarecer cómo las/los protagonistas de este tipo de comunicación ciudadana y comunitaria reinventan, hibridan, reciclan y tienden lazos entre plataformas tecnológicas. En resumen, para entender las tecnologías mediáticas en las márgenes la investigación debe asumir altos niveles de complejidad, debe mantener la noción de ecologías mediáticas y entender cómo, a nivel local, comunicadores comunitarios profundamente inmersos en lo cotidiano e histórico, ajustan las tecnologías mediáticas a las necesidades de sus comunidades.
Resumo:
Polylysogeny is frequently considered to be the result of an adaptive evolutionary process in which prophages confer fitness and/or virulence factors, thus making them important for evolution of both bacterial populations and infectious diseases. The Enterococcus faecalis V583 isolate belongs to the high-risk clonal complex 2 that is particularly well adapted to the hospital environment. Its genome carries 7 prophage-like elements (V583-pp1 to -pp7), one of which is ubiquitous in the species. In this study, we investigated the activity of the V583 prophages and their contribution to E. faecalis biological traits. We systematically analyzed the ability of each prophage to excise from the bacterial chromosome, to replicate and to package its DNA. We also created a set of E. faecalis isogenic strains that lack from one to all six non-ubiquitous prophages by mimicking natural excision. Our work reveals that prophages of E. faecalis V583 excise from the bacterial chromosome in the presence of a fluoroquinolone, and are able to produce active phage progeny. Intricate interactions between V583 prophages were also unveiled: i) pp7, coined EfCIV583 for E. faecalis chromosomal island of V583, hijacks capsids from helper phage 1, leading to the formation of distinct virions, and ii) pp1, pp3 and pp5 inhibit excision of pp4 and pp6. The hijacking exerted by EfCIV583 on helper phage 1 capsids is the first example of molecular piracy in Gram positive bacteria other than staphylococci. Furthermore, prophages encoding platelet-binding-like proteins were found to be involved in adhesion to human platelets, considered as a first step towards the development of infective endocarditis. Our findings reveal not only a role of E. faecalis V583 prophages in pathogenicity, but also provide an explanation for the correlation between antibiotic usage and E. faecalis success as a nosocomial pathogen, as fluoriquinolone may provoke release of prophages and promote gene dissemination among isolates.
Resumo:
What’s behind the mistakes and difficulties that appear on the students to understand and study mathematics?are only related to the cognitive complexity of the content or such difficulties are also related to the possible ways to access the different mathematical objects? The mathematical activity generated in many students learning difficulties that are not manifested in cognitive processes related to other areas of knowledge. If something characterizes the processes of teaching and learning of mathematics is that, unlike what happens with the objects of study in the experimental sciences, the only way to access to them is through its different semiotic representations. The coordination among the different systems of representation that refer to the same mathematical concept, needs to move from one register to another (D’Amore, 1998, 2001, 2003, 2004, 2006; Duval, 1993, 1994, 1995, 1996, 2000, 2003, 2004, 2005, 2007, 2008, 2011, 2012; Godino, 2002, 2003, 2012, 2014; Kaput, 1989a, 1989b,1992, 1998; Radford, 1998, 2004a, 2004b, 2004c, 2006a, 2008,2009, 2011, 2013, 2014a). Therefore, the treatments that can be realized within a given register and the conversion of one register into another, play an essential role in the grasp of the object and mathematical concepts. Through this work with representations, students give meanings to the objects of study and are able to understand the underlying mathematical structures, which is the main educational interest of this issue...
Resumo:
Security defects are common in large software systems because of their size and complexity. Although efficient development processes, testing, and maintenance policies are applied to software systems, there are still a large number of vulnerabilities that can remain, despite these measures. Some vulnerabilities stay in a system from one release to the next one because they cannot be easily reproduced through testing. These vulnerabilities endanger the security of the systems. We propose vulnerability classification and prediction frameworks based on vulnerability reproducibility. The frameworks are effective to identify the types and locations of vulnerabilities in the earlier stage, and improve the security of software in the next versions (referred to as releases). We expand an existing concept of software bug classification to vulnerability classification (easily reproducible and hard to reproduce) to develop a classification framework for differentiating between these vulnerabilities based on code fixes and textual reports. We then investigate the potential correlations between the vulnerability categories and the classical software metrics and some other runtime environmental factors of reproducibility to develop a vulnerability prediction framework. The classification and prediction frameworks help developers adopt corresponding mitigation or elimination actions and develop appropriate test cases. Also, the vulnerability prediction framework is of great help for security experts focus their effort on the top-ranked vulnerability-prone files. As a result, the frameworks decrease the number of attacks that exploit security vulnerabilities in the next versions of the software. To build the classification and prediction frameworks, different machine learning techniques (C4.5 Decision Tree, Random Forest, Logistic Regression, and Naive Bayes) are employed. The effectiveness of the proposed frameworks is assessed based on collected software security defects of Mozilla Firefox.
Resumo:
Theoretical models of social learning predict that individuals can benefit from using strategies that specify when and whom to copy. Here the interaction of two social learning strategies, model age-based biased copying and copy when uncertain, was investigated. Uncertainty was created via a systematic manipulation of demonstration efficacy (completeness) and efficiency (causal relevance of some actions). The participants, 4- to 6-year-old children (N = 140), viewed both an adult model and a child model, each of whom used a different tool on a novel task. They did so in a complete condition, a near-complete condition, a partial demonstration condition, or a no-demonstration condition. Half of the demonstrations in each condition incorporated causally irrelevant actions by the models. Social transmission was assessed by first responses but also through children’s continued fidelity, the hallmark of social traditions. Results revealed a bias to copy the child model both on first response and in continued interactions. Demonstration efficacy and efficiency did not affect choice of model at first response but did influence solution exploration across trials, with demonstrations containing causally irrelevant actions decreasing exploration of alternative methods. These results imply that uncertain environments can result in canalized social learning from specific classes of mode
Resumo:
This study examines the role of visual literacy in learning biology. Biology teachers promote the use of digital images as a learning tool for two reasons: because biology is the most visual of the sciences, and the use of imagery is becoming increasingly important with the advent of bioinformatics; and because studies indicate that this current generation of teenagers have a cognitive structure that is formed through exposure to digital media. On the other hand, there is concern that students are not being exposed enough to the traditional methods of processing biological information - thought to encourage left-brain sequential thinking patterns. Theories of Embodied Cognition point to the importance of hand-drawing for proper assimilation of knowledge, and theories of Multiple Intelligences suggest that some students may learn more easily using traditional pedagogical tools. To test the claim that digital learning tools enhance the acquisition of visual literacy in this generation of biology students, a learning intervention was carried out with 33 students enrolled in an introductory college biology course. The study compared learning outcomes following two types of learning tools. One learning tool was a traditional drawing activity, and the other was an interactive digital activity carried out on a computer. The sample was divided into two random groups, and a crossover design was implemented with two separate interventions. In the first intervention students learned how to draw and label a cell. Group 1 learned the material by computer and Group 2 learned the material by hand-drawing. In the second intervention, students learned how to draw the phases of mitosis, and the two groups were inverted. After each learning activity, students were given a quiz on the material they had learned. Students were also asked to self-evaluate their performance on each quiz, in an attempt to measure their level of metacognition. At the end of the study, they were asked to fill out a questionnaire that was used to measure the level of task engagement the students felt towards the two types of learning activities. In this study, following the first testing phase, the students who learned the material by drawing had a significantly higher average grade on the associated quiz compared to that of those who learned the material by computer. The difference was lost with the second “cross-over” trial. There was no correlation for either group between the grade the students thought they had earned through self-evaluation, and the grade that they received. In terms of different measures of task engagement, there were no significant differences between the two groups. One finding from the study showed a positive correlation between grade and self-reported time spent playing video games, and a negative correlation between grade and self-reported interest in drawing. This study provides little evidence to support claims that the use of digital tools enhances learning, but does provide evidence to support claims that drawing by hand is beneficial for learning biological images. However, the small sample size, limited number and type of learning tasks, and the indirect means of measuring levels of metacognition and task engagement restrict generalisation of these conclusions. Nevertheless, this study indicates that teachers should not use digital learning tools to the exclusion of traditional drawing activities: further studies on the effectiveness of these tools are warranted. Students in this study commented that the computer tool seemed more accurate and detailed - even though the two learning tools carried identical information. Thus there was a mismatch between the perception of the usefulness of computers as a learning tool and the reality, which again points to the need for an objective assessment of their usefulness. Students should be given the opportunity to try out a variety of traditional and digital learning tools in order to address their different learning preferences.
Resumo:
The current approach to data analysis for the Laser Interferometry Space Antenna (LISA) depends on the time delay interferometry observables (TDI) which have to be generated before any weak signal detection can be performed. These are linear combinations of the raw data with appropriate time shifts that lead to the cancellation of the laser frequency noises. This is possible because of the multiple occurrences of the same noises in the different raw data. Originally, these observables were manually generated starting with LISA as a simple stationary array and then adjusted to incorporate the antenna's motions. However, none of the observables survived the flexing of the arms in that they did not lead to cancellation with the same structure. The principal component approach is another way of handling these noises that was presented by Romano and Woan which simplified the data analysis by removing the need to create them before the analysis. This method also depends on the multiple occurrences of the same noises but, instead of using them for cancellation, it takes advantage of the correlations that they produce between the different readings. These correlations can be expressed in a noise (data) covariance matrix which occurs in the Bayesian likelihood function when the noises are assumed be Gaussian. Romano and Woan showed that performing an eigendecomposition of this matrix produced two distinct sets of eigenvalues that can be distinguished by the absence of laser frequency noise from one set. The transformation of the raw data using the corresponding eigenvectors also produced data that was free from the laser frequency noises. This result led to the idea that the principal components may actually be time delay interferometry observables since they produced the same outcome, that is, data that are free from laser frequency noise. The aims here were (i) to investigate the connection between the principal components and these observables, (ii) to prove that the data analysis using them is equivalent to that using the traditional observables and (ii) to determine how this method adapts to real LISA especially the flexing of the antenna. For testing the connection between the principal components and the TDI observables a 10x 10 covariance matrix containing integer values was used in order to obtain an algebraic solution for the eigendecomposition. The matrix was generated using fixed unequal arm lengths and stationary noises with equal variances for each noise type. Results confirm that all four Sagnac observables can be generated from the eigenvectors of the principal components. The observables obtained from this method however, are tied to the length of the data and are not general expressions like the traditional observables, for example, the Sagnac observables for two different time stamps were generated from different sets of eigenvectors. It was also possible to generate the frequency domain optimal AET observables from the principal components obtained from the power spectral density matrix. These results indicate that this method is another way of producing the observables therefore analysis using principal components should give the same results as that using the traditional observables. This was proven by fact that the same relative likelihoods (within 0.3%) were obtained from the Bayesian estimates of the signal amplitude of a simple sinusoidal gravitational wave using the principal components and the optimal AET observables. This method fails if the eigenvalues that are free from laser frequency noises are not generated. These are obtained from the covariance matrix and the properties of LISA that are required for its computation are the phase-locking, arm lengths and noise variances. Preliminary results of the effects of these properties on the principal components indicate that only the absence of phase-locking prevented their production. The flexing of the antenna results in time varying arm lengths which will appear in the covariance matrix and, from our toy model investigations, this did not prevent the occurrence of the principal components. The difficulty with flexing, and also non-stationary noises, is that the Toeplitz structure of the matrix will be destroyed which will affect any computation methods that take advantage of this structure. In terms of separating the two sets of data for the analysis, this was not necessary because the laser frequency noises are very large compared to the photodetector noises which resulted in a significant reduction in the data containing them after the matrix inversion. In the frequency domain the power spectral density matrices were block diagonals which simplified the computation of the eigenvalues by allowing them to be done separately for each block. The results in general showed a lack of principal components in the absence of phase-locking except for the zero bin. The major difference with the power spectral density matrix is that the time varying arm lengths and non-stationarity do not show up because of the summation in the Fourier transform.
Resumo:
Esta investigación se interesa en evaluar los logros y retos que ha presentado el proyecto ASEAN Community en cada una de sus tres áreas de acción (Comunidad económica, comunidad de política y seguridad, y comunidad socio-cultural) ante su aplicación en Tailandia. De esta manera, se busca analizar la incidencia que ha tenido el proyecto en el Desarrollo Humano de Tailandia durante el periodo 2004-2014. A través del análisis del estatus actual a la luz del concepto de libertades instrumentales se realiza la evaluación de los resultados de los proyectos y su conveniencia o no para el desarrollo humano de la sociedad tailandesa.
Resumo:
Este documento tiene como objetivo describir las implicaciones para la salud con el uso de medicamentos biosimilares en comparación con los medicamentos biológicos en Colombia. Así mismo, describir el contexto normativo acerca del uso de medicamentos biosimilares, las recomendaciones y lineamientos sobre seguridad y efectividad del uso de medicamentos Biosimilares y Biológicos, partiendo de sus diferencias biomoleculares. Para esto, se desarrolló una revisión documental electrónica y manual de la literatura en bases de datos, revistas y libros limitada a términos MeSH. La selección de los artículos incluyo documentos completos publicados en revistas indexadas de los últimos 10 años, en español e inglés; la información recolectada se organizó para la construcción del presente documento. Concluyendo, se encontró que las patentes de muchos medicamentos biológicos han vencido o están próximas a caducar y varios biosimilares están desarrollándose y comercializándose incluso en países sin regulaciones estrictas. Los biosimilares nunca podrán ser iguales al original por su complejidad molecular, por ello debemos integrarlos a los sistemas de farmacovigilancia mejorando trazabilidad e identificando su origen mientras se establecen denominaciones comunes distinguibles. La evidencia actual sugiere que la regulación de medicamentos biosimilares debe ser evaluada y armonizada en todo el mundo.
Resumo:
La década de los años 90 fue un periodo de grandes transformaciones a nivel económico, social y político en Colombia. Una de las transformaciones más significativas en el ámbito educativo tuvo que ver con la introducción en la escuela de una serie de prácticas destinadas a garantizar la formación democrática de los ciudadanos; todo esto con el propósito de consolidar en la sociedad una serie de hábitos, valores y prácticas acordes con el nuevo orden social instaurado y amparado por la Constitución Política de 1991. Pese a los esfuerzos legislativos y estatales por promover la democracia en el entorno escolar, las prácticas adelantadas en las instituciones educativas no han logrado responder efectivamente al objetivo de consolidar un ambiente democrático en la escuela y en la sociedad en general, razón por la cual, en este trabajo se realiza un análisis de la manera como una institución educativa en particular, asume y pone en marcha las propuestas democráticas formuladas legalmente, las tensiones suscitadas por la introducción de la propuesta democrática en la escuela, además de las conceptualizaciones sobre la democracia que dichas prácticas contribuyen a configurar y las cuales dan forma a los imaginarios, acciones y discursos cotidianos de estudiantes y docentes específicamente.