914 resultados para software as teaching tool
Resumo:
La normalización facilita la comunicación y permite el intercambio de información con cualquier institución nacional o internacional. Este objetivo es posible a través de los formatos de comunicación para intercambio de información automatizada como CEPAL, MARC., FCC.La Escuela de Bibliotecología, Documentación e Información de la Universidad Nacional utiliza el software MICROISIS en red para la enseñanza. Las bases de datos que se diseñan utilizan el formato MARC y para la descripción bibliográfica las RCAA2.Se presenta la experiencia con la base de datos “I&D” sobre desarrollo rural, presentando la Tabla de Definición de Campos, la hoja de trabajo, el formato de despliegue y Tabla de selección de Campos.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Física, Programa de Pós-Graduação de Mestrado Profissional em Ensino de Física, Mestrado Nacional Profissional em Ensino de Física, 2015.
Resumo:
In this thesis, we will explore approaches to faculty instructional change in astronomy and physics. We primarily focus on professional development (PD) workshops, which are a central mechanism used within our community to help faculty improve their teaching. Although workshops serve a critical role for promoting more equitable instruction, we rarely assess them through careful consideration of how they engage faculty. To encourage a shift towards more reflective, research-informed PD, we developed the Real-Time Professional Development Observation Tool (R-PDOT), to document the form and focus of faculty's engagement during workshops. We then analyze video-recordings of faculty's interactions during the Physics and Astronomy New Faculty Workshop, focusing on instances where faculty might engage in pedagogical sense-making. Finally, we consider insights gained from our own local, team-based effort to improve a course sequence for astronomy majors. We conclude with recommendations for PD leaders and researchers.
Resumo:
Purpose: Nurses and nursing students are often first responders to in-hospital cardiac arrest events; thus they are expected to perform Basic Life Support (BLS) and use an automated external defibrillator (AED) without delay. The aim of this study was to explore the relationship between nursing students’ self-efficacy and performance before and after receiving a particular training intervention in BLS/AED. Materials and methods: Explanatory correlational study. 177 nursing students received a 4-h training session in BLS/AED after being randomized to either a self-directed (SDG) or an instructor-directed teaching group (IDG).1 A validated self-efficacy scale, the Cardiff Test and Laerdal SkillReporter® software were used to assess students’ self-efficacy and performance in BLS/AED at pre-test, post-test and 3-month retention-test. Independent t-test analysis was performed to compare the differences between groups at pre-test. Pearson coefficient (r) was used to calculate the strength of the relationship between self-efficacy and performance in both groups at pre-test, post-test and retention-test. Results: Independent t-tests analysis showed that there were non-significant differences (p-values > 0.05) between groups for any of the variables measured. At pre-test, results showed that correlation between self-efficacy and performance was moderate for the IDG (r = 0.53; p < 0.05) and the SDG (r = 0.49; p < 0.05). At post-test, correlation between self-efficacy and performance was much higher for the SDG (r = 0.81; p < 0.05) than for the IDG (r = 0.32; p < 0.05), which in fact was weaker than at pre-test. Finally, it was found that whereas the correlation between self-efficacy and performance increased from the post-test to the retention-test to almost reach baseline levels for the ILG (r = 0.52; p < 0.05), it slightly decreased in this phase for the SDG (r = 0.77; p < 0.05). Conclusion: Student-directed strategies may be more effective than instructor-directed strategies at promoting self-assessment and, therefore, may help to improve and maintain the relationship between nursing student self-efficacy and actual ability to perform BLS/AED.
Resumo:
In the Hydrocarbon exploration activities, the great enigma is the location of the deposits. Great efforts are undertaken in an attempt to better identify them, locate them and at the same time, enhance cost-effectiveness relationship of extraction of oil. Seismic methods are the most widely used because they are indirect, i.e., probing the subsurface layers without invading them. Seismogram is the representation of the Earth s interior and its structures through a conveniently disposed arrangement of the data obtained by seismic reflection. A major problem in this representation is the intensity and variety of present noise in the seismogram, as the surface bearing noise that contaminates the relevant signals, and may mask the desired information, brought by waves scattered in deeper regions of the geological layers. It was developed a tool to suppress these noises based on wavelet transform 1D and 2D. The Java language program makes the separation of seismic images considering the directions (horizontal, vertical, mixed or local) and bands of wavelengths that form these images, using the Daubechies Wavelets, Auto-resolution and Tensor Product of wavelet bases. Besides, it was developed the option in a single image, using the tensor product of two-dimensional wavelets or one-wavelet tensor product by identities. In the latter case, we have the wavelet decomposition in a two dimensional signal in a single direction. This decomposition has allowed to lengthen a certain direction the two-dimensional Wavelets, correcting the effects of scales by applying Auto-resolutions. In other words, it has been improved the treatment of a seismic image using 1D wavelet and 2D wavelet at different stages of Auto-resolution. It was also implemented improvements in the display of images associated with breakdowns in each Auto-resolution, facilitating the choices of images with the signals of interest for image reconstruction without noise. The program was tested with real data and the results were good
Resumo:
Among the potentially polluting economic activities that compromise the quality of groundwater are the gas stations. The city of Natal has about 120 gas stations, of which only has an environmental license for operation. Discontinuities in the offices were notified by the Public Ministry of Rio Grande do Norte to carry out the environmental adaptations, among which is the investigation of environmental liabilities. The preliminary and confirmatory stages of this investigation consisted in the evaluation of soil gas surveys with two confirmatory chemical analysis of BTEX, PAH and TPH. To get a good evaluation and interpretation of results obtained in the field, it became necessary three-dimensional representation of them. We used a CAD software to graph the equipment installed in a retail service station fuel in Natal, as well as the plumes of contamination by volatile organic compounds. The tool was concluded that contamination is not located in the current system of underground storage of fuel development, but reflects the historical past in which tanks were removed not tight gasoline and diesel
Resumo:
Laser speckle contrast imaging (LSCI) has the potential to be a powerful tool in medicine, but more research in the field is required so it can be used properly. To help in the progression of Michigan Tech's research in the field, a graphical user interface (GUI) was designed in Matlab to control the instrumentation of the experiments as well as process the raw speckle images into contrast images while they are being acquired. The design of the system was successful and is currently being used by Michigan Tech's Biomedical Engineering department. This thesis describes the development of the LSCI GUI as well as offering a full introduction into the history, theory and applications of LSCI.
Resumo:
What qualities, skills, and knowledge produce quality teachers? Many stake-holders in education argue that teacher quality should be measured by student achievement. This qualitative study shows that good teachers are multi-dimensional; their effectiveness cannot be represented by students’ test scores alone. The purpose of this phenomenological study was to gain a deeper understanding of quality in teaching by examining the lived experiences of 10 winners or finalists of the Teacher of the Year (ToY) Award. Phenomenology describes individuals’ daily experiences of phenomena, examines how these experiences are structured, and focuses analysis on the perspectives of the persons having the experience (Moustakas, 1994). This inquiry asked two questions: (a) How is teaching experienced by recognized as outstanding Teachers of the Year? and (b) How do ToYs feelings and perceptions about being good teachers provide insight, if any, about concepts such as pedagogical tact, teacher selfhood, and professional dispositions? Ten participants formed the purposive sample; the major data collection tool was semi-structured interviews (Patton, 1990; Seidman, 2006). Sixty to 90-minute interviews were conducted with each participant. Data also included the participants’ ToY application essays. Data analysis included a three-phase process: description, reduction, interpretation. Findings revealed that the ToYs are dedicated, hard-working individuals. They exhibit behaviors, such as working beyond the school day, engaging in lifelong learning, and assisting colleagues to improve their practice. Working as teachers is their life’s compass, guiding and wrapping them into meaningful and purposeful lives. Pedagogical tact, teacher selfhood, and professional dispositions were shown to be relevant, offering important insights into good teaching. Results indicate that for these ToYs, good teaching is experienced by getting through to students using effective and moral means; they are emotionally open, have a sense of the sacred, and they operate from a sense of intentionality. The essence of the ToYs teaching experience was their being properly engaged in their craft, embodying logical, psychological, and moral realms. Findings challenge current teacher effectiveness process-product orthodoxy which makes a causal connection between effective teaching and student test scores, and which assumes that effective teaching arises solely from and because of the actions of the teacher.
Resumo:
Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.
Resumo:
La Mapoteca Virtual es un sitio web construido sobre la plataforma Joomla, auspiciado por la Escuela de Ciencias Geográfica de la Universidad Nacional, en colaboración con UNA VIRTUAL.Este sitio pretende apoyar la labor docente al permitirle cargar y difundir cartografía digital para los estudiantes, y a los investigadores les facilita la localización de cartografía en línea, necesaria para la realización de sus trabajos en diferentes áreas del conocimiento. Adicionalmente, es un espacio para presentar documentos actuales en relación a la práctica de la cartografía y ciencias conexas, a la vez que fomenta la colaboración y el acceso libre a cartografía digital.Palabras clave: Cartografía, Joomla, Mapas digitales, Herramientas didácticas en líneaAbstractThe Virtual Map Library, www.mapoteca.geo.una.ac.cr, is a website constructed on the Joomla platform and supported by the School of Geographic Sciences at National University, Costa Rica, in collaboration with UNAVIRTUAL. The site intends to support the work of education by allowing the teacher to load and disseminate digital cartography for the students and helping geographic investigators locate cartography needed to accomplish their works in different areas of knowledge.In addition, Mapoteca offers a space to present current documents relating to the practice of cartography and related sciences that at the same time promotes the contribution and free access to digital cartography.Key Words: Cartography, Mapoteca Virtual, Virtual Map Library, Joomla, digital maps, online teaching tools, School of Geographic Sciences, National University, Costa Rica.
Resumo:
The research solved the historiographic lacuna about Leonardo Ricci’s work in the United States focusing on the span 1952-1972 as a fundamental period for the architect's research, which moved from the project for the community space to macrostructures. The considered period is comprised between Ricci’s first travel to the United States and the date of his resignation from the University of Florida, one year before his resignation from the deanship of the faculty of architecture of Florence (1973). The research retraced philologically the stages of Ricci’s activity in the U.S.A. unveiling the premises and results of his American transfer, and to what extent it marked a turning period for his work as educator and designer and for the wider historiographic contest of the Sixties. The American transfer helped him grounding his belief in avoiding a priori morphological results in favor of what he called the “form-act” design method. Ricci’s research in the U.S.A. is described in his books Anonymous (XX century) and City of the Earth (unpublished). In them and in Ricci’s projects one common thread is traceable: the application of the “form-act” as the best tool to conceive urban design, a discipline established in the United States during Ricci’s first stay at M.I.T., in which he encountered the balance point between architecture and urban planning, between the architect’s sign and his being anonymous, between the collective and the individual dimension. With the notions of “anonymous architecture” and “form-act”, Urban Design and “open work” are the key words to understand Ricci’s work in the United States and in Italy. Urban design’s main goal to design the city as a collective work of art was the solution of that dychothomous research that enlivened Ricci’s work and one possible answer to that tension useful for him to seek the truth of architecture.
Resumo:
Molti degli studi oncologici partono dalla analisi di provini istologici, cioè campioni di tessuto prelevati dal paziente. Grazie a marcatori specifici, ovvero coloranti selettivi applicati alla sezione da analizzare, vengono studiate specifiche parti del campione. Spesso per raccogliere più informazioni del campione si utilizzano più marcatori. Tuttavia, questi non sempre possono essere applicati in parallelo e spesso vengono utilizzati in serie dopo un lavaggio del campione. Le immagini così ottenute devono quindi essere allineate per poter procedere con studi di colocalizzazione simulando una acquisizione in parallelo dei vari segnali. Tuttavia, non esiste una procedura standard per allineare le immagini così ottenute. L’allineamento manuale è tempo-dispendioso ed oggetto di possibili errori. Un software potrebbe rendere il tutto più rapido e affidabile. In particolare, DS4H Image Alignment è un plug-in open source implementato per ImageJ/Fiji per allineare immagini multimodali in toni di grigio. Una prima versione del software è stata utilizzata per allineare manualmente una serie di immagini, chiedendo all’utente di definire punti di riferimento comuni a tutte le immagini. In una versione successiva, è stata aggiunta la possibilità di effettuare un allineamento automatico. Tuttavia, questo non era ottimizzato e comportava una perdita di informazione nelle aree non sovrapposte all’immagine definita come riferimento. In questo lavoro, è stato sviluppato un modulo ottimizzato di registrazione automatica di immagini che non assume nessuna immagine di riferimento e preserva tutti i pixel delle immagini originali creando uno stack di dimensioni idonee a contenere il tutto. Inoltre, l’architettura dell’intero software è stata estesa per poter registrare anche immagini a colori.
Resumo:
The IoT is growing more and more each year and is becoming so ubiquitous that it includes heterogeneous devices with different hardware and software constraints leading to an highly fragmented ecosystem. Devices are using different protocols with different paradigms and they are not compatible with each other; some devices use request-response protocols like HTTP or CoAP while others use publish-subscribe protocols like MQTT. Integration in IoT is still an open research topic. When handling and testing IoT sensors there are some common task that people may be interested in: reading and visualizing the current value of the sensor; doing some aggregations on a set of values in order to compute statistical features; saving the history of the data to a time-series database; forecasting the future values to react in advance to a future condition; bridging the protocol of the sensor in order to integrate the device with other tools. In this work we will show the working implementation of a low-code and flow-based tool prototype which supports the common operations mentioned above, based on Node-RED and Python. Since this system is just a prototype, it has some issues and limitations that will be discussed in this work.
Resumo:
The models of teaching social sciences and clinical practice are insufficient for the needs of practical-reflective teaching of social sciences applied to health. The scope of this article is to reflect on the challenges and perspectives of social science education for health professionals. In the 1950s the important movement bringing together social sciences and the field of health began, however weak credentials still prevail. This is due to the low professional status of social scientists in health and the ill-defined position of the social sciences professionals in the health field. It is also due to the scant importance attributed by students to the social sciences, the small number of professionals and the colonization of the social sciences by the biomedical culture in the health field. Thus, the professionals of social sciences applied to health are also faced with the need to build an identity, even after six decades of their presence in the field of health. This is because their ambivalent status has established them as a partial, incomplete and virtual presence, requiring a complex survival strategy in the nebulous area between social sciences and health.
Resumo:
The Centers for High Cost Medication (Centros de Medicação de Alto Custo, CEDMAC), Health Department, São Paulo were instituted by project in partnership with the Clinical Hospital of the Faculty of Medicine, USP, sponsored by the Foundation for Research Support of the State of São Paulo (Fundação de Amparo à Pesquisa do Estado de São Paulo, FAPESP) aimed at the formation of a statewide network for comprehensive care of patients referred for use of immunobiological agents in rheumatological diseases. The CEDMAC of Hospital de Clínicas, Universidade Estadual de Campinas (HC-Unicamp), implemented by the Division of Rheumatology, Faculty of Medical Sciences, identified the need for standardization of the multidisciplinary team conducts, in face of the specificity of care conducts, verifying the importance of describing, in manual format, their operational and technical processes. The aim of this study is to present the methodology applied to the elaboration of the CEDMAC/HC-Unicamp Manual as an institutional tool, with the aim of offering the best assistance and administrative quality. In the methodology for preparing the manuals at HC-Unicamp since 2008, the premise was to obtain a document that is participatory, multidisciplinary, focused on work processes integrated with institutional rules, with objective and didactic descriptions, in a standardized format and with electronic dissemination. The CEDMAC/HC-Unicamp Manual was elaborated in 10 months, with involvement of the entire multidisciplinary team, with 19 chapters on work processes and techniques, in addition to those concerning the organizational structure and its annexes. Published in the electronic portal of HC Manuals in July 2012 as an e-Book (ISBN 978-85-63274-17-5), the manual has been a valuable instrument in guiding professionals in healthcare, teaching and research activities.