896 resultados para Art teaching methods
Resumo:
Process systems design, operation and synthesis problems under uncertainty can readily be formulated as two-stage stochastic mixed-integer linear and nonlinear (nonconvex) programming (MILP and MINLP) problems. These problems, with a scenario based formulation, lead to large-scale MILPs/MINLPs that are well structured. The first part of the thesis proposes a new finitely convergent cross decomposition method (CD), where Benders decomposition (BD) and Dantzig-Wolfe decomposition (DWD) are combined in a unified framework to improve the solution of scenario based two-stage stochastic MILPs. This method alternates between DWD iterations and BD iterations, where DWD restricted master problems and BD primal problems yield a sequence of upper bounds, and BD relaxed master problems yield a sequence of lower bounds. A variant of CD, which includes multiple columns per iteration of DW restricted master problem and multiple cuts per iteration of BD relaxed master problem, called multicolumn-multicut CD is then developed to improve solution time. Finally, an extended cross decomposition method (ECD) for solving two-stage stochastic programs with risk constraints is proposed. In this approach, a CD approach at the first level and DWD at a second level is used to solve the original problem to optimality. ECD has a computational advantage over a bilevel decomposition strategy or solving the monolith problem using an MILP solver. The second part of the thesis develops a joint decomposition approach combining Lagrangian decomposition (LD) and generalized Benders decomposition (GBD), to efficiently solve stochastic mixed-integer nonlinear nonconvex programming problems to global optimality, without the need for explicit branch and bound search. In this approach, LD subproblems and GBD subproblems are systematically solved in a single framework. The relaxed master problem obtained from the reformulation of the original problem, is solved only when necessary. A convexification of the relaxed master problem and a domain reduction procedure are integrated into the decomposition framework to improve solution efficiency. Using case studies taken from renewable resource and fossil-fuel based application in process systems engineering, it can be seen that these novel decomposition approaches have significant benefit over classical decomposition methods and state-of-the-art MILP/MINLP global optimization solvers.
Resumo:
Literature is not generally considered as a coherent branch of the curriculum in relation to language development in either native or foreign language teaching. As teachers of English in multicultural Indian classrooms, we come across students with varying degrees of competence in English language learning. Although language learning is a natural process for natives, students of other languages put in colossal efforts to learn it. Despite their sincere efforts, they face challenges regarding pronunciation, spelling, and vocabulary. Indian classrooms are a microcosm of the larger society, so teaching English language in a manner that equips the students to face the cutthroat competition has become a necessity and a challenge for English language teachers. English today has become the key determinant for being successful in their careers. The hackneyed and stereotypical methods of teaching are not acceptable now. Teachers are no longer arbitrary dispensers of knowledge, but they are playing the role of a guide and facilitator for the students. Teachers of English are using innovative ideas to make English language teaching and learning interesting and simple. Teachers have started using literary texts and their analyses to explore and ignite the imagination and creative skills of the students. One needs to think and rethink the contribution of literature to intelligent thinking as well as its role in the process of teaching/learning. This article is, therefore, an attempt at exploring the nature of the literary experience in the present-day classrooms and the broader role of literature in life.
Resumo:
Presentation given at the 2016 British Educational Research Association (BERA) conference
Resumo:
Within this booklet, teachers will find instructional resources covering a wide array of genres, including, dance, choral music, general music, instrumental music, media arts, theatre, and the visual arts. These lesson plans are explicitly designed to integrate artistic expression and comprehension with other academic disciplines, such as English, History, and Social Studies. Each submission highlights the grade level, artistic genre, sources, learning objectives, instructional plans, and modes of evaluation. This Arts Integration Supplement to the Teacher’s Guide to African American Historic Places in South Carolina outlines 22 lesson plans that meet the 2010 Visual and Performing Arts Standards of South Carolina and integrates the arts into classroom instruction. Where applicable, other standards, such as those for math and social studies, are listed with each lesson plan. The teaching activities in this supplement are provided to aid in the development of lesson plans or to complement existing lessons. Teaching activities are the simplest means of integrating art in classroom instruction.
Resumo:
Determinar la validez concurrente del Sistema de Observación de Tiempo de Instrucción de Condición Física (SOFIT) a través de acelerometría, como método para medir los niveles de actividad física (AF) de los escolares de 1º a 9º durante la clase de educación física en tres colegios públicos de Bogotá, Colombia. Estudio transversal entre Octubre de 2014 y Marzo de 2015. La medición se realizó en tres colegios públicos de Bogotá. Participaron 48 estudiantes (25 niñas; 23 niños), entre 5 y 17 años, seleccionados de acuerdo al protocolo de SOFIT. El resultado se categoriza en porcentaje de tiempo en comportamiento sedentario, AF moderada, AF vigorosa, y AF moderada a vigorosa. Se validó utilizando como patrón de oro la acelerometría en las mismas categorías. Se realizó diferencia de medias, regresión lineal y modelo de efectos fijos. La correlación entre SOFIT y acelerometría fue buena para AF moderada (rho=,958; p=0,000), AF vigorosa (rho=,937; p=0,000) y AF de moderada a vigorosa (rho=0,962; p=0,000). Al igual que utilizando un modelo de efectos fijos, AF moderada (β1=0,92; p=0,00), vigorosa (β1=0,94; p=0,00) y AF de moderada a vigorosa (β1=0,95; p=0,00), mostrando ausencia de diferencias significativas entre los dos métodos para la medición de los niveles de AF. El comportamiento sedentario correlacionó positivamente en Spearman (rho=,0965; p=0,000), El sistema SOFIT demostró ser válido para medir niveles de AF en clases de educación física, tras buena correlación y concordancia con acelerometría. SOFIT es un instrumento de fácil acceso y de bajo costo para la medición de la AF durante las clases de educación física en el contexto escolar y se recomienda su uso en futuros estudios.
Resumo:
Este trabajo se inscribe en uno de los grandes campos de los estudios organizacionales: la estrategia. La perspectiva clásica en este campo promovió la idea de que proyectarse hacia el futuro implica diseñar un plan (una serie de acciones deliberadas). Avances posteriores mostraron que la estrategia podía ser comprendida de otras formas. Sin embargo, la evolución del campo privilegió en alguna medida la mirada clásica estableciendo, por ejemplo, múltiples modelos para ‘formular’ una estrategia, pero dejando en segundo lugar la manera en la que esta puede ‘emerger’. El propósito de esta investigación es, entonces, aportar al actual nivel de comprensión respecto a las estrategias emergentes en las organizaciones. Para hacerlo, se consideró un concepto opuesto —aunque complementario— al de ‘planeación’ y, de hecho, muy cercano en su naturaleza a ese tipo de estrategias: la improvisación. Dado que este se ha nutrido de valiosos aportes del mundo de la música, se acudió al saber propio de este dominio, recurriendo al uso de ‘la metáfora’ como recurso teórico para entenderlo y alcanzar el objetivo propuesto. Los resultados muestran que 1) las estrategias deliberadas y las emergentes coexisten y se complementan, 2) la improvisación está siempre presente en el contexto organizacional, 3) existe una mayor intensidad de la improvisación en el ‘como’ de la estrategia que en el ‘qué’ y, en oposición a la idea convencional al respecto, 4) se requiere cierta preparación para poder improvisar de manera adecuada.
Resumo:
Este trabajo es una revisión de literatura que abarca una selección de artículos disponibles en bases de datos especializadas y publicados en el periodo comprendido entre los años 2006 a 2016 para artículos científicos y entre los años 2000 a 2016 para libros. En total se revisaron: 1 tesis doctoral, 1 tesis magistral, 111 artículos y 9 libros o capítulos de libros. Se presentan diversas definiciones de mindfulness y formas de conceptualizarla, sus mecanismos de acción, sus enfoques psicoterapéuticos predominantes, los efectos de su práctica estable, sus principales campos de acción y la importancia de la formación de los docentes que imparten la práctica. Finalmente se presentan algunas conclusiones acerca del diálogo entre la literatura psicológica sobre mindfulness y algunas de las concepciones de la tradición budista en torno a la meditación.
Resumo:
Sexuality is recognized as part of holistic nursing care, but its inclusion in clinical practice and nursing training is inconsistent. Based on the question "How students and teachers acknowledge sexuality in teaching and learning?", we developed a study in order to characterize the process of teaching and learning sexuality in a micro perspective of curriculum development. We used a mixed methods design with a sequential strategy: QUAN-qual of descriptive and explanatory type. 646 students and teachers participated. The quantitative component used questionnaire surveys. Document analysis was used in the additional component. A curricular dimension of sexuality emerges guided by a behaviourist line and based on a biological vision. The issues considered sage are highlighted and framed in steps of adolescence and adulthood and more attacghed to female sexuality and procreative aspect. There is in emeergence a hidden curriculum by reference to content from other dimensions of sexuality but less often expressed. Theoretical learning follows a communicational model of reality through abstraction strategies, which infers a deductive method of learning, with a behaviourist approach to assessment. Clinical teaching adresses sexuality in combination with reproductive lealth nursing. The influencing factors of teaching and learning of sexuality were also explored. We conclude that the vision of female sexuality taught and learned in relation to women has a projection of care in clinical practice based on the same principles
Resumo:
The purpose of this paper is to present the results of two online forums carried out with the participation of 42 students of the Licenciaturas in Preschool Education, Primary Education and Secondary Education of the University of Costa Rica. The main purpose of the forums was to determine the insights of the participant students about the competencies they have achieved in the field of education research, and which have been the essential tools for them to systematize their own teaching practices. The discussion forums were part of the course FD5091 Métodos de Investigación Educativa [Education Research Methods] of the School of Teacher Education, delivered from March-April 2010. Of the sample, 60 percent were students of the Preschool teaching program, 35 percent were from the Primary Education teaching program and 5 percent were from the Secondary Education teaching program in the fields of Science, Mathematics and Social Studies. According to the insights and beliefs showed by the participants –both, the future teachers and the profession practitioners–, there are no opportunities for research or systematization of their own teaching mediation, in the current work situation.(1) Translator’s Note: In Costa Rica, the “Licenciatura” is a one-year post-Bachelor study program, usually including thesis. “Primary Education” refers to students from the 1st to 6th grades, and “Secondary Education” refers to students from the 7th to 11th grades.
Resumo:
The research solved the historiographic lacuna about Leonardo Ricci’s work in the United States focusing on the span 1952-1972 as a fundamental period for the architect's research, which moved from the project for the community space to macrostructures. The considered period is comprised between Ricci’s first travel to the United States and the date of his resignation from the University of Florida, one year before his resignation from the deanship of the faculty of architecture of Florence (1973). The research retraced philologically the stages of Ricci’s activity in the U.S.A. unveiling the premises and results of his American transfer, and to what extent it marked a turning period for his work as educator and designer and for the wider historiographic contest of the Sixties. The American transfer helped him grounding his belief in avoiding a priori morphological results in favor of what he called the “form-act” design method. Ricci’s research in the U.S.A. is described in his books Anonymous (XX century) and City of the Earth (unpublished). In them and in Ricci’s projects one common thread is traceable: the application of the “form-act” as the best tool to conceive urban design, a discipline established in the United States during Ricci’s first stay at M.I.T., in which he encountered the balance point between architecture and urban planning, between the architect’s sign and his being anonymous, between the collective and the individual dimension. With the notions of “anonymous architecture” and “form-act”, Urban Design and “open work” are the key words to understand Ricci’s work in the United States and in Italy. Urban design’s main goal to design the city as a collective work of art was the solution of that dychothomous research that enlivened Ricci’s work and one possible answer to that tension useful for him to seek the truth of architecture.
Resumo:
The dissertation addresses the still not solved challenges concerned with the source-based digital 3D reconstruction, visualisation and documentation in the domain of archaeology, art and architecture history. The emerging BIM methodology and the exchange data format IFC are changing the way of collaboration, visualisation and documentation in the planning, construction and facility management process. The introduction and development of the Semantic Web (Web 3.0), spreading the idea of structured, formalised and linked data, offers semantically enriched human- and machine-readable data. In contrast to civil engineering and cultural heritage, academic object-oriented disciplines, like archaeology, art and architecture history, are acting as outside spectators. Since the 1990s, it has been argued that a 3D model is not likely to be considered a scientific reconstruction unless it is grounded on accurate documentation and visualisation. However, these standards are still missing and the validation of the outcomes is not fulfilled. Meanwhile, the digital research data remain ephemeral and continue to fill the growing digital cemeteries. This study focuses, therefore, on the evaluation of the source-based digital 3D reconstructions and, especially, on uncertainty assessment in the case of hypothetical reconstructions of destroyed or never built artefacts according to scientific principles, making the models shareable and reusable by a potentially wide audience. The work initially focuses on terminology and on the definition of a workflow especially related to the classification and visualisation of uncertainty. The workflow is then applied to specific cases of 3D models uploaded to the DFG repository of the AI Mainz. In this way, the available methods of documenting, visualising and communicating uncertainty are analysed. In the end, this process will lead to a validation or a correction of the workflow and the initial assumptions, but also (dealing with different hypotheses) to a better definition of the levels of uncertainty.
Resumo:
Deep Neural Networks (DNNs) have revolutionized a wide range of applications beyond traditional machine learning and artificial intelligence fields, e.g., computer vision, healthcare, natural language processing and others. At the same time, edge devices have become central in our society, generating an unprecedented amount of data which could be used to train data-hungry models such as DNNs. However, the potentially sensitive or confidential nature of gathered data poses privacy concerns when storing and processing them in centralized locations. To this purpose, decentralized learning decouples model training from the need of directly accessing raw data, by alternating on-device training and periodic communications. The ability of distilling knowledge from decentralized data, however, comes at the cost of facing more challenging learning settings, such as coping with heterogeneous hardware and network connectivity, statistical diversity of data, and ensuring verifiable privacy guarantees. This Thesis proposes an extensive overview of decentralized learning literature, including a novel taxonomy and a detailed description of the most relevant system-level contributions in the related literature for privacy, communication efficiency, data and system heterogeneity, and poisoning defense. Next, this Thesis presents the design of an original solution to tackle communication efficiency and system heterogeneity, and empirically evaluates it on federated settings. For communication efficiency, an original method, specifically designed for Convolutional Neural Networks, is also described and evaluated against the state-of-the-art. Furthermore, this Thesis provides an in-depth review of recently proposed methods to tackle the performance degradation introduced by data heterogeneity, followed by empirical evaluations on challenging data distributions, highlighting strengths and possible weaknesses of the considered solutions. Finally, this Thesis presents a novel perspective on the usage of Knowledge Distillation as a mean for optimizing decentralized learning systems in settings characterized by data heterogeneity or system heterogeneity. Our vision on relevant future research directions close the manuscript.
Resumo:
This thesis reports on the two main areas of our research: introductory programming as the traditional way of accessing informatics and cultural teaching informatics through unconventional pathways. The research on introductory programming aims to overcome challenges in traditional programming education, thus increasing participation in informatics. Improving access to informatics enables individuals to pursue more and better professional opportunities and contribute to informatics advancements. We aimed to balance active, student-centered activities and provide optimal support to novices at their level. Inspired by Productive Failure and exploring the concept of notional machine, our work focused on developing Necessity Learning Design, a design to help novices tackle new programming concepts. Using this design, we implemented a learning sequence to introduce arrays and evaluated it in a real high-school context. The subsequent chapters discuss our experiences teaching CS1 in a remote-only scenario during the COVID-19 pandemic and our collaborative effort with primary school teachers to develop a learning module for teaching iteration using a visual programming environment. The research on teaching informatics principles through unconventional pathways, such as cryptography, aims to introduce informatics to a broader audience, particularly younger individuals that are less technical and professional-oriented. It emphasizes the importance of understanding informatics's cultural and scientific aspects to focus on the informatics societal value and its principles for active citizenship. After reflecting on computational thinking and inspired by the big ideas of science and informatics, we describe our hands-on approach to teaching cryptography in high school, which leverages its key scientific elements to emphasize its social aspects. Additionally, we present an activity for teaching public-key cryptography using graphs to explore fundamental concepts and methods in informatics and mathematics and their interdisciplinarity. In broadening the understanding of informatics, these research initiatives also aim to foster motivation and prime for more professional learning of informatics.
Resumo:
Machine Learning makes computers capable of performing tasks typically requiring human intelligence. A domain where it is having a considerable impact is the life sciences, allowing to devise new biological analysis protocols, develop patients’ treatments efficiently and faster, and reduce healthcare costs. This Thesis work presents new Machine Learning methods and pipelines for the life sciences focusing on the unsupervised field. At a methodological level, two methods are presented. The first is an “Ab Initio Local Principal Path” and it is a revised and improved version of a pre-existing algorithm in the manifold learning realm. The second contribution is an improvement over the Import Vector Domain Description (one-class learning) through the Kullback-Leibler divergence. It hybridizes kernel methods to Deep Learning obtaining a scalable solution, an improved probabilistic model, and state-of-the-art performances. Both methods are tested through several experiments, with a central focus on their relevance in life sciences. Results show that they improve the performances achieved by their previous versions. At the applicative level, two pipelines are presented. The first one is for the analysis of RNA-Seq datasets, both transcriptomic and single-cell data, and is aimed at identifying genes that may be involved in biological processes (e.g., the transition of tissues from normal to cancer). In this project, an R package is released on CRAN to make the pipeline accessible to the bioinformatic Community through high-level APIs. The second pipeline is in the drug discovery domain and is useful for identifying druggable pockets, namely regions of a protein with a high probability of accepting a small molecule (a drug). Both these pipelines achieve remarkable results. Lastly, a detour application is developed to identify the strengths/limitations of the “Principal Path” algorithm by analyzing Convolutional Neural Networks induced vector spaces. This application is conducted in the music and visual arts domains.
Resumo:
Natural events are a widely recognized hazard for industrial sites where relevant quantities of hazardous substances are handled, due to the possible generation of cascading events resulting in severe technological accidents (Natech scenarios). Natural events may damage storage and process equipment containing hazardous substances, that may be released leading to major accident scenarios called Natech events. The need to assess the risk associated with Natech scenarios is growing and methodologies were developed to allow the quantification of Natech risk, considering both point sources and linear sources as pipelines. A key element of these procedures is the use of vulnerability models providing an estimation of the damage probability of equipment or pipeline segment as a result of the impact of the natural event. Therefore, the first aim of the PhD project was to outline the state of the art of vulnerability models for equipment and pipelines subject to natural events such as floods, earthquakes, and wind. Moreover, the present PhD project also aimed at the development of new vulnerability models in order to fill some gaps in literature. In particular, a vulnerability model for vertical equipment subject to wind and to flood were developed. Finally, in order to improve the calculation of Natech risk for linear sources an original methodology was developed for Natech quantitative risk assessment methodology for pipelines subject to earthquakes. Overall, the results obtained are a step forward in the quantitative risk assessment of Natech accidents. The tools developed open the way to the inclusion of new equipment in the analysis of Natech events, and the methodology for the assessment of linear risk sources as pipelines provides an important tool for a more accurate and comprehensive assessment of Natech risk.