766 resultados para Algebraic thinking
Resumo:
Current reform initiatives recommend that geometry instruction include the study of three-dimensional geometric objects and provide students with opportunities to use spatial skills in problem-solving tasks. Geometer's Sketchpad (GSP) is a dynamic and interactive computer program that enables the user to investigate and explore geometric concepts and manipulate geometric structures. Research using GSP as an instructional tool has focused primarily on teaching and learning two-dimensional geometry. This study explored the effect of a GSP based instructional environment on students' geometric thinking and three-dimensional spatial ability as they used GSP to learn three-dimensional geometry. For 10 weeks, 18 tenth-grade students from an urban school district used GSP to construct and analyze dynamic, two-dimensional representations of three-dimensional objects in a classroom environment that encouraged exploration, discussion, conjecture, and verification. The data were collected primarily from participant observations and clinical interviews and analyzed using qualitative methods of analysis. In addition, pretest and posttest measures of three-dimensional spatial ability and van Hiele level of geometric thinking were obtained. Spatial ability measures were analyzed using standard t-test analysis. ^ The data from this study indicate that GSP is a viable tool to teach students about three-dimensional geometric objects. A comparison of students' pretest and posttest van Hiele levels showed an improvement in geometric thinking, especially for students on lower levels of the van Hiele theory. Evidence at the p < .05 level indicated that students' spatial ability improved significantly. Specifically, the GSP dynamic, visual environment supported students' visualization and reasoning processes as students attempted to solve challenging tasks about three-dimensional geometric objects. The GSP instructional activities also provided students with an experiential base and an intuitive understanding about three-dimensional objects from which more formal work in geometry could be pursued. This study demonstrates that by designing appropriate GSP based instructional environments, it is possible to help students improve their spatial skills, develop more coherent and accurate intuitions about three-dimensional geometric objects, and progress through the levels of geometric thinking proposed by van Hiele. ^
Resumo:
Since the 1950s, the theory of deterministic and nondeterministic finite automata (DFAs and NFAs, respectively) has been a cornerstone of theoretical computer science. In this dissertation, our main object of study is minimal NFAs. In contrast with minimal DFAs, minimal NFAs are computationally challenging: first, there can be more than one minimal NFA recognizing a given language; second, the problem of converting an NFA to a minimal equivalent NFA is NP-hard, even for NFAs over a unary alphabet. Our study is based on the development of two main theories, inductive bases and partials, which in combination form the foundation for an incremental algorithm, ibas, to find minimal NFAs. An inductive basis is a collection of languages with the property that it can generate (through union) each of the left quotients of its elements. We prove a fundamental characterization theorem which says that a language can be recognized by an n-state NFA if and only if it can be generated by an n-element inductive basis. A partial is an incompletely-specified language. We say that an NFA recognizes a partial if its language extends the partial, meaning that the NFA’s behavior is unconstrained on unspecified strings; it follows that a minimal NFA for a partial is also minimal for its language. We therefore direct our attention to minimal NFAs recognizing a given partial. Combining inductive bases and partials, we generalize our characterization theorem, showing that a partial can be recognized by an n-state NFA if and only if it can be generated by an n-element partial inductive basis. We apply our theory to develop and implement ibas, an incremental algorithm that finds minimal partial inductive bases generating a given partial. In the case of unary languages, ibas can often find minimal NFAs of up to 10 states in about an hour of computing time; with brute-force search this would require many trillions of years.
Resumo:
Through action research, the researchers engaged a group of third grade children in a project that offered a variety of learning activities to develop the children’s perspective-taking ability. As a result the teachers found a significant increase in the emotional connection of the students to the characters in the literature.
Resumo:
A Learning Assistant program that recruits strong STEM undergraduates to become mathematics teachers was explored through a qualitative study. Three program participants were purposely selected and interviewed. The program reaffirmed one participant’s choice to become a teacher and clarified for one that it might be a career for him.
Resumo:
This paper makes a case for a direct relationship between digital literacy and nonlinear thinking styles, articulates a demand for nonlinear thinking styles in education and the workplace, and states implications for a connection between nonlinear thinking styles visual literacy, and intuitive artistic practice.
Resumo:
This flyer promotes the event "Contemporary Cuban Culture: Notes on Alternative Thinking Lecture by Madeline Cámara Betancourt ". Credit for image on flyer: Baruj Salinas, Punta Cana VI, 1999.
Resumo:
The purpose of this study was to determine the knowledge and use of critical thinking teaching strategies by full-time and part-time faculty in Associate Degree Nursing (ADN) programs. Sander's CTI (1992) instrument was adapted for this study and pilottested prior to the general administration to ADN faculty in Southeast Florida. This modified instrument, now termed the Burroughs Teaching Strategy Inventory (BTSI), returned reliability estimates (Cronbach alphas of .71, .74, and .82 for the three constructs) comparable to the original instrument. The BTSI was administered to 113 full-time and part-time nursing faculty in three community college nursing programs. The response rate was 92% for full-time faculty (n = 58) and 61 % for part-time faculty (n = 55). The majority of participants supported a combined definition of critical thinking in nursing which represented a composite of thinking skills that included reflective thinking, assessing alternative viewpoints, and the use of problem-solving. Full-time and part-time faculty used different teaching strategies. Fulltime faculty most often used multiple-choice exams and lecture while part-time faculty most frequently used discussion within their classes. One possible explanation for specific strategy choices and differences might be that full-time faculty taught predominately in theory classes where certain strategies would be more appropriate and part-time faculty taught predominately clinical classes. Both faculty types selected written nursing care plans as the second most effective critical thinking strategy. Faculty identified several strategies as being effective in teaching critical thinking. These strategies included discussion, case studies, higher order questioning, and concept analysis. These however, were not always the strategies that were used in either the classroom or clinical setting. Based on this study, the author recommends that if the profession continues to stress critical thinking as a vital component of practice, nursing faculty should receive education in appropriate critical teaching strategies. Both in-service seminars and workshops could be used to further the knowledge and use of critical thinking strategies by faculty. Qualitative research should be done to determine why nursing faculty use self-selected teaching strategies.
Resumo:
Since the 1950s, the theory of deterministic and nondeterministic finite automata (DFAs and NFAs, respectively) has been a cornerstone of theoretical computer science. In this dissertation, our main object of study is minimal NFAs. In contrast with minimal DFAs, minimal NFAs are computationally challenging: first, there can be more than one minimal NFA recognizing a given language; second, the problem of converting an NFA to a minimal equivalent NFA is NP-hard, even for NFAs over a unary alphabet. Our study is based on the development of two main theories, inductive bases and partials, which in combination form the foundation for an incremental algorithm, ibas, to find minimal NFAs. An inductive basis is a collection of languages with the property that it can generate (through union) each of the left quotients of its elements. We prove a fundamental characterization theorem which says that a language can be recognized by an n-state NFA if and only if it can be generated by an n-element inductive basis. A partial is an incompletely-specified language. We say that an NFA recognizes a partial if its language extends the partial, meaning that the NFA's behavior is unconstrained on unspecified strings; it follows that a minimal NFA for a partial is also minimal for its language. We therefore direct our attention to minimal NFAs recognizing a given partial. Combining inductive bases and partials, we generalize our characterization theorem, showing that a partial can be recognized by an n-state NFA if and only if it can be generated by an n-element partial inductive basis. We apply our theory to develop and implement ibas, an incremental algorithm that finds minimal partial inductive bases generating a given partial. In the case of unary languages, ibas can often find minimal NFAs of up to 10 states in about an hour of computing time; with brute-force search this would require many trillions of years.
Resumo:
Dr. Madeline Camara Betancourt gives a lecture that considers several key cultural, political, and literary events as "crossroads" that have generated alternative thoughts in the quest for a Cuban identity after 1959.
Resumo:
We study the algebraic and topological genericity of certain subsets of locally recurrent functions, obtaining (among other results) algebrability and spaceability within these classes.
Resumo:
Funding This work was supported by the German Research Foundation [DFG grants SFB 940/1]. Acknowledgements We would like to thank Lia Kvavilashvili for her helpful comments on this study during the International Conference on Prospective Memory (ICPM4) in Naples, Italy, 2014. We thank Daniel P. Sheppard for proofreading the manuscript.
Resumo:
We study the algebraic and topological genericity of certain subsets of locally recurrent functions, obtaining (among other results) algebrability and spaceability within these classes.
Resumo:
My thesis thinks through the ways Newtonian logics require linear mobility in order to produce narratives of progress. I argue that this linear mobility, and the resulting logics, potentially erases the chaotic and non-linear motions that are required to navigate a colonial landscape. I suggest that these non-linear movements produce important critiques of the seeming stasis of colonial constructs and highlight the ways these logics must appear neutral and scientific in an attempt to conceal the constant and complex adjustments these frameworks require. In order to make room for these complex motions, I develop a quantum intervention. Specifically, I use quantum physics as a metaphor to think through the significance of black life, the double-consciousness ofland, and the intricate motions of sound. In order to put forth this intervention, I look at news coverage of Hurricane Katrina, Du Bois’s characterization of land in Souls of Black Folks, and the aural mobilities of blackness articulated in an academic discussion and interview about post- humanism.