970 resultados para theoretical basis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este libro intenta ofrecer una panorámica diferente de la enseñanza de idiomas y llenar el vacío con el que con mucha frecuencia se encuentran los opositores al cuerpo de Enseñanza de Escuelas Oficiales de Idiomas y de Secundaria cuando deciden buscar información sobre un amplio temario. Esta obra combina una base teórica, que cubre aspectos sobre las teorías del lenguaje y de la enseñanza, con un diseño curricular práctico. La primera parte consta de cuatro bloques: 1) El proceso de comunicación, funciones del lenguaje y comunicación oral y escrita. 2) Competencia comunicativa y enseñanza comunicativa del inglés. 3) Teorías sobre el aprendizaje y la adquisición de un idioma extranjero, el concepto de interlenguaje y el tratamiento de error. 4) Evolución de la enseñanza de idiomas. La segunda parte trata del diseño curricular: los objetivos principales que se persiguen en la enseñanza de un idioma, los contenidos, la metodología a seguir con los estudiantes y criterios de evaluación. Es un tratado basado en la propia experiencia personal de la autora como profesora de inglés.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most parameterizations for precipitating convection in use today are bulk schemes, in which an ensemble of cumulus elements with different properties is modelled as a single, representative entraining-detraining plume. We review the underpinning mathematical model for such parameterizations, in particular by comparing it with spectral models in which elements are not combined into the representative plume. The chief merit of a bulk model is that the representative plume can be described by an equation set with the same structure as that which describes each element in a spectral model. The equivalence relies on an ansatz for detrained condensate introduced by Yanai et al. (1973) and on a simplified microphysics. There are also conceptual differences in the closure of bulk and spectral parameterizations. In particular, we show that the convective quasi-equilibrium closure of Arakawa and Schubert (1974) for spectral parameterizations cannot be carried over to a bulk parameterization in a straightforward way. Quasi-equilibrium of the cloud work function assumes a timescale separation between a slow forcing process and a rapid convective response. But, for the natural bulk analogue to the cloud-work function (the dilute CAPE), the relevant forcing is characterised by a different timescale, and so its quasi-equilibrium entails a different physical constraint. Closures of bulk parameterization that use the non-entraining parcel value of CAPE do not suffer from this timescale issue. However, the Yanai et al. (1973) ansatz must be invoked as a necessary ingredient of those closures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work is part of a study that focused on analyzing the contributions of didactic activities related to scientific language rhetoric characteristics aimed at developing students' abilities to identify such characteristics in chemistry scientific texts and critical reading of those texts. In this study, we present the theoretical basis adopted to determine the scientific discourse characteristics and for the production of the didactic material used in those activities. Latour, Coracini and Campanario studies on persuasive rhetorical strategies present in scientific articles aided the production of such material.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The power of computer game technology is currently being harnessed to produce “serious games”. These “games” are targeted at the education and training marketplace, and employ various key game-engine components such as the graphics and physics engines to produce realistic “digital-world” simulations of the real “physical world”. Many approaches are driven by the technology and often lack a consideration of a firm pedagogical underpinning. The authors believe that an analysis and deployment of both the technological and pedagogical dimensions should occur together, with the pedagogical dimension providing the lead. This chapter explores the relationship between these two dimensions, and explores how “pedagogy may inform the use of technology”, how various learning theories may be mapped onto the use of the affordances of computer game engines. Autonomous and collaborative learning approaches are discussed. The design of a serious game is broken down into spatial and temporal elements. The spatial dimension is related to the theories of knowledge structures, especially “concept maps”. The temporal dimension is related to “experiential learning”, especially the approach of Kolb. The multi-player aspect of serious games is related to theories of “collaborative learning” which is broken down into a discussion of “discourse” versus “dialogue”. Several general guiding principles are explored, such as the use of “metaphor” (including metaphors of space, embodiment, systems thinking, the internet and emergence). The topological design of a serious game is also highlighted. The discussion of pedagogy is related to various serious games we have recently produced and researched, and is presented in the hope of informing the “serious game community”.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to propose a theoretical framework, based on contemporary philosophical aesthetics, from which principled assessments of the aesthetic value of information organization frameworks may be conducted.Design/methodology/approach – This paper identifies appropriate discourses within the field of philosophical aesthetics, constructs from them a framework for assessing aesthetic properties of information organization frameworks. This framework is then applied in two case studies examining the Library of Congress Subject Headings (LCSH), and Sexual Nomenclature: A Thesaurus. Findings – In both information organization frameworks studied, the aesthetic analysis was useful in identifying judgments of the frameworks as aesthetic judgments, in promoting discovery of further areas of aesthetic judgments, and in prompting reflection on the nature of these aesthetic judgments. Research limitations/implications – This study provides proof-of-concept for the aesthetic evaluation of information organization frameworks. Areas of future research are identified as the role of cultural relativism in such aesthetic evaluation and identification of appropriate aesthetic properties of information organization frameworks.Practical implications – By identifying a subset of judgments of information organization frameworks as aesthetic judgments, aesthetic evaluation of such frameworks can be made explicit and principled. Aesthetic judgments can be separated from questions of economic feasibility, functional requirements, and user-orientation. Design and maintenance of information organization frameworks can be based on these principles.Originality/value – This study introduces a new evaluative axis for information organization frameworks based on philosophical aesthetics. By improving the evaluation of such novel frameworks, design and maintenance can be guided by these principles.Keywords Evaluation, Research methods, Analysis, Bibliographic systems, Indexes, Retrieval languages

Relevância:

70.00% 70.00%

Publicador:

Resumo:

C(13)H(16)Cl(2)Te,M(r)=370.76,P2(1)/a, a = 8.1833(8), b = 8.4163(8), c = 20.787(2) A, beta = 99.52(1)degrees, Z = 4, R(1) = 0,0275. The primary coordination around the Te(IV) atom is consistent with a pseudo-trigonal bipyramidal bond configuration with two Cl atoms occupying axial positions while the C atoms and the lone pair of electrons occupy the equatorial positions. The Te(IV) atom is involved in an intermolecular secondary interaction resulting in the self assembly of zigzag-chains supramolecular array. In order to determine the theoretical basis set for the Te atom which leads to the best agreement with the experimental data, a large series of geometry optimizations were performed on dichloro dimethyl Te(IV), as a model compound, and the results compared with the mean distances and angles obtained from 45 X-ray structures. The Ahlrichs basis set plus the Hay & Wadt ECP was selected and used for a series of calculations performed on the title compound.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Abstract Background Blood leukocytes constitute two interchangeable sub-populations, the marginated and circulating pools. These two sub-compartments are found in normal conditions and are potentially affected by non-normal situations, either pathological or physiological. The dynamics between the compartments is governed by rate constants of margination (M) and return to circulation (R). Therefore, estimates of M and R may prove of great importance to a deeper understanding of many conditions. However, there has been a lack of formalism in order to approach such estimates. The few attempts to furnish an estimation of M and R neither rely on clearly stated models that precisely say which rate constant is under estimation nor recognize which factors may influence the estimation. Results The returning of the blood pools to a steady-state value after a perturbation (e.g., epinephrine injection) was modeled by a second-order differential equation. This equation has two eigenvalues, related to a fast- and to a slow-component of the dynamics. The model makes it possible to identify that these components are partitioned into three constants: R, M and SB; where SB is a time-invariant exit to tissues rate constant. Three examples of the computations are worked and a tentative estimation of R for mouse monocytes is presented. Conclusions This study establishes a firm theoretical basis for the estimation of the rate constants of the dynamics between the blood sub-compartments of white cells. It shows, for the first time, that the estimation must also take into account the exit to tissues rate constant, SB.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Although the Standard Model of particle physics (SM) provides an extremely successful description of the ordinary matter, one knows from astronomical observations that it accounts only for around 5% of the total energy density of the Universe, whereas around 30% are contributed by the dark matter. Motivated by anomalies in cosmic ray observations and by attempts to solve questions of the SM like the (g-2)_mu discrepancy, proposed U(1) extensions of the SM gauge group have raised attention in recent years. In the considered U(1) extensions a new, light messenger particle, the hidden photon, couples to the hidden sector as well as to the electromagnetic current of the SM by kinetic mixing. This allows for a search for this particle in laboratory experiments exploring the electromagnetic interaction. Various experimental programs have been started to search for hidden photons, such as in electron-scattering experiments, which are a versatile tool to explore various physics phenomena. One approach is the dedicated search in fixed-target experiments at modest energies as performed at MAMI or at JLAB. In these experiments the scattering of an electron beam off a hadronic target e+(A,Z)->e+(A,Z)+l^+l^- is investigated and a search for a very narrow resonance in the invariant mass distribution of the lepton pair is performed. This requires an accurate understanding of the theoretical basis of the underlying processes. For this purpose it is demonstrated in the first part of this work, in which way the hidden photon can be motivated from existing puzzles encountered at the precision frontier of the SM. The main part of this thesis deals with the analysis of the theoretical framework for electron scattering fixed-target experiments searching for hidden photons. As a first step, the cross section for the bremsstrahlung emission of hidden photons in such experiments is studied. Based on these results, the applicability of the Weizsäcker-Williams approximation to calculate the signal cross section of the process, which is widely used to design such experimental setups, is investigated. In a next step, the reaction e+(A,Z)->e+(A,Z)+l^+l^- is analyzed as signal and background process in order to describe existing data obtained by the A1 experiment at MAMI with the aim to give accurate predictions of exclusion limits for the hidden photon parameter space. Finally, the derived methods are used to find predictions for future experiments, e.g., at MESA or at JLAB, allowing for a comprehensive study of the discovery potential of the complementary experiments. In the last part, a feasibility study for probing the hidden photon model by rare kaon decays is performed. For this purpose, invisible as well as visible decays of the hidden photon are considered within different classes of models. This allows one to find bounds for the parameter space from existing data and to estimate the reach of future experiments.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Students are now involved in a vastly different textual landscape than many English scholars, one that relies on the “reading” and interpretation of multiple channels of simultaneous information. As a response to these new kinds of literate practices, my dissertation adds to the growing body of research on multimodal literacies, narratology in new media, and rhetoric through an examination of the place of video games in English teaching and research. I describe in this dissertation a hybridized theoretical basis for incorporating video games in English classrooms. This framework for textual analysis includes elements from narrative theory in literary study, rhetorical theory, and literacy theory, and when combined to account for the multiple modalities and complexities of gaming, can provide new insights about those theories and practices across all kinds of media, whether in written texts, films, or video games. In creating this framework, I hope to encourage students to view texts from a meta-level perspective, encompassing textual construction, use, and interpretation. In order to foster meta-level learning in an English course, I use specific theoretical frameworks from the fields of literary studies, narratology, film theory, aural theory, reader-response criticism, game studies, and multiliteracies theory to analyze a particular video game: World of Goo. These theoretical frameworks inform pedagogical practices used in the classroom for textual analysis of multiple media. Examining a video game from these perspectives, I use analytical methods from each, including close reading, explication, textual analysis, and individual elements of multiliteracies theory and pedagogy. In undertaking an in-depth analysis of World of Goo, I demonstrate the possibilities for classroom instruction with a complex blend of theories and pedagogies in English courses. This blend of theories and practices is meant to foster literacy learning across media, helping students develop metaknowledge of their own literate practices in multiple modes. Finally, I outline a design for a multiliteracies course that would allow English scholars to use video games along with other texts to interrogate texts as systems of information. In doing so, students can hopefully view and transform systems in their own lives as audiences, citizens, and workers.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The theoretical basis for evaluating shear strength in rock joints is presented and used to derive an equation that governs the relationship between tangential and normal stress on the joint during situations of slippage between the joint faces. The dependent variables include geometric dilatancy, the instantaneous friction angle, and a parameter that considers joint surface roughness. The effect roughness is studied, and the aforementioned formula is used to analyse joints under different conditions. A mathematical expression is deduced that explains Barton's value for the joint roughness coefficient (JRC) according to the roughness geometry. In particular, when the Hoek and Brown failure criterion is used for a rock in the contact with the surface roughness plane, it is possible to determine the shear strength of the joint as a function of the relationship between the uniaxial compressive strength of the wall with the normal stress acting on the wall. Finally, theoretical results obtained for the geometry of a three-dimensional joint are compared with those of the Barton's formulation

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Neural networks have often been motivated by superficial analogy with biological nervous systems. Recently, however, it has become widely recognised that the effective application of neural networks requires instead a deeper understanding of the theoretical foundations of these models. Insight into neural networks comes from a number of fields including statistical pattern recognition, computational learning theory, statistics, information geometry and statistical mechanics. As an illustration of the importance of understanding the theoretical basis for neural network models, we consider their application to the solution of multi-valued inverse problems. We show how a naive application of the standard least-squares approach can lead to very poor results, and how an appreciation of the underlying statistical goals of the modelling process allows the development of a more general and more powerful formalism which can tackle the problem of multi-modality.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Since the last decade, the combined use of chemometrics and molecular spectroscopic techniques has become a new alternative for direct drug determination, without the need of physical separation. Among the new methodologies developed, the application of PARAFAC in the decomposition of spectrofluorimetric data should be highlighted. The first objective of this article is to describe the theoretical basis of PARAFAC. For this purpose, a discussion about the order of chemometric methods used in multivariate calibration and the development of multi-dimensional methods is presented first. The other objective of this article is to divulge for the Brazilian chemical community the potential of the combination PARAFAC/spectrofluorimetry for the determination of drugs in complex biological matrices. For this purpose, two applications aiming at determining, respectively, doxorrubicine and salicylate in human plasma are presented.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this article my main objective is to approach some questions related to the tests used in psychodiagnostics processes of children who are considered as having learning difficulties. Having the book Discipline and Punish (1975/1986) by Foucault as theoretical basis, I intend to investigate the hypothesis that the child is considered ill or abnormal due to the factors related to imposed norms and not to organic aspects and/or neurological pathology. My interest is to analyse the signs that allow us to point out the written language conception of tests. This language is expected and privileged, however it may not be the language that the child uses and experiences every day.