300 resultados para G00-1417


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nueva Ley de organización escolar en Austria. Texto en francés

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Averiguar si los sistemas hipermedia favorecen el aprendizaje de dominios complejos de conocimiento. 12 sujetos de primero y segundo de Pedagogía de la Universidad Pontificia de Salamanca. Estudio centrado en los planteamientos e investigaciones de una de las aproximaciones constructivistas más actuales y recientes que existen: La Teoría de la Flexibilidad Cognitiva, desarrollada por Spiro y colaboradores. Tanto en la primera como en la segunda parte del trabajo ofrece los planteamientos teóricos que fundamentan, justifican y contextualizan la propuesta de investigación. Por otro lado, en la segunda parte aborda el tema de los sistemas multimedia y el desarrollo de los entornos hipermedia. Por último, en la tercera parte del trabajo presenta el desarrollo de la investigación, centrada en la aplicación de la Teoría de la Flexibilidad Cognitiva a los entornos hipermedia. Partiendo de la creación de dos programas hipermedia a través de un sistema de autor (MM Director de la firma Macromedia Inc), trata de verificar algunos de los postulados de esta teoría y de dar cuenta de la importancia del ordenador como facilitador de las actividades de aprendizaje, y no como un mero instrumento de presentación o transmisión de conocimientos. Variables dependientes: 1) Más o menos expertos en dominios de conocimiento mal estructurados. 2) Aprendizaje e dominios de conocimientos complejos. 3) Potencial de transferencia. 4) Aprendizaje de dominios mal estructurados. Variables independientes:1) Modo de presentación de la información. 2) Los sistemas lineales y no lineales. 3) Tratamiento de la información desde un único punto de vista. 4) Utilización de un modelo no líneal. 1) El tratamiento de un dominio de conocimiento complejo desde diferentes dimensiones o perspectivas temáticas favorece considerablemente su comprensión y aprendizaje. 2) Es preferible que en vez de cruzar de manera continuada grandes cantidades de información se opte por ofrecer un menor número de perspectivas pero con una organización mayor a nivel de interrelaciones y con una red interactiva coherente y flexible. De esto se deduce que la presentación de la información a través de modelos unidimensionales no llega a aportar a los sujetos concepciones auténticas y completas sobre el significado de lo que se quiere transmitir. 3) Los sistemas hipermedia, como sistemas no lineales, adaptados a un enfoque interactivo, proporcionan algunas ventajas en el aprendizaje de estos dominios de conocimiento complejo. 4) A la hora de llevar a cabo la construcción del sistema hipermedia tenemos que tener en cuenta también el desarrollo cognitivo de los sujetos, así como el nivel de conocimientos del que parte el usuario. Desde esta perspectiva vemos que los sistemas hipermedia constituyen una herramienta muy valiosa para construir conocimiento y facilitar a los usuarios una búsqueda continua de la comprensión. 5) Para que los sujetos lleguen a construir su propia secuencia de aprendizaje es necesario que sean capaces de identificar sus necesidades con las categorías de información disponibles (botones, menús, títulos, etc.). Si esto no ocurre no son capaces de ejecutar una navegación dirigida hacia las metas. 6) Es imprescindible la presencia de un mediador entre el sistema y el usuario. Desde el mismo profesor, tutor o educador hasta un asistente o botones de ayuda (glosarios, índices,..) incluidos en el mismo programa, pueden valer para hacer de mediadores. lo importante es que exista una ayuda que ofrezca orientación mientras los sujetos exploran dominios extensos de información. 7) El campo de las nuevas tecnologías abre un sinfín de posibilidades de cara a la investigación y el efecto que pueden llegar a tener sobre la educación y nos incumbe a nosotros considerarlo a la hora de pensar en la tarea de reestructurar las escuelas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resumen basado en el de la publicaci??n. Resumen en ingl??s

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Students who are deaf or hard of hearing have typically had difficulty in mathematics; however, this problem often is overlooked because of difficulties in language and reading abilities. This study aims to identify the most appropriate mathematics curriculum for deaf or hard of hearing students in an oral deaf education program.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the Radiative Atmospheric Divergence Using ARM Mobile Facility GERB and AMMA Stations (RADAGAST) project we calculate the divergence of radiative flux across the atmosphere by comparing fluxes measured at each end of an atmospheric column above Niamey, in the African Sahel region. The combination of broadband flux measurements from geostationary orbit and the deployment for over 12 months of a comprehensive suite of active and passive instrumentation at the surface eliminates a number of sampling issues that could otherwise affect divergence calculations of this sort. However, one sampling issue that challenges the project is the fact that the surface flux data are essentially measurements made at a point, while the top-of-atmosphere values are taken over a solid angle that corresponds to an area at the surface of some 2500 km2. Variability of cloud cover and aerosol loading in the atmosphere mean that the downwelling fluxes, even when averaged over a day, will not be an exact match to the area-averaged value over that larger area, although we might expect that it is an unbiased estimate thereof. The heterogeneity of the surface, for example, fixed variations in albedo, further means that there is a likely systematic difference in the corresponding upwelling fluxes. In this paper we characterize and quantify this spatial sampling problem. We bound the root-mean-square error in the downwelling fluxes by exploiting a second set of surface flux measurements from a site that was run in parallel with the main deployment. The differences in the two sets of fluxes lead us to an upper bound to the sampling uncertainty, and their correlation leads to another which is probably optimistic as it requires certain other conditions to be met. For the upwelling fluxes we use data products from a number of satellite instruments to characterize the relevant heterogeneities and so estimate the systematic effects that arise from the flux measurements having to be taken at a single point. The sampling uncertainties vary with the season, being higher during the monsoon period. We find that the sampling errors for the daily average flux are small for the shortwave irradiance, generally less than 5 W m−2, under relatively clear skies, but these increase to about 10 W m−2 during the monsoon. For the upwelling fluxes, again taking daily averages, systematic errors are of order 10 W m−2 as a result of albedo variability. The uncertainty on the longwave component of the surface radiation budget is smaller than that on the shortwave component, in all conditions, but a bias of 4 W m−2 is calculated to exist in the surface leaving longwave flux.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years nonpolynomial finite element methods have received increasing attention for the efficient solution of wave problems. As with their close cousin the method of particular solutions, high efficiency comes from using solutions to the Helmholtz equation as basis functions. We present and analyze such a method for the scattering of two-dimensional scalar waves from a polygonal domain that achieves exponential convergence purely by increasing the number of basis functions in each element. Key ingredients are the use of basis functions that capture the singularities at corners and the representation of the scattered field towards infinity by a combination of fundamental solutions. The solution is obtained by minimizing a least-squares functional, which we discretize in such a way that a matrix least-squares problem is obtained. We give computable exponential bounds on the rate of convergence of the least-squares functional that are in very good agreement with the observed numerical convergence. Challenging numerical examples, including a nonconvex polygon with several corner singularities, and a cavity domain, are solved to around 10 digits of accuracy with a few seconds of CPU time. The examples are implemented concisely with MPSpack, a MATLAB toolbox for wave computations with nonpolynomial basis functions, developed by the authors. A code example is included.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The cognitive bases of language impairment in specific language impairment (SLI) and autism spectrum disorders (ASD) were investigated in a novel non-word comparison task which manipulated phonological short-term memory (PSTM) and speech perception, both implicated in poor non-word repetition. Aims: This study aimed to investigate the contributions of PSTM and speech perception in non-word processing and whether individuals with SLI and ASD plus language impairment (ALI) show similar or different patterns of deficit in these cognitive processes. Method & Procedures: Three groups of adolescents (aged 14–17 years), 14 with SLI, 16 with ALI, and 17 age and non-verbal IQ matched typically developing (TD) controls, made speeded discriminations between non-word pairs. Stimuli varied in PSTM load (two- or four-syllables) and speech perception load (mismatches on a word-initial or word-medial segment). Outcomes & Results: Reaction times showed effects of both non-word length and mismatch position and these factors interacted: four-syllable and word-initial mismatch stimuli resulted in the slowest decisions. Individuals with language impairment showed the same pattern of performance as those with typical development in the reaction time data. A marginal interaction between group and item length was driven by the SLI and ALI groups being less accurate with long items than short ones, a difference not found in the TD group. Conclusions & Implications: Non-word discrimination suggests that there are similarities and differences between adolescents with SLI and ALI and their TD peers. Reaction times appear to be affected by increasing PSTM and speech perception loads in a similar way. However, there was some, albeit weaker, evidence that adolescents with SLI and ALI are less accurate than TD individuals, with both showing an effect of PSTM load. This may indicate, at some level, the processing substrate supporting both PSTM and speech perception is intact in adolescents with SLI and ALI, but also in both there may be impaired access to PSTM resources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As a major mode of intraseasonal variability, which interacts with weather and climate systems on a near-global scale, the Madden – Julian Oscillation (MJO) is a crucial source of predictability for numerical weather prediction (NWP) models. Despite its global significance and comprehensive investigation, improvements in the representation of the MJO in an NWP context remain elusive. However, recent modifications to the model physics in the ECMWF model led to advances in the representation of atmospheric variability and the unprecedented propagation of the MJO signal through the entire integration period. In light of these recent advances, a set of hindcast experiments have been designed to assess the sensitivity of MJO simulation to the formulation of convection. Through the application of established MJO diagnostics, it is shown that the improvements in the representation of the MJO can be directly attributed to the modified convective parametrization. Furthermore, the improvements are attributed to the move from a moisture-convergent- to a relative-humidity-dependent formulation for organized deep entrainment. It is concluded that, in order to understand the physical mechanisms through which a relative-humidity-dependent formulation for entrainment led to an improved simulation of the MJO, a more process-based approach should be taken. T he application of process-based diagnostics t o t he hindcast experiments presented here will be the focus of Part II of this study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diatom, geochemical and isotopic data provide a record of environmental change in Laguna La Gaiba, lowland Bolivia (17°450S, 57°350W), over the last ca. 25 000 years. High-resolution diatom analysis around the Last Glacial–Interglacial Transition provides new insights into this period of change. The full and late glacial lake was generally quite shallow, but with evidence of periodic flooding. At about 13 100 cal a BP, just before the start of the Younger Dryas chronozone, the diatoms indicate shallower water conditions, but there is a marked change at about 12 200 cal a BP indicating the onset of a period of high variability, with rising water levels punctuated by periodic drying. From ca. 11 800 to 10 000 cal a BP, stable, deeper water conditions persisted. There is evidence for drying in the early to middle Holocene, but not as pronounced as that reported from elsewhere in the southern hemisphere tropics of South America. This was followed by the onset of wetter conditions in the late Holocene consistent with insolation forcing. Conditions very similar to present were established about 2100 cal a BP. A complex response to both insolation forcing and millennial-scale events originating in the North Atlantic is noted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Windstorm Kyrill affected large parts of Europe in January 2007 and caused widespread havoc and loss of life. In this study the formation of a secondary cyclone, Kyill II, along the occluded front of the mature cyclone Kyrill and the occurrence of severe wind gusts as Kyrill II passed over Germany are investigated with the help of high-resolution regional climate model simulations. Kyrill underwent an explosive cyclogenesis south of Greenland as the storm crossed polewards of an intense upper-level jet stream. Later in its life cycle secondary cyclogenesis occurred just west of the British Isles. The formation of Kyrill II along the occluded front was associated (a) with frontolytic strain and (b) with strong diabatic heating in combination with a developing upper-level shortwave trough. Sensitivity studies with reduced latent heat release feature a similar development but a weaker secondary cyclone, revealing the importance of diabatic processes during the formation of Kyrill II. Kyrill II moved further towards Europe and its development was favored by a split jet structure aloft, which maintained the cyclone’s exceptionally deep core pressure (below 965 hPa) for at least 36 hours. The occurrence of hurricane force winds related to the strong cold front over North and Central Germany is analyzed using convection-permitting simulations. The lower troposphere exhibits conditional instability, a turbulent flow and evaporative cooling. Simulation at high spatio-temporal resolution suggests that the downward mixing of high momentum (the wind speed at 875 hPa widely exceeded 45 m s-1) accounts for widespread severe surface wind gusts, which is in agreement with observed widespread losses.