983 resultados para explicit undervisning
Resumo:
Resultaten av de senaste PIRLS-undersökningarna visar att svenska elevers läsförståelse har försämrats under senare år. Enligt forskning bör undervisningen vara inriktad på lässtrategier för att elever ska utveckla läsförståelse. För att eleverna ska lära sig använda olika lässtrategier självständigt när de tolkar text och skapar mening behöver de undervisas explicit. Undervisningen bör vara strukturerad och varierad samtidigt som läraren stöttar eleverna och anpassar undervisningen efter deras individuella behov. Syftet med studien har varit att undersöka hur lärare i F–3 undervisar i läsförståelse och lässtrategier för att se vilka förtjänster och brister som finns. Studiens metod är kvalitativ och för att samla in material har fem halvstrukturerade intervjuer genomförts med fem verksamma lärare, fyra klasslärare och en specialpedagog, fördelade på fyra olika skolor. Resultaten visar att lärare använder sig av olika undervisningsmetoder när de undervisar i läsförståelse, såsom en läsande klass, think-aloud, cirkelmodellen och Chambers modell för boksamtal. Det som tycks genomsyra lärarnas undervisning i läsförståelse är kommunikation, vilket är nära sammankopplat med studiens teoretiska utgångspunkt, det sociokulturella perspektivet. Enligt lärarna samtalar de mycket med eleverna om texter, och de verkar försöka anpassa undervisningen efter elevernas individuella behov genom att välja texter som eleverna har förkunskaper i och om. När de undervisar explicit tycks lärarna diskutera olika lässtrategiers nyttoanvändning genom att samtala om hur och varför olika lässtrategier är lämpliga att använda i olika sammanhang. Enligt lärarna erbjuds eleverna undervisning i olika gruppkonstellationer såsom arbete i helklass, halvklass, mindre grupper, parvis och enskilt. Lärarna tycks vara tydliga i sin undervisning för att eleverna ska bli medvetna om i vilket syfte de ska läsa.
Resumo:
Kursplanen i svenska förklarar att eleverna ska utveckla det multimodala skrivandet inom svenskämnet. Det multimodala skrivandet innebär att ord, bild och ljud kombineras och samspelar. Huvudsyftet med den här litteraturstudien har varit att undersöka hur det multimodala skrivandet inom svenskämnet för årskurs 4-6 kan se ut, vilka kompetenser och resurser som krävs för att bedriva en multimodal skrivundervisning, samt vilket slags lärande det multimodala skrivandet kan ge upphov till hos eleverna. Litteraturstudien visar att det multimodala skrivandet kan förekomma såväl analogt som digitalt. Vidare visar den att svensk forskning på området är mycket begränsad. De artiklar och avhandlingar som inkluderats i litteraturstudien visar att forskare är eniga om att lärare behöver utveckla sina kunskaper om olika teckenvärldar, såsom auditiva och visuella, för att göra elever medvetna om teckenvärldarnas meningspotential och samspel. Det multimodala skrivandet ger upphov till en form av samordnat lärande, eftersom det multimodala skrivandet är en komplex process, som kräver att eleverna får explicit undervisning om aktuell digital programvara och teckenvärldarnas meningsskapande. Multimodalt skrivande är ett vanligt inslag utanför skolan, men bör få tillträde in i skolvärlden. Det förutsätter att digitala resurser finns tillgängliga och att lärare är positivt inställda till den multimodala skrivutvecklingen.
Resumo:
Internationella studier, PIRLS 2011 och PISA 2012, visar svenska elevers försämrade kunskapsnivåer i läsförståelse. Skolan styrdokument från 2011, Lgr 11, uttrycker explicit undervisning av läsförståelsestrategier i ämnet svenska i grundskolans alla stadier. Den nedåtgående trenden har medfört att diskussionerna kring utvecklingen av läsförståelseundervisning har ökat. Läsforskare varnar för att lärarna lämnas att ensamma tolka hur de ska undervisa i lässtrategier som påverkar läsförståelsen. Våren 2014 lanserades läromedlet En läsande klass som är en privat satsning med ett långsiktigt mål att öka elevers läsförståelse och syftet att lyfta vikten av läsförståelse. Omfattande svensk forskning saknas dock på området. Syftet med denna studie var därför att undersöka hur undervisande lärare på låg- och mellanstadiet ser på sin läsförståelseundervisning och användandet av metoder och strategier för elever med god läsförståelse men även för elever som upplever hinder i sin läsförståelse. Undersökningen genomfördes med en kvalitativ metod med intervjuer som hade halvstrukturerad form. Fyra lärare, som alla undervisar i svenska och arbetar med läsförståelse i sin undervisning, intervjuades. Resultatet visade att samtliga lärare arbetar med läsförståelsestrategier på olika nivåer och att de använder materialet En läsande klass i olika utsträckning i sin undervisning. I resultatet kan man även utläsa att det finns ett behov hos lärare att få direktiv hur explicit läsförståelseundervisning ska bedrivas i skolan för att säkerställa kvalitet i sitt arbete med läsförståelse.
Resumo:
Syftet med avhandlingen har varit att granska finska inlärares konnektorbruk på CEFR-nivåerna A1, A2 och B1 longitudinellt ur ett funktionellt perspektiv. Jag har studerat vad som är kännetecknande för konnektorbruket på dessa CEFR-nivåer och i vilken funktion konnektorerna har använts på dessa nivåer. Vidare har jämförts konnektorbruket i materialet med det som sägs i CEFR-kriterierna. Slutligen har jag också granskat hur konnektorbruket utvecklas. Som material har jag använt berättande texter (n=303) skrivna av 101 finskspråkiga grundskolelever och gymnasister. Materialet ingår i projektet Topling – Inlärningsgångar i andraspråket vid Jyväskylä universitet. I avhandlingen har använts såväl kvantitativa som kvalitativa metoder. Jag har räknat konnektorernas och konnektorkategoriernas frekvenser samt analyserat i vilka funktioner konnektorerna har använts. I den funktionella analysen har använts systemisk-funktionell lingvistik (Halliday & Matthiessen 2004) samt Labovs (1972) modell om berättelsestrukturen. Analysen har visat att konnektorbruket skiljer sig mellan CEFR-nivåerna A1, A2 och B1. Antalet konnektorer ökar såväl från nivå A1 till A2 som från nivå A2 till nivå B1 och andelen additiva och målspråksavvikande konnektorer minskar medan andelen temporala, kausala och komparativa konnektorer samt att ökar. Konnektorerna har använts först och främst i deras prototypiska funktioner på alla dessa CEFR-nivåer. Vissa konnektorer (när, eftersom, att) verkar även ha en funktion i berättelsestrukturen. Om man jämför konnektorbruket med CEFR-kriterierna kan man konstatera att inlärare på nivå A1 använder pronomenet den i stället för sedan även om denna konnektor nämns i CEFR-kriterierna på nivå A1. Konnektorbruket verkar utvecklas på det sättet att antalet konnektorer samt andelen additiva, temporala och komparativa konnektorer samt att ökar. Andelen additiva konnektorer och målspråksavvikande konnektorer minskar. Vidare börjar inlärare använda mera olika konnektorer och på nivå B1 även mindre frekventa konnektorer som om och fast. I fortsättningen borde man granska konnektorbruket i olika texttyper samt studera om explicit undervisning påverkar inlärares konnektorbruk.
Resumo:
In the protein folding problem, solvent-mediated forces are commonly represented by intra-chain pairwise contact energy. Although this approximation has proven to be useful in several circumstances, it is limited in some other aspects of the problem. Here we show that it is possible to achieve two models to represent the chain-solvent system. one of them with implicit and other with explicit solvent, such that both reproduce the same thermodynamic results. Firstly, lattice models treated by analytical methods, were used to show that the implicit and explicitly representation of solvent effects can be energetically equivalent only if local solvent properties are time and spatially invariant. Following, applying the same reasoning Used for the lattice models, two inter-consistent Monte Carlo off-lattice models for implicit and explicit solvent are constructed, being that now in the latter the solvent properties are allowed to fluctuate. Then, it is shown that the chain configurational evolution as well as the globule equilibrium conformation are significantly distinct for implicit and explicit solvent systems. Actually, strongly contrasting with the implicit solvent version, the explicit solvent model predicts: (i) a malleable globule, in agreement with the estimated large protein-volume fluctuations; (ii) thermal conformational stability, resembling the conformational hear resistance of globular proteins, in which radii of gyration are practically insensitive to thermal effects over a relatively wide range of temperatures; and (iii) smaller radii of gyration at higher temperatures, indicating that the chain conformational entropy in the unfolded state is significantly smaller than that estimated from random coil configurations. Finally, we comment on the meaning of these results with respect to the understanding of the folding process. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
We present a fast method for finding optimal parameters for a low-resolution (threading) force field intended to distinguish correct from incorrect folds for a given protein sequence. In contrast to other methods, the parameterization uses information from >10(7) misfolded structures as well as a set of native sequence-structure pairs. In addition to testing the resulting force field's performance on the protein sequence threading problem, results are shown that characterize the number of parameters necessary for effective structure recognition.
Resumo:
1. Although population viability analysis (PVA) is widely employed, forecasts from PVA models are rarely tested. This study in a fragmented forest in southern Australia contrasted field data on patch occupancy and abundance for the arboreal marsupial greater glider Petauroides volans with predictions from a generic spatially explicit PVA model. This work represents one of the first landscape-scale tests of its type. 2. Initially we contrasted field data from a set of eucalypt forest patches totalling 437 ha with a naive null model in which forecasts of patch occupancy were made, assuming no fragmentation effects and based simply on remnant area and measured densities derived from nearby unfragmented forest. The naive null model predicted an average total of approximately 170 greater gliders, considerably greater than the true count (n = 81). 3. Congruence was examined between field data and predictions from PVA under several metapopulation modelling scenarios. The metapopulation models performed better than the naive null model. Logistic regression showed highly significant positive relationships between predicted and actual patch occupancy for the four scenarios (P = 0.001-0.006). When the model-derived probability of patch occupancy was high (0.50-0.75, 0.75-1.00), there was greater congruence between actual patch occupancy and the predicted probability of occupancy. 4. For many patches, probability distribution functions indicated that model predictions for animal abundance in a given patch were not outside those expected by chance. However, for some patches the model either substantially over-predicted or under-predicted actual abundance. Some important processes, such as inter-patch dispersal, that influence the distribution and abundance of the greater glider may not have been adequately modelled. 5. Additional landscape-scale tests of PVA models, on a wider range of species, are required to assess further predictions made using these tools. This will help determine those taxa for which predictions are and are not accurate and give insights for improving models for applied conservation management.
Resumo:
The corporative portals, enabled by Information Technology and Communication tools, provide the integration of heterogeneous data proceeding from internal information systems, which are available for access and sharing of the interested community. They can be considered an important instrument of explicit knowledge evaluation in the. organization, once they allow faster and,safer, information exchanges, enabling a healthful collaborative environment. In the specific case of major Brazilian universities, the corporate portals assume a basic aspect; therefore they offer an enormous variety and amount of information and knowledge, due to the multiplicity of their activities This. study aims to point out important aspects of the explicit knowledge expressed by the searched universities; by the analysis, of the content offered in their corporative portals` This is an exploratory study made through, direct observation of the existing contents in the corporative portals of two public universities as. Well as three private ones. A. comparative analysis of the existing contents in these portals was carried through;. it can be useful to evaluate its use as factor of optimization of the generated explicit knowledge in the university. As results, the existence of important differences, could be verified in the composition and in the content of the corporative portals of the public universities compared to the private institutions. The main differences are about the kind of services and the destination-of the,information that have as focus different public-target. It-could also be concluded that the searched private universities, focus, on the processes related to the attendance of the students, the support for the courses as well as the spreading of information to the public interested in joining the institution; whereas the anal public universities prioritize more specific information, directed to,the dissemination-of the research, developed internally or with institutional objectives.
Resumo:
Reaction between 5-(4-amino-2-thiabutyl)-5-methyl-3,7-dithianonane-1, 9-diamine (N3S3) and 5- methyl-2,2-bipyridine-5-carbaldehyde and subsequent reduction of the resulting imine with sodium borohydride results in a potentially ditopic ligand (L). Treatment of L with one equivalent of an iron( II) salt led to the monoprotonated complex [Fe(HL)](3+), isolated as the hexafluorophosphate salt. The presence of characteristic bands for the tris( bipyridyl) iron( II) chromophore in the UV/vis spectrum indicated that the iron( II) atom is coordinated octahedrally by the three bipyridyl (bipy) groups. The [Fe( bipy) 3] moiety encloses a cavity composed of the N3S3 portion of the ditopic ligand. The mononuclear and monomeric nature of the complex [Fe(HL)](3+) has been established also by accurate mass analysis. [Fe(HL)](3+) displays reduced stability to base compared with the complex [Fe(bipy)(3)](2+). In aqueous solution [Fe(HL)](3+) exhibits irreversible electrochemical behaviour with an oxidation wave ca. 60 mV to more positive potential than [Fe(bipy)(3)](2+). Investigations of the interaction of [Fe(L)](2+) with copper( II), iron( II), and mercury( II) using mass spectroscopic and potentiometric methods suggested that where complexation occurred, fewer than six of the N3S3 cavity donors were involved. The high affinity of the complex [Fe(L)](2+) for protons is one reason suggested to contribute to the reluctance to coordinate a second metal ion.
Resumo:
We detail the automatic construction of R matrices corresponding to (the tensor products of) the (O-m\alpha(n)) families of highest-weight representations of the quantum superalgebras Uq[gl(m\n)]. These representations are irreducible, contain a free complex parameter a, and are 2(mn)-dimensional. Our R matrices are actually (sparse) rank 4 tensors, containing a total of 2(4mn) components, each of which is in general an algebraic expression in the two complex variables q and a. Although the constructions are straightforward, we describe them in full here, to fill a perceived gap in the literature. As the algorithms are generally impracticable for manual calculation, we have implemented the entire process in MATHEMATICA; illustrating our results with U-q [gl(3\1)]. (C) 2002 Published by Elsevier Science B.V.
Resumo:
Most finite element packages use the Newmark algorithm for time integration of structural dynamics. Various algorithms have been proposed to better optimize the high frequency dissipation of this algorithm. Hulbert and Chung proposed both implicit and explicit forms of the generalized alpha method. The algorithms optimize high frequency dissipation effectively, and despite recent work on algorithms that possess momentum conserving/energy dissipative properties in a non-linear context, the generalized alpha method remains an efficient way to solve many problems, especially with adaptive timestep control. However, the implicit and explicit algorithms use incompatible parameter sets and cannot be used together in a spatial partition, whereas this can be done for the Newmark algorithm, as Hughes and Liu demonstrated, and for the HHT-alpha algorithm developed from it. The present paper shows that the explicit generalized alpha method can be rewritten so that it becomes compatible with the implicit form. All four algorithmic parameters can be matched between the explicit and implicit forms. An element interface between implicit and explicit partitions can then be used, analogous to that devised by Hughes and Liu to extend the Newmark method. The stability of the explicit/implicit algorithm is examined in a linear context and found to exceed that of the explicit partition. The element partition is significantly less dissipative of intermediate frequencies than one using the HHT-alpha method. The explicit algorithm can also be rewritten so that the discrete equation of motion evaluates forces from displacements and velocities found at the predicted mid-point of a cycle. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
Image segmentation is an ubiquitous task in medical image analysis, which is required to estimate morphological or functional properties of given anatomical targets. While automatic processing is highly desirable, image segmentation remains to date a supervised process in daily clinical practice. Indeed, challenging data often requires user interaction to capture the required level of anatomical detail. To optimize the analysis of 3D images, the user should be able to efficiently interact with the result of any segmentation algorithm to correct any possible disagreement. Building on a previously developed real-time 3D segmentation algorithm, we propose in the present work an extension towards an interactive application where user information can be used online to steer the segmentation result. This enables a synergistic collaboration between the operator and the underlying segmentation algorithm, thus contributing to higher segmentation accuracy, while keeping total analysis time competitive. To this end, we formalize the user interaction paradigm using a geometrical approach, where the user input is mapped to a non-cartesian space while this information is used to drive the boundary towards the position provided by the user. Additionally, we propose a shape regularization term which improves the interaction with the segmented surface, thereby making the interactive segmentation process less cumbersome. The resulting algorithm offers competitive performance both in terms of segmentation accuracy, as well as in terms of total analysis time. This contributes to a more efficient use of the existing segmentation tools in daily clinical practice. Furthermore, it compares favorably to state-of-the-art interactive segmentation software based on a 3D livewire-based algorithm.
Resumo:
OBJECTIVE: To develop an instrument to assess discrimination effects on health outcomes and behaviors, capable of distinguishing harmful differential treatment effects from their interpretation as discriminatory events. METHODS: Successive versions of an instrument were developed based on a systematic review of instruments assessing racial discrimination, focus groups and review by a panel comprising seven experts. The instrument was refined using cognitive interviews and pilot-testing. The final version of the instrument was administered to 424 undergraduate college students in the city of Rio de Janeiro, Southeastern Brazil, in 2010. Structural dimensionality, two types of reliability and construct validity were analyzed. RESULTS: Exploratory factor analysis corroborated the hypothesis of the instrument's unidimensionality, and seven experts verified its face and content validity. The internal consistency was 0.8, and test-retest reliability was higher than 0.5 for 14 out of 18 items. The overall score was higher among socially disadvantaged individuals and correlated with adverse health behaviors/conditions, particularly when differential treatments were attributed to discrimination. CONCLUSIONS: These findings indicate the validity and reliability of the instrument developed. The proposed instrument enables the investigation of novel aspects of the relationship between discrimination and health.