996 resultados para Lessons


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fragestellung/Einleitung: Bisher existieren kaum Daten für den deutschsprachigen Raum, welche Fehler häufig bei der Erstellung von schriftlichen Prüfungsfragen gemacht werden. Diese Erkenntnisse könnten hilfreich sein, um Autoren in Schulungsworkshops mit dem Fokus auf die häufigsten Fehler zu schulen. Im vorliegenden Projekt wird der Frage nachgegangen, welche Fehler am häufigsten bei der Erstellung von schriftlichen Prüfungsfragen gemacht werden, und welche Schlussfolgerungen daraus für Autorenschulungen gezogen werden können. Methoden: Am Institut für Medizinische Lehre Bern werden die mit der inhaltlichen und/oder formalen Fragenrevision befassten MitarbeiterInnen (N=14) anhand von semi-strukturierten Interviews befragt, mit welcher Art von Mängeln sie bei den Fragen der von ihnen betreuten schriftlichen Prüfungen am häufigsten umgehen. Weiter wird erhoben, wie dem Revisionsbedarf aus ihrer Sicht in Schulungen am besten begegnet werden kann. Ergebnisse: Die vorläufigen Ergebnisse weisen darauf hin, dass in folgenden Bereichen am häufigsten Revisionsbedarf besteht: eindeutiger Focus auf ein konkretes Lernziel authentische und relevante Vignette für den Ausbildungsstand angemessener Schwierigkeitsgrad eindeutige Lösung formale und sprachliche Korrektheit Dementsprechend sollte auf diese Themenbereiche ein besonderer Schwerpunkt bei Schulungen gelegt werden. Diskussion/Schlussfolgerung: Die vorläufigen Ergebnisse weisen darauf hin, dass Mängel von schriftlichen Prüfungsfragen häufig in den Bereichen Focus, Vignette, Schwierigkeitsgrad, Eindeutigkeit und formal-sprachlichen Aspekten liegen. Autorenschulungen sollten diese Aspekte in den Vordergrund stellen. Unsere zum Zeitpunkt der GMA-Tagung vorliegenden definitiven Ergebnisse können dazu beitragen, Workshops zur Fragenerstellung noch besser am Schulungsbedarf auszurichten.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prospective cohort studies significantly contribute to answering specific research questions in a defined population. Since 2008, the Swiss Transplant Cohort Study (STCS) systematically enrolled >95 % of all transplant recipients in Switzerland, collecting predefined data at determined time points. Designed as an open cohort, the STCS has included >3900 patients to date, with a median follow-up of 2.96 years (IQR 1.44-4.73). This review highlights some relevant findings in the field of transplant-associated infections gained by the STCS so far. Three key general aspects have crystallized: (i) Well-run cohort studies are a powerful tool to conduct genetic studies, which are crucially dependent on a meticulously described phenotype. (ii) Long-term real-life observations are adding a distinct layer of information that cannot be obtained during randomized studies. (iii) The systemic collection of data, close interdisciplinary collaboration, and continuous analysis of some key outcome data such as infectious diseases endpoints can improve patient care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A higher risk of future range losses as a result of climate change is expected to be one of the main drivers of extinction trends in vascular plants occurring in habitat types of high conservation value. Nevertheless, the impact of the climate changes of the last 60 years on the current distribution and extinction patterns of plants is still largely unclear. We applied species distribution models to study the impact of environmental variables (climate, soil conditions, land cover, topography), on the current distribution of 18 vascular plant species characteristic of three threatened habitat types in southern Germany: (i) xero-thermophilous vegetation, (ii) mesophilous mountain grasslands (mountain hay meadows and matgrass communities), and (iii) wetland habitats (bogs, fens, and wet meadows). Climate and soil variables were the most important variables affecting plant distributions at a spatial level of 10 × 10 km. Extinction trends in our study area revealed that plant species which occur in wetland habitats faced higher extinction risks than those in xero-thermophilous vegetation, with the risk for species in mesophilous mountain grasslands being intermediary. For three plant species characteristic either of mesophilous mountain grasslands or wetland habitats we showed exemplarily that extinctions from 1950 to the present day have occurred at the edge of the species’ current climatic niche, indicating that climate change has likely been the main driver of extinction. This is largely consistent with current extinction trends reported in other studies. Our study indicates that the analysis of past extinctions is an appropriate means to assess the impact of climate change on species and that vulnerability to climate change is both species- and habitat-specific.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our knowledge on the many aspects of mammalian reproduction in general and equine reproduction in particular has greatly increased during the last 15 years. Advances in the understanding of the physiology, cell biology, and biochemistry of reproduction have facilitated genetic analyses of fertility. Currently, there are more than 200 genes known that are involved in the production of fertile sperm cells. The completion of a number of mammalian genome projects will aid in the investigation of these genes in different species. Great progress has been made in the understanding of genetic aberrations that lead to male infertility. Additionally, the first genetic mechanisms are being discovered that contribute to the quantitative variation of fertility traits in fertile male animals. As artificial insemination (AI) represents a widespread technology in horse breeding, semen quality traits may eventually become an additional selection criterion for breeding stallions. Current research activities try to identify genetic markers that correlate to these semen quality traits. Here, we will review the current state of genetic research in male fertility and offer some perspectives for future research in horses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines how the geospatial accuracy of samples and sample size influence conclusions from geospatial analyses. It does so using the example of a study investigating the global phenomenon of large-scale land acquisitions and the socio-ecological characteristics of the areas they target. First, we analysed land deal datasets of varying geospatial accuracy and varying sizes and compared the results in terms of land cover, population density, and two indicators for agricultural potential: yield gap and availability of uncultivated land that is suitable for rainfed agriculture. We found that an increase in geospatial accuracy led to a substantial and greater change in conclusions about the land cover types targeted than an increase in sample size, suggesting that using a sample of higher geospatial accuracy does more to improve results than using a larger sample. The same finding emerged for population density, yield gap, and the availability of uncultivated land suitable for rainfed agriculture. Furthermore, the statistical median proved to be more consistent than the mean when comparing the descriptive statistics for datasets of different geospatial accuracy. Second, we analysed effects of geospatial accuracy on estimations regarding the potential for advancing agricultural development in target contexts. Our results show that the target contexts of the majority of land deals in our sample whose geolocation is known with a high level of accuracy contain smaller amounts of suitable, but uncultivated land than regional- and national-scale averages suggest. Consequently, the more target contexts vary within a country, the more detailed the spatial scale of analysis has to be in order to draw meaningful conclusions about the phenomena under investigation. We therefore advise against using national-scale statistics to approximate or characterize phenomena that have a local-scale impact, particularly if key indicators vary widely within a country.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the past ten years, reading comprehension instruction has received significant attention from educational researchers. Drawing on studies from cognitive psychology, reader response theory, and language arts research, current best practice in reading comprehension instruction is characterized by a strategies approach in which students are taught to think like proficient readers who visualize, infer, activate schema, question, and summarize as they read. Studies investigating the impact of comprehension strategy instruction on student achievement in reading suggest that when implemented consistently the intervention has a positive effect on achievement. Research also shows, however, that few teachers embrace this approach to reading instruction despite its effectiveness, even when the conditions for substantive professional development (i.e. prolonged engagement, support, resources, time) are present. The interpretive case study reported in this dissertation examined the year-long experience of one fourth grade teacher, Ellen, as she leanled about comprehension strategy instruction and attempted to integrate the approach in her reading program. The goal of the study was to extend current understanding of the factors that support or inhibit an individual teacher's instructional decision making. The research explored how Ellen's academic preparation, beliefs about reading comprehension instruction, and attitudes toward teacher-student interaction influenced her efforts to employ comprehension strategy instruction. Qualitative methods were the basis of this study's research design. The primary methods for collecting data included pre- and post-interviews, field notes from classroom observations and staff development sessions, infonnal interviews, e-mail correspondence, and artifacts such as reading assignments, professional writing, school newsletters, and photographs of the classroom. Transcripts from interviews, as well as field notes, e-mail, and artifacts, were analyzed according to grounded theory's constant-comparative method. The results of the study suggest that three factors were pivotal in Ellen's successful implementation of reading strategy instruction: Pedagogical beliefs, classroom relationships, and professional community. Research on instructional change generally focuses on issues of time, resources, feedback, and follow-through. The research reported here recognizes the importance of these components, but expands contemporary thinking by showing how, in Ellen's case, a teacher's existing theories, her relationship with her students, and her professional interaction with peers impact instructional decisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper discusses the meaning and measurement of pro-poor growth and also reviews evidence of pro-poor growth (or the lack of it) in a large cross-section of countries and time periods. The emerging story is that many episodes of growth are not pro-poor and also that although economic reforms have had positive effects in those countries that have been steadfast in implementing market reforms, the overall impact on growth has been small for many countries and in most cases not pro-poor. I present a general theory of pro-poor growth that includes ten principles that should be incorporated in all economic reforms that seek to generate pro-poor growth. These principles highlight the importance of understanding the poor, their economic activities, capabilities, constraints that impede their participation in markets and also an appreciation of linkages within sectors and regions. It is argued that pro-poor reforms cannot have the intended impact unless there are significant changes in the institutions of governance. Finally, the principles presented underscore the fact that pro-poor growth policies cannot be sustained without workable partnerships between markets and states in the ever changing and complex processes of social and economic development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a marvelous but somewhat neglected paper, 'The Corporation: Will It Be Managed by Machines?' Herbert Simon articulated from the perspective of 1960 his vision of what we now call the New Economy the machine-aided system of production and management of the late twentieth century. Simon's analysis sprang from what I term the principle of cognitive comparative advantage: one has to understand the quite different cognitive structures of humans and machines (including computers) in order to explain and predict the tasks to which each will be most suited. Perhaps unlike Simon's better-known predictions about progress in artificial intelligence research, the predictions of this 1960 article hold up remarkably well and continue to offer important insights. In what follows I attempt to tell a coherent story about the evolution of machines and the division of labor between humans and machines. Although inspired by Simon's 1960 paper, I weave many other strands into the tapestry, from classical discussions of the division of labor to present-day evolutionary psychology. The basic conclusion is that, with growth in the extent of the market, we should see humans 'crowded into' tasks that call for the kinds of cognition for which humans have been equipped by biological evolution. These human cognitive abilities range from the exercise of judgment in situations of ambiguity and surprise to more mundane abilities in spatio-temporal perception and locomotion. Conversely, we should see machines 'crowded into' tasks with a well-defined structure. This conclusion is not based (merely) on a claim that machines, including computers, are specialized idiots-savants today because of the limits (whether temporary or permanent) of artificial intelligence; rather, it rests on a claim that, for what are broadly 'economic' reasons, it will continue to make economic sense to create machines that are idiots-savants.