918 resultados para Reading strategies and techniques
Resumo:
The purpose of this study was to develop an understanding of the current state of scientific data sharing that stakeholders could use to develop and implement effective data sharing strategies and policies. The study developed a conceptual model to describe the process of data sharing, and the drivers, barriers, and enablers that determine stakeholder engagement. The conceptual model was used as a framework to structure discussions and interviews with key members of all stakeholder groups. Analysis of data obtained from interviewees identified a number of themes that highlight key requirements for the development of a mature data sharing culture.
Resumo:
Observations of atmospheric conditions and processes in citiesare fundamental to understanding the interactions between the urban surface and weather/climate, improving the performance of urban weather, air quality and climate models, and providing key information for city end-users (e.g. decision-makers, stakeholders, public). In this paper, Shanghai's urban integrated meteorological observation network (SUIMON) and some examples of intended applications are introduced. Its characteristics include being: multi- purpose (e.g. forecast, research, service), multi-function (high impact weather, city climate, special end-users), multi-scale (e.g. macro/meso-, urban-, neighborhood, street canyon), multi-variable (e.g. thermal, dynamic, chemical, bio-meteorological, ecological), and multi- platform (e.g. radar, wind profiler, ground-based, satellite based, in-situ observation/ sampling). Underlying SUIMON is a data management system to facilitate exchange of data and information. The overall aim of the network is to improve coordination strategies and instruments; to identify data gaps based on science and user driven requirements; and to intelligently combine observations from a variety of platforms by using a data assimilation system that is tuned to produce the best estimate of the current state of the urban atmosphere.
Resumo:
Measurements of the ionospheric E region during total solar eclipses in the period 1932-1999 have been used to investigate the fraction of Extreme Ultra Violet and soft X-ray radiation, phi, that is emitted from the limb corona and chromosphere. The relative apparent sizes of the Moon and the Sun are different for each eclipse, and techniques are presented which correct the measurements and, therefore, allow direct comparisons between different eclipses. The results show that the fraction of ionising radiation emitted by the limb corona has a clear solar cycle variation and that the underlying trend shows this fraction has been increasing since 1932. Data from the SOHO spacecraft are used to study the effects of short-term variability and it is shown that the observed long-term rise in phi has a negligible probability of being a chance occurrence.
Resumo:
With movement toward kilometer-scale ensembles, new techniques are needed for their characterization. A new methodology is presented for detailed spatial ensemble characterization using the fractions skill score (FSS). To evaluate spatial forecast differences, the average and standard deviation are taken of the FSS calculated over all ensemble member–member pairs at different scales and lead times. These methods were found to give important information about the ensemble behavior allowing the identification of useful spatial scales, spinup times for the model, and upscale growth of errors and forecast differences. The ensemble spread was found to be highly dependent on the spatial scales considered and the threshold applied to the field. High thresholds picked out localized and intense values that gave large temporal variability in ensemble spread: local processes and undersampling dominate for these thresholds. For lower thresholds the ensemble spread increases with time as differences between the ensemble members upscale. Two convective cases were investigated based on the Met Office United Model run at 2.2-km resolution. Different ensemble types were considered: ensembles produced using the Met Office Global and Regional Ensemble Prediction System (MOGREPS) and an ensemble produced using different model physics configurations. Comparison of the MOGREPS and multiphysics ensembles demonstrated the utility of spatial ensemble evaluation techniques for assessing the impact of different perturbation strategies and the need for assessing spread at different, believable, spatial scales.
Resumo:
The EU Water Framework Directive (WFD) requires that the ecological and chemical status of water bodies in Europe should be assessed, and action taken where possible to ensure that at least "good" quality is attained in each case by 2015. This paper is concerned with the accuracy and precision with which chemical status in rivers can be measured given certain sampling strategies, and how this can be improved. High-frequency (hourly) chemical data from four rivers in southern England were subsampled to simulate different sampling strategies for four parameters used for WFD classification: dissolved phosphorus, dissolved oxygen, pH and water temperature. These data sub-sets were then used to calculate the WFD classification for each site. Monthly sampling was less precise than weekly sampling, but the effect on WFD classification depended on the closeness of the range of concentrations to the class boundaries. In some cases, monthly sampling for a year could result in the same water body being assigned to three or four of the WFD classes with 95% confidence, due to random sampling effects, whereas with weekly sampling this was one or two classes for the same cases. In the most extreme case, the same water body could have been assigned to any of the five WFD quality classes. Weekly sampling considerably reduces the uncertainties compared to monthly sampling. The width of the weekly sampled confidence intervals was about 33% that of the monthly for P species and pH, about 50% for dissolved oxygen, and about 67% for water temperature. For water temperature, which is assessed as the 98th percentile in the UK, monthly sampling biases the mean downwards by about 1 °C compared to the true value, due to problems of assessing high percentiles with limited data. Low-frequency measurements will generally be unsuitable for assessing standards expressed as high percentiles. Confining sampling to the working week compared to all 7 days made little difference, but a modest improvement in precision could be obtained by sampling at the same time of day within a 3 h time window, and this is recommended. For parameters with a strong diel variation, such as dissolved oxygen, the value obtained, and thus possibly the WFD classification, can depend markedly on when in the cycle the sample was taken. Specifying this in the sampling regime would be a straightforward way to improve precision, but there needs to be agreement about how best to characterise risk in different types of river. These results suggest that in some cases it will be difficult to assign accurate WFD chemical classes or to detect likely trends using current sampling regimes, even for these largely groundwater-fed rivers. A more critical approach to sampling is needed to ensure that management actions are appropriate and supported by data.
Resumo:
Probiotics are live microorganisms that confer a health benefit on the host when administered in appropriate amounts. Over 700 randomized, controlled, human studies have been conducted with probiotics thus far, with the results providing strong support for the use of probiotics in the clinical prevention or treatment of gastrointestinal tract disorders and metabolic syndrome. The present review is based on webinar presentations that were developed by the American Gastroenterological Association (AGA) in partnership with the International Scientific Association for Probiotics and Prebiotics (ISAPP) and the North American branch of the International Life Sciences Institute (ILSI North America). The presentations provided gastroenterologists and researchers with fundamental and current scientific information on the influence of gut microbiota on human health and disease, as well as clinical intervention strategies and practical guidelines for the use of probiotics and prebiotics.
Resumo:
We monitored 8- and 10-year-old children’s eye movements as they read sentences containing a temporary syntactic ambiguity to obtain a detailed record of their online processing. Children showed the classic garden-path effect in online processing. Their reading was disrupted following disambiguation, relative to control sentences containing a comma to block the ambiguity, although the disruption occurred somewhat later than would be expected for mature readers. We also asked children questions to probe their comprehension of the syntactic ambiguity offline. They made more errors following ambiguous sentences than following control sentences, demonstrating that the initial incorrect parse of the garden-path sentence influenced offline comprehension. These findings are consistent with “good enough” processing effects seen in adults. While faster reading times and more regressions were generally associated with better comprehension, spending longer reading the question predicted comprehension success specifically in the ambiguous condition. This suggests that reading the question prompted children to reconstruct the sentence and engage in some form of processing, which in turn increased the likelihood of comprehension success. Older children were more sensitive to the syntactic function of commas, and, overall, they were faster and more accurate than younger children.
Resumo:
Compared to skilled adult readers, children typically make more fixations that are longer in duration, shorter saccades, and more regressions, thus reading more slowly (Blythe & Joseph, 2011). Recent attempts to understand the reasons for these differences have discovered some similarities (e.g., children and adults target their saccades similarly; Joseph, Liversedge, Blythe, White, & Rayner, 2009) and some differences (e.g., children’s fixation durations are more affected by lexical variables; Blythe, Liversedge, Joseph, White, & Rayner, 2009) that have yet to be explained. In this article, the E-Z Reader model of eye-movement control in reading (Reichle, 2011; Reichle, Pollatsek, Fisher, & Rayner, 1998) is used to simulate various eye-movement phenomena in adults versus children in order to evaluate hypotheses about the concurrent development of reading skill and eye-movement behavior. These simulations suggest that the primary difference between children and adults is their rate of lexical processing, and that different rates of (post-lexical) language processing may also contribute to some phenomena (e.g., children’s slower detection of semantic anomalies; Joseph et al., 2008). The theoretical implications of this hypothesis are discussed, including possible alternative accounts of these developmental changes, how reading skill and eye movements change across the entire lifespan (e.g., college-aged vs. elderly readers), and individual differences in reading ability.
Resumo:
The present study examined the effects of word length on children’s eye movement behaviour when other variables were carefully controlled. Importantly, the results showed that word length influenced children’s reading times and fixation positions on words. Furthermore, children exhibited stronger word length effects than adults in gaze durations and refixations. Adults and children generally did not differ in initial landing positions, but did differ in refixation behaviour. Overall, the results indicated that while adults and children show similar effects of word length for early measures of eye movement behaviour, differences emerge in later measures.
Resumo:
Two experiments were undertaken to examine whether there is an age-related change in the speed with which readers can capture visual information during fixations in reading. Children’s and adults’ eye movements were recorded as they read sentences that were presented either normally or as “disappearing text”. The disappearing text manipulation had a surprisingly small effect on the children, inconsistent with the notion of an age-related change in the speed with which readers can capture visual information from the page. Instead, we suggest that differences between adults and children are related to the level of difficulty of the sentences for readers of different ages.
Resumo:
Observations from the Heliospheric Imager (HI) instruments aboard the twin STEREO spacecraft have enabled the compilation of several catalogues of coronal mass ejections (CMEs), each characterizing the propagation of CMEs through the inner heliosphere. Three such catalogues are the Rutherford Appleton Laboratory (RAL)-HI event list, the Solar Stormwatch CME catalogue, and, presented here, the J-tracker catalogue. Each catalogue uses a different method to characterize the location of CME fronts in the HI images: manual identification by an expert, the statistical reduction of the manual identifications of many citizen scientists, and an automated algorithm. We provide a quantitative comparison of the differences between these catalogues and techniques, using 51 CMEs common to each catalogue. The time-elongation profiles of these CME fronts are compared, as are the estimates of the CME kinematics derived from application of three widely used single-spacecraft-fitting techniques. The J-tracker and RAL-HI profiles are most similar, while the Solar Stormwatch profiles display a small systematic offset. Evidence is presented that these differences arise because the RAL-HI and J-tracker profiles follow the sunward edge of CME density enhancements, while Solar Stormwatch profiles track closer to the antisunward (leading) edge. We demonstrate that the method used to produce the time-elongation profile typically introduces more variability into the kinematic estimates than differences between the various single-spacecraft-fitting techniques. This has implications for the repeatability and robustness of these types of analyses, arguably especially so in the context of space weather forecasting, where it could make the results strongly dependent on the methods used by the forecaster.
Resumo:
This paper looks at the blockages to the publication of children’s literature caused by the intellectual climate of the postwar era, through a case study of the editorial policy of Hachette, the largest publisher for children at this time. This period witnessed heightened tensions surrounding the social and humanitarian responsibilities of literature. Writers were blamed for having created a culture of defeatism, and collaborationist authors were punished harshly in the purges. In the case of children’s literature, the discourse on responsibility was made more urgent by the assumption that children were easily influenced by their reading material, and by the centrality of the young to the discourse on the moral reconstruction of France. As the politician and education reformer Gustave Monod put it: “penser l’avenir, c’est penser le sort des enfants et de la jeunesse.” These concerns led to the expansion of associations and publications dedicated to protecting children and promoting “good” reading matter for them, and, famously, to the 1949 law regulating publications for children, which banned the depiction of crime, debauchery and violence that might demoralise young readers. Using the testimonials of former employees, along with readers’ reports and editorial correspondence preserved in the Hachette archives, this paper will examine how individual editorial decisions and self-censorship strategies were shaped by the 1949 law with its attendant discourse of moral panic on children’s reading, and how national concerns for future citizens were balanced with commercial imperatives.
Resumo:
An increasing world population has put great pressure on agricultural landscapes to continually increase in efficiency whilst avoiding negative impacts on the environment. Protected areas, mass flower crops and agri-environment schemes have been identified as three broad complimentary mitigation strategies to protect and conserve pollinators. Each strategy differs temporarily and spatially but all offer significant benefits to pollinators. It is vital we identify the value of these mitigation strategies and their complementarity if we are to tailor landscape management for optimal results and work towards safeguarding our pollination service.
Resumo:
The PhD dissertation investigates the rise of emerging country multinationals (EMNEs), a phenomenon that has opened up a series of research themes and debates. The main debate in this field is the extent to which the theories/frameworks on foreign direct investment (FDI), which have been developed from investigations on multinationals from developed countries, is relevant in explaining outward FDI from EMNEs. This debate is sparked by research suggesting that EMNEs supposedly do not hold the characteristics that are seen as a prerequisite to engaging in FDI. The underlying theme in this PhD is that the field should move away from a one size fit all categorisation of EMNEs, and explore the heterogeneity within EMNEs. Collecting data through various databases, archival articles and annual reports, there was an examination of the internationalisation process of 136 Latin American Multinationals (LAMNEs). The research explores the differences in internationalisation trajectories and global strategies and classifies firms into one of four categories. The four categories that LAMNEs fall into are: Natural-Resource Vertical Integrator, which are firms that are in resource seeking sectors; Accelerated Global, which depict firms that have become global over a very short period of time; Traditional Global, which are EMNEs that have internationalised at the same pace as developed country MNEs and Local Optimisers that only acquire or internationalise to developing countries. The analysis also looks at which decade LAMNEs engaged in FDI, to see if LAMNEs that internationalised during the 1970s and 1980s, during a time when Latin America had a closed economy, was different to LAMNEs that internationalised during the Washington consensus era of the 1990s or to firms that have only just internationalised within the last decade. The findings show that LAMNEs that internationalised before 1990 were more likely to adopt Local Optimiser strategies. However, more LAMNEs that started to internationalise during the 1990s started to adopt Traditional Global strategies, although Local Optimisers were the most prominent strategy. From 2002, there was more prominence of Accelerated Global strategies and a lot more heterogeneity among LAMNEs. Natural-Resource Vertical Integrator LAMNEs, tended to start to internationalisation process during the 1970s/1980s. Despite the rise of EMNEs, and by extension LAMNEs opting to use cross border merger and acquisitions (M&A), there is little research on whether this entry mode has been successful. Contrary to the argument that EMNEs are “internationalising successfully” through this strategy, the findings show that these firms are highly geared and are running less efficiently against their Western competitors. In comparison, LAMNEs internationalising through a more gradual approach, are outperforming their Western competitors on efficiency and are not highly geared- i.e. do not hold a lot of debt. The conclusion of the thesis is the emphasis of moving away from evaluating firms from their country or region of origin, but rather through the global strategy they are using. This will give a more a robust firm level of analysis, and help develop the understanding of EMNEs and international business theory.