594 resultados para Visions in global education
Resumo:
The purpose of this paper is to explore the implementation of online learning in distance educational delivery at Yellow Fields University (pseudonymous) in Sri Lanka. The implementation of online distance education at the University included the use of blended learning. The policy initiative to introduce online for distance education in Sri Lanka was guided by the expectation of cost reduction and the implementation was financed under the Distance Education Modernization Project. The paper presents one case study of a larger multiple case study research that employed an ethnographic research approach in investigating the impact of ICT on distance education in Sri Lanka. Documents, questionnaires and qualitative interviews were used for data collection. There was a significant positive relationship between ownership of computers and students’ ability to use computer for word processing, emailing and Web searching. The lack of access to computers and the Internet, the lack of infrastructure, low levels of computer literacy, the lack of local language content, and the lack of formal student support services at the University were found to be major barriers to implementing compulsory online activities at the University
Resumo:
Combining satellite data, atmospheric reanalyses and climate model simulations, variability in the net downward radiative flux imbalance at the top of Earth's atmosphere (N) is reconstructed and linked to recent climate change. Over the 1985-1999 period mean N (0.34 ± 0.67 Wm–2) is lower than for the 2000-2012 period (0.62 ± 0.43 Wm–2, uncertainties at 90% confidence level) despite the slower rate of surface temperature rise since 2000. While the precise magnitude of N remains uncertain, the reconstruction captures interannual variability which is dominated by the eruption of Mt. Pinatubo in 1991 and the El Niño Southern Oscillation. Monthly deseasonalized interannual variability in N generated by an ensemble of 9 climate model simulations using prescribed sea surface temperature and radiative forcings and from the satellite-based reconstruction is significantly correlated (r ∼ 0.6) over the 1985-2012 period.
Resumo:
This paper introduces an ontology-based knowledge model for knowledge management. This model can facilitate knowledge discovery that provides users with insight for decision making. The users requiring the insight normally play different roles with different requirements in an organisation. To meet the requirements, insights are created by purposely aggregated transnational data. This involves a semantic data integration process. In this paper, we present a knowledge management system which is capable of representing knowledge requirements in a domain context and enabling the semantic data integration through ontology modeling. The knowledge domain context of United Bible Societies is used to illustrate the features of the knowledge management capabilities.
Resumo:
In this study we examine the performance of 31 global model radiative transfer schemes in cloud-free conditions with prescribed gaseous absorbers and no aerosols (Rayleigh atmosphere), with prescribed scattering-only aerosols, and with more absorbing aerosols. Results are compared to benchmark results from high-resolution, multi-angular line-by-line radiation models. For purely scattering aerosols, model bias relative to the line-by-line models in the top-of-the atmosphere aerosol radiative forcing ranges from roughly −10 to 20%, with over- and underestimates of radiative cooling at lower and higher solar zenith angle, respectively. Inter-model diversity (relative standard deviation) increases from ~10 to 15% as solar zenith angle decreases. Inter-model diversity in atmospheric and surface forcing decreases with increased aerosol absorption, indicating that the treatment of multiple-scattering is more variable than aerosol absorption in the models considered. Aerosol radiative forcing results from multi-stream models are generally in better agreement with the line-by-line results than the simpler two-stream schemes. Considering radiative fluxes, model performance is generally the same or slightly better than results from previous radiation scheme intercomparisons. However, the inter-model diversity in aerosol radiative forcing remains large, primarily as a result of the treatment of multiple-scattering. Results indicate that global models that estimate aerosol radiative forcing with two-stream radiation schemes may be subject to persistent biases introduced by these schemes, particularly for regional aerosol forcing.
Resumo:
This paper evaluates the current status of global modeling of the organic aerosol (OA) in the troposphere and analyzes the differences between models as well as between models and observations. Thirty-one global chemistry transport models (CTMs) and general circulation models (GCMs) have participated in this intercomparison, in the framework of AeroCom phase II. The simulation of OA varies greatly between models in terms of the magnitude of primary emissions, secondary OA (SOA) formation, the number of OA species used (2 to 62), the complexity of OA parameterizations (gas-particle partitioning, chemical aging, multiphase chemistry, aerosol microphysics), and the OA physical, chemical and optical properties. The diversity of the global OA simulation results has increased since earlier AeroCom experiments, mainly due to the increasing complexity of the SOA parameterization in models, and the implementation of new, highly uncertain, OA sources. Diversity of over one order of magnitude exists in the modeled vertical distribution of OA concentrations that deserves a dedicated future study. Furthermore, although the OA / OC ratio depends on OA sources and atmospheric processing, and is important for model evaluation against OA and OC observations, it is resolved only by a few global models. The median global primary OA (POA) source strength is 56 Tg a−1 (range 34–144 Tg a−1) and the median SOA source strength (natural and anthropogenic) is 19 Tg a−1 (range 13–121 Tg a−1). Among the models that take into account the semi-volatile SOA nature, the median source is calculated to be 51 Tg a−1 (range 16–121 Tg a−1), much larger than the median value of the models that calculate SOA in a more simplistic way (19 Tg a−1; range 13–20 Tg a−1, with one model at 37 Tg a−1). The median atmospheric burden of OA is 1.4 Tg (24 models in the range of 0.6–2.0 Tg and 4 between 2.0 and 3.8 Tg), with a median OA lifetime of 5.4 days (range 3.8–9.6 days). In models that reported both OA and sulfate burdens, the median value of the OA/sulfate burden ratio is calculated to be 0.77; 13 models calculate a ratio lower than 1, and 9 models higher than 1. For 26 models that reported OA deposition fluxes, the median wet removal is 70 Tg a−1 (range 28–209 Tg a−1), which is on average 85% of the total OA deposition. Fine aerosol organic carbon (OC) and OA observations from continuous monitoring networks and individual field campaigns have been used for model evaluation. At urban locations, the model–observation comparison indicates missing knowledge on anthropogenic OA sources, both strength and seasonality. The combined model–measurements analysis suggests the existence of increased OA levels during summer due to biogenic SOA formation over large areas of the USA that can be of the same order of magnitude as the POA, even at urban locations, and contribute to the measured urban seasonal pattern. Global models are able to simulate the high secondary character of OA observed in the atmosphere as a result of SOA formation and POA aging, although the amount of OA present in the atmosphere remains largely underestimated, with a mean normalized bias (MNB) equal to −0.62 (−0.51) based on the comparison against OC (OA) urban data of all models at the surface, −0.15 (+0.51) when compared with remote measurements, and −0.30 for marine locations with OC data. The mean temporal correlations across all stations are low when compared with OC (OA) measurements: 0.47 (0.52) for urban stations, 0.39 (0.37) for remote stations, and 0.25 for marine stations with OC data. The combination of high (negative) MNB and higher correlation at urban stations when compared with the low MNB and lower correlation at remote sites suggests that knowledge about the processes that govern aerosol processing, transport and removal, on top of their sources, is important at the remote stations. There is no clear change in model skill with increasing model complexity with regard to OC or OA mass concentration. However, the complexity is needed in models in order to distinguish between anthropogenic and natural OA as needed for climate mitigation, and to calculate the impact of OA on climate accurately.
Resumo:
This chapter presents findings on English Language instruction at the lower primary level in the context of policies for curricular innovation at national, school and classroom levels. The focus is on policies which connect national and school levels, and on how they might be interpreted when implemented in multiple schools within Singapore’s educational system. Referring to case studies in two schools and to individual lesson observations in 10 schools, we found much agreement with national policies in terms of curriculum (i.e. lesson content and activity selection),leading to great uniformity in the lessons taught by different teachers in different schools. In addition, we found that schools had an important mediating influence on implementation of national policies. However, adoptions and adaptations of policy innovations at the classroom level were somewhat superficial as they were more related to changes in educational facilities and procedures than in philosophies.
Resumo:
This paper explores the idea that stakeholder proximity, that is, how much/little experience a stakeholder has with a focal organization, impacts the extent to which stakeholders rely on strategic group characteristics as an anchor when judging the reputation of higher education institutions. We synthesize theories from psychology (ie, cognitive categorization theory) and management (ie, strategic group theory) to explore how stakeholder proximity may influence the formation of organizational reputation. Specifically, we examine how the proximity of three key stakeholders (N=1,049; prospective students, parents of students and hiring managers of new graduates) influences the perceived strategic character and generalized favorability of three distinct groups of post-secondary institutions (research-intensive universities, teaching-intensive universities and career colleges). Our results suggest that high proximity stakeholders rely less on strategic group characteristics, while reputation at a strategic group level is suggested to have greater influence on stakeholders who have less direct experience of and low proximity to an organization. Interestingly, our findings reveal some consistent differences between perceptions of prospective students and hiring managers that pose important theoretical questions about the role and impact of direct experiences in the reputation-building process, while also suggesting that higher education institutions may benefit significantly from differentiated marketing strategies according to issues of proximity.
Resumo:
The Arctic is an important region in the study of climate change, but monitoring surface temperatures in this region is challenging, particularly in areas covered by sea ice. Here in situ, satellite and reanalysis data were utilised to investigate whether global warming over recent decades could be better estimated by changing the way the Arctic is treated in calculating global mean temperature. The degree of difference arising from using five different techniques, based on existing temperature anomaly dataset techniques, to estimate Arctic SAT anomalies over land and sea ice were investigated using reanalysis data as a testbed. Techniques which interpolated anomalies were found to result in smaller errors than non-interpolating techniques. Kriging techniques provided the smallest errors in anomaly estimates. Similar accuracies were found for anomalies estimated from in situ meteorological station SAT records using a kriging technique. Whether additional data sources, which are not currently utilised in temperature anomaly datasets, would improve estimates of Arctic surface air temperature anomalies was investigated within the reanalysis testbed and using in situ data. For the reanalysis study, the additional input anomalies were reanalysis data sampled at certain supplementary data source locations over Arctic land and sea ice areas. For the in situ data study, the additional input anomalies over sea ice were surface temperature anomalies derived from the Advanced Very High Resolution Radiometer satellite instruments. The use of additional data sources, particularly those located in the Arctic Ocean over sea ice or on islands in sparsely observed regions, can lead to substantial improvements in the accuracy of estimated anomalies. Decreases in Root Mean Square Error can be up to 0.2K for Arctic-average anomalies and more than 1K for spatially resolved anomalies. Further improvements in accuracy may be accomplished through the use of other data sources.
Resumo:
Budgeting system has been traditionally viewed as a control mechanism rather than a communication tool to facilitate the institutionalisation of organisational change. A good budgeting system not only reflects the organisational reality but also socially constructs the reality. This paper uses the structuration perspective to understand budget-related behaviour in a UK research-intensive university and especially, study the role of budgeting system in achieving organisational sustainability. Giddens’ structuration theory offers a valuable framework for the study of the duality of structure and emphasises on the structural properties of social systems. Based on the semi-structured interviews with top management and budget holders, it is concluded that in this specific context, budgeting system may place a significant role in establishing and legitimising institutional change.
Resumo:
Ocean–sea ice reanalyses are crucial for assessing the variability and recent trends in the Arctic sea ice cover. This is especially true for sea ice volume, as long-term and large scale sea ice thickness observations are inexistent. Results from the Ocean ReAnalyses Intercomparison Project (ORA-IP) are presented, with a focus on Arctic sea ice fields reconstructed by state-of-the-art global ocean reanalyses. Differences between the various reanalyses are explored in terms of the effects of data assimilation, model physics and atmospheric forcing on properties of the sea ice cover, including concentration, thickness, velocity and snow. Amongst the 14 reanalyses studied here, 9 assimilate sea ice concentration, and none assimilate sea ice thickness data. The comparison reveals an overall agreement in the reconstructed concentration fields, mainly because of the constraints in surface temperature imposed by direct assimilation of ocean observations, prescribed or assimilated atmospheric forcing and assimilation of sea ice concentration. However, some spread still exists amongst the reanalyses, due to a variety of factors. In particular, a large spread in sea ice thickness is found within the ensemble of reanalyses, partially caused by the biases inherited from their sea ice model components. Biases are also affected by the assimilation of sea ice concentration and the treatment of sea ice thickness in the data assimilation process. An important outcome of this study is that the spatial distribution of ice volume varies widely between products, with no reanalysis standing out as clearly superior as compared to altimetry estimates. The ice thickness from systems without assimilation of sea ice concentration is not worse than that from systems constrained with sea ice observations. An evaluation of the sea ice velocity fields reveals that ice drifts too fast in most systems. As an ensemble, the ORA-IP reanalyses capture trends in Arctic sea ice area and extent relatively well. However, the ensemble can not be used to get a robust estimate of recent trends in the Arctic sea ice volume. Biases in the reanalyses certainly impact the simulated air–sea fluxes in the polar regions, and questions the suitability of current sea ice reanalyses to initialize seasonal forecasts.
Resumo:
The collective representation within global models of aerosol, cloud, precipitation, and their radiative properties remains unsatisfactory. They constitute the largest source of uncertainty in predictions of climatic change and hamper the ability of numerical weather prediction models to forecast high-impact weather events. The joint European Space Agency (ESA)–Japan Aerospace Exploration Agency (JAXA) Earth Clouds, Aerosol and Radiation Explorer (EarthCARE) satellite mission, scheduled for launch in 2018, will help to resolve these weaknesses by providing global profiles of cloud, aerosol, precipitation, and associated radiative properties inferred from a combination of measurements made by its collocated active and passive sensors. EarthCARE will improve our understanding of cloud and aerosol processes by extending the invaluable dataset acquired by the A-Train satellites CloudSat, Cloud–Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO), and Aqua. Specifically, EarthCARE’s cloud profiling radar, with 7 dB more sensitivity than CloudSat, will detect more thin clouds and its Doppler capability will provide novel information on convection, precipitating ice particle, and raindrop fall speeds. EarthCARE’s 355-nm high-spectral-resolution lidar will measure directly and accurately cloud and aerosol extinction and optical depth. Combining this with backscatter and polarization information should lead to an unprecedented ability to identify aerosol type. The multispectral imager will provide a context for, and the ability to construct, the cloud and aerosol distribution in 3D domains around the narrow 2D retrieved cross section. The consistency of the retrievals will be assessed to within a target of ±10 W m–2 on the (10 km)2 scale by comparing the multiview broadband radiometer observations to the top-of-atmosphere fluxes estimated by 3D radiative transfer models acting on retrieved 3D domains.
Resumo:
This paper discusses how global financial institutions are using big data analytics within their compliance operations. A lot of previous research has focused on the strategic implications of big data, but not much research has considered how such tools are entwined with regulatory breaches and investigations in financial services. Our work covers two in-depth qualitative case studies, each addressing a distinct type of analytics. The first case focuses on analytics which manage everyday compliance breaches and so are expected by managers. The second case focuses on analytics which facilitate investigation and litigation where serious unexpected breaches may have occurred. In doing so, the study focuses on the micro/data to understand how these tools are influencing operational risks and practices. The paper draws from two bodies of literature, the social studies of information systems and finance to guide our analysis and practitioner recommendations. The cases illustrate how technologies are implicated in multijurisdictional challenges and regulatory conflicts at each end of the operational risk spectrum. We find that compliance analytics are both shaping and reporting regulatory matters yet often firms may have difficulties in recruiting individuals with relevant but diverse skill sets. The cases also underscore the increasing need for financial organizations to adopt robust information governance policies and processes to ease future remediation efforts.