165 resultados para Small Scale Municipally


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Different systems, different purposes – but how do they compare as learning environments? We undertook a survey of students at the University, asking whether they learned from their use of the systems, whether they made contact with other students through them, and how often they used them. Although it was a small scale survey, the results are quite enlightening and quite surprising. Blackboard is populated with learning material, has all the students on a module signed up to it, a safe environment (in terms of Acceptable Use and some degree of staff monitoring) and provides privacy within the learning group (plus lecturer and relevant support staff). Facebook, on the other hand, has no learning material, only some of the students using the system, and on the face of it, it has the opportunity for slips in privacy and potential bullying because the Acceptable Use policy is more lax than an institutional one, and breaches must be dealt with on an exception basis, when reported. So why do more students find people on their courses through Facebook than Blackboard? And why are up to 50% of students reporting that they have learned from using Facebook? Interviews indicate that students in subjects which use seminars are using Facebook to facilitate working groups – they can set up private groups which give them privacy to discuss ideas in an environment which perceived as safer than Blackboard can provide. No staff interference, unless they choose to invite them in, and the opportunity to select who in the class can engage. The other striking finding is the difference in use between the genders. Males are using blackboard more frequently than females, whilst the reverse is true for Facebook. Interviews suggest that this may have something to do with needing to access lecture notes… Overall, though, it appears that there is little relationship between the time spent engaging with Blackboard and reports that students have learned from it. Because Blackboard is our central repository for notes, any contact is likely to result in some learning. Facebook, however, shows a clear relationship between frequency of use and perception of learning – and our students post frequently to Facebook. Whilst much of this is probably trivia and social chit chat, the educational elements of it are, de facto, contructivist in nature. Further questions need to be answered - Is the reason the students learn from Facebook because they are creating content which others will see and comment on? Is it because they can engage in a dialogue, without the risk of interruption by others?

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Sudden stratospheric warmings (SSWs) are usually considered to be initiated by planetary wave activity. Here it is asked whether small-scale variability (e.g., related to gravity waves) can lead to SSWs given a certain amount of planetary wave activity that is by itself not sufficient to cause a SSW. A highly vertically truncated version of the Holton–Mass model of stratospheric wave–mean flow interaction, recently proposed by Ruzmaikin et al., is extended to include stochastic forcing. In the deterministic setting, this low-order model exhibits multiple stable equilibria corresponding to the undisturbed vortex and SSW state, respectively. Momentum forcing due to quasi-random gravity wave activity is introduced as an additive noise term in the zonal momentum equation. Two distinct approaches are pursued to study the stochastic system. First, the system, initialized at the undisturbed state, is numerically integrated many times to derive statistics of first passage times of the system undergoing a transition to the SSW state. Second, the Fokker–Planck equation corresponding to the stochastic system is solved numerically to derive the stationary probability density function of the system. Both approaches show that even small to moderate strengths of the stochastic gravity wave forcing can be sufficient to cause a SSW for cases for which the deterministic system would not have predicted a SSW.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Finite computing resources limit the spatial resolution of state-of-the-art global climate simulations to hundreds of kilometres. In neither the atmosphere nor the ocean are small-scale processes such as convection, clouds and ocean eddies properly represented. Climate simulations are known to depend, sometimes quite strongly, on the resulting bulk-formula representation of unresolved processes. Stochastic physics schemes within weather and climate models have the potential to represent the dynamical effects of unresolved scales in ways which conventional bulk-formula representations are incapable of so doing. The application of stochastic physics to climate modelling is a rapidly advancing, important and innovative topic. The latest research findings are gathered together in the Theme Issue for which this paper serves as the introduction.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aerosols from anthropogenic and natural sources have been recognized as having an important impact on the climate system. However, the small size of aerosol particles (ranging from 0.01 to more than 10 μm in diameter) and their influence on solar and terrestrial radiation makes them difficult to represent within the coarse resolution of general circulation models (GCMs) such that small-scale processes, for example, sulfate formation and conversion, need parameterizing. It is the parameterization of emissions, conversion, and deposition and the radiative effects of aerosol particles that causes uncertainty in their representation within GCMs. The aim of this study was to perturb aspects of a sulfur cycle scheme used within a GCM to represent the climatological impacts of sulfate aerosol derived from natural and anthropogenic sulfur sources. It was found that perturbing volcanic SO2 emissions and the scavenging rate of SO2 by precipitation had the largest influence on the sulfate burden. When these parameters were perturbed the sulfate burden ranged from 0.73 to 1.17 TgS for 2050 sulfur emissions (A2 Special Report on Emissions Scenarios (SRES)), comparable with the range in sulfate burden across all the Intergovernmental Panel on Climate Change SRESs. Thus, the results here suggest that the range in sulfate burden due to model uncertainty is comparable with scenario uncertainty. Despite the large range in sulfate burden there was little influence on the climate sensitivity, which had a range of less than 0.5 K across the ensemble. We hypothesize that this small effect was partly associated with high sulfate loadings in the control phase of the experiment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article describes the development and evaluation of the U.K.’s new High-Resolution Global Environmental Model (HiGEM), which is based on the latest climate configuration of the Met Office Unified Model, known as the Hadley Centre Global Environmental Model, version 1 (HadGEM1). In HiGEM, the horizontal resolution has been increased to 0.83° latitude × 1.25° longitude for the atmosphere, and 1/3° × 1/3° globally for the ocean. Multidecadal integrations of HiGEM, and the lower-resolution HadGEM, are used to explore the impact of resolution on the fidelity of climate simulations. Generally, SST errors are reduced in HiGEM. Cold SST errors associated with the path of the North Atlantic drift improve, and warm SST errors are reduced in upwelling stratocumulus regions where the simulation of low-level cloud is better at higher resolution. The ocean model in HiGEM allows ocean eddies to be partially resolved, which dramatically improves the representation of sea surface height variability. In the Southern Ocean, most of the heat transports in HiGEM is achieved by resolved eddy motions, which replaces the parameterized eddy heat transport in the lower-resolution model. HiGEM is also able to more realistically simulate small-scale features in the wind stress curl around islands and oceanic SST fronts, which may have implications for oceanic upwelling and ocean biology. Higher resolution in both the atmosphere and the ocean allows coupling to occur on small spatial scales. In particular, the small-scale interaction recently seen in satellite imagery between the atmosphere and tropical instability waves in the tropical Pacific Ocean is realistically captured in HiGEM. Tropical instability waves play a role in improving the simulation of the mean state of the tropical Pacific, which has important implications for climate variability. In particular, all aspects of the simulation of ENSO (spatial patterns, the time scales at which ENSO occurs, and global teleconnections) are much improved in HiGEM.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Problem structuring methods or PSMs are widely applied across a range of variable but generally small-scale organizational contexts. However, it has been argued that they are seen and experienced less often in areas of wide ranging and highly complex human activity-specifically those relating to sustainability, environment, democracy and conflict (or SEDC). In an attempt to plan, track and influence human activity in SEDC contexts, the authors in this paper make the theoretical case for a PSM, derived from various existing approaches. They show how it could make a contribution in a specific practical context-within sustainable coastal development projects around the Mediterranean which have utilized systemic and prospective sustainability analysis or, as it is now known, Imagine. The latter is itself a PSM but one which is 'bounded' within the limits of the project to help deliver the required 'deliverables' set out in the project blueprint. The authors argue that sustainable development projects would benefit from a deconstruction of process by those engaged in the project and suggest one approach that could be taken-a breakout from a project-bounded PSM to an analysis that embraces the project itself. The paper begins with an introduction to the sustainable development context and literature and then goes on to illustrate the issues by grounding the debate within a set of projects facilitated by Blue Plan for Mediterranean coastal zones. The paper goes on to show how the analytical framework could be applied and what insights might be generated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes some of the results of a detailed farm-level survey of 32 small-scale cotton farmers in the Makhathini Flats region of South Africa. The aim was to assess and measure some of the impacts (especially in terms of savings in pesticide and labour as well as benefits to human health) attributable to the use of insect-tolerant Bt cotton. The study reveals a direct cost benefit for Bt growers of SAR416 ($51) per hectare per season due to a reduction in the number of insecticide applications. Cost savings emerged in the form of lower requirements for pesticide, but also important were reduced requirements for water and labour. The reduction in the number of sprays was particularly beneficial to women who do some spraying and children who collect water and assist in spraying. The increasing adoption rate of Bt cotton appears to have a health benefit measured in terms of reported rates of accidental insecticide poisoning. These appear to be declining as the uptake of Bt cotton increases. However, the understanding of refugia and their management by local farmers are deficient and need improving. Finally, Bt cotton growers emerge as more resilient in absorbing price fluctuations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper summarizes some of the geoarchaeological evidence for early arable agriculture in Britain and Europe, and introduces new evidence for small-scale but very intensive cultivation in the Neolithic, Bronze Age and Iron Age in Scotland. The Scottish examples demonstrate that, from the Neolithic to the Iron Age, midden heaps were sometimes ploughed in situ; this means that, rather than spreading midden material onto the fields, the early farmers simply ran an ard over their compost heaps and sowed the resulting plots. The practice appears to have been common in Scotland, and may also have occurred in England. Neolithic cultivation of a Mesolithic midden is suggested, based on thin-section analysis of the middens at Northton, Harris. The fertility of the Mesolithic middens may partly explain why Neolithic farmers re-settled Mesolithic sites in the Northern and Western Isles.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The promotion of technologies seen to be aiding in the attainment of agricultural sustainability has been Popular amongst Northern-based development donors for many years. One of these, botanical insecticides (e.g., those based on neem, Pyrethrum and tobacco) have been a particular favorite as they are equated with being 'natural' and hence less damaging to human health and the environment. This paper describes the outcome of interactions between one non-government organisation (NGO), the Diocesan Development Services (DDS), based in Kogi State, Nigeria, and a major development donor based in Europe that led to the establishment of a programme designed to promote the Virtues of a tobacco-based insecticide to small-scale farmers. The Tobacco Insecticide Programme (TIP) began in the late 1980s and ended in 200 1, absorbing significant quantities of resource in the process. TIP began with exploratory investigations of efficacy on the DDS seed multiplication farm followed by stages of researcher-managed and farmer-managed on-farm trials. A survey in 2002 assessed adoption of the technology by farmers. While yield benefits from using the insecticide were nearly always positive and statistically significant relative to an untreated control, they were not as good as commercial insecticides. However, adoption of the tobacco insecticide by local farmers was poor. The paper discusses the reasons for poor adoption, including relative benefits in gross margin, and uses the TIP example to explore the differing power relationships that exist between donors, their field partners and farmers. (C) 2004 by The Haworth Press, Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

These notes have been issued on a small scale in 1983 and 1987 and on request at other times. This issue follows two items of news. First, WaIter Colquitt and Luther Welsh found the 'missed' Mersenne prime M110503 and advanced the frontier of complete Mp-testing to 139,267. In so doing, they terminated Slowinski's significant string of four consecutive Mersenne primes. Secondly, a team of five established a non-Mersenne number as the largest known prime. This result terminated the 1952-89 reign of Mersenne primes. All the original Mersenne numbers with p < 258 were factorised some time ago. The Sandia Laboratories team of Davis, Holdridge & Simmons with some little assistance from a CRAY machine cracked M211 in 1983 and M251 in 1984. They contributed their results to the 'Cunningham Project', care of Sam Wagstaff. That project is now moving apace thanks to developments in technology, factorisation and primality testing. New levels of computer power and new computer architectures motivated by the open-ended promise of parallelism are now available. Once again, the suppliers may be offering free buildings with the computer. However, the Sandia '84 CRAY-l implementation of the quadratic-sieve method is now outpowered by the number-field sieve technique. This is deployed on either purpose-built hardware or large syndicates, even distributed world-wide, of collaborating standard processors. New factorisation techniques of both special and general applicability have been defined and deployed. The elliptic-curve method finds large factors with helpful properties while the number-field sieve approach is breaking down composites with over one hundred digits. The material is updated on an occasional basis to follow the latest developments in primality-testing large Mp and factorising smaller Mp; all dates derive from the published literature or referenced private communications. Minor corrections, additions and changes merely advance the issue number after the decimal point. The reader is invited to report any errors and omissions that have escaped the proof-reading, to answer the unresolved questions noted and to suggest additional material associated with this subject.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We use the third perihelion pass by the Ulysses spacecraft to illustrate and investigate the “flux excess” effect, whereby open solar flux estimates from spacecraft increase with increasing heliocentric distance. We analyze the potential effects of small-scale structure in the heliospheric field (giving fluctuations in the radial component on timescales smaller than 1 h) and kinematic time-of-flight effects of longitudinal structure in the solar wind flow. We show that the flux excess is explained by neither very small-scale structure (timescales < 1 h) nor by the kinematic “bunching effect” on spacecraft sampling. The observed flux excesses is, however, well explained by the kinematic effect of larger-scale (>1 day) solar wind speed variations on the frozen-in heliospheric field. We show that averaging over an interval T (that is long enough to eliminate structure originating in the heliosphere yet small enough to avoid cancelling opposite polarity radial field that originates from genuine sector structure in the coronal source field) is only an approximately valid way of allowing for these effects and does not adequately explain or account for differences between the streamer belt and the polar coronal holes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Chemical and meteorological parameters measured on board the Facility for Airborne Atmospheric Measurements (FAAM) BAe 146 Atmospheric Research Aircraft during the African Monsoon Multidisciplinary Analysis (AMMA) campaign are presented to show the impact of NOx emissions from recently wetted soils in West Africa. NO emissions from soils have been previously observed in many geographical areas with different types of soil/vegetation cover during small scale studies and have been inferred at large scales from satellite measurements of NOx. This study is the first dedicated to showing the emissions of NOx at an intermediate scale between local surface sites and continental satellite measurements. The measurements reveal pronounced mesoscale variations in NOx concentrations closely linked to spatial patterns of antecedent rainfall. Fluxes required to maintain the NOx concentrations observed by the BAe-146 in a number of cases studies and for a range of assumed OH concentrations (1×106 to 1×107 molecules cm−3) are calculated to be in the range 8.4 to 36.1 ng N m−2 s−1. These values are comparable to the range of fluxes from 0.5 to 28 ng N m−2 s−1 reported from small scale field studies in a variety of non-nutrient rich tropical and sub-tropical locations reported in the review of Davidson and Kingerlee (1997). The fluxes calculated in the present study have been scaled up to cover the area of the Sahel bounded by 10 to 20 N and 10 E to 20 W giving an estimated emission of 0.03 to 0.30 Tg N from this area for July and August 2006. The observed chemical data also suggest that the NOx emitted from soils is taking part in ozone formation as ozone concentrations exhibit similar fine scale structure to the NOx, with enhancements over the wet soils. Such variability can not be explained on the basis of transport from other areas. Delon et al. (2008) is a companion paper to this one which models the impact of soil NOx emissions on the NOx and ozone concentration over West Africa during AMMA. It employs an artificial neural network to define the emissions of NOx from soils, integrated into a coupled chemistry-dynamics model. The results are compared to the observed data presented in this paper. Here we compare fluxes deduced from the observed data with the model-derived values from Delon et al. (2008).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In real world applications sequential algorithms of data mining and data exploration are often unsuitable for datasets with enormous size, high-dimensionality and complex data structure. Grid computing promises unprecedented opportunities for unlimited computing and storage resources. In this context there is the necessity to develop high performance distributed data mining algorithms. However, the computational complexity of the problem and the large amount of data to be explored often make the design of large scale applications particularly challenging. In this paper we present the first distributed formulation of a frequent subgraph mining algorithm for discriminative fragments of molecular compounds. Two distributed approaches have been developed and compared on the well known National Cancer Institute’s HIV-screening dataset. We present experimental results on a small-scale computing environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This report presents key findings from a small-scale pilot research project that explored the experiences and priorities of young people caring for their siblings in sibling-headed households affected by AIDS in Tanzania and Uganda. Qualitative and participatory research was conducted with 33 young people living in sibling-headed households and 39 NGO staff and community members in rural and urban areas of Tanzania and Uganda. The report analyses the ways that young people manage transitions to caring for their younger siblings following their parents’ death and the impacts of caring on their family relations, education, emotional wellbeing and health, social lives and their transitions to adulthood. The study highlights gendered- and age-related differences in the nature and extent of young people’s care work and discusses young people’s needs and priorities for action, based on the views of young people, NGO staff and community members. Meeting the basic needs of young people living in sibling-headed households, listening to young people’s views, fostering peer support and relationships of trust with supportive adults, raising awareness and advocacy emerge as key priorities to safeguard the rights of children and young people living in sibling-headed households and challenge the stigma and marginalisation they sometimes face.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Atmospheric factors Governing Banded Orographic Convection The three-dimensional structure of shallow orographic convection is investigated through simulations performed with a cloud-resolving numerical model. In moist flows that overcome a given topographic barrier to form statically unstable cap clouds, the organization of the convection depends on both the atmospheric structure and the mechanism by which the convection is initiated. Convection initiated by background thermal fluctuations embedded in the flow over a smooth mountain (without any small-scale topographic features) tends to be cellular and disorganized except that shear-parallel bands may form in flows with strong unidirectional vertical shear. The development of well-organized bands is favored when there is weak static instability inside the cloud and when the dry air surrounding the cloud is strongly stable. These bands move with the flow and distribute their cumulative precipitation evenly over the mountain upslope. Similar shear-parallel bands also develop in flows where convection is initiated by small-scale topographic noise superimposed onto the main mountain profile, but in this case stronger circulations are also triggered that create stationary rainbands parallel to the low-level flow. This second dominant mode, which is less sensitive to the atmospheric structure and the strength of forcing, is triggered by lee waves that form over small-scale topographic bumps near the upstream edge of the main orographic cloud. Due to their stationarity, these flow-parallel bands can produce locally heavy precipitation amounts.