864 resultados para Implementation of ERP-system
Resumo:
Limnologists had an early preoccupation with lake classification. It gave a necessary structure to the many chemical and biological observations that were beginning to form the basis of one of the earliest truly environmental sciences. August Thienemann was the doyen of such classifiers and his concept with Einar Naumann of oligotrophic and eutrophic lakes remains central to the world-view that limnologists still have. Classification fell into disrepute, however, as it became clear that there would always be lakes that deviated from the prescriptions that the classifiers made for them. Continua became the de rigeur concept and lakes were seen as varying along many chemical, biological and geographic axes. Modern limnologists are comfortable with this concept. That all lakes are different guarantees an indefinite future for limnological research. For those who manage lakes and the landscapes in which they are set, however, it is not very useful. There may be as many as 300000 standing water bodies in England and Wales alone and maybe as many again in Scotland. More than 80 000 are sizable (> 1 ha). Some classification scheme to cope with these numbers is needed and, as human impacts on them increase, a system of assessing and monitoring change must be built into such a scheme. Although ways of classifying and monitoring running waters are well developed in the UK, the same is not true of standing waters. Sufficient understanding of what determines the nature and functioning of lakes exists to create a system which has intellectual credibility as well as practical usefulness. This paper outlines the thinking behind a system which will be workable on a north European basis and presents some early results.
Resumo:
The article discusses various reports published within the issue, including one by Carmine Bianchi on understanding public sector from different levels and perspectives, one by Mauro Lo Tennero on the aspiration and structure of Sicily to enforce public policy, and one by Nuno Videira and colleagues on the use of group model building in the public sector to concur sustainable policies.
Resumo:
Although Marine Protected Areas (MPAs) are an increasingly popular policy tool for protecting marine stocks and biodiversity, they pose high costs for small-scale fisherfolk in poor countries. With Tanzania’s Mnazi Bay Ruvuma Estuary Marine Park as an example, we develop a spatial economic decision-modelling framework as a lens to examine fishers’ reactions to incentives created by an MPA. We argue that MPAs in poor countries can only contribute to sustainability if management induces changes in incentives to fish through a combination of enforcement (‘sticks’) and livelihood projects (‘carrots’). We emphasise practical implementation issues and implications for fostering marine ecosystem sustainability.
Resumo:
Sri Lanka's participation rates in higher education are low and have risen only slightly in the last few decades; the number of places for higher education in the state university system only caters for around 3% of the university entrant age cohort. The literature reveals that the highly competitive global knowledge economy increasingly favours workers with high levels of education who are also lifelong learners. This lack of access to higher education for a sizable proportion of the labour force is identified as a severe impediment to Sri Lanka‟s competitiveness in the global knowledge economy. The literature also suggests that Information and Communication Technologies are increasingly relied upon in many contexts in order to deliver flexible learning, to cater especially for the needs of lifelong learners in today‟s higher educational landscape. The government of Sri Lanka invested heavily in ICTs for distance education during the period 2003-2009 in a bid to increase access to higher education; but there has been little research into the impact of this. To address this lack, this study investigated the impact of ICTs on distance education in Sri Lanka with respect to increasing access to higher education. In order to achieve this aim, the research focused on Sri Lanka‟s effort from three perspectives: policy perspective, implementation perspective and user perspective. A multiple case study research using an ethnographic approach was conducted to observe Orange Valley University‟s and Yellow Fields University‟s (pseudonymous) implementation of distance education programmes using questionnaires, qualitative interviewing and document analysis. In total, data for the analysis was collected from 129 questionnaires, 33 individual interviews and 2 group interviews. The research revealed that ICTs have indeed increased opportunities for higher education; but mainly for people of affluent families from the Western Province. Issues identified were categorized under the themes: quality assurance, location, language, digital literacies and access to resources. Recommendations were offered to tackle the identified issues in accordance with the study findings. The study also revealed the strong presence of a multifaceted digital divide in the country. In conclusion, this research has shown that iii although ICT-enabled distance education has the potential to increase access to higher education the present implementation of the system in Sri Lanka has been less than successful.
Resumo:
This study puts forward a method to model and simulate the complex system of hospital on the basis of multi-agent technology. The formation of the agents of hospitals with intelligent and coordinative characteristics was designed, the message object was defined, and the model operating mechanism of autonomous activities and coordination mechanism was also designed. In addition, the Ontology library and Norm library etc. were introduced using semiotic method and theory, to enlarge the method of system modelling. Swarm was used to develop the multi-agent based simulation system, which is favorable for making guidelines for hospital's improving it's organization and management, optimizing the working procedure, improving the quality of medical care as well as reducing medical charge costs.
Resumo:
Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.
Resumo:
Earth system models are increasing in complexity and incorporating more processes than their predecessors, making them important tools for studying the global carbon cycle. However, their coupled behaviour has only recently been examined in any detail, and has yielded a very wide range of outcomes, with coupled climate-carbon cycle models that represent land-use change simulating total land carbon stores by 2100 that vary by as much as 600 Pg C given the same emissions scenario. This large uncertainty is associated with differences in how key processes are simulated in different models, and illustrates the necessity of determining which models are most realistic using rigorous model evaluation methodologies. Here we assess the state-of-the-art with respect to evaluation of Earth system models, with a particular emphasis on the simulation of the carbon cycle and associated biospheric processes. We examine some of the new advances and remaining uncertainties relating to (i) modern and palaeo data and (ii) metrics for evaluation, and discuss a range of strategies, such as the inclusion of pre-calibration, combined process- and system-level evaluation, and the use of emergent constraints, that can contribute towards the development of more robust evaluation schemes. An increasingly data-rich environment offers more opportunities for model evaluation, but it is also a challenge, as more knowledge about data uncertainties is required in order to determine robust evaluation methodologies that move the field of ESM evaluation from "beauty contest" toward the development of useful constraints on model behaviour.
Resumo:
The Ultra Weak Variational Formulation (UWVF) is a powerful numerical method for the approximation of acoustic, elastic and electromagnetic waves in the time-harmonic regime. The use of Trefftz-type basis functions incorporates the known wave-like behaviour of the solution in the discrete space, allowing large reductions in the required number of degrees of freedom for a given accuracy, when compared to standard finite element methods. However, the UWVF is not well disposed to the accurate approximation of singular sources in the interior of the computational domain. We propose an adjustment to the UWVF for seismic imaging applications, which we call the Source Extraction UWVF. Differing fields are solved for in subdomains around the source, and matched on the inter-domain boundaries. Numerical results are presented for a domain of constant wavenumber and for a domain of varying sound speed in a model used for seismic imaging.
Resumo:
A parameterization of mesoscale eddies in coarse-resolution ocean general circulation models (GCM) is formulated and implemented using a residual-mean formalism. In that framework, mean buoyancy is advected by the residual velocity (the sum of the Eulerian and eddy-induced velocities) and modified by a residual flux which accounts for the diabatic effects of mesoscale eddies. The residual velocity is obtained by stepping forward a residual-mean momentum equation in which eddy stresses appear as forcing terms. Study of the spatial distribution of eddy stresses, derived by using them as control parameters to ‘‘fit’’ the residual-mean model to observations, supports the idea that eddy stresses can be likened to a vertical down-gradient flux of momentum with a coefficient which is constant in the vertical. The residual eddy flux is set to zero in the ocean interior, where mesoscale eddies are assumed to be quasi-adiabatic, but is parameterized by a horizontal down-gradient diffusivity near the surface where eddies develop a diabatic component as they stir properties horizontally across steep isopycnals. The residual-mean model is implemented and tested in the MIT general circulation model. It is shown that the resulting model (1) has a climatology that is superior to that obtained using the Gent and McWilliams parameterization scheme with a spatially uniform diffusivity and (2) allows one to significantly reduce the (spurious) horizontal viscosity used in coarse resolution GCMs.