851 resultados para Meaningful teachings
Resumo:
This paper describes the results of research intended to explore the volatility inherent in the United Nations Development Programme's (UNDP) Human Development Index (HDI). The HDI is intended to be a simple and transparent device for comparing progress in human development, and is an aggregate of life expectancy, education and GDP per capita. Values of the HDI for each country are presented in the Human Development Reports (HDRs), the first being published in 1990. However, while the methodology is consistent for all countries in each year there are notable differences between years that make temporal comparisons of progress difficult. The paper presents the results of recalculating the HDI for a simplified sample of 114 countries using various methodologies employed by the UNDP. The results are a set of deviations of recalculated HDI ranks compared to the original ranks given in the HDRs. The volatility that can result from such recalculation is shown to be substantial (+/-10-15 ranks), yet reports in the popular press are frequently sensitive to movements of only a few ranks. Such movement can easily be accounted for by changes in the HDI methodology rather than genuine progress in human development. While the HDRs often carry warnings about the inadvisability of such year-on-year comparisons, it is argued that the existence of such a high-profile index and the overt presentation within league tables do encourage such comparison. Assuming that the HDI will be retained as a focal point within the HDRs, then it is suggested that greater focus be upon more meaningful and robust categories of human development (e.g. low, medium and high) rather than league tables where shifts of a few places, perhaps as a result of nothing more than a methodological or data artefact, may be highlighted in the press and by policy makers. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
Efforts to decentralise the pursuit of economic and social development have increased in recent years. The authors examine the rationale for establishing local development companies in areas of high unemployment and deprivation. The broad purpose is to establish a new style of organisation that combines attributes of the public and private sectors-to adapt and integrate economic and social services to meet local needs, to champion local interests in external arenas, and to act as enabling agents to promote local investment and development. These arguments are elaborated and illustrated with reference to one of Britain's most successful local development companies, Govan Initiative. The analysis reveals important strengths of the Initiative, including its action orientation, commitment to quality, and a local leadership role, but also certain weaknesses including its limited leverage over wider policies and resource flows. Local development companies need meaningful commitment from regional and national public organisations to fulfil their potential.
Resumo:
Data from the MIPAS instrument on Envisat, supplemented by meteorological analyses from ECMWF and the Met Office, are used to study the meteorological and trace-gas evolution of the stratosphere in the southern hemisphere during winter and spring 2003. A pole-centred approach is used to interpret the data in the physically meaningful context of the evolving stratospheric polar vortex. The following salient dynamical and transport features are documented and analysed: the merger of anticyclones in the stratosphere; the development of an intense, quasi-stationary anticyclone in spring; the associated top-down breakdown of the polar vortex; the systematic descent of air into the polar vortex; and the formation of a three-dimensional structure of a tracer filament on a planetary scale. The paper confirms and extends existing paradigms of the southern hemisphere vortex evolution. The quality of the MIPAS observations is seen to be generally good. though the water vapour retrievals are unrealistic above 10 hPa in the high-latitude winter.
Resumo:
Despite the success of studies attempting to integrate remotely sensed data and flood modelling and the need to provide near-real time data routinely on a global scale as well as setting up online data archives, there is to date a lack of spatially and temporally distributed hydraulic parameters to support ongoing efforts in modelling. Therefore, the objective of this project is to provide a global evaluation and benchmark data set of floodplain water stages with uncertainties and assimilation in a large scale flood model using space-borne radar imagery. An algorithm is developed for automated retrieval of water stages with uncertainties from a sequence of radar imagery and data are assimilated in a flood model using the Tewkesbury 2007 flood event as a feasibility study. The retrieval method that we employ is based on possibility theory which is an extension of fuzzy sets and that encompasses probability theory. In our case we first attempt to identify main sources of uncertainty in the retrieval of water stages from radar imagery for which we define physically meaningful ranges of parameter values. Possibilities of values are then computed for each parameter using a triangular ‘membership’ function. This procedure allows the computation of possible values of water stages at maximum flood extents along a river at many different locations. At a later stage in the project these data are then used in assimilation, calibration or validation of a flood model. The application is subsequently extended to a global scale using wide swath radar imagery and a simple global flood forecasting model thereby providing improved river discharge estimates to update the latter.
Resumo:
We use proper orthogonal decomposition (POD) to study a transient teleconnection event at the onset of the 2001 planet-encircling dust storm on Mars, in terms of empirical orthogonal functions (EOFs). There are several differences between this and previous studies of atmospheric events using EOFs. First, instead of using a single variable such as surface pressure or geopotential height on a given pressure surface, we use a dataset describing the evolution in time of global and fully three-dimensional atmospheric fields such as horizontal velocity and temperature. These fields are produced by assimilating Thermal Emission Spectrometer observations from NASA's Mars Global Surveyor spacecraft into a Mars general circulation model. We use total atmospheric energy (TE) as a physically meaningful quantity which weights the state variables. Second, instead of adopting the EOFs to define teleconnection patterns as planetary-scale correlations that explain a large portion of long time-scale variability, we use EOFs to understand transient processes due to localised heating perturbations that have implications for the atmospheric circulation over distant regions. The localised perturbation is given by anomalous heating due to the enhanced presence of dust around the northern edge of the Hellas Planitia basin on Mars. We show that the localised disturbance is seemingly restricted to a small number (a few tens) of EOFs. These can be classified as low-order, transitional, or high-order EOFs according to the TE amount they explain throughout the event. Despite the global character of the EOFs, they show the capability of accounting for the localised effects of the perturbation via the presence of specific centres of action. We finally discuss possible applications for the study of terrestrial phenomena with similar characteristics.
Resumo:
The Joint UK Land Environmental Simulator (JULES) was run offline to investigate the sensitivity of land surface type changes over South Africa. Sensitivity tests were made in idealised experiments where the actual land surface cover is replaced by a single homogeneous surface type. The vegetation surface types on which some of the experiments were made are static. Experimental tests were evaluated against the control. The model results show among others that the change of the surface cover results in changes of other variables such as soil moisture, albedo, net radiation and etc. These changes are also visible in the spin up process. The model shows different surfaces spinning up at different cycles. Because JULES is the land surface model of Unified Model, the results could be more physically meaningful if it is coupled to the Unified Model.
Resumo:
Productivity growth is conventionally measured by indices representing discreet approximations of the Divisia TFP index under the assumption that technological change is Hicks-neutral. When this assumption is violated, these indices are no longer meaningful because they conflate the effects of factor accumulation and technological change. We propose a way of adjusting the conventional TFP index that solves this problem. The method adopts a latent variable approach to the measurement of technical change biases that provides a simple means of correcting product and factor shares in the standard Tornqvist-Theil TFP index. An application to UK agriculture over the period 1953-2000 demonstrates that technical progress is strongly biased. The implications of that bias for productivity measurement are shown to be very large, with the conventional TFP index severely underestimating productivity growth. The result is explained primarily by the fact that technological change has favoured the rapidly accumulating factors against labour, the factor leaving the sector. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
This article is a commentary on several research studies conducted on the prospects for aerobic rice production systems that aim at reducing the demand for irrigation water which in certain major rice producing areas of the world is becoming increasingly scarce. The research studies considered, as reported in published articles mainly under the aegis of the International Rice Research Institute (IRRI), have a narrow scope in that they test only 3 or 4 rice varieties under different soil moisture treatments obtained with controlled irrigation, but with other agronomic factors of production held as constant. Consequently, these studies do not permit an assessment of the interactions among agronomic factors that will be of critical significance to the performance of any production system. Varying the production factor of "water" will seriously affect also the levels of the other factors required to optimise the performance of a production system. The major weakness in the studies analysed in this article originates from not taking account of the interactions between experimental and non-experimental factors involved in the comparisons between different production systems. This applies to the experimental field design used for the research studies as well as to the subsequent statistical analyses of the results. The existence of such interactions is a serious complicating element that makes meaningful comparisons between different crop production systems difficult. Consequently, the data and conclusions drawn from such research readily become biased towards proposing standardised solutions for possible introduction to farmers through a linear technology transfer process. Yet, the variability and diversity encountered in the real-world farming environment demand more flexible solutions and approaches in the dissemination of knowledge-intensive production practices through "experiential learning" types of processes, such as those employed by farmer field schools. This article illustrates, based on expertise of the 'system of rice intensification' (SRI), that several cost-effective and environment-friendly agronomic solutions to reduce the demand for irrigation water, other than the asserted need for the introduction of new cultivars, are feasible. Further, these agronomic Solutions can offer immediate benefits of reduced water requirements and increased net returns that Would be readily accessible to a wide range of rice producers, particularly the resource poor smallholders. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Recent studies of the current state of rural education and training (RET) systems in sub-Saharan Africa have assessed their ability to provide for the learning needs essential for more knowledgeable and productive small-scale rural households. These are most necessary if the endemic causes of rural poverty (poor nutrition, lack of sustainable livelihoods, etc.) are to be overcome. A brief historical background and analysis of the major current constraints to improvement in the sector are discussed. Paramount among those factors leading to its present 'malaise' is the lack of a whole-systems perspective and the absence of any coherent policy framework in most countries. There is evidence of some recent innovations, both in the public sector and through the work of non-governmental organisations (NGOs), civil society organisations (CSOs) and other private bodies. These provide hope of a new sense of direction that could lead towards meaningful 'revitalisation' of the sector. A suggested framework offers 10 key steps which, it is argued, could largely be achieved with modest internal resources and very little external support, provided that the necessary leadership and managerial capacities are in place. (C) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Background: The objective was to evaluate the efficacy and tolerability of donepezil (5 and 10 mg/day) compared with placebo in alleviating manifestations of mild to moderate Alzheimer's disease (AD). Method: A systematic review of individual patient data from Phase II and III double-blind, randomised, placebo-controlled studies of up to 24 weeks and completed by 20 December 1999. The main outcome measures were the ADAS-cog, the CIBIC-plus, and reports of adverse events. Results: A total of 2376 patients from ten trials were randomised to either donepezil 5 mg/day (n = 821), 10 mg/day (n = 662) or placebo (n = 893). Cognitive performance was better in patients receiving donepezil than in patients receiving placebo. At 12 weeks the differences in ADAS-cog scores were 5 mg/day-placebo: - 2.1 [95% confidence interval (CI), - 2.6 to - 1.6; p < 0.001], 10 mg/day-placebo: - 2.5 ( - 3.1 to - 2.0; p < 0.001). The corresponding results at 24 weeks were - 2.0 ( - 2.7 to - 1.3; p < 0.001) and - 3.1 ( - 3.9 to - 2.4; p < 0.001). The difference between the 5 and 10 mg/day doses was significant at 24 weeks (p = 0.005). The odds ratios (OR) of improvement on the CIBIC-plus at 12 weeks were: 5 mg/day-placebo 1.8 (1.5 to 2.1; p < 0.001), 10 mg/day-placebo 1.9 (1.5 to 2.4; p < 0.001). The corresponding values at 24 weeks were 1.9 (1.5 to 2.4; p = 0.001) and 2.1 (1.6 to 2.8; p < 0.001). Donepezil was well tolerated; adverse events were cholinergic in nature and generally of mild severity and brief in duration. Conclusion: Donepezil (5 and 10 mg/day) provides meaningful benefits in alleviating deficits in cognitive and clinician-rated global function in AD patients relative to placebo. Increased improvements in cognition were indicated for the higher dose. Copyright © 2004 John Wiley & Sons, Ltd.
Resumo:
In this paper, we ask why so much ecological scientific research does not have a greater policy impact in the UK. We argue that there are two potentially important and related reasons for this failing. First, much current ecological science is not being conducted at a scale that is readily meaningful to policy-makers. Second, to make much of this research policy-relevant requires collaborative interdisciplinary research between ecologists and social scientists. However, the challenge of undertaking useful interdisciplinary research only re-emphasises the problems of scale: ecologists and social scientists traditionally frame their research questions at different scales and consider different facets of natural resource management, setting different objectives and using different language. We argue that if applied ecological research is to have greater impact in informing environmental policy, much greater attention needs to be given to the scale of the research efforts as well as to the interaction with social scientists. Such an approach requires an adjustment in existing research and funding infrastructures.
Resumo:
There is a concerted global effort to digitize biodiversity occurrence data from herbarium and museum collections that together offer an unparalleled archive of life on Earth over the past few centuries. The Global Biodiversity Information Facility provides the largest single gateway to these data. Since 2004 it has provided a single point of access to specimen data from databases of biological surveys and collections. Biologists now have rapid access to more than 120 million observations, for use in many biological analyses. We investigate the quality and coverage of data digitally available, from the perspective of a biologist seeking distribution data for spatial analysis on a global scale. We present an example of automatic verification of geographic data using distributions from the International Legume Database and Information Service to test empirically, issues of geographic coverage and accuracy. There are over 1/2 million records covering 31% of all Legume species, and 84% of these records pass geographic validation. These data are not yet a global biodiversity resource for all species, or all countries. A user will encounter many biases and gaps in these data which should be understood before data are used or analyzed. The data are notably deficient in many of the world's biodiversity hotspots. The deficiencies in data coverage can be resolved by an increased application of resources to digitize and publish data throughout these most diverse regions. But in the push to provide ever more data online, we should not forget that consistent data quality is of paramount importance if the data are to be useful in capturing a meaningful picture of life on Earth.
Resumo:
It has become evident that the mystery of life will not be deciphered just by decoding its blueprint, the genetic code. In the life and biomedical sciences, research efforts are now shifting from pure gene analysis to the analysis of all biomolecules involved in the machinery of life. One area of these postgenomic research fields is proteomics. Although proteomics, which basically encompasses the analysis of proteins, is not a new concept, it is far from being a research field that can rely on routine and large-scale analyses. At the time the term proteomics was coined, a gold-rush mentality was created, promising vast and quick riches (i.e., solutions to the immensely complex questions of life and disease). Predictably, the reality has been quite different. The complexity of proteomes and the wide variations in the abundances and chemical properties of their constituents has rendered the use of systematic analytical approaches only partially successful, and biologically meaningful results have been slow to arrive. However, to learn more about how cells and, hence, life works, it is essential to understand the proteins and their complex interactions in their native environment. This is why proteomics will be an important part of the biomedical sciences for the foreseeable future. Therefore, any advances in providing the tools that make protein analysis a more routine and large-scale business, ideally using automated and rapid analytical procedures, are highly sought after. This review will provide some basics, thoughts and ideas on the exploitation of matrix-assisted laser desorption/ ionization in biological mass spectrometry - one of the most commonly used analytical tools in proteomics - for high-throughput analyses.
Resumo:
In this review we describe how concepts of shoot apical meristem function have developed over time. The role of the scientist is emphasized, as proposer, receiver and evaluator of ideas about the shoot apical meristem. Models have become increasingly popular over the last 250 years, and we consider their role. They provide valuable grounding for the development of hypotheses, but in addition they have a strong human element and their uptake relies on various degrees of persuasion. The most influential models are probably those that most data support, consolidating them as an insight into reality; but they also work by altering how we see meristems, re-directing us to influence the data we collect and the questions we consider meaningful.
Resumo:
It has become evident that the mystery of life will not be deciphered just by decoding its blueprint, the genetic code. In the life and biomedical sciences, research efforts are now shifting from pure gene analysis to the analysis of all biomolecules involved in the machinery of life. One area of these postgenomic research fields is proteomics. Although proteomics, which basically encompasses the analysis of proteins, is not a new concept, it is far from being a research field that can rely on routine and large-scale analyses. At the time the term proteomics was coined, a gold-rush mentality was created, promising vast and quick riches (i.e., solutions to the immensely complex questions of life and disease). Predictably, the reality has been quite different. The complexity of proteomes and the wide variations in the abundances and chemical properties of their constituents has rendered the use of systematic analytical approaches only partially successful, and biologically meaningful results have been slow to arrive. However, to learn more about how cells and, hence, life works, it is essential to understand the proteins and their complex interactions in their native environment. This is why proteomics will be an important part of the biomedical sciences for the foreseeable future. Therefore, any advances in providing the tools that make protein analysis a more routine and large-scale business, ideally using automated and rapid analytical procedures, are highly sought after. This review will provide some basics, thoughts and ideas on the exploitation of matrix-assisted laser desorption/ionization in biological mass spectrometry - one of the most commonly used analytical tools in proteomics - for high-throughput analyses.