19 resultados para Territorial approach on development
em CentAUR: Central Archive University of Reading - UK
Resumo:
The ≈3,450-million-year-old Strelley Pool Formation in Western Australia contains a reef-like assembly of laminated sedimentary accretion structures (stromatolites) that have macroscale characteristics suggestive of biological influence. However, direct microscale evidence of biology—namely, organic microbial remains or biosedimentary fabrics—has to date eluded discovery in the extensively-recrystallized rocks. Recently-identified outcrops with relatively good textural preservation record microscale evidence of primary sedimentary processes, including some that indicate probable microbial mat formation. Furthermore, we find relict fabrics and organic layers that covary with stromatolite morphology, linking morphologic diversity to changes in sedimentation, seafloor mineral precipitation, and inferred microbial mat development. Thus, the most direct and compelling signatures of life in the Strelley Pool Formation are those observed at the microscopic scale. By examining spatiotemporal changes in microscale characteristics it is possible not only to recognize the presence of probable microbial mats during stromatolite development, but also to infer aspects of the biological inputs to stromatolite morphogenesis. The persistence of an inferred biological signal through changing environmental circumstances and stromatolite types indicates that benthic microbial populations adapted to shifting environmental conditions in early oceans.
Resumo:
Although the use of climate scenarios for impact assessment has grown steadily since the 1990s, uptake of such information for adaptation is lagging by nearly a decade in terms of scientific output. Nonetheless, integration of climate risk information in development planning is now a priority for donor agencies because of the need to prepare for climate change impacts across different sectors and countries. This urgency stems from concerns that progress made against Millennium Development Goals (MDGs) could be threatened by anthropogenic climate change beyond 2015. Up to this time the human signal, though detectable and growing, will be a relatively small component of climate variability and change. This implies the need for a twin-track approach: on the one hand, vulnerability assessments of social and economic strategies for coping with present climate extremes and variability, and, on the other hand, development of climate forecast tools and scenarios to evaluate sector-specific, incremental changes in risk over the next few decades. This review starts by describing the climate outlook for the next couple of decades and the implications for adaptation assessments. We then review ways in which climate risk information is already being used in adaptation assessments and evaluate the strengths and weaknesses of three groups of techniques. Next we identify knowledge gaps and opportunities for improving the production and uptake of climate risk information for the 2020s. We assert that climate change scenarios can meet some, but not all, of the needs of adaptation planning. Even then, the choice of scenario technique must be matched to the intended application, taking into account local constraints of time, resources, human capacity and supporting infrastructure. We also show that much greater attention should be given to improving and critiquing models used for climate impact assessment, as standard practice. Finally, we highlight the over-arching need for the scientific community to provide more information and guidance on adapting to the risks of climate variability and change over nearer time horizons (i.e. the 2020s). Although the focus of the review is on information provision and uptake in developing regions, it is clear that many developed countries are facing the same challenges. Copyright © 2009 Royal Meteorological Society
Resumo:
In this paper we propose an alternative model of, what is often called, land value capture in the planning system. Based on development viability models, negotiations and policy formation regarding the level of planning obligations have taken place at the local level with little clear guidance on technique, approach and method. It is argued that current approaches are regressive and fail to reflect how the ability of sites to generate planning gain can vary over time and between sites. The alternative approach suggested here attempts to rationalise rather than replace the existing practice of development viability appraisal. It is based upon the assumption that schemes with similar development values should produce similar levels of return to the landowner, developer and other stakeholders in the development as well as similar levels of planning obligations in all parts of the country. Given the high level of input uncertainty in viability modelling, a simple viability model is ‘good enough’ to quantify the maximum level of planning obligations for a given level of development value. We have argued that such an approach can deliver a more durable, equitable, simpler, consistent and cheaper method for policy formation regarding planning obligations.
Resumo:
There has been an increased emphasis upon the application of science for humanitarian and development planning, decision-making and practice; particularly in the context of understanding, assessing and anticipating risk (e.g. HERR, 2011). However, there remains very little guidance for practitioners on how to integrate sciences they may have had little contact with in the past (e.g. climate). This has led to confusion as to which ‘science’ might be of use and how it would be best utilised. Furthermore, since this integration has stemmed from a need to be more predictive, agencies are struggling with the problems associated with uncertainty and probability. Whilst a range of expertise is required to build resilience, these guidelines focus solely upon the relevant data, information, knowledge, methods, principles and perspective which scientists can provide, that typically lie outside of current humanitarian and development approaches. Using checklists, real-life case studies and scenarios the full guidelines take practitioners through a five step approach to finding, understanding and applying science. This document provides a short summary of the five steps and some key lessons for integrating science.
Resumo:
Movement disorders (MD) include a group of neurological disorders that involve neuromotor systems. MD can result in several abnormalities ranging from an inability to move, to severe constant and excessive movements. Strokes are a leading cause of disability affecting largely the older people worldwide. Traditional treatments rely on the use of physiotherapy that is partially based on theories and also heavily reliant on the therapists training and past experience. The lack of evidence to prove that one treatment is more effective than any other makes the rehabilitation of stroke patients a difficult task. UL motor re-learning and recovery levels tend to improve with intensive physiotherapy delivery. The need for conclusive evidence supporting one method over the other and the need to stimulate the stroke patient clearly suggest that traditional methods lack high motivational content, as well as objective standardised analytical methods for evaluating a patient's performance and assessment of therapy effectiveness. Despite all the advances in machine mediated therapies, there is still a need to improve therapy tools. This chapter describes a new approach to robot assisted neuro-rehabilitation for upper limb rehabilitation. Gentle/S introduces a new approach on the integration of appropriate haptic technologies to high quality virtual environments, so as to deliver challenging and meaningful therapies to people with upper limb impairment in consequence of a stroke. The described approach can enhance traditional therapy tools, provide therapy "on demand" and can present accurate objective measurements of a patient's progression. Our recent studies suggest the use of tele-presence and VR-based systems can potentially motivate patients to exercise for longer periods of time. Two identical prototypes have undergone extended clinical trials in the UK and Ireland with a cohort of 30 stroke subjects. From the lessons learnt with the Gentle/S approach, it is clear also that high quality therapy devices of this nature have a role in future delivery of stroke rehabilitation, and machine mediated therapies should be available to patient and his/her clinical team from initial hospital admission, through to long term placement in the patient's home following hospital discharge.
Resumo:
Descriptions of graphic language are relatively rare compared to descriptions of spoken language. This paper presents an analytical approach to studying the visual attributes and conventions in children’s reading and information books. The approach comprises development of a checklist to record ‘features’ of visual organization, such as those relevant to typography and layout, illustration and the material qualities of the books, and consideration of the contextual factors that influence the ways that features have been organized or treated. The contextual factors particularly relevant to children’s reading include educational policy, legibility and vision research and typeface development and availability. The approach to analysis and description is illustrated with examples of children’s reading and information books from the Typographic Design for Children database, which also demonstrates an application of the checklist approach.
Resumo:
In accord with the general program of researching factors relating to ultimate attainment and maturational constraints in adult language acquisition, this commentary highlights the importance of input differences in amount, type, and setting between naturalistic and classroom learners of an L2. It is suggested that these variables are often confounded with age factors. Herein, we wish to call attention to the possible deterministic role that the differences in the grammatical quality of classroom input have on development and on competence outcomes. Framing what we see as greater formal complexity of the learning task for classroom learners, we suggest that one might benefit from focusing less on difference and more on how classroom L2 learners, at least some of them, come to acquire all that they do despite crucial qualitative differences in their input.
Resumo:
We describe the approach to be adopted for a major new initiative to derive a homogeneous record of sea surface temperature for 1991–2007 from the observations of the series of three along-track scanning radiometers (ATSRs). This initiative is called (A)RC: (Advanced) ATSR Re-analysis for Climate. The main objectives are to reduce regional biases in retrieved sea surface temperature (SST) to less than 0.1 K for all global oceans, while creating a very homogenous record that is stable in time to within 0.05 K decade−1, with maximum independence of the record from existing analyses of SST used in climate change research. If these stringent targets are achieved, this record will enable significantly improved estimates of surface temperature trends and variability of sufficient quality to advance questions of climate change attribution, climate sensitivity and historical reconstruction of surface temperature changes. The approach includes development of new, consistent estimators for SST for each of the ATSRs, and detailed analysis of overlap periods. Novel aspects of the approach include generation of multiple versions of the record using alternative channel sets and cloud detection techniques, to assess for the first time the effect of such choices. There will be extensive effort in quality control, validation and analysis of the impact on climate SST data sets. Evidence for the plausibility of the 0.1 K target for systematic error is reviewed, as is the need for alternative cloud screening methods in this context.
Resumo:
Within the literature, many authors have argued that the rapid growth of the field of Information and Communication Technologies for Development (ICT4D) has resulted in an emphasis on the applications rather than on theory. However, it is clear that it is not theories, rather the integration of theory and practice, that is often lacking. To address this gap, the authors begin by exploring some of the popular theoretical approaches to ICT4D with a view to identifying those theories relevant to shared impacts: development, delivery and communication. To unify practice and theory, we offer a framework to directly assess the impact of ICT4D on development.
Resumo:
Crop production is inherently sensitive to variability in climate. Temperature is a major determinant of the rate of plant development and, under climate change, warmer temperatures that shorten development stages of determinate crops will most probably reduce the yield of a given variety. Earlier crop flowering and maturity have been observed and documented in recent decades, and these are often associated with warmer (spring) temperatures. However, farm management practices have also changed and the attribution of observed changes in phenology to climate change per se is difficult. Increases in atmospheric [CO2] often advance the time of flowering by a few days, but measurements in FACE (free air CO2 enrichment) field-based experiments suggest that elevated [CO2] has little or no effect on the rate of development other than small advances in development associated with a warmer canopy temperature. The rate of development (inverse of the duration from sowing to flowering) is largely determined by responses to temperature and photoperiod, and the effects of temperature and of photoperiod at optimum and suboptimum temperatures can be quantified and predicted. However, responses to temperature, and more particularly photoperiod, at supraoptimal temperature are not well understood. Analysis of a comprehensive data set of time to tassel initiation in maize (Zea mays) with a wide range of photoperiods above and below the optimum suggests that photoperiod modulates the negative effects of temperature above the optimum. A simulation analysis of the effects of prescribed increases in temperature (0-6 degrees C in + 1 degrees C steps) and temperature variability (0% and + 50%) on days to tassel initiation showed that tassel initiation occurs later, and variability was increased, as the temperature exceeds the optimum in models both with and without photoperiod sensitivity. However, the inclusion of photoperiod sensitivity above the optimum temperature resulted in a higher apparent optimum temperature and less variability in the time of tassel initiation. Given the importance of changes in plant development for crop yield under climate change, the effects of photoperiod and temperature on development rates above the optimum temperature clearly merit further research, and some of the knowledge gaps are identified herein.
Resumo:
One of the key hindrances on development of solid catalysts containing cobalt species for partial oxidation of organic molecules at mild conditions in conventional liquid phase is the severe metal leaching. The leached soluble Co species with a higher degree of freedom always out-performs those of solid supported Co species in oxidation catalysis. However, the homogeneous Co species concomitantly introduces separation problems. We have recently reponed for the first time, a new oxidation catalyst system for the oxidation of organic molecules in supercritical CO2 using the principle of micellar catalysis. [CF3(CF2)(8)COO](2)Co.xH(2)O (the fluorinated anionic moiety forms aqueous reverse micelles carrying water-soluble Co2+ cations in scCO(2)) was previously shown to be extremely active for the oxidation of toluene in the presence of sodium bromide in water-CO2 mixture, giving 98% conversion and 99% selectivity to benzoic acid at 120 degreesC. In this study, we show that the effects of varying the type of surfactant counterions and the length of the surfactant chains on catalysis. It is found that the use of [CF3(CF2)(8)COO](2)Mg.yH(2)O/Co(II) acetate is as effective as the [CF3(CF2)(8)COO](2)Co.xH(2)O and the fluorinated chain length used has a subtle effect on the catalytic rate measured. It is also demonstrated that this new type of micellar catalyst in scCO(2) can be easily separated via CO2 depressurisation and be reused without noticeable deactivation. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
Several pixel-based people counting methods have been developed over the years. Among these the product of scale-weighted pixel sums and a linear correlation coefficient is a popular people counting approach. However most approaches have paid little attention to resolving the true background and instead take all foreground pixels into account. With large crowds moving at varying speeds and with the presence of other moving objects such as vehicles this approach is prone to problems. In this paper we present a method which concentrates on determining the true-foreground, i.e. human-image pixels only. To do this we have proposed, implemented and comparatively evaluated a human detection layer to make people counting more robust in the presence of noise and lack of empty background sequences. We show the effect of combining human detection with a pixel-map based algorithm to i) count only human-classified pixels and ii) prevent foreground pixels belonging to humans from being absorbed into the background model. We evaluate the performance of this approach on the PETS 2009 dataset using various configurations of the proposed methods. Our evaluation demonstrates that the basic benchmark method we implemented can achieve an accuracy of up to 87% on sequence ¿S1.L1 13-57 View 001¿ and our proposed approach can achieve up to 82% on sequence ¿S1.L3 14-33 View 001¿ where the crowd stops and the benchmark accuracy falls to 64%.
Resumo:
This study adopts the RBV of the firm in order to identify critical advantage-generating resources and capabilities with strong positive export strategy and performance implications. The proposed export performance model is tested using a structural equation modeling approach on a sample of 356 British exporters. We examine the individual as well as the concurrent (simultaneous) direct and indirect effects of five resource bundles on export performance. We find that four resources/capabilities: managerial, knowledge, planning, and technology, have a significant positive direct effect on export performance, while relational and physical resources exhibited no unique positive effect. We also find that the firm’s export strategy mediates the resource-performance nexus in the case of managerial and knowledge-based resources. The theoretical and methodological grounding of this study contributes to the advancement of export related research by providing better specification of the nature of the effects – direct or indirect – of particular resource factors on export performance.