85 resultados para top-down approach
Resumo:
This paper explores the role of local government in urban regeneration in England. The first part describes local-central government relations during recent decades. It concludes that 'actually occurring' regeneration fuses top-down and bottom-up priorities and preferences, as well as path dependencies created by past decisions and local relations. The second part illustrates this contention by examining the regeneration of inner-city Salford over a 25-year period. It describes Salford City Council's approach in achieving the redevelopment of the former Salford Docks and how this created the confidence for the council to embark on further regeneration projects. Yet the top-down decision-making model has failed to satisfy local expectations, creating apathy which threatens the Labour government's desire for active citizens in regeneration projects.
Resumo:
The degree to which perceived controllability alters the way a stressor is experienced varies greatly among individuals. We used functional magnetic resonance imaging to examine the neural activation associated with individual differences in the impact of perceived controllability on self-reported pain perception. Subjects with greater activation in response to uncontrollable (UC) rather than controllable (C) pain in the pregenual anterior cingulate cortex (pACC), periaqueductal gray (PAG), and posterior insula/SII reported higher levels of pain during the UC versus C conditions. Conversely, subjects with greater activation in the ventral lateral prefrontal cortex (VLPFC) in anticipation of pain in the UC versus C conditions reported less pain in response to UC versus C pain. Activation in the VLPFC was significantly correlated with the acceptance and denial subscales of the COPE inventory [Carver, C. S., Scheier, M. F., & Weintraub, J. K. Assessing coping strategies: A theoretically based approach. Journal of Personality and Social Psychology, 56, 267–283, 1989], supporting the interpretation that this anticipatory activation was associated with an attempt to cope with the emotional impact of uncontrollable pain. A regression model containing the two prefrontal clusters (VLPFC and pACC) predicted 64% of the variance in pain rating difference, with activation in the two additional regions (PAG and insula/SII) predicting almost no additional variance. In addition to supporting the conclusion that the impact of perceived controllability on pain perception varies highly between individuals, these findings suggest that these effects are primarily top-down, driven by processes in regions of the prefrontal cortex previously associated with cognitive modulation of pain and emotion regulation.
Resumo:
Data from the MIPAS instrument on Envisat, supplemented by meteorological analyses from ECMWF and the Met Office, are used to study the meteorological and trace-gas evolution of the stratosphere in the southern hemisphere during winter and spring 2003. A pole-centred approach is used to interpret the data in the physically meaningful context of the evolving stratospheric polar vortex. The following salient dynamical and transport features are documented and analysed: the merger of anticyclones in the stratosphere; the development of an intense, quasi-stationary anticyclone in spring; the associated top-down breakdown of the polar vortex; the systematic descent of air into the polar vortex; and the formation of a three-dimensional structure of a tracer filament on a planetary scale. The paper confirms and extends existing paradigms of the southern hemisphere vortex evolution. The quality of the MIPAS observations is seen to be generally good. though the water vapour retrievals are unrealistic above 10 hPa in the high-latitude winter.
Resumo:
Given the growing impact of human activities on the sea, managers are increasingly turning to marine protected areas (MPAs) to protect marine habitats and species. Many MPAs have been unsuccessful, however, and lack of income has been identified as a primary reason for failure. In this study, data from a global survey of 79 MPAs in 36 countries were analysed and attempts made to construct predictive models to determine the income requirements of any given MPA. Statistical tests were used to uncover possible patterns and relationships in the data, with two basic approaches. In the first of these, an attempt was made to build an explanatory "bottom-up" model of the cost structures that might be required to pursue various management activities. This proved difficult in practice owing to the very broad range of applicable data, spanning many orders of magnitude. In the second approach, a "top-down" regression model was constructed using logarithms of the base data, in order to address the breadth of the data ranges. This approach suggested that MPA size and visitor numbers together explained 46% of the minimum income requirements (P < 0.001), with area being the slightly more influential factor. The significance of area to income requirements was of little surprise, given its profile in the literature. However, the relationship between visitors and income requirements might go some way to explaining why northern hemisphere MPAs with apparently high incomes still claim to be under-funded. The relationship between running costs and visitor numbers has important implications not only in determining a realistic level of funding for MPAs, but also in assessing from where funding might be obtained. Since a substantial proportion of the income of many MPAs appears to be utilized for amenity purposes, a case may be made for funds to be provided from the typically better resourced government social and educational budgets as well as environmental budgets. Similarly visitor fees, already an important source of funding for some MPAs, might have a broader role to play in how MPAs are financed in the future. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Development research has responded to a number of charges over the past few decades. For example, when traditional research was accused of being 'top-down', the response was participatory research, linking the 'receptors' to the generators of research. As participatory processes were recognised as producing limited outcomes, the demand-led agenda was born. In response to the alleged failure of research to deliver its products, the 'joined-up' model, which links research with the private sector, has become popular. However, using examples from animal-health research, this article demonstrates that all the aforementioned approaches are seriously limited in their attempts to generate outputs to address the multi-faceted problems facing the poor. The article outlines a new approach to research: the Mosaic Model. By combining different knowledge forms, and focusing on existing gaps, the model aims to bridge basic and applied findings to enhance the efficiency and value of research, past, present, and future.
Resumo:
This paper investigates whether and to what extent a wide range of actors in the UK are adapting to climate change, and whether this is evidence of a social transition. We document evidence of over 300 examples of early adopters of adaptation practice to climate change in the UK. These examples span a range of activities from small adjustments (or coping) to building adaptive capacity, implementing actions and creating deeper systemic change in public and private organisations in a range of sectors. We find that adaptation in the UK has been dominated by government initiatives and has principally occurred in the form of research into climate change impacts. These actions within government stimulate a further set of actions at other scales in public agencies, regulatory agencies and regional government (or in the devolved administrations), though with little real evidence of climate change adaptation initiatives trickling down to local government level. The water supply and flood defence sectors, requiring significant investment in large scale infrastructure such as reservoirs and coastal defences, have invested more heavily in identifying potential impacts and adaptations. Economic sectors that are not dependent on large scale infrastructure appear to be investing far less effort and resources in preparing for climate change. We conclude that while the government-driven top-down targeted adaptation approach has generated anticipatory action at low cost, it may also have created enough niche activities to allow for diffusion of new adaptation practices in response to real or perceived climate change. These results have significant implications for how climate policy can be developed to support autonomous adaptors in the UK and other countries.
Resumo:
Several methods for assessing the sustainability of agricultural systems have been developed. These methods do not fully: (i) take into account the multi‐functionality of agriculture; (ii) include multidimensionality; (iii) utilize and implement the assessment knowledge; and (iv) identify conflicting goals and trade‐offs. This paper reviews seven recently developed multidisciplinary indicator‐based assessment methods with respect to their contribution to these shortcomings. All approaches include (1) normative aspects such as goal setting, (2) systemic aspects such as a specification of scale of analysis, (3) a reproducible structure of the approach. The approaches can be categorized into three typologies. The top‐down farm assessments focus on field or farm assessment. They have a clear procedure for measuring the indicators and assessing the sustainability of the system, which allows for benchmarking across farms. The degree of participation is low, potentially affecting the implementation of the results negatively. The top‐down regional assessment assesses the on‐farm and the regional effects. They include some participation to increase acceptance of the results. However, they miss the analysis of potential trade‐offs. The bottom‐up, integrated participatory or transdisciplinary approaches focus on a regional scale. Stakeholders are included throughout the whole process assuring the acceptance of the results and increasing the probability of implementation of developed measures. As they include the interaction between the indicators in their system representation, they allow for performing a trade‐off analysis. The bottom‐up, integrated participatory or transdisciplinary approaches seem to better overcome the four shortcomings mentioned above.
Resumo:
Providing high quality and timely feedback to students is often a challenge for many staff in higher education as it can be both time-consuming and frustratingly repetitive. From the student perspective, feedback may sometimes be considered unhelpful, confusing and inconsistent and may not always be provided within a timeframe that is considered to be ‘useful’. The ASSET project, based at the University of Reading, addresses many of these inherent challenges by encouraging the provision of feedback that supports learning, i.e. feedback that contains elements of ‘feed-forward’, is of a high quality and is delivered in a timely manner. In particular, the project exploits the pedagogic benefits of video/audio media within a Web 2.0 context to provide a new, interactive resource, ‘ASSET’, to enhance the feedback experience for both students and staff. A preliminary analysis of both our quantitative and qualitative pedagogic data demonstrate that the ASSET project has instigated change in the ways in which both staff and students think about, deliver, and engage with feedback. For example, data from our online questionnaires and focus groups with staff and students indicate a positive response to the use of video as a medium for delivering feedback to students. In particular, the academic staff engaged in piloting the ASSET resource indicated that i) using video has made them think more, and in some cases differently, about the ways in which they deliver feedback to students and ii) they now see video as an effective means of making feedback more useful and engaging for students. Moreover, the majority of academic staff involved in the project have said they will continue to use video feedback. From the student perspective, 60% of those students whose lecturers used ASSET to provide video feedback said that “receiving video feedback encouraged me to take more notice of the feedback compared with normal methods” and 80% would like their lecturer to continue to use video as a method for providing feedback. An important aim of the project was for it to complement existing University-wide initiatives on feedback and for ASSET to become a ‘model’ resource for staff and students wishing to explore video as a medium for feedback provision. An institutional approach was therefore adopted and key members of Senior Management, academics, T&L support staff, IT support and Student Representatives were embedded within the project from the start. As with all initiatives of this kind, a major issue is the future sustainability of the ASSET resource and to have had both ‘top-down’ and ‘bottom-up’ support for the project has been extremely beneficial. In association with the project team the University is currently exploring the creation of an open-source, two-tiered video supply solution and a ‘framework’ (that other HEIs can adopt and/or adapt) to support staff in using video for feedback provision. In this way students and staff will have new opportunities to explore video and to exploit the benefits of this medium for supporting learning.
Resumo:
Recently major processor manufacturers have announced a dramatic shift in their paradigm to increase computing power over the coming years. Instead of focusing on faster clock speeds and more powerful single core CPUs, the trend clearly goes towards multi core systems. This will also result in a paradigm shift for the development of algorithms for computationally expensive tasks, such as data mining applications. Obviously, work on parallel algorithms is not new per se but concentrated efforts in the many application domains are still missing. Multi-core systems, but also clusters of workstations and even large-scale distributed computing infrastructures provide new opportunities and pose new challenges for the design of parallel and distributed algorithms. Since data mining and machine learning systems rely on high performance computing systems, research on the corresponding algorithms must be on the forefront of parallel algorithm research in order to keep pushing data mining and machine learning applications to be more powerful and, especially for the former, interactive. To bring together researchers and practitioners working in this exciting field, a workshop on parallel data mining was organized as part of PKDD/ECML 2006 (Berlin, Germany). The six contributions selected for the program describe various aspects of data mining and machine learning approaches featuring low to high degrees of parallelism: The first contribution focuses the classic problem of distributed association rule mining and focuses on communication efficiency to improve the state of the art. After this a parallelization technique for speeding up decision tree construction by means of thread-level parallelism for shared memory systems is presented. The next paper discusses the design of a parallel approach for dis- tributed memory systems of the frequent subgraphs mining problem. This approach is based on a hierarchical communication topology to solve issues related to multi-domain computational envi- ronments. The forth paper describes the combined use and the customization of software packages to facilitate a top down parallelism in the tuning of Support Vector Machines (SVM) and the next contribution presents an interesting idea concerning parallel training of Conditional Random Fields (CRFs) and motivates their use in labeling sequential data. The last contribution finally focuses on very efficient feature selection. It describes a parallel algorithm for feature selection from random subsets. Selecting the papers included in this volume would not have been possible without the help of an international Program Committee that has provided detailed reviews for each paper. We would like to also thank Matthew Otey who helped with publicity for the workshop.
Resumo:
Practical applications of portfolio optimisation tend to proceed on a “top down” basis where funds are allocated first at asset class level (between, say, bonds, cash, equities and real estate) and then, progressively, at sub-class level (within property to sectors, office, retail, industrial for example). While there are organisational benefits from such an approach, it can potentially lead to sub-optimal allocations when compared to a “global” or “side-by-side” optimisation. This will occur where there are correlations between sub-classes across the asset divide that are masked in aggregation – between, for instance, City offices and the performance of financial services stocks. This paper explores such sub-class linkages using UK monthly stock and property data. Exploratory analysis using clustering procedures and factor analysis suggests that property performance and equity performance are distinctive: there is little persuasive evidence of contemporaneous or lagged sub-class linkages. Formal tests of the equivalence of optimised portfolios using top-down and global approaches failed to demonstrate significant differences, whether or not allocations were constrained. While the results may be a function of measurement of market returns, it is those returns that are used to assess fund performance. Accordingly, the treatment of real estate as a distinct asset class with diversification potential seems justified.
Resumo:
Dorsolateral prefrontal cortex (DLPFC) is recruited during visual working memory (WM) when relevant information must be maintained in the presence of distracting information. The mechanism by which DLPFC might ensure successful maintenance of the contents of WM is, however, unclear; it might enhance neural maintenance of memory targets or suppress processing of distracters. To adjudicate between these possibilities, we applied time-locked transcranial magnetic stimulation (TMS) during functional MRI, an approach that permits causal assessment of a stimulated brain region's influence on connected brain regions, and evaluated how this influence may change under different task conditions. Participants performed a visual WM task requiring retention of visual stimuli (faces or houses) across a delay during which visual distracters could be present or absent. When distracters were present, they were always from the opposite stimulus category, so that targets and distracters were represented in distinct posterior cortical areas. We then measured whether DLPFC-TMS, administered in the delay at the time point when distracters could appear, would modulate posterior regions representing memory targets or distracters. We found that DLPFC-TMS influenced posterior areas only when distracters were present and, critically, that this influence consisted of increased activity in regions representing the current memory targets. DLPFC-TMS did not affect regions representing current distracters. These results provide a new line of causal evidence for a top-down DLPFC-based control mechanism that promotes successful maintenance of relevant information in WM in the presence of distraction.
Resumo:
Flood extents caused by fluvial floods in urban and rural areas may be predicted by hydraulic models. Assimilation may be used to correct the model state and improve the estimates of the model parameters or external forcing. One common observation assimilated is the water level at various points along the modelled reach. Distributed water levels may be estimated indirectly along the flood extents in Synthetic Aperture Radar (SAR) images by intersecting the extents with the floodplain topography. It is necessary to select a subset of levels for assimilation because adjacent levels along the flood extent will be strongly correlated. A method for selecting such a subset automatically and in near real-time is described, which would allow the SAR water levels to be used in a forecasting model. The method first selects candidate waterline points in flooded rural areas having low slope. The waterline levels and positions are corrected for the effects of double reflections between the water surface and emergent vegetation at the flood edge. Waterline points are also selected in flooded urban areas away from radar shadow and layover caused by buildings, with levels similar to those in adjacent rural areas. The resulting points are thinned to reduce spatial autocorrelation using a top-down clustering approach. The method was developed using a TerraSAR-X image from a particular case study involving urban and rural flooding. The waterline points extracted proved to be spatially uncorrelated, with levels reasonably similar to those determined manually from aerial photographs, and in good agreement with those of nearby gauges.
Resumo:
Methods for assessing the sustainability of agricultural systems do often not fully (i) take into account the multifunctionality of agriculture, (ii) include multidimensionality, (iii) utilize and implement the assessment knowledge and (iv) identify conflicting goals and trade-offs. This chapter reviews seven recently developed multidisciplinary indicator-based assessment methods with respect to their contribution to these shortcomings. All approaches include (1) normative aspects such as goal setting, (2) systemic aspects such as a specification of scale of analysis and (3) a reproducible structure of the approach. The approaches can be categorized into three typologies: first, top-down farm assessments, which focus on field or farm assessment; second, top-down regional assessments, which assess the on-farm and the regional effects; and third, bottom-up, integrated participatory or transdisciplinary approaches, which focus on a regional scale. Our analysis shows that the bottom-up, integrated participatory or transdisciplinary approaches seem to better overcome the four shortcomings mentioned above.
Resumo:
The UK has a target for an 80% reduction in CO2 emissions by 2050 from a 1990 base. Domestic energy use accounts for around 30% of total emissions. This paper presents a comprehensive review of existing models and modelling techniques and indicates how they might be improved by considering individual buying behaviour. Macro (top-down) and micro (bottom-up) models have been reviewed and analysed. It is found that bottom-up models can project technology diffusion due to their higher resolution. The weakness of existing bottom-up models at capturing individual green technology buying behaviour has been identified. Consequently, Markov chains, neural networks and agent-based modelling are proposed as possible methods to incorporate buying behaviour within a domestic energy forecast model. Among the three methods, agent-based models are found to be the most promising, although a successful agent approach requires large amounts of input data. A prototype agent-based model has been developed and tested, which demonstrates the feasibility of an agent approach. This model shows that an agent-based approach is promising as a means to predict the effectiveness of various policy measures.
Resumo:
In a world where data is captured on a large scale the major challenge for data mining algorithms is to be able to scale up to large datasets. There are two main approaches to inducing classification rules, one is the divide and conquer approach, also known as the top down induction of decision trees; the other approach is called the separate and conquer approach. A considerable amount of work has been done on scaling up the divide and conquer approach. However, very little work has been conducted on scaling up the separate and conquer approach.In this work we describe a parallel framework that allows the parallelisation of a certain family of separate and conquer algorithms, the Prism family. Parallelisation helps the Prism family of algorithms to harvest additional computer resources in a network of computers in order to make the induction of classification rules scale better on large datasets. Our framework also incorporates a pre-pruning facility for parallel Prism algorithms.