911 resultados para top-down approach
Resumo:
Dorsolateral prefrontal cortex (DLPFC) is recruited during visual working memory (WM) when relevant information must be maintained in the presence of distracting information. The mechanism by which DLPFC might ensure successful maintenance of the contents of WM is, however, unclear; it might enhance neural maintenance of memory targets or suppress processing of distracters. To adjudicate between these possibilities, we applied time-locked transcranial magnetic stimulation (TMS) during functional MRI, an approach that permits causal assessment of a stimulated brain region's influence on connected brain regions, and evaluated how this influence may change under different task conditions. Participants performed a visual WM task requiring retention of visual stimuli (faces or houses) across a delay during which visual distracters could be present or absent. When distracters were present, they were always from the opposite stimulus category, so that targets and distracters were represented in distinct posterior cortical areas. We then measured whether DLPFC-TMS, administered in the delay at the time point when distracters could appear, would modulate posterior regions representing memory targets or distracters. We found that DLPFC-TMS influenced posterior areas only when distracters were present and, critically, that this influence consisted of increased activity in regions representing the current memory targets. DLPFC-TMS did not affect regions representing current distracters. These results provide a new line of causal evidence for a top-down DLPFC-based control mechanism that promotes successful maintenance of relevant information in WM in the presence of distraction.
Resumo:
Flood extents caused by fluvial floods in urban and rural areas may be predicted by hydraulic models. Assimilation may be used to correct the model state and improve the estimates of the model parameters or external forcing. One common observation assimilated is the water level at various points along the modelled reach. Distributed water levels may be estimated indirectly along the flood extents in Synthetic Aperture Radar (SAR) images by intersecting the extents with the floodplain topography. It is necessary to select a subset of levels for assimilation because adjacent levels along the flood extent will be strongly correlated. A method for selecting such a subset automatically and in near real-time is described, which would allow the SAR water levels to be used in a forecasting model. The method first selects candidate waterline points in flooded rural areas having low slope. The waterline levels and positions are corrected for the effects of double reflections between the water surface and emergent vegetation at the flood edge. Waterline points are also selected in flooded urban areas away from radar shadow and layover caused by buildings, with levels similar to those in adjacent rural areas. The resulting points are thinned to reduce spatial autocorrelation using a top-down clustering approach. The method was developed using a TerraSAR-X image from a particular case study involving urban and rural flooding. The waterline points extracted proved to be spatially uncorrelated, with levels reasonably similar to those determined manually from aerial photographs, and in good agreement with those of nearby gauges.
Resumo:
Methods for assessing the sustainability of agricultural systems do often not fully (i) take into account the multifunctionality of agriculture, (ii) include multidimensionality, (iii) utilize and implement the assessment knowledge and (iv) identify conflicting goals and trade-offs. This chapter reviews seven recently developed multidisciplinary indicator-based assessment methods with respect to their contribution to these shortcomings. All approaches include (1) normative aspects such as goal setting, (2) systemic aspects such as a specification of scale of analysis and (3) a reproducible structure of the approach. The approaches can be categorized into three typologies: first, top-down farm assessments, which focus on field or farm assessment; second, top-down regional assessments, which assess the on-farm and the regional effects; and third, bottom-up, integrated participatory or transdisciplinary approaches, which focus on a regional scale. Our analysis shows that the bottom-up, integrated participatory or transdisciplinary approaches seem to better overcome the four shortcomings mentioned above.
Resumo:
The UK has a target for an 80% reduction in CO2 emissions by 2050 from a 1990 base. Domestic energy use accounts for around 30% of total emissions. This paper presents a comprehensive review of existing models and modelling techniques and indicates how they might be improved by considering individual buying behaviour. Macro (top-down) and micro (bottom-up) models have been reviewed and analysed. It is found that bottom-up models can project technology diffusion due to their higher resolution. The weakness of existing bottom-up models at capturing individual green technology buying behaviour has been identified. Consequently, Markov chains, neural networks and agent-based modelling are proposed as possible methods to incorporate buying behaviour within a domestic energy forecast model. Among the three methods, agent-based models are found to be the most promising, although a successful agent approach requires large amounts of input data. A prototype agent-based model has been developed and tested, which demonstrates the feasibility of an agent approach. This model shows that an agent-based approach is promising as a means to predict the effectiveness of various policy measures.
Resumo:
In a world where data is captured on a large scale the major challenge for data mining algorithms is to be able to scale up to large datasets. There are two main approaches to inducing classification rules, one is the divide and conquer approach, also known as the top down induction of decision trees; the other approach is called the separate and conquer approach. A considerable amount of work has been done on scaling up the divide and conquer approach. However, very little work has been conducted on scaling up the separate and conquer approach.In this work we describe a parallel framework that allows the parallelisation of a certain family of separate and conquer algorithms, the Prism family. Parallelisation helps the Prism family of algorithms to harvest additional computer resources in a network of computers in order to make the induction of classification rules scale better on large datasets. Our framework also incorporates a pre-pruning facility for parallel Prism algorithms.
Resumo:
Inducing rules from very large datasets is one of the most challenging areas in data mining. Several approaches exist to scaling up classification rule induction to large datasets, namely data reduction and the parallelisation of classification rule induction algorithms. In the area of parallelisation of classification rule induction algorithms most of the work has been concentrated on the Top Down Induction of Decision Trees (TDIDT), also known as the ‘divide and conquer’ approach. However powerful alternative algorithms exist that induce modular rules. Most of these alternative algorithms follow the ‘separate and conquer’ approach of inducing rules, but very little work has been done to make the ‘separate and conquer’ approach scale better on large training data. This paper examines the potential of the recently developed blackboard based J-PMCRI methodology for parallelising modular classification rule induction algorithms that follow the ‘separate and conquer’ approach. A concrete implementation of the methodology is evaluated empirically on very large datasets.
Resumo:
The Prism family of algorithms induces modular classification rules in contrast to the Top Down Induction of Decision Trees (TDIDT) approach which induces classification rules in the intermediate form of a tree structure. Both approaches achieve a comparable classification accuracy. However in some cases Prism outperforms TDIDT. For both approaches pre-pruning facilities have been developed in order to prevent the induced classifiers from overfitting on noisy datasets, by cutting rule terms or whole rules or by truncating decision trees according to certain metrics. There have been many pre-pruning mechanisms developed for the TDIDT approach, but for the Prism family the only existing pre-pruning facility is J-pruning. J-pruning not only works on Prism algorithms but also on TDIDT. Although it has been shown that J-pruning produces good results, this work points out that J-pruning does not use its full potential. The original J-pruning facility is examined and the use of a new pre-pruning facility, called Jmax-pruning, is proposed and evaluated empirically. A possible pre-pruning facility for TDIDT based on Jmax-pruning is also discussed.
Resumo:
Ensemble learning techniques generate multiple classifiers, so called base classifiers, whose combined classification results are used in order to increase the overall classification accuracy. In most ensemble classifiers the base classifiers are based on the Top Down Induction of Decision Trees (TDIDT) approach. However, an alternative approach for the induction of rule based classifiers is the Prism family of algorithms. Prism algorithms produce modular classification rules that do not necessarily fit into a decision tree structure. Prism classification rulesets achieve a comparable and sometimes higher classification accuracy compared with decision tree classifiers, if the data is noisy and large. Yet Prism still suffers from overfitting on noisy and large datasets. In practice ensemble techniques tend to reduce the overfitting, however there exists no ensemble learner for modular classification rule inducers such as the Prism family of algorithms. This article describes the first development of an ensemble learner based on the Prism family of algorithms in order to enhance Prism’s classification accuracy by reducing overfitting.
Resumo:
In order to gain knowledge from large databases, scalable data mining technologies are needed. Data are captured on a large scale and thus databases are increasing at a fast pace. This leads to the utilisation of parallel computing technologies in order to cope with large amounts of data. In the area of classification rule induction, parallelisation of classification rules has focused on the divide and conquer approach, also known as the Top Down Induction of Decision Trees (TDIDT). An alternative approach to classification rule induction is separate and conquer which has only recently been in the focus of parallelisation. This work introduces and evaluates empirically a framework for the parallel induction of classification rules, generated by members of the Prism family of algorithms. All members of the Prism family of algorithms follow the separate and conquer approach.
Resumo:
In recent years, the potential role of planned, internal resettlement as a climate change adaptation measure has been highlighted by national governments and the international policy community. However, in many developing countries, resettlement is a deeply political process that often results in an unequal distribution of costs and benefits amongst relocated persons. This paper examines these tensions in Mozambique, drawing on a case study of flood-affected communities in the Lower Zambezi River valley. It takes a political ecology approach – focusing on discourses of human-environment interaction, as well as the power relationships that are supported by such discourses – to show how a dominant narrative of climate change-induced hazards for small-scale farmers is contributing to their involuntary resettlement to higher-altitude, less fertile areas of land. These forced relocations are buttressed by a series of wider economic and political interests in the Lower Zambezi River region, such dam construction for hydroelectric power generation and the extension of control over rural populations, from which resettled people derive little direct benefit. Rather than engaging with these challenging issues, most international donors present in the country accept the ‘inevitability’ of extreme weather impacts and view resettlement as an unfortunate and, in some cases, necessary step to increase people’s ‘resilience’, thus rationalising the top-down imposition of unpopular social policies. The findings add weight to the argument that a depoliticised interpretation of climate change can deflect attention away from underlying drivers of vulnerability and poverty, as well as obscure the interests of governments that are intent on reordering poor and vulnerable populations.
Resumo:
Climate change is putting Colombian agriculture under significant stress and, if no adaptation is made, the latter will be severely impacted during the next decades. Ramirez-Villegas et al. (2012) set out a government-led, top-down, techno-scientific proposal for a way forward by which Colombian agriculture could adapt to climate change. However, this proposal largely overlooks the root causes of vulnerability of Colombian agriculture, and of smallholders in particular. I discuss some of the hidden assumptions underpinning this proposal and of the arguments employed by Ramirez-Villegas et al., based on existing literature on Colombian agriculture and the wider scientific debate on adaptation to climate change. While technical measures may play an important role in the adaptation of Colombian agriculture to climate change, I question whether the actions listed in the proposal alone and specifically for smallholders, truly represent priority issues. I suggest that by i) looking at vulnerability before adaptation, ii) contextualising climate change as one of multiple exposures, and iii) truly putting smallholders at the centre of adaptation, i.e. to learn about and with them, different and perhaps more urgent priorities for action can be identified. Ultimately, I argue that what is at stake is not only a list of adaptation measures but, more importantly, the scientific approach from which priorities for action are identified. In this respect, I propose that transformative rather than technical fix adaptation represents a better approach for Colombian agriculture and smallholders in particular, in the face of climate change.
Resumo:
The reality of the current international order makes it imperative that a just and effective climate regime balance the historical responsibility of developed countries with the increasing absolute emissions from many developing nations. In this short outlook article, key pillars are proposed for a new international climate architecture that envisions replacing the current annex system with two new annexes –Annex α, for countries with high current emissions and historically high emissions, and Annex β, for countries with high current emissions and historically low emissions. Countries in both annexes would implement legally binding targets under this framework. Additionally, this proposal includes tweaks and revisions to funding and technology transfer mechanisms to correct for weaknesses and inequities under the current Kyoto architecture. The proposed framework stems from a belief that a top-down, international approach to climate policy remains the most effective for ensuring environmental integrity. Given the slow rate of institutional learning, reforming and improving the current system is held as a more efficient course of action than abandoning the progress already achieved. It is argued that the proposed framework effectively accommodates key equity, environmental integrity and political feasibility concerns.
Resumo:
This paper investigates whether and to what extent a wide range of actors in the UK are adapting to climate change, and whether this is evidence of a social transition. We document evidence of over 300 examples of early adopters of adaptation practice to climate change in the UK. These examples span a range of activities from small adjustments (or coping), to building adaptive capacity, to implementing actions and to creating deeper systemic change in public and private organisations in a range of sectors. We find that adaptation in the UK has been dominated by government initiatives and has principally occurred in the form of research into climate change impacts. These government initiatives have stimulated a further set of actions at other scales in public agencies, regulatory agencies and regional government (and the devolved administrations), though with little real evidence of climate change adaptation initiatives trickling down to local government level. The sectors requiring significant investment in large scale infrastructure have invested more heavily than those that do not in identifying potential impacts and adaptations. Thus we find a higher level of adaptation activity by the water supply and flood defence sectors. Sectors that are not dependent on large scale infrastructure appear to be investing far less effort and resources in preparing for climate change. We conclude that the UK government-driven top-down targeted adaptation approach has generated anticipatory action at low cost in some areas. We also conclude that these actions may have created enough niche activities to allow for diffusion of new adaptation practices in response to real or perceived climate change. These results have significant implications for how climate policy can be developed to support autonomous adaptors in the UK and other countries.
Resumo:
The use of information and communication technologies (ICT) for transforming the way publicservices are delivered, has been an area of investment and focus in many countries in recentyears. The UK government envisioned moving from e-Government to transformationalgovernment by 2008, and initiatives such as the National Programme for IT (NPfIT) wereunderway towards this end. NPfIT was the largest civil IT programme worldwide at an initialestimated cost of £12.4bn over a ten-year period. It was launched in 2002 by the UKgovernment as part of its policy to transform the English NHS and to implement standardised ITsolutions at a national level. However, this top down, government led approach came underincreasing scrutiny, and is now being reconfigured towards a more decentralised mode of operations. This paper looks into the implementation of NPfIT and analyses the reasons behindits failure, and what effect the new NHS reforms are likely to have on the health sector. Wedraw from past studies (Weill and Ross, 2005) to highlight the key areas of concern in ITgovernance, using the NPfIT as an illustration
Resumo:
Modeling aging and age-related pathologies presents a substantial analytical challenge given the complexity of gene−environment influences and interactions operating on an individual. A top-down systems approach is used to model the effects of lifelong caloric restriction, which is known to extend life span in several animal models. The metabolic phenotypes of caloric-restricted (CR; n = 24) and pair-housed control-fed (CF; n = 24) Labrador Retriever dogs were investigated by use of orthogonal projection to latent structures discriminant analysis (OPLS-DA) to model both generic and age-specific responses to caloric restriction from the 1H NMR blood serum profiles of young and older dogs. Three aging metabolic phenotypes were resolved: (i) an aging metabolic phenotype independent of diet, characterized by high levels of glutamine, creatinine, methylamine, dimethylamine, trimethylamine N-oxide, and glycerophosphocholine and decreasing levels of glycine, aspartate, creatine and citrate indicative of metabolic changes associated largely with muscle mass; (ii) an aging metabolic phenotype specific to CR dogs that consisted of relatively lower levels of glucose, acetate, choline, and tyrosine and relatively higher serum levels of phosphocholine with increased age in the CR population; (iii) an aging metabolic phenotype specific to CF dogs including lower levels of liproprotein fatty acyl groups and allantoin and relatively higher levels of formate with increased age in the CF population. There was no diet metabotype that consistently differentiated the CF and CR dogs irrespective of age. Glucose consistently discriminated between feeding regimes in dogs (≥312 weeks), being relatively lower in the CR group. However, it was observed that creatine and amino acids (valine, leucine, isoleucine, lysine, and phenylalanine) were lower in the CR dogs (<312 weeks), suggestive of differences in energy source utilization. 1H NMR spectroscopic analysis of longitudinal serum profiles enabled an unbiased evaluation of the metabolic markers modulated by a lifetime of caloric restriction and showed differences in the metabolic phenotype of aging due to caloric restriction, which contributes to longevity studies in caloric-restricted animals. Furthermore, OPLS-DA provided a framework such that significant metabolites relating to life extension could be differentiated and integrated with aging processes.