51 resultados para Eutrophication. Ecological modeling. Eutrophication model. Top-down control

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study compared the molecular lipidomic profi le of LDL in patients with nondiabetic advanced renal disease and no evidence of CVD to that of age-matched controls, with the hypothesis that it would reveal proatherogenic lipid alterations. LDL was isolated from 10 normocholesterolemic patients with stage 4/5 renal disease and 10 controls, and lipids were analyzed by accurate mass LC/MS. Top-down lipidomics analysis and manual examination of the data identifi ed 352 lipid species, and automated comparative analysis demonstrated alterations in lipid profi le in disease. The total lipid and cholesterol content was unchanged, but levels of triacylglycerides and N -acyltaurines were signifi cantly increased, while phosphatidylcholines, plasmenyl ethanolamines, sulfatides, ceramides, and cholesterol sulfate were signifi cantly decreased in chronic kidney disease (CKD) patients. Chemometric analysis of individual lipid species showed very good discrimination of control and disease sample despite the small cohorts and identifi ed individual unsaturated phospholipids and triglycerides mainly responsible for the discrimination. These fi ndings illustrate the point that although the clinical biochemistry parameters may not appear abnormal, there may be important underlying lipidomic changes that contribute to disease pathology. The lipidomic profi le of CKD LDL offers potential for new biomarkers and novel insights into lipid metabolism and cardiovascular risk in this disease. -Reis, A., A. Rudnitskaya, P. Chariyavilaskul, N. Dhaun, V. Melville, J. Goddard, D. J. Webb, A. R. Pitt, and C. M. Spickett. Topdown lipidomics of low density lipoprotein reveal altered lipid profi les in advanced chronic kidney disease. J. Lipid Res. 2015.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neuronal operations associated with the top-down control process of shifting attention from one locus to another involve a network of cortical regions, and their influence is deemed fundamental to visual perception. However, the extent and nature of these operations within primary visual areas are unknown. In this paper, we used magnetoencephalography (MEG) in combination with magnetic resonance imaging (MRI) to determine whether, prior to the onset of a visual stimulus, neuronal activity within early visual cortex is affected by covert attentional shifts. Time/frequency analyses were used to identify the nature of this activity. Our results show that shifting attention towards an expected visual target results in a late-onset (600 ms postcue onset) depression of alpha activity which persists until the appearance of the target. Independent component analysis (ICA) and dipolar source modeling confirmed that the neuronal changes we observed originated from within the calcarine cortex. Our results further show that the amplitude changes in alpha activity were induced not evoked (i.e., not phase-locked to the cued attentional task). We argue that the decrease in alpha prior to the onset of the target may serve to prime the early visual cortex for incoming sensory information. We conclude that attentional shifts affect activity within the human calcarine cortex by altering the amplitude of spontaneous alpha rhythms and that subsequent modulation of visual input with attentional engagement follows as a consequence of these localized changes in oscillatory activity. 2005 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Biased Competition Model (BCM) suggests both top-down and bottom-up biases operate on selective attention (e.g., Desimone & Duncan, 1995). It has been suggested that top-down control signals may arise from working memory. In support, Downing (2000) found faster responses to probes presented in the location of stimuli held vs. not held in working memory. Soto, Heinke, Humphreys, and Blanco (2005) showed the involuntary nature of this effect and that shared features between stimuli were sufficient to attract attention. Here we show that stimuli held in working memory had an influence on the deployment of attentional resources even when: (1) It was detrimental to the task, (2) there was equal prior exposure, and (3) there was no bottom-up priming. These results provide further support for involuntary top-down guidance of attention from working memory and the basic tenets of the BCM, but further discredit the notion that bottom-up priming is necessary for the effect to occur.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is concerned with the inventory control of items that can be considered independent of one another. The decisions when to order and in what quantity, are the controllable or independent variables in cost expressions which are minimised. The four systems considered are referred to as (Q, R), (nQ,R,T), (M,T) and (M,R,T). Wiith ((Q,R) a fixed quantity Q is ordered each time the order cover (i.e. stock in hand plus on order ) equals or falls below R, the re-order level. With the other three systems reviews are made only at intervals of T. With (nQ,R,T) an order for nQ is placed if on review the inventory cover is less than or equal to R, where n, which is an integer, is chosen at the time so that the new order cover just exceeds R. In (M, T) each order increases the order cover to M. Fnally in (M, R, T) when on review, order cover does not exceed R, enough is ordered to increase it to M. The (Q, R) system is examined at several levels of complexity, so that the theoretical savings in inventory costs obtained with more exact models could be compared with the increases in computational costs. Since the exact model was preferable for the (Q,R) system only exact models were derived for theoretical systems for the other three. Several methods of optimization were tried, but most were found inappropriate for the exact models because of non-convergence. However one method did work for each of the exact models. Demand is considered continuous, and with one exception, the distribution assumed is the normal distribution truncated so that demand is never less than zero. Shortages are assumed to result in backorders, not lost sales. However, the shortage cost is a function of three items, one of which, the backorder cost, may be either a linear, quadratic or an exponential function of the length of time of a backorder, with or without period of grace. Lead times are assumed constant or gamma distributed. Lastly, the actual supply quantity is allowed to be distributed. All the sets of equations were programmed for a KDF 9 computer and the computed performances of the four inventory control procedures are compared under each assurnption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article considers two contrasting approaches to reforming public services in order to meet the needs of people living in poverty. The first approach is top-down, involves categorising individuals (as 'hard to help', 'at risk', etc) and invokes scientific backing for justification. The second approach is bottom-up, emancipatory, relates to people as individuals and treats people who have experience of poverty and social exclusion as experts. The article examines each approach through providing brief examples in the fields of unemployment and parenting policy - two fields that have been central to theories of 'cycles of deprivation'. It is suggested here that the two approaches differ in terms of their scale, type of user involvement and type of evidence that is used for their legitimation. While the article suggests that direct comparison between the two approaches is difficult, it highlights the prevalence of top-down approaches towards services for people living in poverty, despite increasing support for bottom-up approaches in other policy areas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The number of interoperable research infrastructures has increased significantly with the growing awareness of the efforts made by the Global Earth Observation System of Systems (GEOSS). One of the Societal Benefit Areas (SBA) that is benefiting most from GEOSS is biodiversity, given the costs of monitoring the environment and managing complex information, from space observations to species records including their genetic characteristics. But GEOSS goes beyond simple data sharing to encourage the publishing and combination of models, an approach which can ease the handling of complex multi-disciplinary questions. It is the purpose of this paper to illustrate these concepts by presenting eHabitat, a basic Web Processing Service (WPS) for computing the likelihood of finding ecosystems with equal properties to those specified by a user. When chained with other services providing data on climate change, eHabitat can be used for ecological forecasting and becomes a useful tool for decision-makers assessing different strategies when selecting new areas to protect. eHabitat can use virtually any kind of thematic data that can be considered as useful when defining ecosystems and their future persistence under different climatic or development scenarios. The paper will present the architecture and illustrate the concepts through case studies which forecast the impact of climate change on protected areas or on the ecological niche of an African bird.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

How are the image statistics of global image contrast computed? We answered this by using a contrast-matching task for checkerboard configurations of battenberg micro-patterns where the contrasts and spatial spreads of interdigitated pairs of micro-patterns were adjusted independently. Test stimuli were 20 20 arrays with various sized cluster widths, matched to standard patterns of uniform contrast. When one of the test patterns contained a pattern with much higher contrast than the other, that determined global pattern contrast, as in a max() operation. Crucially, however, the full matching functions had a curious intermediate region where low contrast additions for one pattern to intermediate contrasts of the other caused a paradoxical reduction in perceived global contrast. None of the following models predicted this: RMS, energy, linear sum, max, Legge and Foley. However, a gain control model incorporating wide-field integration and suppression of nonlinear contrast responses predicted the results with no free parameters. This model was derived from experiments on summation of contrast at threshold, and masking and summation effects in dipper functions. Those experiments were also inconsistent with the failed models above. Thus, we conclude that our contrast gain control model (Meese & Summers, 2007) describes a fundamental operation in human contrast vision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The mechanics-based analysis framework predicts top-down fatigue cracking initiation time in asphalt concrete pavements by utilising fracture mechanics and mixture morphology-based property. To reduce the level of complexity involved, traffic data were characterised and incorporated into the framework using the equivalent single axle load (ESAL) approach. There is a concern that this kind of simplistic traffic characterisation might result in erroneous performance predictions and pavement structural designs. This paper integrates axle load spectra and other traffic characterisation parameters into the mechanics-based analysis framework and studies the impact these traffic characterisation parameters have on predicted fatigue cracking performance. The traffic characterisation inputs studied are traffic growth rate, axle load spectra, lateral wheel wander and volume adjustment factors. For this purpose, a traffic integration approach which incorporates Monte Carlo simulation and representative traffic characterisation inputs was developed. The significance of these traffic characterisation parameters was established by evaluating a number of field pavement sections. It is evident from the results that all the traffic characterisation parameters except truck wheel wander have been observed to have significant influence on predicted top-down fatigue cracking performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

According to some models of visual selective attention, objects in a scene activate corresponding neural representations, which compete for perceptual awareness and motor behavior. During a visual search for a target object, top-down control exerted by working memory representations of the target's defining properties resolves competition in favor of the target. These models, however, ignore the existence of associative links among object representations. Here we show that such associations can strongly influence deployment of attention in humans. In the context of visual search, objects associated with the target were both recalled more often and recognized more accurately than unrelated distractors. Notably, both target and associated objects competitively weakened recognition of unrelated distractors and slowed responses to a luminance probe. Moreover, in a speeded search protocol, associated objects rendered search both slower and less accurate. Finally, the first saccades after onset of the stimulus array were more often directed toward associated than control items.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Single cell recordings in monkeys support the notion that the lateral prefrontal cortex (PFC) controls reactivation of visual working memory representations when rehearsal is disrupted. In contrast, recent fMRI findings yielded a double dissociation for PFC and the medial temporal lobe (MTL) in a letter working memory task. PFC was engaged in interference protection during reactivation while MTL was prominently involved in the retrieval of the letter representations. We present event-related potential data (ERP) that support PFC involvement in the top-down control of reactivation during a visual working memory task with endogenously triggered recovery after visual interference. A differentiating view is proposed for the role of PFC in working memory with respect to endogenous/exogenous control and to stimulus type. General implications for binding and retention mechanisms are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid developments in computer technology have resulted in a widespread use of discrete event dynamic systems (DEDSs). This type of system is complex because it exhibits properties such as concurrency, conflict and non-determinism. It is therefore important to model and analyse such systems before implementation to ensure safe, deadlock free and optimal operation. This thesis investigates current modelling techniques and describes Petri net theory in more detail. It reviews top down, bottom up and hybrid Petri net synthesis techniques that are used to model large systems and introduces on object oriented methodology to enable modelling of larger and more complex systems. Designs obtained by this methodology are modular, easy to understand and allow re-use of designs. Control is the next logical step in the design process. This thesis reviews recent developments in control DEDSs and investigates the use of Petri nets in the design of supervisory controllers. The scheduling of exclusive use of resources is investigated and an efficient Petri net based scheduling algorithm is designed and a re-configurable controller is proposed. To enable the analysis and control of large and complex DEDSs, an object oriented C++ software tool kit was developed and used to implement a Petri net analysis tool, Petri net scheduling and control algorithms. Finally, the methodology was applied to two industrial DEDSs: a prototype can sorting machine developed by Eurotherm Controls Ltd., and a semiconductor testing plant belonging to SGS Thomson Microelectronics Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research is concerned with the development of distributed real-time systems, in which software is used for the control of concurrent physical processes. These distributed control systems are required to periodically coordinate the operation of several autonomous physical processes, with the property of an atomic action. The implementation of this coordination must be fault-tolerant if the integrity of the system is to be maintained in the presence of processor or communication failures. Commit protocols have been widely used to provide this type of atomicity and ensure consistency in distributed computer systems. The objective of this research is the development of a class of robust commit protocols, applicable to the coordination of distributed real-time control systems. Extended forms of the standard two phase commit protocol, that provides fault-tolerant and real-time behaviour, were developed. Petri nets are used for the design of the distributed controllers, and to embed the commit protocol models within these controller designs. This composition of controller and protocol model allows the analysis of the complete system in a unified manner. A common problem for Petri net based techniques is that of state space explosion, a modular approach to both the design and analysis would help cope with this problem. Although extensions to Petri nets that allow module construction exist, generally the modularisation is restricted to the specification, and analysis must be performed on the (flat) detailed net. The Petri net designs for the type of distributed systems considered in this research are both large and complex. The top down, bottom up and hybrid synthesis techniques that are used to model large systems in Petri nets are considered. A hybrid approach to Petri net design for a restricted class of communicating processes is developed. Designs produced using this hybrid approach are modular and allow re-use of verified modules. In order to use this form of modular analysis, it is necessary to project an equivalent but reduced behaviour on the modules used. These projections conceal events local to modules that are not essential for the purpose of analysis. To generate the external behaviour, each firing sequence of the subnet is replaced by an atomic transition internal to the module, and the firing of these transitions transforms the input and output markings of the module. Thus local events are concealed through the projection of the external behaviour of modules. This hybrid design approach preserves properties of interest, such as boundedness and liveness, while the systematic concealment of local events allows the management of state space. The approach presented in this research is particularly suited to distributed systems, as the underlying communication model is used as the basis for the interconnection of modules in the design procedure. This hybrid approach is applied to Petri net based design and analysis of distributed controllers for two industrial applications that incorporate the robust, real-time commit protocols developed. Temporal Petri nets, which combine Petri nets and temporal logic, are used to capture and verify causal and temporal aspects of the designs in a unified manner.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In construction projects, the aim of project control is to ensure projects finish on time, within budget, and achieve other project objectives. During the last few decades, numerous project control methods have been developed and adopted by project managers in practice. However, many existing methods focus on describing what the processes and tasks of project control are; not on how these tasks should be conducted. There is also a potential gap between principles that underly these methods and project control practice. As a result, time and cost overruns are still common in construction projects, partly attributable to deficiencies of existing project control methods and difficulties in implementing them. This paper describes a new project cost and time control model, the project control and inhibiting factors management (PCIM) model, developed through a study involving extensive interaction with construction practitioners in the UK, which better reflects the real needs of project managers. A set of good practice checklist is also developed to facilitate implementation of the model. 2013 American Society of Civil Engineers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computerised production control developments have concentrated on Manufacturing Resources Planning (MRP II) systems. The literature suggests however, that despite the massive investment in hardware, software and management education, successful implementation of such systems in manufacturing industries has proved difficult. This thesis reviews the development of production planning and control systems, in particular, investigates the causes of failures in implementing MRP/MRP II systems in industrial environments and argues that the centralised and top-down planning structure, as well as the routine operational methodology of such systems, is inherently prone to failure. The thesis reviews the control benefits of cellular manufacturing systems but concludes that in more dynamic manufacturing environments, techniques such as Kanban are inappropriate. The basic shortcomings of MRP II systems are highlighted and a new enhanced operational methodology based on distributed planning and control principles is introduced. Distributed Manufacturing Resources Planning (DMRP), was developed as a capacity sensitive production planning and control solution for cellular manufacturing environments. The system utilises cell based, independently operated MRP II systems, integrated into a plant-wide control system through a Local Area Network. The potential benefits of adopting the system in industrial environments is discussed and the results of computer simulation experiments to compare the performance of the DMRP system against the conventional MRP II systems presented. DMRP methodology is shown to offer significant potential advantages which include ease of implementation, cost effectiveness, capacity sensitivity, shorter manufacturing lead times, lower working in progress levels and improved customer service.