27 resultados para Computations Driven Systems
Resumo:
Occupants’ behaviour when improving the indoor environment plays a significant role in saving energy in buildings. Therefore the key step to reducing energy consumption and carbon emissions from buildings is to understand how occupants interact with the environment they are exposed to in terms of achieving thermal comfort and well-being; though such interaction is complex. This paper presents a dynamic process of occupant behaviours involving technological, personal and psychological adaptations in response to varied thermal conditions based on the data covering four seasons gathered from the field study in Chongqing, China. It demonstrates that occupants are active players in environmental control and their adaptive responses are driven strongly by ambient thermal stimuli and vary from season to season and from time to time, even on the same day. Positive, dynamic, behavioural adaptation will help save energy used in heating and cooling buildings. However, when environmental parameters cannot fully satisfy occupants’ requirements, negative behaviours could conflict with energy saving. The survey revealed that about 23% of windows are partly open for fresh air when air-conditioners are in operation in summer. This paper addresses the issues how the building and environmental systems should be designed, operated and managed in a way that meets the requirements of energy efficiency without compromising wellbeing and productivity.
Resumo:
Stroke is a medical emergency and can cause a neurological damage, affecting the motor and sensory systems. Harnessing brain plasticity should make it possible to reconstruct the closed loop between the brain and the body, i.e., association of the generation of the motor command with the somatic sensory feedback might enhance motor recovery. In order to aid reconstruction of this loop with a robotic device it is necessary to assist the paretic side of the body at the right moment to achieve simultaneity between motor command and feedback signal to somatic sensory area in brain. To this end, we propose an integrated EEG-driven assistive robotic system for stroke rehabilitation. Depending on the level of motor recovery, it is important to provide adequate stimulation for upper limb motion. Thus, we propose an assist arm incorporating a Magnetic Levitation Joint that can generate a compliant motion due to its levitation and mechanical redundancy. This paper reports on a feasibility study carried out to verify the validity of the robot sensing and on EEG measurements conducted with healthy volunteers while performing a spontaneous arm flexion/extension movement. A characteristic feature was found in the temporal evolution of EEG signal in the single motion prior to executed motion which can aid in coordinating timing of the robotic arm assistance onset.
Resumo:
This paper analyses and describes the semi-arid rangelands of southern Africa. These rangelands are found in the grassland, savanna and thicket biomes and comprise all the remaining land which does not support commercial rainfed agriculture, in extent some 778221 km2 (66% of South Africa). Although production is primarily driven by rainfall, rangeland management systems have been developed to cope with the uncertain climate and to ameliorate the impact of inter-annual variation in production. We describe the rangeland types that occur, provide an insight into their management and examine some constraints on livestock production which the socio-economic environment presents. We describe the grazing management systems which apply under the two land tenure systems, namely freehold and leasehold tenure, and discuss how each of these systems effects livestock production, management and resource condition.
Resumo:
The endocannabinoid system (ECS) was only 'discovered' in the 1990s. Since then, many new ligands have been identified, as well as many new intracellular targets--ranging from the PPARs, to mitochondria, to lipid rafts. It was thought that blocking the CB-1 receptor might reverse obesity and the metabolic syndrome. This was based on the idea that the ECS was dysfunctional in these conditions. This has met with limited success. The reason may be that the ECS is a homeostatic system, which integrates energy seeking and storage behaviour with resistance to oxidative stress. It could be viewed as having thrifty actions. Thriftiness is an innate property of life, which is programmed to a set point by both environment and genetics, resulting in an epigenotype perfectly adapted to its environment. This thrifty set point can be modulated by hormetic stimuli, such as exercise, cold and plant micronutrients. We have proposed that the physiological and protective insulin resistance that underlies thriftiness encapsulates something called 'redox thriftiness', whereby insulin resistance is determined by the ability to resist oxidative stress. Modern man has removed most hormetic stimuli and replaced them with a calorific sedentary lifestyle, leading to increased risk of metabolic inflexibility. We suggest that there is a tipping point where lipotoxicity in adipose and hepatic cells induces mild inflammation, which switches thrifty insulin resistance to inflammation-driven insulin resistance. To understand this, we propose that the metabolic syndrome could be seen from the viewpoint of the ECS, the mitochondrion and the FOXO group of transcription factors. FOXO has many thrifty actions, including increasing insulin resistance and appetite, suppressing oxidative stress and shifting the organism towards using fatty acids. In concert with factors such as PGC-1, they also modify mitochondrial function and biogenesis. Hence, the ECS and FOXO may interact at many points; one of which may be via intracellular redox signalling. As cannabinoids have been shown to modulate reactive oxygen species production, it is possible that they can upregulate anti-oxidant defences. This suggests they may have an 'endohormetic' signalling function. The tipping point into the metabolic syndrome may be the result of a chronic lack of hormetic stimuli (in particular, physical activity), and thus, stimulus for PGC-1, with a resultant reduction in mitochondrial function and a reduced lipid capacitance. This, in the context of a positive calorie environment, will result in increased visceral adipose tissue volume, abnormal ectopic fat content and systemic inflammation. This would worsen the inflammatory-driven pathological insulin resistance and inability to deal with lipids. The resultant oxidative stress may therefore drive a compensatory anti-oxidative response epitomised by the ECS and FOXO. Thus, although blocking the ECS (e.g. via rimonabant) may induce temporary weight loss, it may compromise long-term stress resistance. Clues about how to modulate the system more safely are emerging from observations that some polyphenols, such as resveratrol and possibly, some phytocannabinoids, can modulate mitochondrial function and might improve resistance to a modern lifestyle.
Resumo:
Understanding how climate change can affect crop-pollinator systems helps predict potential geographical mismatches between a crop and its pollinators, and therefore identify areas vulnerable to loss of pollination services. We examined the distribution of orchard species (apples, pears, plums and other top fruits) and their pollinators in Great Britain, for present and future climatic conditions projected for 2050 under the SRES A1B Emissions Scenario. We used a relative index of pollinator availability as a proxy for pollination service. At present there is a large spatial overlap between orchards and their pollinators, but predictions for 2050 revealed that the most suitable areas for orchards corresponded to low pollinator availability. However, we found that pollinator availability may persist in areas currently used for fruit production, but which are predicted to provide sub-optimal environmental suitability for orchard species in the future. Our results may be used to identify mitigation options to safeguard orchard production against the risk of pollination failure in Great Britain over the next 50 years; for instance choosing fruit tree varieties that are adapted to future climatic conditions, or boosting wild pollinators through improving landscape resources. Our approach can be readily applied to other regions and crop systems, and expanded to include different climatic scenarios.
Resumo:
Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.
Resumo:
The glutamate decarboxylase (GAD) system has been shown to be important for the survival of Listeria monocytogenes in low pH environments. The bacterium can use this faculty to maintain pH homeostasis under acidic conditions. The accepted model for the GAD system proposes that the antiport of glutamate into the bacterial cell in exchange for γ-aminobutyric acid (GABA) is coupled to an intracellular decarboxylation reaction of glutamate into GABA that consumes protons and therefore facilitates pH homeostasis. Most strains of L. monocytogenes possess three decarboxylase genes (gadD1, D2 & D3) and two antiporter genes (gadT1 & gadT2). Here, we confirm that the gadD3 encodes a glutamate decarboxylase dedicated to the intracellular GAD system (GADi), which produces GABA from cytoplasmic glutamate in the absence of antiport activity. We also compare the functionality of the GAD system between two commonly studied reference strains, EGD-e and 10403S with differences in terms of acid resistance. Through functional genomics we show that EGD-e is unable to export GABA and relies exclusively in the GADi system, which is driven primarily by GadD3 in this strain. In contrast 10403S relies upon GadD2 to maintain both an intracellular and extracellular GAD system (GADi/GADe). Through experiments with a murinised variant of EGD-e (EGDm) in mice, we found that the GAD system plays a significant role in the overall virulence of this strain. Double mutants lacking either gadD1D3 or gadD2D3 of the GAD system displayed reduced acid tolerance and were significantly affected in their ability to cause infection following oral inoculation. Since EGDm exploits GADi but not GADe the results indicate that the GADi system makes a contribution to virulence within the mouse. Furthermore, we also provide evidence that there might be a separate line of evolution in the GAD system between two commonly used reference strains.
A benchmark-driven modelling approach for evaluating deployment choices on a multi-core architecture
Resumo:
The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.
Resumo:
The incorporation of numerical weather predictions (NWP) into a flood warning system can increase forecast lead times from a few hours to a few days. A single NWP forecast from a single forecast centre, however, is insufficient as it involves considerable non-predictable uncertainties and can lead to a high number of false or missed warnings. Weather forecasts using multiple NWPs from various weather centres implemented on catchment hydrology can provide significantly improved early flood warning. The availability of global ensemble weather prediction systems through the ‘THORPEX Interactive Grand Global Ensemble’ (TIGGE) offers a new opportunity for the development of state-of-the-art early flood forecasting systems. This paper presents a case study using the TIGGE database for flood warning on a meso-scale catchment (4062 km2) located in the Midlands region of England. For the first time, a research attempt is made to set up a coupled atmospheric-hydrologic-hydraulic cascade system driven by the TIGGE ensemble forecasts. A probabilistic discharge and flood inundation forecast is provided as the end product to study the potential benefits of using the TIGGE database. The study shows that precipitation input uncertainties dominate and propagate through the cascade chain. The current NWPs fall short of representing the spatial precipitation variability on such a comparatively small catchment, which indicates need to improve NWPs resolution and/or disaggregating techniques to narrow down the spatial gap between meteorology and hydrology. The spread of discharge forecasts varies from centre to centre, but it is generally large and implies a significant level of uncertainties. Nevertheless, the results show the TIGGE database is a promising tool to forecast flood inundation, comparable with that driven by raingauge observation.
Resumo:
We present a data-driven mathematical model of a key initiating step in platelet activation, a central process in the prevention of bleeding following Injury. In vascular disease, this process is activated inappropriately and causes thrombosis, heart attacks and stroke. The collagen receptor GPVI is the primary trigger for platelet activation at sites of injury. Understanding the complex molecular mechanisms initiated by this receptor is important for development of more effective antithrombotic medicines. In this work we developed a series of nonlinear ordinary differential equation models that are direct representations of biological hypotheses surrounding the initial steps in GPVI-stimulated signal transduction. At each stage model simulations were compared to our own quantitative, high-temporal experimental data that guides further experimental design, data collection and model refinement. Much is known about the linear forward reactions within platelet signalling pathways but knowledge of the roles of putative reverse reactions are poorly understood. An initial model, that includes a simple constitutively active phosphatase, was unable to explain experimental data. Model revisions, incorporating a complex pathway of interactions (and specifically the phosphatase TULA-2), provided a good description of the experimental data both based on observations of phosphorylation in samples from one donor and in those of a wider population. Our model was used to investigate the levels of proteins involved in regulating the pathway and the effect of low GPVI levels that have been associated with disease. Results indicate a clear separation in healthy and GPVI deficient states in respect of the signalling cascade dynamics associated with Syk tyrosine phosphorylation and activation. Our approach reveals the central importance of this negative feedback pathway that results in the temporal regulation of a specific class of protein tyrosine phosphatases in controlling the rate, and therefore extent, of GPVI-stimulated platelet activation.
Resumo:
ISO19156 Observations and Measurements (O&M) provides a standardised framework for organising information about the collection of information about the environment. Here we describe the implementation of a specialisation of O&M for environmental data, the Metadata Objects for Linking Environmental Sciences (MOLES3). MOLES3 provides support for organising information about data, and for user navigation around data holdings. The implementation described here, “CEDA-MOLES”, also supports data management functions for the Centre for Environmental Data Archival, CEDA. The previous iteration of MOLES (MOLES2) saw active use over five years, being replaced by CEDA-MOLES in late 2014. During that period important lessons were learnt both about the information needed, as well as how to design and maintain the necessary information systems. In this paper we review the problems encountered in MOLES2; how and why CEDA-MOLES was developed and engineered; the migration of information holdings from MOLES2 to CEDA-MOLES; and, finally, provide an early assessment of MOLES3 (as implemented in CEDA-MOLES) and its limitations. Key drivers for the MOLES3 development included the necessity for improved data provenance, for further structured information to support ISO19115 discovery metadata export (for EU INSPIRE compliance), and to provide appropriate fixed landing pages for Digital Object Identifiers (DOIs) in the presence of evolving datasets. Key lessons learned included the importance of minimising information structure in free text fields, and the necessity to support as much agility in the information infrastructure as possible without compromising on maintainability both by those using the systems internally and externally (e.g. citing in to the information infrastructure), and those responsible for the systems themselves. The migration itself needed to ensure continuity of service and traceability of archived assets.
Resumo:
Understanding complex social-ecological systems, and anticipating how they may respond to rapid change, requires an approach that incorporates environmental, social, economic, and policy factors, usually in a context of fragmented data availability. We employed fuzzy cognitive mapping (FCM) to integrate these factors in the assessment of future wildfire risk in the Chiquitania region, Bolivia. In this region, dealing with wildfires is becoming increasingly challenging due to reinforcing feedbacks between multiple drivers. We conducted semi-structured interviews and constructed different FCMs in focus groups to understand the regional dynamics of wildfire from diverse perspectives. We used FCM modelling to evaluate possible adaptation scenarios in the context of future drier climatic conditions. Scenarios also considered possible failure to respond in time to the emergent risk. This approach proved of great potential to support decision-making for risk management. It helped identify key forcing variables and generate insights into potential risks and trade-offs of different strategies. All scenarios showed increased wildfire risk in the event of more droughts. The ‘Hands-off’ scenario resulted in amplified impacts driven by intensifying trends, affecting particularly the agricultural production. The ‘Fire management’ scenario, which adopted a bottom-up approach to improve controlled burning, showed less trade-offs between wildfire risk reduction and production compared to the ‘Fire suppression’ scenario. Findings highlighted the importance of considering strategies that involve all actors who use fire, and the need to nest these strategies for a more systemic approach to manage wildfire risk. The FCM model could be used as a decision-support tool and serve as a ‘boundary object’ to facilitate collaboration and integration of different forms of knowledge and perceptions of fire in the region. This approach has also the potential to support decisions in other dynamic frontier landscapes around the world that are facing increased risk of large wildfires.