828 resultados para Generazione Distribuita Rinnovabili Controllo Tensione Smart Grid
Resumo:
Flavonoids reduce cardiovascular disease risk through anti-inflammatory, anti-coagulant and anti-platelet actions. One key flavonoid inhibitory mechanism is blocking kinase activity that drives these processes. Flavonoids attenuate activities of kinases including phosphoinositide-3-kinase (PI3K), Fyn, Lyn, Src, Syk, PKC, PIM1/2, ERK, JNK, and PKA. X-ray crystallographic analyses of kinase-flavonoid complexes show that flavonoid ring systems and their hydroxyl substitutions are important structural features for their binding to kinases. A clearer understanding of structural interactions of flavonoids with kinases is necessary to allow construction of more potent and selective counterparts. We examined flavonoid (quercetin, apigenin and catechin) interactions with Src-family kinases (Lyn, Fyn and Hck) applying the Sybyl docking algorithm and GRID. A homology model (Lyn) was used in our analyses to demonstrate that high quality predicted kinase structures are suitable for flavonoid computational studies. Our docking results revealed potential hydrogen bond contacts between flavonoid hydroxyls and kinase catalytic site residues. Identification of plausible contacts indicated that quercetin formed the most energetically stable interactions, apigenin lacked hydroxyl groups necessary for important contacts, and the non-planar structure of catechin could not support predicted hydrogen bonding patterns. GRID analysis using a hydroxyl functional group supported docking results. Based on these findings, we predicted that quercetin would inhibit activities of Src-family kinases with greater potency than apigenin and catechin. We validated this prediction using in vitro kinase assays. We conclude that our study can be used as a basis to construct virtual flavonoid interaction libraries to guide drug discovery using these compounds as molecular templates.
Resumo:
In this study we applied a smart biomaterial formed from a self-assembling, multi-functional synthetic peptide amphiphile (PA) to coat substrates with various surface chemistries. The combination of PA coating and alignment-inducing functionalised substrates provided a template to instruct human corneal stromal fibroblasts to adhere, become aligned and then bio-fabricate a highlyordered, multi-layered, three-dimensional tissue by depositing an aligned, native-like extracellular matrix. The newly-formed corneal tissue equivalent was subsequently able to eliminate the adhesive properties of the template and govern its own complete release via the action of endogenous proteases. Tissues recovered through this method were structurally stable, easily handled, and carrier-free. Furthermore, topographical and mechanical analysis by atomic force microscopy showed that tissue equivalents formed on the alignment-inducing PA template had highly-ordered, compact collagen deposition, with a two-fold higher elastic modulus compared to the less compact tissues produced on the non-alignment template, the PA-coated glass. We suggest that this technology represents a new paradigm in tissue engineering and regenerative medicine, whereby all processes for the biofabrication and subsequent self-release of natural, bioprosthetic human tissues depend solely on simple templatetissue feedback interactions.
Resumo:
Existing urban meteorological networks have an important role to play as test beds for inexpensive and more sustainable measurement techniques that are now becoming possible in our increasingly smart cities. The Birmingham Urban Climate Laboratory (BUCL) is a near-real-time, high-resolution urban meteorological network (UMN) of automatic weather stations and inexpensive, nonstandard air temperature sensors. The network has recently been implemented with an initial focus on monitoring urban heat, infrastructure, and health applications. A number of UMNs exist worldwide; however, BUCL is novel in its density, the low-cost nature of the sensors, and the use of proprietary Wi-Fi networks. This paper provides an overview of the logistical aspects of implementing a UMN test bed at such a density, including selecting appropriate urban sites; testing and calibrating low-cost, nonstandard equipment; implementing strict quality-assurance/quality-control mechanisms (including metadata); and utilizing preexisting Wi-Fi networks to transmit data. Also included are visualizations of data collected by the network, including data from the July 2013 U.K. heatwave as well as highlighting potential applications. The paper is an open invitation to use the facility as a test bed for evaluating models and/or other nonstandard observation techniques such as those generated via crowdsourcing techniques.
Resumo:
Land cover data derived from satellites are commonly used to prescribe inputs to models of the land surface. Since such data inevitably contains errors, quantifying how uncertainties in the data affect a model’s output is important. To do so, a spatial distribution of possible land cover values is required to propagate through the model’s simulation. However, at large scales, such as those required for climate models, such spatial modelling can be difficult. Also, computer models often require land cover proportions at sites larger than the original map scale as inputs, and it is the uncertainty in these proportions that this article discusses. This paper describes a Monte Carlo sampling scheme that generates realisations of land cover proportions from the posterior distribution as implied by a Bayesian analysis that combines spatial information in the land cover map and its associated confusion matrix. The technique is computationally simple and has been applied previously to the Land Cover Map 2000 for the region of England and Wales. This article demonstrates the ability of the technique to scale up to large (global) satellite derived land cover maps and reports its application to the GlobCover 2009 data product. The results show that, in general, the GlobCover data possesses only small biases, with the largest belonging to non–vegetated surfaces. In vegetated surfaces, the most prominent area of uncertainty is Southern Africa, which represents a complex heterogeneous landscape. It is also clear from this study that greater resources need to be devoted to the construction of comprehensive confusion matrices.
Resumo:
In 2006 the Route load balancing algorithm was proposed and compared to other techniques aiming at optimizing the process allocation in grid environments. This algorithm schedules tasks of parallel applications considering computer neighborhoods (where the distance is defined by the network latency). Route presents good results for large environments, although there are cases where neighbors do not have an enough computational capacity nor communication system capable of serving the application. In those situations the Route migrates tasks until they stabilize in a grid area with enough resources. This migration may take long time what reduces the overall performance. In order to improve such stabilization time, this paper proposes RouteGA (Route with Genetic Algorithm support) which considers historical information on parallel application behavior and also the computer capacities and load to optimize the scheduling. This information is extracted by using monitors and summarized in a knowledge base used to quantify the occupation of tasks. Afterwards, such information is used to parameterize a genetic algorithm responsible for optimizing the task allocation. Results confirm that RouteGA outperforms the load balancing carried out by the original Route, which had previously outperformed others scheduling algorithms from literature.
Resumo:
The aim of task scheduling is to minimize the makespan of applications, exploiting the best possible way to use shared resources. Applications have requirements which call for customized environments for their execution. One way to provide such environments is to use virtualization on demand. This paper presents two schedulers based on integer linear programming which schedule virtual machines (VMs) in grid resources and tasks on these VMs. The schedulers differ from previous work by the joint scheduling of tasks and VMs and by considering the impact of the available bandwidth on the quality of the schedule. Experiments show the efficacy of the schedulers in scenarios with different network configurations.
Resumo:
The InteGrade project is a multi-university effort to build a novel grid computing middleware based on the opportunistic use of resources belonging to user workstations. The InteGrade middleware currently enables the execution of sequential, bag-of-tasks, and parallel applications that follow the BSP or the MPI programming models. This article presents the lessons learned over the last five years of the InteGrade development and describes the solutions achieved concerning the support for robust application execution. The contributions cover the related fields of application scheduling, execution management, and fault tolerance. We present our solutions, describing their implementation principles and evaluation through the analysis of several experimental results. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
Selective Estrogen Receptor Modulators ( SERMs) have been developed, but the selectivity towards the subtypes ( ER or ER is not well understood. Based on three-dimensional structural properties of ligand binding domains, a model that takes into account this aspect was developed via molecular interaction fields and consensus principal component analysis (GRID/CPCA).