928 resultados para Data-driven energy e ciency


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The neural mechanisms determining the timing of even simple actions, such as when to walk or rest, are largely mysterious. One intriguing, but untested, hypothesis posits a role for ongoing activity fluctuations in neurons of central action selection circuits that drive animal behavior from moment to moment. To examine how fluctuating activity can contribute to action timing, we paired high-resolution measurements of freely walking Drosophila melanogaster with data-driven neural network modeling and dynamical systems analysis. We generated fluctuation-driven network models whose outputs-locomotor bouts-matched those measured from sensory-deprived Drosophila. From these models, we identified those that could also reproduce a second, unrelated dataset: the complex time-course of odor-evoked walking for genetically diverse Drosophila strains. Dynamical models that best reproduced both Drosophila basal and odor-evoked locomotor patterns exhibited specific characteristics. First, ongoing fluctuations were required. In a stochastic resonance-like manner, these fluctuations allowed neural activity to escape stable equilibria and to exceed a threshold for locomotion. Second, odor-induced shifts of equilibria in these models caused a depression in locomotor frequency following olfactory stimulation. Our models predict that activity fluctuations in action selection circuits cause behavioral output to more closely match sensory drive and may therefore enhance navigation in complex sensory environments. Together these data reveal how simple neural dynamics, when coupled with activity fluctuations, can give rise to complex patterns of animal behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The attached file is created with Scientific Workplace Latex

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data centre is a centralized repository,either physical or virtual,for the storage,management and dissemination of data and information organized around a particular body and nerve centre of the present IT revolution.Data centre are expected to serve uniinterruptedly round the year enabling them to perform their functions,it consumes enormous energy in the present scenario.Tremendous growth in the demand from IT Industry made it customary to develop newer technologies for the better operation of data centre.Energy conservation activities in data centre mainly concentrate on the air conditioning system since it is the major mechanical sub-system which consumes considerable share of the total power consumption of the data centre.The data centre energy matrix is best represented by power utilization efficiency(PUE),which is defined as the ratio of the total facility power to the IT equipment power.Its value will be greater than one and a large value of PUE indicates that the sub-systems draw more power from the facility and the performance of the data will be poor from the stand point of energy conservation. PUE values of 1.4 to 1.6 are acievable by proper design and management techniques.Optimizing the air conditioning systems brings enormous opportunity in bringing down the PUE value.The air conditioning system can be optimized by two approaches namely,thermal management and air flow management.thermal management systems are now introduced by some companies but they are highly sophisticated and costly and do not catch much attention in the thumb rules.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data caching is an attractive solution for reducing bandwidth demands and network latency in mobile ad hoc networks. Deploying caches in mobile nodes can reduce the overall traf c considerably. Cache hits eliminate the need to contact the data source frequently, which avoids additional network overhead. In this paper we propose a data discovery and cache management policy for cooperative caching, which reduces the power usage, caching overhead and delay by reducing the number of control messages flooded into the network .A cache discovery process based on position cordinates of neighboring nodes is developed for this .The stimulstion results gives a promising result based on the metrics of the studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los aportes teóricos y aplicados de la complejidad en economía han tomado tantas direcciones y han sido tan frenéticos en las últimas décadas, que no existe un trabajo reciente, hasta donde conocemos, que los compile y los analice de forma integrada. El objetivo de este proyecto, por tanto, es desarrollar un estado situacional de las diferentes aplicaciones conceptuales, teóricas, metodológicas y tecnológicas de las ciencias de la complejidad en la economía. Asimismo, se pretende analizar las tendencias recientes en el estudio de la complejidad de los sistemas económicos y los horizontes que las ciencias de la complejidad ofrecen de cara al abordaje de los fenómenos económicos del mundo globalizado contemporáneo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: In many experimental pipelines, clustering of multidimensional biological datasets is used to detect hidden structures in unlabelled input data. Taverna is a popular workflow management system that is used to design and execute scientific workflows and aid in silico experimentation. The availability of fast unsupervised methods for clustering and visualization in the Taverna platform is important to support a data-driven scientific discovery in complex and explorative bioinformatics applications. Results: This work presents a Taverna plugin, the Biological Data Interactive Clustering Explorer (BioDICE), that performs clustering of high-dimensional biological data and provides a nonlinear, topology preserving projection for the visualization of the input data and their similarities. The core algorithm in the BioDICE plugin is Fast Learning Self Organizing Map (FLSOM), which is an improved variant of the Self Organizing Map (SOM) algorithm. The plugin generates an interactive 2D map that allows the visual exploration of multidimensional data and the identification of groups of similar objects. The effectiveness of the plugin is demonstrated on a case study related to chemical compounds. Conclusions: The number and variety of available tools and its extensibility have made Taverna a popular choice for the development of scientific data workflows. This work presents a novel plugin, BioDICE, which adds a data-driven knowledge discovery component to Taverna. BioDICE provides an effective and powerful clustering tool, which can be adopted for the explorative analysis of biological datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this research the 3DVAR data assimilation scheme is implemented in the numerical model DIVAST in order to optimize the performance of the numerical model by selecting an appropriate turbulence scheme and tuning its parameters. Two turbulence closure schemes: the Prandtl mixing length model and the two-equation k-ε model were incorporated into DIVAST and examined with respect to their universality of application, complexity of solutions, computational efficiency and numerical stability. A square harbour with one symmetrical entrance subject to tide-induced flows was selected to investigate the structure of turbulent flows. The experimental part of the research was conducted in a tidal basin. A significant advantage of such laboratory experiment is a fully controlled environment where domain setup and forcing are user-defined. The research shows that the Prandtl mixing length model and the two-equation k-ε model, with default parameterization predefined according to literature recommendations, overestimate eddy viscosity which in turn results in a significant underestimation of velocity magnitudes in the harbour. The data assimilation of the model-predicted velocity and laboratory observations significantly improves model predictions for both turbulence models by adjusting modelled flows in the harbour to match de-errored observations. 3DVAR allows also to identify and quantify shortcomings of the numerical model. Such comprehensive analysis gives an optimal solution based on which numerical model parameters can be estimated. The process of turbulence model optimization by reparameterization and tuning towards optimal state led to new constants that may be potentially applied to complex turbulent flows, such as rapidly developing flows or recirculating flows.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the increase in water demand and hydropower energy, it is getting more important to operate hydraulic structures in an efficient manner while sustaining multiple demands. Especially, companies, governmental agencies, consultant offices require effective, practical integrated tools and decision support frameworks to operate reservoirs, cascades of run-of-river plants and related elements such as canals by merging hydrological and reservoir simulation/optimization models with various numerical weather predictions, radar and satellite data. The model performance is highly related with the streamflow forecast, related uncertainty and its consideration in the decision making. While deterministic weather predictions and its corresponding streamflow forecasts directly restrict the manager to single deterministic trajectories, probabilistic forecasts can be a key solution by including uncertainty in flow forecast scenarios for dam operation. The objective of this study is to compare deterministic and probabilistic streamflow forecasts on an earlier developed basin/reservoir model for short term reservoir management. The study is applied to the Yuvacık Reservoir and its upstream basin which is the main water supply of Kocaeli City located in the northwestern part of Turkey. The reservoir represents a typical example by its limited capacity, downstream channel restrictions and high snowmelt potential. Mesoscale Model 5 and Ensemble Prediction System data are used as a main input and the flow forecasts are done for 2012 year using HEC-HMS. Hydrometeorological rule-based reservoir simulation model is accomplished with HEC-ResSim and integrated with forecasts. Since EPS based hydrological model produce a large number of equal probable scenarios, it will indicate how uncertainty spreads in the future. Thus, it will provide risk ranges in terms of spillway discharges and reservoir level for operator when it is compared with deterministic approach. The framework is fully data driven, applicable, useful to the profession and the knowledge can be transferred to other similar reservoir systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Distributed energy and water balance models require time-series surfaces of the meteorological variables involved in hydrological processes. Most of the hydrological GIS-based models apply simple interpolation techniques to extrapolate the point scale values registered at weather stations at a watershed scale. In mountainous areas, where the monitoring network ineffectively covers the complex terrain heterogeneity, simple geostatistical methods for spatial interpolation are not always representative enough, and algorithms that explicitly or implicitly account for the features creating strong local gradients in the meteorological variables must be applied. Originally developed as a meteorological pre-processing tool for a complete hydrological model (WiMMed), MeteoMap has become an independent software. The individual interpolation algorithms used to approximate the spatial distribution of each meteorological variable were carefully selected taking into account both, the specific variable being mapped, and the common lack of input data from Mediterranean mountainous areas. They include corrections with height for both rainfall and temperature (Herrero et al., 2007), and topographic corrections for solar radiation (Aguilar et al., 2010). MeteoMap is a GIS-based freeware upon registration. Input data include weather station records and topographic data and the output consists of tables and maps of the meteorological variables at hourly, daily, predefined rainfall event duration or annual scales. It offers its own pre and post-processing tools, including video outlook, map printing and the possibility of exporting the maps to images or ASCII ArcGIS formats. This study presents the friendly user interface of the software and shows some case studies with applications to hydrological modeling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Inelasticity distributions in high-energy p-nucleus collisions are computed in the framework of the interacting gluon model, with the impact-parameter fluctuation included. A proper account of the peripheral events by this fluctuation has shown to be vital for the overall agreement with several reported data. The energy dependence is found to be weak.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho apresenta resultados práticos de uma atenção sistemática dada ao processamento e à interpretação sísmica de algumas linhas terrestres do conjunto de dados do gráben do Tacutu (Brasil), sobre os quais foram aplicadas etapas fundamentais do sistema WIT de imageamento do empilhamento CRS (Superfície de Reflexão Comum) vinculado a dados. Como resultado, esperamos estabelecer um fluxograma para a reavaliação sísmica de bacias sedimentares. Fundamentado nos atributos de frente de onda resultantes do empilhamento CRS, um macro-modelo suave de velocidades foi obtido através de inversão tomográfica. Usando este macro-modelo, foi realizado uma migração à profundidade pré- e pós-empilhamento. Além disso, outras técnicas baseadas no empilhamento CRS foram realizadas em paralelo como correção estática residual e migração de abertura-limitada baseada na zona de Fresnel projetada. Uma interpretação geológica sobre as seções empilhadas e migradas foi esboçada. A partir dos detalhes visuais dos painéis é possível interpretar desconformidades, afinamentos, um anticlinal principal falhado com conjuntos de horstes e grábens. Também, uma parte da linha selecionada precisa de processamento mais detalhado para evidenciar melhor qualquer estrutura presente na subsuperfície.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A measurement of differential cross sections for the production of a pair of isolated photons in proton-proton collisions at root s = 7 TeV is presented. The data sample corresponds to an integrated luminosity of 5.0 fb(-1) collected with the CMS detector. A data-driven isolation template method is used to extract the prompt diphoton yield. The measured cross section for two isolated photons, with transverse energy above 40 and 25 GeV respectively, in the pseudorapidity range vertical bar eta vertical bar < 2.5, vertical bar eta vertical bar (sic) [1.44, 1.57] and with an angular separation Delta R > 0.45, is 17.2 +/-0.2 (stat) +/-1.9 (syst) +/- 0.4 (lumi) pb. Differential cross sections are measured as a function of the diphoton invariant mass, the diphoton transverse momentum, the azimuthal angle difference between the two photons, and the cosine of the polar angle in the Collins-Soper reference frame of the diphoton system. The results are compared to theoretical predictions at leading, next-to-leading, and next-to-next-to-leading order in quantum chromodynamics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis three measurements of top-antitop differential cross section at an energy in the center of mass of 7 TeV will be shown, as a function of the transverse momentum, the mass and the rapidity of the top-antitop system. The analysis has been carried over a data sample of about 5/fb recorded with the ATLAS detector. The events have been selected with a cut based approach in the "one lepton plus jets" channel, where the lepton can be either an electron or a muon. The most relevant backgrounds (multi-jet QCD and W+jets) have been extracted using data driven methods; the others (Z+ jets, diboson and single top) have been simulated with Monte Carlo techniques. The final, background-subtracted, distributions have been corrected, using unfolding methods, for the detector and selection effects. At the end, the results have been compared with the theoretical predictions. The measurements are dominated by the systematic uncertainties and show no relevant deviation from the Standard Model predictions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Standard Model of particle physics is a very successful theory which describes nearly all known processes of particle physics very precisely. Nevertheless, there are several observations which cannot be explained within the existing theory. In this thesis, two analyses with high energy electrons and positrons using data of the ATLAS detector are presented. One, probing the Standard Model of particle physics and another searching for phenomena beyond the Standard Model.rnThe production of an electron-positron pair via the Drell-Yan process leads to a very clean signature in the detector with low background contributions. This allows for a very precise measurement of the cross-section and can be used as a precision test of perturbative quantum chromodynamics (pQCD) where this process has been calculated at next-to-next-to-leading order (NNLO). The invariant mass spectrum mee is sensitive to parton distribution functions (PFDs), in particular to the poorly known distribution of antiquarks at large momentum fraction (Bjoerken x). The measurementrnof the high-mass Drell-Yan cross-section in proton-proton collisions at a center-of-mass energy of sqrt(s) = 7 TeV is performed on a dataset collected with the ATLAS detector, corresponding to an integrated luminosity of 4.7 fb-1. The differential cross-section of pp -> Z/gamma + X -> e+e- + X is measured as a function of the invariant mass in the range 116 GeV < mee < 1500 GeV. The background is estimated using a data driven method and Monte Carlo simulations. The final cross-section is corrected for detector effects and different levels of final state radiation corrections. A comparison isrnmade to various event generators and to predictions of pQCD calculations at NNLO. A good agreement within the uncertainties between measured cross-sections and Standard Model predictions is observed.rnExamples of observed phenomena which can not be explained by the Standard Model are the amount of dark matter in the universe and neutrino oscillations. To explain these phenomena several extensions of the Standard Model are proposed, some of them leading to new processes with a high multiplicity of electrons and/or positrons in the final state. A model independent search in multi-object final states, with objects defined as electrons and positrons, is performed to search for these phenomenas. Therndataset collected at a center-of-mass energy of sqrt(s) = 8 TeV, corresponding to an integrated luminosity of 20.3 fb-1 is used. The events are separated in different categories using the object multiplicity. The data-driven background method, already used for the cross-section measurement was developed further for up to five objects to get an estimation of the number of events including fake contributions. Within the uncertainties the comparison between data and Standard Model predictions shows no significant deviations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wind energy has been one of the most growing sectors of the nation’s renewable energy portfolio for the past decade, and the same tendency is being projected for the upcoming years given the aggressive governmental policies for the reduction of fossil fuel dependency. Great technological expectation and outstanding commercial penetration has shown the so called Horizontal Axis Wind Turbines (HAWT) technologies. Given its great acceptance, size evolution of wind turbines over time has increased exponentially. However, safety and economical concerns have emerged as a result of the newly design tendencies for massive scale wind turbine structures presenting high slenderness ratios and complex shapes, typically located in remote areas (e.g. offshore wind farms). In this regard, safety operation requires not only having first-hand information regarding actual structural dynamic conditions under aerodynamic action, but also a deep understanding of the environmental factors in which these multibody rotating structures operate. Given the cyclo-stochastic patterns of the wind loading exerting pressure on a HAWT, a probabilistic framework is appropriate to characterize the risk of failure in terms of resistance and serviceability conditions, at any given time. Furthermore, sources of uncertainty such as material imperfections, buffeting and flutter, aeroelastic damping, gyroscopic effects, turbulence, among others, have pleaded for the use of a more sophisticated mathematical framework that could properly handle all these sources of indetermination. The attainable modeling complexity that arises as a result of these characterizations demands a data-driven experimental validation methodology to calibrate and corroborate the model. For this aim, System Identification (SI) techniques offer a spectrum of well-established numerical methods appropriated for stationary, deterministic, and data-driven numerical schemes, capable of predicting actual dynamic states (eigenrealizations) of traditional time-invariant dynamic systems. As a consequence, it is proposed a modified data-driven SI metric based on the so called Subspace Realization Theory, now adapted for stochastic non-stationary and timevarying systems, as is the case of HAWT’s complex aerodynamics. Simultaneously, this investigation explores the characterization of the turbine loading and response envelopes for critical failure modes of the structural components the wind turbine is made of. In the long run, both aerodynamic framework (theoretical model) and system identification (experimental model) will be merged in a numerical engine formulated as a search algorithm for model updating, also known as Adaptive Simulated Annealing (ASA) process. This iterative engine is based on a set of function minimizations computed by a metric called Modal Assurance Criterion (MAC). In summary, the Thesis is composed of four major parts: (1) development of an analytical aerodynamic framework that predicts interacted wind-structure stochastic loads on wind turbine components; (2) development of a novel tapered-swept-corved Spinning Finite Element (SFE) that includes dampedgyroscopic effects and axial-flexural-torsional coupling; (3) a novel data-driven structural health monitoring (SHM) algorithm via stochastic subspace identification methods; and (4) a numerical search (optimization) engine based on ASA and MAC capable of updating the SFE aerodynamic model.