43 resultados para Multi-scale hierarchical framework
Resumo:
Multi-relational data mining enables pattern mining from multiple tables. The existing multi-relational mining association rules algorithms are not able to process large volumes of data, because the amount of memory required exceeds the amount available. The proposed algorithm MRRadix presents a framework that promotes the optimization of memory usage. It also uses the concept of partitioning to handle large volumes of data. The original contribution of this proposal is enable a superior performance when compared to other related algorithms and moreover successfully concludes the task of mining association rules in large databases, bypass the problem of available memory. One of the tests showed that the MR-Radix presents fourteen times less memory usage than the GFP-growth. © 2011 IEEE.
Resumo:
A multi-agent system with a percolation approach to simulate the driving pattern of Plug-In Electric Vehicle (PEV), especially suited to simulate the PEVs behavior on any distribution systems, is presented. This tool intends to complement information about the driving patterns database on systems where that kind of information is not available. So, this paper aims to provide a framework that is able to work with any kind of technology and load generated of PEVs. The service zone is divided into several sub-zones, each subzone is considered as an independent agent identified with corresponding load level, and their relationships with the neighboring zones are represented as network probabilities. A percolation approach is used to characterize the autonomy of the battery of the PVEs to move through the city. The methodology is tested with data from a mid-size city real distribution system. The result shows the sub-area where the battery of PEVs will need to be recharge and gives the planners of distribution systems the necessary input for a medium to long term network planning in a smart grid environment. © 2012 IEEE.
Resumo:
A recently proposed scenario for baryogenesis, called post-sphaleron baryogenesis (PSB), is discussed within a class of quark-lepton unified framework based on the gauge symmetry SU(2)L×SU(2) R×SU(4)c realized in the multi-TeV scale. The baryon asymmetry of the Universe in this model is produced below the electroweak phase transition temperature after the sphalerons have decoupled from the Hubble expansion. These models embed naturally the seesaw mechanism for neutrino masses and predict color-sextet scalar particles in the TeV range which may be accessible to the LHC experiments. A necessary consequence of this scenario is the baryon-number-violating ΔB=2 process of neutron-antineutron (n-n̄) oscillations. In this paper we show that the constraints of PSB, when combined with the neutrino oscillation data and restrictions from flavor changing neutral currents mediated by the colored scalars, imply an upper limit on the n-n̄ oscillation time of 5×1010 sec regardless of the quark-lepton unification scale. If this scale is relatively low, in the (200-250) TeV range, τn-n̄ is predicted to be less than 1010 sec, which is accessible to the next generation of proposed experiments. © 2013 American Physical Society.
Resumo:
The effectiveness of ecological restoration actions toward biodiversity conservation depends on both local and landscape constraints. Extensive information on local constraints is already available, but few studies consider the landscape context when planning restoration actions. We propose a multiscale framework based on the landscape attributes of habitat amount and connectivity to infer landscape resilience and to set priority areas for restoration. Landscapes with intermediate habitat amount and where connectivity remains sufficiently high to favor recolonization were considered to be intermediately resilient, with high possibilities of restoration effectiveness and thus were designated as priority areas for restoration actions. The proposed method consists of three steps: (1) quantifying habitat amount and connectivity; (2) using landscape ecology theory to identify intermediate resilience landscapes based on habitat amount, percolation theory, and landscape connectivity; and (3) ranking landscapes according to their importance as corridors or bottlenecks for biological flows on a broader scale, based on a graph theory approach. We present a case study for the Brazilian Atlantic Forest (approximately 150 million hectares) in order to demonstrate the proposed method. For the Atlantic Forest, landscapes that present high restoration effectiveness represent only 10% of the region, but contain approximately 15 million hectares that could be targeted for restoration actions (an area similar to today's remaining forest extent). The proposed method represents a practical way to both plan restoration actions and optimize biodiversity conservation efforts by focusing on landscapes that would result in greater conservation benefits. © 2013 Society for Ecological Restoration.
Resumo:
In different regions of Brazil, population growth and economic development can degrade water quality, compromising watershed health and human supply. Because of its ability to combine spatial and temporal data in the same environment and to create water resources management (WRM) models, the Geographical Information System (GIS) is a powerful tool for managing water resources, preventing floods and estimating water supply. This paper discusses the integration between GIS and hydrological models and presents a case study relating to the upper section of the Paraíba do Sul Basin (Sao Paulo State portion), situated in the Southeast of Brazil. The case study presented in this paper has a database suitable for the basin's dimensions, including digitized topographic maps at a 50,000 scale. From an ArcGIS®/ArcHydro Framework Data Model, a geometric network was created to produce different raster products. This first grid derived from the digital elevation model grid (DEM) is the flow direction map followed by flow accumulation, stream and catchment maps. The next steps in this research are to include the different multipurpose reservoirs situated along the Paraíba do Sul River and to incorporate rainfall time series data in ArcHydro to build a hydrologic data model within a GIS environment in order to produce a comprehensive spatial-temporal model.
Resumo:
Bio-molecular computing, 'computations performed by bio-molecules', is already challenging traditional approaches to computation both theoretically and technologically. Often placed within the wider context of ´bio-inspired' or 'natural' or even 'unconventional' computing, the study of natural and artificial molecular computations is adding to our understanding of biology, physical sciences and computer science well beyond the framework of existing design and implementation paradigms. In this introduction, We wish to outline the current scope of the field and assemble some basic arguments that, bio-molecular computation is of central importance to computer science, physical sciences and biology using HOL - Higher Order Logic. HOL is used as the computational tool in our R&D work. DNA was analyzed as a chemical computing engine, in our effort to develop novel formalisms to understand the molecular scale bio-chemical computing behavior using HOL. In our view, our focus is one of the pioneering efforts in this promising domain of nano-bio scale chemical information processing dynamics.
Resumo:
This paper presents a multi-agent system for real-time operation of simulated microgrid using the Smart-Grid Test Bed at Washington State University. The multi-agent system (MAS) was developed in JADE (Java Agent DEvelopment Framework) which is a Foundation for Intelligent Physical Agents (FIPA) compliant open source multi-agent platform. The proposed operational strategy is mainly focused on using an appropriate energy management and control strategies to improve the operation of an islanded microgrid, formed by photovoltaic (PV) solar energy, batteries and resistive and rotating machines loads. The focus is on resource management and to avoid impact on loads from abrupt variations or interruption that changes the operating conditions. The management and control of the PV system is performed in JADE, while the microgrid model is simulated in RSCAD/RTDS (Real-Time Digital Simulator). Finally, the outcome of simulation studies demonstrated the feasibility of the proposed multi-agent approach for real-time operation of a microgrid.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Scintillations are rapid fluctuations in the phase and amplitude of transionospheric radio signals which are caused by small-scale plasma density irregularities in the ionosphere. In the case of the Global Navigation Satellite System (GNSS) receivers, scintillation can cause cycle slips, degrade the positioning accuracy and, when severe enough, can even lead to a complete loss of signal lock. Thus, the required levels of availability, accuracy, integrity and reliability for the GNSS applications may not be met during scintillation occurrence; this poses a major threat to a large number of modern-day GNSS-based applications. The whole of Latin America, Brazil in particular, is located in one of the regions most affected by scintillations. These effects will be exacerbated during solar maxima, the next predicted for 2013. This paper presents initial results from a research work aimed to tackle ionospheric scintillation effects for GNSS users in Latin America. This research is a part of the CIGALA (Concept for Ionospheric Scintillation Mitigation for Professional GNSS in Latin America) project, co-funded by the EC Seventh Framework Program and supervised by the GNSS Supervisory Authority (GSA), which aims to develop and test ionospheric scintillation countermeasures to be implemented in multi-frequency, multi-constellation GNSS receivers.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This chapter addresses the mismatch between existing knowledge, techniques and management methods for improved soil carbon management and deficits in its implementation. The paper gives a short overview of the evolution of the concept of soil carbon, which illustrates the interactions between scientific, industrial, technical, societal and economic change. It then goes on to show that sufficient techniques are available for the large-scale implementation of soil organic carbon (SOC) sequestration. A subsequent analysis of the bottlenecks that prevent implementation identifies where issues need to be addressed in order to enable robust, integrated and sustainable SOC management strategies.
Resumo:
Assuming that neutrinos are Majorana particles, in a three-generation framework, current and future neutrino oscillation experiments can determine six out of the nine parameters which fully describe the structure of the neutrino mass matrix. We try to clarify the interplay among the remaining parameters, the absolute neutrino mass scale and two CP violating Majorana phases, and how they can be accessed by future neutrinoless double beta (0vυββ) decay experiments, for the normal as well as for the inverted order of the neutrino mass spectrum. Assuming the oscillation parameters to be in the range presently allowed by atmospheric, solar, reactor, and accelerator neutrino experiments, we quantitatively estimate the bounds on m 0, the lightest neutrino mass, that can be inferred if the next generation 0υββ decay experiments can probe the effective Majorana mass (m ee) down to ∼1 meV. In this context we conclude that in the case that neutrinos are Majorana particles, (a) if m 0≳300 meV, i.e., within the range directly attainable by future laboratory experiments as well as astrophysical observations, then m ee≳30 meV must be observed, (b) if m 0 ≤ 300 meV, results from future 0υββ decay experiments combined with stringent bounds on the neutrino oscillation parameters, especially the solar ones, will place much stronger limits on the allowed values of m 0 than these direct experiments. For instance, if a positive signal is observed around m ee = 10 meV, we estimate 3≲m 0/meV≲65 at 95% C.L.; on the other hand, if no signal is observed down to m ee = 10 meV, then m 0≲55 meV at 95% C.L.