965 resultados para Network Data management,


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recently, two international standard organizations, ISO and OGC, have done the work of standardization for GIS. Current standardization work for providing interoperability among GIS DB focuses on the design of open interfaces. But, this work has not considered procedures and methods for designing river geospatial data. Eventually, river geospatial data has its own model. When we share the data by open interface among heterogeneous GIS DB, differences between models result in the loss of information. In this study a plan was suggested both to respond to these changes in the information envirnment and to provide a future Smart River-based river information service by understanding the current state of river geospatial data model, improving, redesigning the database. Therefore, primary and foreign key, which can distinguish attribute information and entity linkages, were redefined to increase the usability. Database construction of attribute information and entity relationship diagram have been newly redefined to redesign linkages among tables from the perspective of a river standard database. In addition, this study was undertaken to expand the current supplier-oriented operating system to a demand-oriented operating system by establishing an efficient management of river-related information and a utilization system, capable of adapting to the changes of a river management paradigm.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Regional accreditors often desire the same metrics and data collected by professional program reviews, but may use different terminologies to describe this information; as a result, some schools must manually translate or even recollect data already stored. This report profiles strategies to proactively consolidate the language and policies of accreditation to avoid duplication of labor and to efficiently route program accreditation data that will be repurposed in regional review. It also suggests ways to select new technologies that can streamline data collection, storage, and presentation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

O objetivo desta pesquisa foi compreender de que modo a emergência de estilos de gestão pública baseados em rede pode contribuir para a construção de políticas públicas relacionadas à prevenção e repressão à lavagem de dinheiro. O arcabouço teórico fundamentou-se nas teorias de redes interorganizacionais, mais especificamente, nas redes de políticas públicas. Foram estudadas relações de colaboração e cooperação que, por transcenderem os limites organizacionais, garantem maior flexibilidade e abrangência ao processo de construção de políticas públicas. As reflexões sobre estas teorias foram realizadas a partir de um contexto empírico específico: o da articulação entre instituições brasileiras que atuam na prevenção e repressão à lavagem de dinheiro. Este contexto foi escolhido pela percepção de que a prática de ilícitos financeiros traz enormes prejuízos a nações que precisam superar desigualdades econômicas e sociais. A escolha metodológica foi motivada por questões ontológicas e epistemológicas que apontaram a metodologia reflexiva como a mais adequada para o alcance dos objetivos propostos na pesquisa. A partir da construção dos dados empíricos foi possível criar construtos que sintetizam os benefícios e desafios da constituição de redes de políticas públicas. As conexões interpretativas subsequentes reforçam a ideia de que os espíritos democrático, republicano e de cooperação devem nortear os valores compartilhados na rede. Assim, se a atuação interna da rede se aprimora, torna-se possível superar entraves endógenos (foco apenas na boa reputação externa) e exógenos (pressão das elites corruptas) à rede, o que resulta em uma melhor interação entre os participantes, gerando resultados mais efetivos para a sociedade.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper adjusts decentralized OPF optimization to the AC power flow problem in power systems with interconnected areas operated by diferent transmission system operators (TSO). The proposed methodology allows finding the operation point of a particular area without explicit knowledge of network data of the other interconnected areas, being only necessary to exchange border information related to the tie-lines between areas. The methodology is based on the decomposition of the first-order optimality conditions of the AC power flow, which is formulated as a nonlinear programming problem. To allow better visualization of the concept of independent operation of each TSO, an artificial neural network have been used for computing border information of the interconnected TSOs. A multi-area Power Flow tool can be seen as a basic building block able to address a large number of problems under a multi-TSO competitive market philosophy. The IEEE RTS-96 power system is used in order to show the operation and effectiveness of the decentralized AC Power Flow. ©2010 IEEE.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper a framework based on the decomposition of the first-order optimality conditions is described and applied to solve the Probabilistic Power Flow (PPF) problem in a coordinated but decentralized way in the context of multi-area power systems. The purpose of the decomposition framework is to solve the problem through a process of solving smaller subproblems, associated with each area of the power system, iteratively. This strategy allows the probabilistic analysis of the variables of interest, in a particular area, without explicit knowledge of network data of the other interconnected areas, being only necessary to exchange border information related to the tie-lines between areas. An efficient method for probabilistic analysis, considering uncertainty in n system loads, is applied. The proposal is to use a particular case of the point estimate method, known as Two-Point Estimate Method (TPM), rather than the traditional approach based on Monte Carlo simulation. The main feature of the TPM is that it only requires resolve 2n power flows for to obtain the behavior of any random variable. An iterative coordination algorithm between areas is also presented. This algorithm solves the Multi-Area PPF problem in a decentralized way, ensures the independent operation of each area and integrates the decomposition framework and the TPM appropriately. The IEEE RTS-96 system is used in order to show the operation and effectiveness of the proposed approach and the Monte Carlo simulations are used to validation of the results. © 2011 IEEE.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A automação na gestão e análise de dados tem sido um fator crucial para as empresas que necessitam de soluções eficientes em um mundo corporativo cada vez mais competitivo. A explosão do volume de informações, que vem se mantendo crescente nos últimos anos, tem exigido cada vez mais empenho em buscar estratégias para gerenciar e, principalmente, extrair informações estratégicas valiosas a partir do uso de algoritmos de Mineração de Dados, que comumente necessitam realizar buscas exaustivas na base de dados a fim de obter estatísticas que solucionem ou otimizem os parâmetros do modelo de extração do conhecimento utilizado; processo que requer computação intensiva para a execução de cálculos e acesso frequente à base de dados. Dada a eficiência no tratamento de incerteza, Redes Bayesianas têm sido amplamente utilizadas neste processo, entretanto, à medida que o volume de dados (registros e/ou atributos) aumenta, torna-se ainda mais custoso e demorado extrair informações relevantes em uma base de conhecimento. O foco deste trabalho é propor uma nova abordagem para otimização do aprendizado da estrutura da Rede Bayesiana no contexto de BigData, por meio do uso do processo de MapReduce, com vista na melhora do tempo de processamento. Para tanto, foi gerada uma nova metodologia que inclui a criação de uma Base de Dados Intermediária contendo todas as probabilidades necessárias para a realização dos cálculos da estrutura da rede. Por meio das análises apresentadas neste estudo, mostra-se que a combinação da metodologia proposta com o processo de MapReduce é uma boa alternativa para resolver o problema de escalabilidade nas etapas de busca em frequência do algoritmo K2 e, consequentemente, reduzir o tempo de resposta na geração da rede.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper analyses the scientific collaboration network formed by the Brazilian universities that investigate in dentistry area. The constructed network is based on the published documents in the Scopus (Elsevier) database covering a period of 10 (ten) years. It is used social network analysis as the best methodological approach to visualize the capacity for collaboration, dissemination and transmission of new knowledge among universities. Cohesion and density of the collaboration network is analyzed, as well as the centrality of the universities as key-actors and the occurrence of subgroups within the network. Data were analyzed using the software UCINET and NetDraw. The number of documents published by each university was used as an indicator of its scientific production.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nowadays the companies generate great amount of data from different sources, however some of them produce more data than they can analyze. Big Data is a set of data that grows very fast, collected several times during a short period of time. This work focus on the importance of the correct management of Big Data in an industrial plant. Through a case study based on a company that belongs to the pulp and paper area, the problems resolutions are going to be presented with the usage of appropriate data management. In the final chapters, the results achieved by the company are discussed, showing how the correct choice of data to be monitored and analyzed brought benefits to the company, also best practices will be recommended for the Big Data management

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents one proposal of the energy management's model system using photovoltaic panels. The module proposed seeks to monitor photovoltaic panels, which have intermittency in power generation caused by environmental or load conditions, in order to control the coupling between the panel and the load - through the charge controller, of aiming that the panel's operation will be always on the maximum power transfer point as possible. For this, it used the maximum power point tracking technique - MPPT, implemented in LabVIEW software, also utilizing the data acquisition card NI myDAQ. In addition, it was implemented the controller access remote module, from the sharing of network data, so the panels performance can be through a tablet, monitored and controlled with no need for direct contact with the Supervisory server

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nowadays the companies generate great amount of data from different sources, however some of them produce more data than they can analyze. Big Data is a set of data that grows very fast, collected several times during a short period of time. This work focus on the importance of the correct management of Big Data in an industrial plant. Through a case study based on a company that belongs to the pulp and paper area, the problems resolutions are going to be presented with the usage of appropriate data management. In the final chapters, the results achieved by the company are discussed, showing how the correct choice of data to be monitored and analyzed brought benefits to the company, also best practices will be recommended for the Big Data management

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents one proposal of the energy management's model system using photovoltaic panels. The module proposed seeks to monitor photovoltaic panels, which have intermittency in power generation caused by environmental or load conditions, in order to control the coupling between the panel and the load - through the charge controller, of aiming that the panel's operation will be always on the maximum power transfer point as possible. For this, it used the maximum power point tracking technique - MPPT, implemented in LabVIEW software, also utilizing the data acquisition card NI myDAQ. In addition, it was implemented the controller access remote module, from the sharing of network data, so the panels performance can be through a tablet, monitored and controlled with no need for direct contact with the Supervisory server

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abstract Background Dizziness is a common complaint among older adults and has been linked to a wide range of health conditions, psychological and social characteristics in this population. However a profile of dizziness is still uncertain which hampers clinical decision-making. We therefore sought to explore the relationship between dizziness and a comprehensive range of demographic data, diseases, health and geriatric conditions, and geriatric syndromes in a representative sample of community-dwelling older people. Methods This is a cross-sectional, population-based study derived from FIBRA (Network for the Study of Frailty in Brazilian Elderly Adults), with 391 elderly adults, both men and women, aged 65 years and older. Elderly participants living at home in an urban area were enrolled through a process of random cluster sampling of census regions. The outcome variable was the self-report of dizziness in the last year. Several feelings of dizziness were investigated including vertigo, spinning, light or heavy headedness, floating, fuzziness, giddiness and instability. A multivariate logistic regression analysis was conducted to estimate the adjusted odds ratios and build the probability model for dizziness. Results The complaint of dizziness was reported by 45% of elderly adults, from which 71.6% were women (p=0.004). The multivariate regression analysis revealed that dizziness is associated with depressive symptoms (OR = 2.08; 95% CI 1.29–3.35), perceived fatigue (OR = 1.93; 95% CI 1.21-3.10), recurring falls (OR = 2.01; 95% CI 1.11-3.62) and excessive drowsiness (OR = 1.91; 95% CI 1.11–3.29). The discrimination of the final model was AUC = 0.673 (95% CI 0.619-0.727) (p< 0.001). Conclusions The prevalence of dizziness in community-dwelling elderly adults is substantial. It is associated with other common geriatric conditions usually neglected in elderly adults, such as fatigue and drowsiness, supporting its possible multifactorial manifestation. Our findings demonstrate the need to expand the design in future studies, aiming to estimate risk and identify possible causal relations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In fluid dynamics research, pressure measurements are of great importance to define the flow field acting on aerodynamic surfaces. In fact the experimental approach is fundamental to avoid the complexity of the mathematical models for predicting the fluid phenomena. It’s important to note that, using in-situ sensor to monitor pressure on large domains with highly unsteady flows, several problems are encountered working with the classical techniques due to the transducer cost, the intrusiveness, the time response and the operating range. An interesting approach for satisfying the previously reported sensor requirements is to implement a sensor network capable of acquiring pressure data on aerodynamic surface using a wireless communication system able to collect the pressure data with the lowest environmental–invasion level possible. In this thesis a wireless sensor network for fluid fields pressure has been designed, built and tested. To develop the system, a capacitive pressure sensor, based on polymeric membrane, and read out circuitry, based on microcontroller, have been designed, built and tested. The wireless communication has been performed using the Zensys Z-WAVE platform, and network and data management have been implemented. Finally, the full embedded system with antenna has been created. As a proof of concept, the monitoring of pressure on the top of the mainsail in a sailboat has been chosen as working example.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Questo studio, che è stato realizzato in collaborazione con Hera, è un'analisi della gestione dei rifiuti a Bologna. La ricerca è stata effettuata su diversi livelli: un livello strategico il cui scopo è quello di identificare nuovi metodi per la raccolta dei rifiuti in funzione delle caratteristiche del territorio della città, un livello analitico che riguarda il miglioramento delle applicazioni informatiche di supporto, e livello ambientale che riguarda il calcolo delle emissioni in atmosfera di veicoli adibiti alla raccolta e al trasporto dei rifiuti. innanzitutto è stato necessario studiare Bologna e lo stato attuale dei servizi di raccolta dei rifiuti. È incrociando questi componenti che in questi ultimi tre anni sono state effettuate modifiche nel settore della gestione dei rifiuti. I capitoli seguenti sono inerenti le applicazioni informatiche a sostegno di tali attività: Siget e Optit. Siget è il programma di gestione del servizio, che attualmente viene utilizzato per tutte le attività connesse alla raccolta di rifiuti. È un programma costituito da moduli diversi, ma di sola la gestione dati. la sperimentazione con Optit ha aggiunto alla gestione dei dati la possibilità di avere tali dati in cartografia e di associare un algoritmo di routing. I dati archiviati in Siget hanno rappresentato il punto di partenza, l'input, e il raggiungimento di tutti punti raccolta l'obiettivo finale. L'ultimo capitolo è relativo allo studio dell'impatto ambientale di questi percorsi di raccolta dei rifiuti. Tale analisi, basata sulla valutazione empirica e sull'implementazione in Excel delle formule del Corinair mostra la fotografia del servizio nel 2010. Su questo aspetto Optit ha fornito il suo valore aggiunto, implementando nell'algoritmo anche le formule per il calcolo delle emissioni.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

L’esperimento CMS a LHC ha raccolto ingenti moli di dati durante Run-1, e sta sfruttando il periodo di shutdown (LS1) per evolvere il proprio sistema di calcolo. Tra i possibili miglioramenti al sistema, emergono ampi margini di ottimizzazione nell’uso dello storage ai centri di calcolo di livello Tier-2, che rappresentano - in Worldwide LHC Computing Grid (WLCG)- il fulcro delle risorse dedicate all’analisi distribuita su Grid. In questa tesi viene affrontato uno studio della popolarità dei dati di CMS nell’analisi distribuita su Grid ai Tier-2. Obiettivo del lavoro è dotare il sistema di calcolo di CMS di un sistema per valutare sistematicamente l’ammontare di spazio disco scritto ma non acceduto ai centri Tier-2, contribuendo alla costruzione di un sistema evoluto di data management dinamico che sappia adattarsi elasticamente alle diversi condizioni operative - rimuovendo repliche dei dati non necessarie o aggiungendo repliche dei dati più “popolari” - e dunque, in ultima analisi, che possa aumentare l’“analysis throughput” complessivo. Il Capitolo 1 fornisce una panoramica dell’esperimento CMS a LHC. Il Capitolo 2 descrive il CMS Computing Model nelle sue generalità, focalizzando la sua attenzione principalmente sul data management e sulle infrastrutture ad esso connesse. Il Capitolo 3 descrive il CMS Popularity Service, fornendo una visione d’insieme sui servizi di data popularity già presenti in CMS prima dell’inizio di questo lavoro. Il Capitolo 4 descrive l’architettura del toolkit sviluppato per questa tesi, ponendo le basi per il Capitolo successivo. Il Capitolo 5 presenta e discute gli studi di data popularity condotti sui dati raccolti attraverso l’infrastruttura precedentemente sviluppata. L’appendice A raccoglie due esempi di codice creato per gestire il toolkit attra- verso cui si raccolgono ed elaborano i dati.