155 resultados para Burglar alarms.
Resumo:
One of the current challenges of Ubiquitous Computing is the development of complex applications, those are more than simple alarms triggered by sensors or simple systems to configure the environment according to user preferences. Those applications are hard to develop since they are composed by services provided by different middleware and it is needed to know the peculiarities of each of them, mainly the communication and context models. This thesis presents OpenCOPI, a platform which integrates various services providers, including context provision middleware. It provides an unified ontology-based context model, as well as an environment that enable easy development of ubiquitous applications via the definition of semantic workflows that contains the abstract description of the application. Those semantic workflows are converted into concrete workflows, called execution plans. An execution plan consists of a workflow instance containing activities that are automated by a set of Web services. OpenCOPI supports the automatic Web service selection and composition, enabling the use of services provided by distinct middleware in an independent and transparent way. Moreover, this platform also supports execution adaptation in case of service failures, user mobility and degradation of services quality. The validation of OpenCOPI is performed through the development of case studies, specifically applications of the oil industry. In addition, this work evaluates the overhead introduced by OpenCOPI and compares it with the provided benefits, and the efficiency of OpenCOPI s selection and adaptation mechanism
Resumo:
The chart of control of Hotelling T2 has been the main statistical device used in monitoring multivariate processes. Currently the technological development of control systems and automation enabled a high rate of collection of information of the production systems in very short time intervals, causing a dependency between the results of observations. This phenomenon known as auto correlation causes in the statistical control of the multivariate processes a high rate of false alarms, prejudicing in the chart performance. This entails the violation of the assumption of independence and normality of the distribution. In this thesis we considered not only the correlation between two variables, but also the dependence between observations of the same variable, that is, auto correlation. It was studied by simulation, the bi variate case and the effect of auto correlation on the performance of the T2 chart of Hotelling.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
A methodology for pipeline leakage detection using a combination of clustering and classification tools for fault detection is presented here. A fuzzy system is used to classify the running mode and identify the operational and process transients. The relationship between these transients and the mass balance deviation are discussed. This strategy allows for better identification of the leakage because the thresholds are adjusted by the fuzzy system as a function of the running mode and the classified transient level. The fuzzy system is initially off-line trained with a modified data set including simulated leakages. The methodology is applied to a small-scale LPG pipeline monitoring case where portability, robustness and reliability are amongst the most important criteria for the detection system. The results are very encouraging with relatively low levels of false alarms, obtaining increased leakage detection with low computational costs. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
This paper presents an economic design of (X) over bar control charts with variable sample sizes, variable sampling intervals, and variable control limits. The sample size n, the sampling interval h, and the control limit coefficient k vary between minimum and maximum values, tightening or relaxing the control. The control is relaxed when an (X) over bar value falls close to the target and is tightened when an (X) over bar value falls far from the target. A cost model is constructed that involves the cost of false alarms, the cost of finding and eliminating the assignable cause, the cost associated with production in an out-of-control state, and the cost of sampling and testing. The assumption of an exponential distribution to describe the length of time the process remains in control allows the application of the Markov chain approach for developing the cost function. A comprehensive study is performed to examine the economic advantages of varying the (X) over bar chart parameters.
Resumo:
In this work the problem of defects location in power systems is formulated through a binary linear programming (BLP) model based on alarms historical database of control and protection devices from the system control center, sets theory of minimal coverage (AI) and protection philosophy adopted by the electric utility. In this model, circuit breaker operations are compared to their expected states in a strictly mathematical manner. For solving this BLP problem, which presents a great number of decision variables, a dedicated Genetic Algorithm (GA), is proposed. Control parameters of the GA, such as crossing over and mutation rates, population size, iterations number and population diversification, are calibrated in order to obtain efficiency and robustness. Results for a test system found in literature, are presented and discussed. © 2004 IEEE.
Resumo:
In this paper, a methodology based on Unconstrained Binary Programming (UBP) model and Genetic Algorithms (GAs) is proposed for estimating fault sections in automated distribution substations. The UBP model, established by using the parsimonious set covering theory, looks for the match between the relays' protective alarms informed by the SCADA system and their expected states. The GA is developed to minimize the UBP model and estimate the fault sections in a swift and reliable manner. The proposed methodology is tested by utilizing a real-life automated distribution substation. Control parameters of the GA are tuned to achieve maximum computational efficiency and reduction of processing time. Results show the potential and efficiency of the methodology for estimating fault section in real-time at Distribution Control Centers. ©2009 IEEE.
Resumo:
Incluye Bibliografía
Resumo:
Multisensor data fusion is a technique that combines the readings of multiple sensors to detect some phenomenon. Data fusion applications are numerous and they can be used in smart buildings, environment monitoring, industry and defense applications. The main goal of multisensor data fusion is to minimize false alarms and maximize the probability of detection based on the detection of multiple sensors. In this paper a local data fusion algorithm based on luminosity, temperature and flame for fire detection is presented. The data fusion approach was embedded in a low cost mobile robot. The prototype test validation has indicated that our approach can detect fire occurrence. Moreover, the low cost project allow the development of robots that could be discarded in their fire detection missions. © 2013 IEEE.
Planejamento econômico de gráficos de controle X para monitoramento de processos autocorrelacionados
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
As comunicações marítimas são ainda fortemente efetivadas por meio de circuitos rádio convencionais empregando ondas terrestres, em bandas de MF a VHF, especialmente após a introdução de sistemática de chamadas seletivas digitais. Entretanto, um número significativo de procedimentos operacionais nas radiocomunicações ainda depende de intervenções de operador, reduzindo sua eficiência e dando margem ao cometimento de erros como emissão de falsos alarmes de socorro. Esses erros têm sido frequentes e têm ocasionado muito desvio de atenção das ocorrências reais e perdas de tempo nas instâncias que cuidam das ações de busca e salvamento no mar, afetando, seriamente, a confiabilidade no processo de comunicações terrestres. Em vista disto, propõem-se, nesta pesquisa, sistemáticas para controle das emissões de falsos alertas de socorro, nas transmissões por ondas terrestres, e para melhorar o processo vigente de transmissão de chamadas. As chamadas de socorro recebidas pelas estações costeiras são identificadas por estas, de acordo com as estações que as emitiu, objetivando subsidiar ações corretivas por instâncias competenrtes, para inibição de novas ocorrências. Paralelamente, maior nível de automação é proposto para os processos de chamadas, reduzindo-se a dependência de procedimentos manuais e, assim, as possibilidades de emissões equivocadas, além de melhor aproveitar o conceito de chamada seletiva do sistema vigente. Tais proposições dão maior eficiência e confiabilidade às comunições terrestres, úteis especialmente nas situações de emergências, sem adicionar significativas demandas operacionais nas estações, incentivando e melhorando as condições de comunicações no SMM por ondas terrestres.
Resumo:
Esta dissertação apresenta a implementação de navegação no ambiente virtual, reconhecimento de gestos e controle de interface, feitos através do dispositivo Kinect, no Sistema ITV: um sistema de treinamento de operadores e mantenedores de usinas hidrelétricas e subestações elétricas. São mostrados, também, determinados aperfeiçoamentos recentes, como conversão em vídeo, telas de alarmes sonoros e visuais, ambientação sonora em três dimensões e narração do processo. Além da apresentação do Sistema ITV, são expostos o dispositivo Kinect e o algoritmo utilizado na comparação dos padrões de movimento, o DTW. Em seguida, são abordados em detalhes o projeto e a implementação da navegação, do reconhecimento de gestos e do controle de interface. Como estudo de caso, é exibida uma Instrução Técnica Virtual (ITV), elaborada especialmente para testar e avaliar a nova interface proposta. Posteriormente, são apresentados os resultados, considerados satisfatórios, obtidos através da análise de questionários qualitativos aplicados a estudantes da Universidade Federal do Pará. Por fim, são realizadas as considerações referentes a este trabalho e expostas idéias de trabalhos futuros.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The steady-state average run length is used to measure the performance of the recently proposed synthetic double sampling (X) over bar chart (synthetic DS chart). The overall performance of the DS X chart in signaling process mean shifts of different magnitudes does not improve when it is integrated with the conforming run length chart, except when the integrated charts are designed to offer very high protection against false alarms, and the use of large samples is prohibitive. The synthetic chart signals when a second point falls beyond the control limits, no matter whether one of them falls above the centerline and the other falls below it; with the side-sensitive feature, the synthetic chart does not signal when they fall on opposite sides of the centerline. We also investigated the steady-state average run length of the side-sensitive synthetic DS X chart. With the side-sensitive feature, the overall performance of the synthetic DS X chart improves, but not enough to outperform the non-synthetic DS X chart. Copyright (C) 2014 John Wiley &Sons, Ltd.