120 resultados para Detectors: scintillator
Resumo:
This paper proposes a novel and simple positive sequence detector (PSD), which is inherently self-adjustable to fundamental frequency deviations by means of a software-based PLL (Phase Locked Loop). Since the proposed positive sequence detector is not based on Fortescue's classical decomposition and no special input filtering is needed, its dynamic response may be as fast as one fundamental cycle. The digital PLL ensures that the positive sequence components can be calculated even under distorted waveform conditions and fundamental frequency deviations. For the purpose of validating the proposed models, the positive sequence detector has been implemented in a PC-based Power Quality Monitor and experimental results illustrate its good performance. The PSD algorithm has also been evaluated in the control loop of a Series Active Filter and simulation results demonstrate its effectiveness in a closed-loop system. Moreover, considering single-phase applications, this paper also proposes a general single-phase PLL and a Fundamental Wave Detector (FWD) immune to frequency variations and waveform distortions. © 2005 IEEE.
Resumo:
Regulatory authorities in many countries, in order to maintain an acceptable balance between appropriate customer service qualities and costs, are introducing a performance-based regulation. These regulations impose penalties, and in some cases rewards, which introduce a component of financial risk to an electric power utility due to the uncertainty associated with preserving a specific level of system reliability. In Brazil, for instance, one of the reliability indices receiving special attention by the utilities is the Maximum Continuous Interruption Duration per customer (MCID). This paper describes a chronological Monte Carlo simulation approach to evaluate probability distributions of reliability indices, including the MCID, and the corresponding penalties. In order to get the desired efficiency, modern computational techniques are used for modeling (UML -Unified Modeling Language) as well as for programming (Object- Oriented Programming). Case studies on a simple distribution network and on real Brazilian distribution systems are presented and discussed. © Copyright KTH 2006.
Resumo:
The purpose of this paper is to introduce a new approach for edge detection in gray shaded images. The proposed approach is based on the fuzzy number theory. The idea is to deal with the uncertainties concerning the gray shades making up the image, and thus calculate the appropriateness of the pixels in relation to an homogeneous region around them. The pixels not belonging to the region are then classified as border pixels. The results have shown that the technique is simple, computationally efficient and with good results when compared with both the traditional border detectors and the fuzzy edge detectors. © 2007 IEEE.
Resumo:
This paper discusses two pitch detection algorithms (PDA) for simple audio signals which are based on zero-cross rate (ZCR) and autocorrelation function (ACF). As it is well known, pitch detection methods based on ZCR and ACF are widely used in signal processing. This work shows some features and problems in using these methods, as well as some improvements developed to increase their performance. © 2008 IEEE.
Resumo:
We report on a first search for resonant pair production of neutral long-lived particles (NLLP) which each decay to a bb̄ pair, using 3.6fb-1 of data recorded with the D0 detector at the Fermilab Tevatron collider. We search for pairs of displaced vertices in the tracking detector at radii in the range 1.6-20cm from the beam axis. No significant excess is observed above background, and upper limits are set on the production rate in a hidden-valley benchmark model for a range of Higgs boson masses and NLLP masses and lifetimes. © 2009 The American Physical Society.
Resumo:
The purpose of this paper is to introduce a new approach for edge detection in grey shaded images. The proposed approach is based on the fuzzy number theory. The idea is to deal with the uncertainties concerning the grey shades making up the image and, thus, calculate the appropriateness of the pixels in relation to a homogeneous region around them. The pixels not belonging to the region are then classified as border pixels. The results have shown that the technique is simple, computationally efficient and with good results when compared with both the traditional border detectors and the fuzzy edge detectors. Copyright © 2009, Inderscience Publishers.
Resumo:
X-ray computed tomography (CT) refers to the cross-sectional imaging of an object measuring the transmitted radiation at different directions. In this work, we describe the development of a low cost micro-CT X-ray scanner that is being developed for nondestructive testing. This tomograph operates using a microfocus X-ray source and contains a silicon photodiode as detectors. The performance of the system, by its spatial resolution, has been estimated through its Modulation Transfer Function - MTF and the obtained value at 10% of MTF is 661 μm. It was built as a general purpose nondestructive testing device. © 2009 American Institute of Physics.
Resumo:
The outdating of cartographic products affects planning. It is important to propose methods to help detect changes in surface. Thus, the combined use of remote sensing image and techniques of digital image processing has contributed significantly to minimize such outdating. Mathematical morphology is an image processing technique which describes quantitatively geometric structures presented in the image and provides tools such as edge detectors and morphological filters. Previous studies have shown that the technique has potential on the detection of significant features. Thus, this paper proposes a routine of morphological operators to detect a road network. The test area corresponds to an excerpt Quickbird image and has as a feature of interest an avenue of the city of Presidente Prudente, SP. In the processing, the main morphological operators used were threshad, areaopen, binary and erosion. To estimate the accuracy with which the linear features were detected, it was done the analysis of linear correlation between vectors of the features detected and the corresponding topographical map of the region. The results showed that the mathematical morphology can be used in cartography, aiming to use them in conventional cartographic updating processes.
Resumo:
Conventional radiography, using industrial radiographic films, has its days numbered. Digital radiography, recently, has taken its place in various segments of products and services, such as medicine, aerospace, security, automotive, etc. As well as the technological trend, the digital technique has brought proven benefits in terms of productivity, sensitivity, the environment, tools for image treatment, cost reductions, etc. If the weld to be inspected is on a serried product, such as, for example, a pipe, the best option for the use of digital radiography is the plane detector, since its use can reduce the length of the inspection cycle due to its high degree of automation. This work tested welded joints produced with the submerged arc process, which were specially prepared in such a way that it shows small artificial cracks, which served as the basis forcomparing the sensitivity levels of the techniques involved. After carrying out the various experiments, the digital meth odshowed the highest sensitivity for the image quality indicator (IQI) of the wire and also in terms of detecting small discontinuities, indicating that the use of digital radiography using the plane detector had advantages over the conventional technique (Moreira et al. Digital radiography, the use of plane detectors for the inspection of welds in oil pipes and gas pipes.9th COTEQ and XXV National Testing Congress for Non Destructive Testing and Inspection; Salvador, Bahia, Brazil and Bavendiek et al. New digital radiography procedure exceeds film sensitivity considerably in aerospace applications. ECNDT; 2006; Berlin). The works were carried out on the basis of the specifications for oil and gas pipelines, API 5L 2004 edition (American Petroleum Institute. API 5L: specification for line pipe. 4th ed. p. 155; 2004) and ISO 3183 2007 edition (International Organization for Standardization, ISO 3183. Petroleum and gas industries - steel pipes for pi pelines transportation systems. p. 143; 2007). © 2010 Taylor & Francis.
Resumo:
The intension of this paper was to review and discuss some of the current quantitative analytical procedures which are used for quality control of pharmaceutical products. The selected papers were organized according to the analytical technique employed. Several techniques like ultraviolet/visible spectrophotometry, fluorimetry, titrimetry, electroanalytical techniques, chromatographic methods (thin-layer chromatography, gas chromatography and high-performance liquid chromatography), capillary electrophoresis and vibrational spectroscopies are the main techniques that have been used for the quantitative analysis of pharmaceutical compounds. In conclusion, although simple techniques such as UV/VIS spectrophotometry and TLC are still extensively employed, HPLC is the most popular instrumental technique used for the analysis of pharmaceuticals. Besides, a review of recent works in the area of pharmaceutical analysis showed a trend in the application of techniques increasingly rapid such as ultra performance liquid chromatography and the use of sensitive and specific detectors as mass spectrometers.
Resumo:
The CMS Collaboration conducted a month-long data-taking exercise known as the Cosmic Run At Four Tesla in late 2008 in order to complete the commissioning of the experiment for extended operation. The operational lessons resulting from this exercise were addressed in the subsequent shutdown to better prepare CMS for LHC beams in 2009. The cosmic data collected have been invaluable to study the performance of the detectors, to commission the alignment and calibration techniques, and to make several cosmic ray measurements. The experimental setup, conditions, and principal achievements from this data-taking exercise are described along with a review of the preceding integration activities. © 2010 IOP Publishing Ltd and SISSA.
Resumo:
The operation and general performance of the CMS electromagnetic calorimeter using cosmic-ray muons are described. These muons were recorded after the closure of the CMS detector in late 2008. The calorimeter is made of lead tungstate crystals and the overall status of the 75 848 channels corresponding to the barrel and endcap detectors is reported. The stability of crucial operational parameters, such as high voltage, temperature and electronic noise, is summarised and the performance of the light monitoring system is presented. © 2010 IOP Publishing Ltd and SISSA.
Resumo:
Malicious programs (malware) can cause severe damage on computer systems and data. The mechanism that the human immune system uses to detect and protect from organisms that threaten the human body is efficient and can be adapted to detect malware attacks. In this paper we propose a system to perform malware distributed collection, analysis and detection, this last inspired by the human immune system. After collecting malware samples from Internet, they are dynamically analyzed so as to provide execution traces at the operating system level and network flows that are used to create a behavioral model and to generate a detection signature. Those signatures serve as input to a malware detector, acting as the antibodies in the antigen detection process. This allows us to understand the malware attack and aids in the infection removal procedures. © 2012 Springer-Verlag.
Resumo:
The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 pb-1 of data collected in pp collisions at s = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV/c is above 95% over the whole region of pseudorapidity covered by the CMS muon system, < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeVc is higher than 90% over the full η range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100GeV/c and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV/c. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation. © 2012 IOP Publishing Ltd and Sissa Medialab srl.
Resumo:
Results are presented from a search for the pair production of third-generation scalar and vector leptoquarks, as well as for top squarks in R-parity-violating supersymmetric models. In either scenario, the new, heavy particle decays into a τ lepton and a b quark. The search is based on a data sample of pp collisions at √s=7 TeV, which is collected by the CMS detector at the LHC and corresponds to an integrated luminosity of 4.8 fb -1. The number of observed events is found to be in agreement with the standard model prediction, and exclusion limits on mass parameters are obtained at the 95% confidence level. Vector leptoquarks with masses below 760 GeV are excluded and, if the branching fraction of the scalar leptoquark decay to a τ lepton and a b quark is assumed to be unity, third-generation scalar leptoquarks with masses below 525 GeV are ruled out. Top squarks with masses below 453 GeV are excluded for a typical benchmark scenario, and limits on the coupling between the top squark, τ lepton, and b quark, λ333′ are obtained. These results are the most stringent for these scenarios to date. © 2013 CERN.