888 resultados para Automatic call detector
Resumo:
Retinal ultra-wide field of view images (fundus images) provides the visu-alization of a large part of the retina though, artifacts may appear in those images. Eyelashes and eyelids often cover the clinical region of interest and worse, eye-lashes can be mistaken with arteries and/or veins when those images are put through automatic diagnosis or segmentation software creating, in those cases, the appearance of false positives results. Correcting this problem, the first step in the development of qualified auto-matic diseases diagnosis programs can be done and in that way the development of an objective tool to assess diseases eradicating the human error from those processes can also be achieved. In this work the development of a tool that automatically delimitates the clinical region of interest is proposed by retrieving features from the images that will be analyzed by an automatic classifier. This automatic classifier will evaluate the information and will decide which part of the image is of interest and which part contains artifacts. The results were validated by implementing a software in C# language and validated through a statistical analysis. From those results it was confirmed that the methodology presented is capable of detecting artifacts and selecting the clin-ical region of interest in fundus images of the retina.
Resumo:
The extraction of relevant terms from texts is an extensively researched task in Text- Mining. Relevant terms have been applied in areas such as Information Retrieval or document clustering and classification. However, relevance has a rather fuzzy nature since the classification of some terms as relevant or not relevant is not consensual. For instance, while words such as "president" and "republic" are generally considered relevant by human evaluators, and words like "the" and "or" are not, terms such as "read" and "finish" gather no consensus about their semantic and informativeness. Concepts, on the other hand, have a less fuzzy nature. Therefore, instead of deciding on the relevance of a term during the extraction phase, as most extractors do, I propose to first extract, from texts, what I have called generic concepts (all concepts) and postpone the decision about relevance for downstream applications, accordingly to their needs. For instance, a keyword extractor may assume that the most relevant keywords are the most frequent concepts on the documents. Moreover, most statistical extractors are incapable of extracting single-word and multi-word expressions using the same methodology. These factors led to the development of the ConceptExtractor, a statistical and language-independent methodology which is explained in Part I of this thesis. In Part II, I will show that the automatic extraction of concepts has great applicability. For instance, for the extraction of keywords from documents, using the Tf-Idf metric only on concepts yields better results than using Tf-Idf without concepts, specially for multi-words. In addition, since concepts can be semantically related to other concepts, this allows us to build implicit document descriptors. These applications led to published work. Finally, I will present some work that, although not published yet, is briefly discussed in this document.
Resumo:
The “CMS Safety Closing Sensors System” (SCSS, or CSS for brevity) is a remote monitoring system design to control safety clearance and tight mechanical movements of parts of the CMS detector, especially during CMS assembly phases. We present the different systems that makes SCSS: its sensor technologies, the readout system, the data acquisition and control software. We also report on calibration and installation details, which determine the resolution and limits of the system. We present as well our experience from the operation of the system and the analysis of the data collected since 2008. Special emphasis is given to study positioning reproducibility during detector assembly and understanding how the magnetic fields influence the detector structure.
Resumo:
In cataract surgery, the eye’s natural lens is removed because it has gone opaque and doesn’t allow clear vision any longer. To maintain the eye’s optical power, a new artificial lens must be inserted. Called Intraocular Lens (IOL), it needs to be modelled in order to have the correct refractive power to substitute the natural lens. Calculating the refractive power of this substitution lens requires precise anterior eye chamber measurements. An interferometry equipment, the AC Master from Zeiss Meditec, AG, was in use for half a year to perform these measurements. A Low Coherence Interferometry (LCI) measurement beam is aligned with the eye’s optical axis, for precise measurements of anterior eye chamber distances. The eye follows a fixation target in order to make the visual axis align with the optical axis. Performance problems occurred, however, at this step. Therefore, there was a necessity to develop a new procedure that ensures better alignment between the eye’s visual and optical axes, allowing a more user friendly and versatile procedure, and eventually automatizing the whole process. With this instrument, the alignment between the eye’s optical and visual axes is detected when Purkinje reflections I and III are overlapped, as the eye follows a fixation target. In this project, image analysis is used to detect these Purkinje reflections’ positions, eventually automatically detecting when they overlap. Automatic detection of the third Purkinje reflection of an eye following a fixation target is possible with some restrictions. Each pair of detected third Purkinje reflections is used in automatically calculating an acceptable starting position for the fixation target, required for precise measurements of anterior eye chamber distances.
Resumo:
Application of Experimental Design techniques has proven to be essential in various research fields, due to its statistical capability of processing the effect of interactions among independent variables, known as factors, in a system’s response. Advantages of this methodology can be summarized in more resource and time efficient experimentations while providing more accurate results. This research emphasizes the quantification of 4 antioxidants extraction, at two different concentration, prepared according to an experimental procedure and measured by a Photodiode Array Detector. Experimental planning was made following a Central Composite Design, which is a type of DoE that allows to consider the quadratic component in Response Surfaces, a component that includes pure curvature studies on the model produced. This work was executed with the intention of analyzing responses, peak areas obtained from chromatograms plotted by the Detector’s system, and comprehending if the factors considered – acquired from an extensive literary review – produced the expected effect in response. Completion of this work will allow to take conclusions regarding what factors should be considered for the optimization studies of antioxidants extraction in a Oca (Oxalis tuberosa) matrix.
Resumo:
This project aims to provide feasible solutions to improve customer´s Help Area at Continente Online. The goal is to increase satisfaction and loyalty by reducing the main causes that lead customers to appeal to Call Center or abandon the website. The pursued solution is the implementation of Web Self-Service and the vision taken is focused not only on providing customers basic help tools but also innovate with international best practices to sustain Sonae MC´s present and future market leader position. Customer´s feedback, costs and impact are taken in consideration to find the best fit for the company.
Resumo:
Ship tracking systems allow Maritime Organizations that are concerned with the Safety at Sea to obtain information on the current location and route of merchant vessels. Thanks to Space technology in recent years the geographical coverage of the ship tracking platforms has increased significantly, from radar based near-shore traffic monitoring towards a worldwide picture of the maritime traffic situation. The long-range tracking systems currently in operations allow the storage of ship position data over many years: a valuable source of knowledge about the shipping routes between different ocean regions. The outcome of this Master project is a software prototype for the estimation of the most operated shipping route between any two geographical locations. The analysis is based on the historical ship positions acquired with long-range tracking systems. The proposed approach makes use of a Genetic Algorithm applied on a training set of relevant ship positions extracted from the long-term storage tracking database of the European Maritime Safety Agency (EMSA). The analysis of some representative shipping routes is presented and the quality of the results and their operational applications are assessed by a Maritime Safety expert.
Resumo:
Due to advances in information technology (e.g., digital video cameras, ubiquitous sensors), the automatic detection of human behaviors from video is a very recent research topic. In this paper, we perform a systematic and recent literature review on this topic, from 2000 to 2014, covering a selection of 193 papers that were searched from six major scientific publishers. The selected papers were classified into three main subjects: detection techniques, datasets and applications. The detection techniques were divided into four categories (initialization, tracking, pose estimation and recognition). The list of datasets includes eight examples (e.g., Hollywood action). Finally, several application areas were identified, including human detection, abnormal activity detection, action recognition, player modeling and pedestrian detection. Our analysis provides a road map to guide future research for designing automatic visual human behavior detection systems.
Resumo:
ETL conceptual modeling is a very important activity in any data warehousing system project implementation. Owning a high-level system representation allowing for a clear identification of the main parts of a data warehousing system is clearly a great advantage, especially in early stages of design and development. However, the effort to model conceptually an ETL system rarely is properly rewarded. Translating ETL conceptual models directly into something that saves work and time on the concrete implementation of the system process it would be, in fact, a great help. In this paper we present and discuss a hybrid approach to this problem, combining the simplicity of interpretation and power of expression of BPMN on ETL systems conceptualization with the use of ETL patterns to produce automatically an ETL skeleton, a first prototype system, which has the ability to be executed in a commercial ETL tool like Kettle.
Resumo:
The jet energy scale (JES) and its systematic uncertainty are determined for jets measured with the ATLAS detector using proton–proton collision data with a centre-of-mass energy of s√=7 TeV corresponding to an integrated luminosity of 4.7 fb −1 . Jets are reconstructed from energy deposits forming topological clusters of calorimeter cells using the anti- kt algorithm with distance parameters R=0.4 or R=0.6 , and are calibrated using MC simulations. A residual JES correction is applied to account for differences between data and MC simulations. This correction and its systematic uncertainty are estimated using a combination of in situ techniques exploiting the transverse momentum balance between a jet and a reference object such as a photon or a Z boson, for 20≤pjetT<1000 GeV and pseudorapidities |η|<4.5 . The effect of multiple proton–proton interactions is corrected for, and an uncertainty is evaluated using in situ techniques. The smallest JES uncertainty of less than 1 % is found in the central calorimeter region ( |η|<1.2 ) for jets with 55≤pjetT<500 GeV . For central jets at lower pT , the uncertainty is about 3 %. A consistent JES estimate is found using measurements of the calorimeter response of single hadrons in proton–proton collisions and test-beam data, which also provide the estimate for pjetT>1 TeV. The calibration of forward jets is derived from dijet pT balance measurements. The resulting uncertainty reaches its largest value of 6 % for low- pT jets at |η|=4.5 . Additional JES uncertainties due to specific event topologies, such as close-by jets or selections of event samples with an enhanced content of jets originating from light quarks or gluons, are also discussed. The magnitude of these uncertainties depends on the event sample used in a given physics analysis, but typically amounts to 0.5–3 %.
Resumo:
Measurements of the centrality and rapidity dependence of inclusive jet production in sNN−−−√=5.02 TeV proton--lead (p+Pb) collisions and the jet cross-section in s√=2.76 TeV proton--proton collisions are presented. These quantities are measured in datasets corresponding to an integrated luminosity of 27.8 nb−1 and 4.0 pb−1, respectively, recorded with the ATLAS detector at the Large Hadron Collider in 2013. The p+Pb collision centrality was characterised using the total transverse energy measured in the pseudorapidity interval −4.9<η<−3.2 in the direction of the lead beam. Results are presented for the double-differential per-collision yields as a function of jet rapidity and transverse momentum (pT) for minimum-bias and centrality-selected p+Pb collisions, and are compared to the jet rate from the geometric expectation. The total jet yield in minimum-bias events is slightly enhanced above the expectation in a pT-dependent manner but is consistent with the expectation within uncertainties. The ratios of jet spectra from different centrality selections show a strong modification of jet production at all pT at forward rapidities and for large pT at mid-rapidity, which manifests as a suppression of the jet yield in central events and an enhancement in peripheral events. These effects imply that the factorisation between hard and soft processes is violated at an unexpected level in proton--nucleus collisions. Furthermore, the modifications at forward rapidities are found to be a function of the total jet energy only, implying that the violations may have a simple dependence on the hard parton--parton kinematics.
Resumo:
Various differential cross-sections are measured in top-quark pair (tt¯) events produced in proton--proton collisions at a centre-of-mass energy of s√=7 TeV at the LHC with the ATLAS detector. These differential cross-sections are presented in a data set corresponding to an integrated luminosity of 4.6 fb−1. The differential cross-sections are presented in terms of kinematic variables of a top-quark proxy referred to as the pseudo-top-quark whose dependence on theoretical models is minimal. The pseudo-top-quark can be defined in terms of either reconstructed detector objects or stable particles in an analogous way. The measurements are performed on tt¯ events in the lepton+jets channel, requiring exactly one charged lepton and at least four jets with at least two of them tagged as originating from a b-quark. The hadronic and leptonic pseudo-top-quarks are defined via the leptonic or hadronic decay mode of the W boson produced by the top-quark decay in events with a single charged lepton.The cross-section is measured as a function of the transverse momentum and rapidity of both the hadronic and leptonic pseudo-top-quark as well as the transverse momentum, rapidity and invariant mass of the pseudo-top-quark pair system. The measurements are corrected for detector effects and are presented within a kinematic range that closely matches the detector acceptance. Differential cross-section measurements of the pseudo-top-quark variables are compared with several Monte Carlo models that implement next-to-leading order or leading-order multi-leg matrix-element calculations.
Resumo:
This Letter reports evidence of triple gauge boson production pp→W(ℓν)γγ+X, which is accessible for the first time with the 8 TeV LHC data set. The fiducial cross section for this process is measured in a data sample corresponding to an integrated luminosity of 20.3 fb−1, collected by the ATLAS detector in 2012. Events are selected using the W boson decay to eν or μν as well as requiring two isolated photons. The measured cross section is used to set limits on anomalous quartic gauge couplings in the high diphoton mass region.
Resumo:
A measurement of spin correlation in tt¯ production is presented using data collected with the ATLAS detector at the Large Hadron Collider in proton-proton collisions at a center-of-mass energy of 8 TeV, corresponding to an integrated luminosity of 20.3 fb−1. The correlation between the top and antitop quark spins is extracted from dilepton tt¯ events by using the difference in azimuthal angle between the two charged leptons in the laboratory frame. In the helicity basis the measured degree of correlation corresponds to Ahelicity=0.38±0.04, in agreement with the Standard Model prediction. A search is performed for pair production of top squarks with masses close to the top quark mass decaying to predominantly right-handed top quarks and a light neutralino, the lightest supersymmetric particle. Top squarks with masses between the top quark mass and 191 GeV are excluded at the 95% confidence level.