853 resultados para Detection and representation
OFDM joint data detection and phase noise cancellation based on minimum mean square prediction error
Resumo:
This paper proposes a new iterative algorithm for orthogonal frequency division multiplexing (OFDM) joint data detection and phase noise (PHN) cancellation based on minimum mean square prediction error. We particularly highlight the relatively less studied problem of "overfitting" such that the iterative approach may converge to a trivial solution. Specifically, we apply a hard-decision procedure at every iterative step to overcome the overfitting. Moreover, compared with existing algorithms, a more accurate Pade approximation is used to represent the PHN, and finally a more robust and compact fast process based on Givens rotation is proposed to reduce the complexity to a practical level. Numerical Simulations are also given to verify the proposed algorithm. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
This correspondence proposes a new algorithm for the OFDM joint data detection and phase noise (PHN) cancellation for constant modulus modulations. We highlight that it is important to address the overfitting problem since this is a major detrimental factor impairing the joint detection process. In order to attack the overfitting problem we propose an iterative approach based on minimum mean square prediction error (MMSPE) subject to the constraint that the estimated data symbols have constant power. The proposed constrained MMSPE algorithm (C-MMSPE) significantly improves the performance of existing approaches with little extra complexity being imposed. Simulation results are also given to verify the proposed algorithm.
Resumo:
There is a rising demand for the quantitative performance evaluation of automated video surveillance. To advance research in this area, it is essential that comparisons in detection and tracking approaches may be drawn and improvements in existing methods can be measured. There are a number of challenges related to the proper evaluation of motion segmentation, tracking, event recognition, and other components of a video surveillance system that are unique to the video surveillance community. These include the volume of data that must be evaluated, the difficulty in obtaining ground truth data, the definition of appropriate metrics, and achieving meaningful comparison of diverse systems. This chapter provides descriptions of useful benchmark datasets and their availability to the computer vision community. It outlines some ground truth and evaluation techniques, and provides links to useful resources. It concludes by discussing the future direction for benchmark datasets and their associated processes.
Resumo:
Deception-detection is the crux of Turing’s experiment to examine machine thinking conveyed through a capacity to respond with sustained and satisfactory answers to unrestricted questions put by a human interrogator. However, in 60 years to the month since the publication of Computing Machinery and Intelligence little agreement exists for a canonical format for Turing’s textual game of imitation, deception and machine intelligence. This research raises from the trapped mine of philosophical claims, counter-claims and rebuttals Turing’s own distinct five minutes question-answer imitation game, which he envisioned practicalised in two different ways: a) A two-participant, interrogator-witness viva voce, b) A three-participant, comparison of a machine with a human both questioned simultaneously by a human interrogator. Using Loebner’s 18th Prize for Artificial Intelligence contest, and Colby et al.’s 1972 transcript analysis paradigm, this research practicalised Turing’s imitation game with over 400 human participants and 13 machines across three original experiments. Results show that, at the current state of technology, a deception rate of 8.33% was achieved by machines in 60 human-machine simultaneous comparison tests. Results also show more than 1 in 3 Reviewers succumbed to hidden interlocutor misidentification after reading transcripts from experiment 2. Deception-detection is essential to uncover the increasing number of malfeasant programmes, such as CyberLover, developed to steal identity and financially defraud users in chatrooms across the Internet. Practicalising Turing’s two tests can assist in understanding natural dialogue and mitigate the risk from cybercrime.
Resumo:
Internal bacterial communities of synanthropic mites Acarus siro, Dermatophagoides farinae, Lepidoglyphus destructor, and Tyrophagus putrescentiae (Acari: Astigmata) were analyzed by culturing and culture-independent approaches from specimens obtained from laboratory colonies. Homogenates of surface-sterilized mites were used for cultivation on non-selective agar and DNA extraction. Isolated bacteria were identified by sequencing of the 16S rRNA gene. PCR amplified 16S rRNA genes were analyzed by terminal restriction fragment length polymorphism analysis (T-RFLP) and cloning sequencing. Fluorescence in situ hybridization using universal bacterial probes was used for direct bacterial localization. T-RFLP analysis of 16S rRNA gene revealed distinct species-specific bacterial communities. The results were further confirmed by cloning and sequencing (284 clones). L. destructor and D. farinae showed more diverse communities then A. siro and T. putrescentiae. In the cultivated part of the community, the mean CFUs from four mite species ranged from 5.2 × 102 to 1.4 × 103 per mite. D. farinae had significantly higher CFUs than the other species. Bacteria were located in the digestive and reproductive tract, parenchymatical tissue, and in bacteriocytes. Among the clones, Bartonella-like bacteria occurring in A. siro and T. putresecentiae represented a distinct group related to Bartonellaceae and to Bartonella-like symbionts of ants. The clones of high similarity to Xenorhabdus cabanillasii were found in L. destructor and D. farinae, and one clone related to Photorhabdus temperata in A. siro. Members of Sphingobacteriales cloned from D. farinae and A. siro clustered with the sequences of “Candidatus Cardinium hertigii” and as a separate novel cluster.
Resumo:
The Intergovernmental Panel on Climate Change fourth assessment report, published in 2007 came to a more confident assessment of the causes of global temperature change than previous reports and concluded that ‘it is likely that there has been significant anthropogenic warming over the past 50 years averaged over each continent except Antarctica.’ Since then, warming over Antarctica has also been attributed to human influence, and further evidence has accumulated attributing a much wider range of climate changes to human activities. Such changes are broadly consistent with theoretical understanding, and climate model simulations, of how the planet is expected to respond. This paper reviews this evidence from a regional perspective to reflect a growing interest in understanding the regional effects of climate change, which can differ markedly across the globe. We set out the methodological basis for detection and attribution and discuss the spatial scales on which it is possible to make robust attribution statements. We review the evidence showing significant human-induced changes in regional temperatures, and for the effects of external forcings on changes in the hydrological cycle, the cryosphere, circulation changes, oceanic changes, and changes in extremes. We then discuss future challenges for the science of attribution. To better assess the pace of change, and to understand more about the regional changes to which societies need to adapt, we will need to refine our understanding of the effects of external forcing and internal variability
Resumo:
Diaminofluoresceins are widely used probes for detection and intracellular localization of NO formation in cultured/isolated cells and intact tissues. The fluorinated derivative, 4-amino-5-methylamino-2′,7′-difluorofluorescein (DAF-FM), has gained increasing popularity in recent years due to its improved NO-sensitivity, pH-stability, and resistance to photo-bleaching compared to the first-generation compound, DAF-2. Detection of NO production by either reagent relies on conversion of the parent compound into a fluorescent triazole, DAF-FM-T and DAF-2-T, respectively. While this reaction is specific for NO and/or reactive nitrosating species, it is also affected by the presence of oxidants/antioxidants. Moreover, the reaction with other molecules can lead to the formation of fluorescent products other than the expected triazole. Thus additional controls and structural confirmation of the reaction products are essential. Using human red blood cells as an exemplary cellular system we here describe robust protocols for the analysis of intracellular DAF-FM-T formation using an array of fluorescence-based methods (laser-scanning fluorescence microscopy, flow cytometry and fluorimetry) and analytical separation techniques (reversed-phase HPLC and LC-MS/MS). When used in combination, these assays afford unequivocal identification of the fluorescent signal as being derived from NO and are applicable to most other cellular systems without or with only minor modifications.
Resumo:
We designed FISH-probes for two distinct microsporidian clades and demonstrated their application in detecting respectively Nosema/Vairimorpha and Dictyoceola species. We applied them to study the vertical transmission of two microsporidia infecting the amphipod Gammarus duebeni
Resumo:
Safety is an element of extreme priority in mining operations, currently many traditional mining countries are investing in the implementation of wireless sensors capable of detecting risk factors; through early warning signs to prevent accidents and significant economic losses. The objective of this research is to contribute to the implementation of sensors for continuous monitoring inside underground mines providing technical parameters for the design of sensor networks applied in underground coal mines. The application of sensors capable of measuring in real time variables of interest, promises to be of great impact on safety for mining industry. The relationship between the geological conditions and mining method design, establish how to implement a system of continuous monitoring. In this paper, the main causes of accidents for underground coal mines are established based on existing worldwide reports. Variables (temperature, gas, structural faults, fires) that can be related to the most frequent causes of disaster and its relevant measuring range are then presented, also the advantages, management and mining operations are discussed, including the analyzed of applying these systems in terms of Benefit, Opportunity, Cost and Risk. The publication focuses on coal mining, based on the proportion of these events a year worldwide, where a significant number of workers are seriously injured or killed. Finally, a dynamic assessment of safety at underground mines it is proposed, this approach offers a contribution to design personalized monitoring networks, the experience developed in coal mines provides a tool that facilitates the application development of technology within underground coal mines.
Resumo:
The variability of results from different automated methods of detection and tracking of extratropical cyclones is assessed in order to identify uncertainties related to the choice of method. Fifteen international teams applied their own algorithms to the same dataset—the period 1989–2009 of interim European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis (ERAInterim) data. This experiment is part of the community project Intercomparison of Mid Latitude Storm Diagnostics (IMILAST; see www.proclim.ch/imilast/index.html). The spread of results for cyclone frequency, intensity, life cycle, and track location is presented to illustrate the impact of using different methods. Globally, methods agree well for geographical distribution in large oceanic regions, interannual variability of cyclone numbers, geographical patterns of strong trends, and distribution shape for many life cycle characteristics. In contrast, the largest disparities exist for the total numbers of cyclones, the detection of weak cyclones, and distribution in some densely populated regions. Consistency between methods is better for strong cyclones than for shallow ones. Two case studies of relatively large, intense cyclones reveal that the identification of the most intense part of the life cycle of these events is robust between methods, but considerable differences exist during the development and the dissolution phases.
Resumo:
Northern Hemisphere cyclone activity is assessed by applying an algorithm for the detection and tracking of synoptic scale cyclones to mean sea level pressure data. The method, originally developed for the Southern Hemisphere, is adapted for application in the Northern Hemisphere winter season. NCEP-Reanalysis data from 1958/59 to 1997/98 are used as input. The sensitivities of the results to particular parameters of the algorithm are discussed for both case studies and from a climatological point of view. Results show that the choice of settings is of major relevance especially for the tracking of smaller scale and fast moving systems. With an appropriate setting the algorithm is capable of automatically tracking different types of cyclones at the same time: Both fast moving and developing systems over the large ocean basins and smaller scale cyclones over the Mediterranean basin can be assessed. The climatology of cyclone variables, e.g., cyclone track density, cyclone counts, intensification rates, propagation speeds and areas of cyclogenesis and -lysis gives detailed information on typical cyclone life cycles for different regions. The lowering of the spatial and temporal resolution of the input data from full resolution T62/06h to T42/12h decreases the cyclone track density and cyclone counts. Reducing the temporal resolution alone contributes to a decline in the number of fast moving systems, which is relevant for the cyclone track density. Lowering spatial resolution alone mainly reduces the number of weak cyclones.