853 resultados para Detection and representation
Resumo:
The advent of molecular biology has had a dramatic impact on all aspects of biology, not least applied microbial ecology. Microbiological testing of water has traditionally depended largely on culture techniques. Growing understanding that only a small proportion of microbial species are culturable, and that many microorganisms may attain a viable but non-culturable state, has promoted the development of novel approaches to monitoring pathogens in the environment. This has been paralleled by an increased awareness of the surprising genetic diversity of natural microbial populations. By targeting gene sequences that are specific for particular microorganisms, for example genes that encode diagnostic enzymes, or species-specific domains of conserved genes such as 16S ribosomal RNA coding sequences (rrn genes), the problems of culture can be avoided. Technical developments, notably in the area of in vitro amplification of DNA using the polymerase chain reaction (PCR), now permit routine detection and identification of specific microorganisms, even when present in very low numbers. Although the techniques of molecular biology have provided some very powerful tools for environmental microbiology, it should not be forgotten that these have their own drawbacks and biases in sampling. For example, molecular techniques are dependent on efficient lysis and recovery of nucleic acids from both vegetative forms and spores of microbial species that may differ radically when growing in the laboratory compared with the natural environment. Furthermore, PCR amplification can introduce its own bias depending on the nature of the oligonucleotide primers utilised. However, despite these potential caveats, it seems likely that a molecular biological approach, particularly with its potential for automation, will provide the mainstay of diagnostic technology for the foreseeable future.
Resumo:
It is widely recognised that conventional culture techniques may underestimate true viable bacterial numbers by several orders of magnitude. The basis of this discrepancy is that a culture in or on media of high nutrient concentration is highly selective (either through ”nutrient shock” or failure to provide vital co-factors) and decreases apparent diversity; thus it is unrepresentative of the natural community. In addition, the non-culturable but viable state (NCBV) is a strategy adopted by some bacteria as a response to environmental stress. The basis for the non-culturable state is that cells placed in conditions present in the environment cannot be recultured but can be shown to maintain their viability. Consequently, these cells would not be detected by standard water quality techniques that are based on culture. In the case of pathogens, it may explain outbreaks of disease in populations that have not come into contact with the pathogen. However, the NCBV state is difficult to attribute, due to the failure to distinguish between NCBV and non-viable cells. This article will describe experiences with the fish pathogen Aeromonas salmonicida subsp. salmonicida and the application of molecular techniques for its detection and physiological analysis.
Resumo:
Tastes and odours are amongst the few water quality standards immediately apparent to a consumer and, as a result, account for most consumer complaints about water quality. Although taste and odour problems can arise from a great many sources, from an operational point of view they are either ”predictable” or ”unpredictable”. The former - which include problems related to actinomycete and algal growth - have a tendency to occur in certain types of water under certain combinations of conditions, whereas the latter - typically chemical spills - can occur anywhere. Long-term control is one option for predictable problems, although biomanipulation on a large scale has had utile success. Detection and avoidance is a more practicable option for both predictable and unpredictable problems, particularly if the distribution network can be serviced from other sources. Where these are not feasible, then water treatment, typically using activated carbon, is possible. In general there is a reasonable understanding of what compounds cause taste and odour problems, and how to treat these. An efficient taste and odour control programme therefore relies ultimately on good management of existing resources. However, a number of problems lie outside the remit of water supply companies and will require more fundamental regulation of activities in the catchment.
Resumo:
We present a simple and practical method for the single-ended distributed fiber temperature measurements using microwave (11-GHz) coherent detection and the instantaneous frequency measurement (IFM) technique to detect spontaneous Brillouin backscattered signal in which a specially designed rf bandpass filter at 11 GHz is used as a frequency discriminator to transform frequency shift to intensity fluctuation. A Brillouin temperature signal can be obtained at 11 GHz over a sensing length of 10 km. The power sensitivity dependence on temperature induced by frequency shift is measured as 2.66%/K. (c) 2007 Society of Photo-Optical Instrumentation Engineers.
Resumo:
abstract {We present a simple and practical method for the single-ended distributed fiber temperature measurements using microwave (11-GHz) coherent detection and the instantaneous frequency measurement (IFM) technique to detect spontaneous Brillouin backscattered signal in which a specially designed rf bandpass filter at 11 GHz is used as a frequency discriminator to transform frequency shift to intensity fluctuation. A Brillouin temperature signal can be obtained at 11 GHz over a sensing length of 10 km. The power sensitivity dependence on temperature induced by frequency shift is measured as 2.66%/K. © 2007 Society of Photo-Optical Instrumentation Engineers.}
Resumo:
The objective of the work was to develop a non-invasive methodology for image acquisition, processing and nonlinear trajectory analysis of the collective fish response to a stochastic event. Object detection and motion estimation were performed by an optical flow algorithm in order to detect moving fish and simultaneously eliminate background, noise and artifacts. The Entropy and the Fractal Dimension (FD) of the trajectory followed by the centroids of the groups of fish were calculated using Shannon and permutation Entropy and the Katz, Higuchi and Katz-Castiglioni's FD algorithms respectively. The methodology was tested on three case groups of European sea bass (Dicentrarchus labrax), two of which were similar (C1 control and C2 tagged fish) and very different from the third (C3, tagged fish submerged in methylmercury contaminated water). The results indicate that Shannon entropy and Katz-Castiglioni were the most sensitive algorithms and proved to be promising tools for the non-invasive identification and quantification of differences in fish responses. In conclusion, we believe that this methodology has the potential to be embedded in online/real time architecture for contaminant monitoring programs in the aquaculture industry.
Resumo:
NOAA’s National Centers for Coastal Ocean Science Biogeography Branch has mapped and characterized large portions of the coral reef ecosystems inside the U.S. coastal and territorial waters, including the U.S. Caribbean. The complementary protocols used in these efforts have enabled scientists and managers to quantitatively and qualitatively compare marine ecosystems in tropical U.S. waters. The Biogeography Branch used similar protocols to generate new benthic habitat maps for Fish Bay, Coral Bay and the St. Thomas East End Reserve (STEER). While this mapping effort marks the third time that some of these shallow-water habitats (≤40 m) have been mapped, it is the first time that nearly 100% of the seafloor has been characterized in each of these areas. It is also the first time that high resolution imagery describing seafloor depth has been collected in each of these areas. Consequently, these datasets provide new information describing the distribution of coral reef ecosystems and serve as a spatial baseline for monitoring change in the Fish Bay, Coral Bay and the STEER. Benthic habitat maps were developed for approximately 64.3 square kilometers of seafloor in and around Fish Bay, Coral Bay and the STEER. Twenty seven percent (17.5 square kilometers) of these habitat maps describe the seafloor inside the boundaries of the STEER, the Virgin Islands National Park and the Virgin Islands Coral Reef National Monument. The remaining 73% (46.8 square kilometers) describe the seafloor outside of these MPA boundaries. These habitat maps were developed using a combination of semi-automated and manual classification methods. Habitats were interpreted from aerial photographs and LiDAR (Light Detection and Ranging) imagery. In total, 155 distinct combinations of habitat classes describing the geology and biology of the seafloor were identified from the source imagery.
Resumo:
Detection and perception of ecological relationships between biota and their surrounding habitats is sensitive to analysis scale and resolution of habitat data. We measured strength of univariate linear correlations between reef fish and seascape variables at multiple spatial scales (25 to 800 m). Correlation strength was used to identify the scale that best associates fish to their surrounding habitat. To evaluate the influence of map resolution, seascape variables were calculated based on 4 separate benthic maps produced using 2 levels of spatial and thematic resolution, respectively. Individual seascape variables explained only 25% of the variability in fish distributions. Length of reef edge was correlated with more aspects of the fish assemblage than other features. Area of seagrass and bare sand correlated with distribution of many fish, not just obligate users. No fish variables correlated with habitat diversity. Individual fish species achieved a wider range of correlations than mobility guilds or the entire fish assemblage. Scales of peak correlation were the same for juveniles and adults in a majority of comparisons. Highly mobile species exhibited broader scales of peak correlation than either resident or moderately mobile fish. Use of different input maps changed perception of the strength and even the scale of peak correlations for many comparisons involving hard bottom edge length and area of sand, whereas results were consistent regardless of map type for comparisons involving area of seagrass and habitat diversity.
Resumo:
Nonindigenous species (NIS) are a major threat to marine ecosystems, with possible dramatic effects on biodiversity, biological productivity, habitat structure and fisheries. The Papahānaumokuākea Marine National Monument (PMNM) has taken active steps to mitigate the threats of NIS in Northwestern Hawaiian Islands (NWHI). Of particular concern are the 13 NIS already detected in NWHI and two invasive species found among the main Hawaiian Islands, snowflake coral (Carijoa riseii) and a red alga (Hypnea musciformis). Much of the information regarding NIS in NWHI has been collected or informed by surveys using conventional SCUBA or fishing gear. These technologies have significant drawbacks. SCUBA is generally constrained to depths shallower than 40 m and several NIS of concern have been detected well below this limit (e.g., L. kasmira – 256 m) and fishing gear is highly selective. Consequently, not all habitats or species can be properly represented. Effective management of NIS requires knowledge of their spatial distribution and abundance over their entire range. Surveys which provide this requisite information can be expensive, especially in the marine environment and even more so in deepwater. Technologies which minimize costs, increase the probability of detection and are capable of satisfying multiple objectives simultaneously are desired. This report examines survey technologies, with a focus on towed camera systems (TCSs), and modeling techniques which can increase NIS detection and sampling efficiency in deepwater habitats of NWHI; thus filling a critical data gap in present datasets. A pilot study conducted in 2008 at French Frigate Shoals and Brooks Banks was used to investigate the application of TCSs for surveying NIS in habitats deeper than 40 m. Cost and data quality were assessed. Over 100 hours of video was collected, in which 124 sightings of NIS were made among benthic habitats from 20 to 250 m. Most sightings were of a single cosmopolitan species, Lutjanus kasmira, but Cephalopholis argus, and Lutjanus fulvus, were also detected. The data expand the spatial distributions of observed NIS into deepwater habitats, identify algal plain as an important habitat and complement existing data collected using SCUBA and fishing gear. The technology’s principal drawback was its inability to identify organisms of particular concern, such as Carijoa riseii and Hypnea musciformis due to inadequate camera resolution and inability to thoroughly inspect sites. To solve this issue we recommend incorporating high-resolution cameras into TCSs, or using alternative technologies, such as technical SCUBA diving or remotely operated vehicles, in place of TCSs. We compared several different survey technologies by cost and their ability to detect NIS and these results are summarized in Table 3.
Resumo:
The Indo-Pacific lionfishes, Pterois miles and P. volitans, are now established along the Southeast U.S. and Caribbean and are expected to expand into the Gulf of Mexico and Central and South America. Prior to this invasion little was known regarding the biology and ecology of these lionfishes. I provide a synopsis of chronology, taxonomy, local abundance, reproduction, early life history and dispersal, venomology, feeding ecology, parasitology, potential impacts, and possible control and management strategies for the lionfish invasion. This information was collected by review of the literature and by direct field and experimental study. I confirm the existence of an unusual supraocular tentacle phenotype and suggest that the high prevalence of this phenotype in the Atlantic is not the result of selection, but likely ontogenetic change. To describe the trophic impacts of lionfish, I report a comprehensive assessment of diet that describes lionfish as a generalist piscivore that preys on over 40 species of teleost comprising more than 20 families. Next, I use the histology of gonads to describe both oogenesis and reproductive dynamics of lionfish. Lionfish mature relatively early and reproduce several times per month throughout the entire calendar year off North Carolina and the Bahamas. To investigate predation, an important component of natural mortality, I assessed the vulnerability of juvenile lionfish to predation by native serranids. Juvenile lionfish are not readily consumed by serranids, even after extreme periods of starvation. Last, I used a stage-based, matrix population model to estimate the scale of control that would be needed to reduce an invading population of lionfish. Together, this research provides the first comprehensive assessment on lionfish biology and ecology and explains a number of life history and ecological interactions that have facilitated the unprecedented and rapid establishment of this invasive finfish. Future research is needed to understand the scale of impacts that lionfish could cause, especially in coral reef ecosystems, which are already heavily stressed. This research further demonstrates the need for lionfish control strategies and more rigorous prevention and early detection and rapid response programs for marine non-native introductions.
Resumo:
A dynamic programming algorithm for joint data detection and carrier phase estimation of continuous-phase-modulated signal is presented. The intent is to combine the robustness of noncoherent detectors with the superior performance of coherent ones. The algorithm differs from the Viterbi algorithm only in the metric that it maximizes over the possible transmitted data sequences. This metric is influenced both by the correlation with the received signal and the current estimate of the carrier phase. Carrier-phase estimation is based on decision guiding, but there is no external phase-locked loop. Instead, the phase of the best complex correlation with the received signal over the last few signaling intervals is used. The algorithm is slightly more complex than the coherent Viterbi algorithm but does not require narrowband filtering of the recovered carrier, as earlier appproaches did, to achieve the same level of performance.
Resumo:
An approach to reconfiguring control systems in the event of major failures is advocated. The approach relies on the convergence of several technologies which are currently emerging: Constrained predictive control, High-fidelity modelling of complex systems, Fault detection and identification, and Model approximation and simplification. Much work is needed, both theoretical and algorithmic, to make this approach practical, but we believe that there is enough evidence, especially from existing industrial practice, for the scheme to be considered realistic. After outlining the problem and proposed solution, the paper briefly reviews constrained predictive control and object-oriented modelling, which are the essential ingredients for practical implementation. The prospects for automatic model simplification are also reviewed briefly. The paper emphasizes some emerging trends in industrial practice, especially as regards modelling and control of complex systems. Examples from process control and flight control are used to illustrate some of the ideas.
Resumo:
There are over 600,000 bridges in the US, and not all of them can be inspected and maintained within the specified time frame. This is because manually inspecting bridges is a time-consuming and costly task, and some state Departments of Transportation (DOT) cannot afford the essential costs and manpower. In this paper, a novel method that can detect large-scale bridge concrete columns is proposed for the purpose of eventually creating an automated bridge condition assessment system. The method employs image stitching techniques (feature detection and matching, image affine transformation and blending) to combine images containing different segments of one column into a single image. Following that, bridge columns are detected by locating their boundaries and classifying the material within each boundary in the stitched image. Preliminary test results of 114 concrete bridge columns stitched from 373 close-up, partial images of the columns indicate that the method can correctly detect 89.7% of these elements, and thus, the viability of the application of this research.
Resumo:
The automated detection of structural elements (e.g., columns and beams) from visual data can be used to facilitate many construction and maintenance applications. The research in this area is under initial investigation. The existing methods solely rely on color and texture information, which makes them unable to identify each structural element if these elements connect each other and are made of the same material. The paper presents a novel method of automated concrete column detection from visual data. The method overcomes the limitation by combining columns’ boundary information with their color and texture cues. It starts from recognizing long vertical lines in an image/video frame through edge detection and Hough transform. The bounding rectangle for each pair of lines is then constructed. When the rectangle resembles the shape of a column and the color and texture contained in the pair of lines are matched with one of the concrete samples in knowledge base, a concrete column surface is assumed to be located. This way, one concrete column in images/videos is detected. The method was tested using real images/videos. The results are compared with the manual detection ones to indicate the method’s validity.
Resumo:
After earthquakes, licensed inspectors use the established codes to assess the impact of damage on structural elements. It always takes them days to weeks. However, emergency responders (e.g. firefighters) must act within hours of a disaster event to enter damaged structures to save lives, and therefore cannot wait till an official assessment completes. This is a risk that firefighters have to take. Although Search and Rescue Organizations offer training seminars to familiarize firefighters with structural damage assessment, its effectiveness is hard to guarantee when firefighters perform life rescue and damage assessment operations together. Also, the training is not available to every firefighter. The authors therefore proposed a novel framework that can provide firefighters with a quick but crude assessment of damaged buildings through evaluating the visible damage on their critical structural elements (i.e. concrete columns in the study). This paper presents the first step of the framework. It aims to automate the detection of concrete columns from visual data. To achieve this, the typical shape of columns (long vertical lines) is recognized using edge detection and the Hough transform. The bounding rectangle for each pair of long vertical lines is then formed. When the resulting rectangle resembles a column and the material contained in the region of two long vertical lines is recognized as concrete, the region is marked as a concrete column surface. Real video/image data are used to test the method. The preliminary results indicate that concrete columns can be detected when they are not distant and have at least one surface visible.