937 resultados para Underwater pipeline inspection


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a motion control system for guidance of an underactuated Unmanned Underwater Vehicle (UUV) on a helical trajectory. The control strategy is developed using Port-Hamiltonian theory and interconnection and damping assignment passivity-based control. Using energy routing, the trajectory of a virtual fully actuated plant is guided onto a vector field. A tracking controller is then used that commands the underactuated plant to follow the velocity of the virtual plant. An integral control is inserted between the two control layers, which adds robustness and disturbance rejection to the design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a motion control system for tracking of attitude and speed of an underactuated slender-hull unmanned underwater vehicle. The feedback control strategy is developed using the Port-Hamiltonian theory. By shaping of the target dynamics (desired dynamic response in closed loop) with particular attention to the target mass matrix, the influence of the unactuated dynamics on the controlled system is suppressed. This results in achievable dynamics independent of stable uncontrolled states. Throughout the design, the insight of the physical phenomena involved is used to propose the desired target dynamics. Integral action is added to the system for robustness and to reject steady disturbances. This is achieved via a change of coordinates that result in input-to-state stable (ISS) target dynamics. As a final step in the design, an anti-windup scheme is implemented to account for limited actuator capacity, namely saturation. The performance of the design is demonstrated through simulation with a high-fidelity model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Tuberculosis still remains one of the largest killer infectious diseases, warranting the identification of newer targets and drugs. Identification and validation of appropriate targets for designing drugs are critical steps in drug discovery, which are at present major bottle-necks. A majority of drugs in current clinical use for many diseases have been designed without the knowledge of the targets, perhaps because standard methodologies to identify such targets in a high-throughput fashion do not really exist. With different kinds of 'omics' data that are now available, computational approaches can be powerful means of obtaining short-lists of possible targets for further experimental validation. Results: We report a comprehensive in silico target identification pipeline, targetTB, for Mycobacterium tuberculosis. The pipeline incorporates a network analysis of the protein-protein interactome, a flux balance analysis of the reactome, experimentally derived phenotype essentiality data, sequence analyses and a structural assessment of targetability, using novel algorithms recently developed by us. Using flux balance analysis and network analysis, proteins critical for survival of M. tuberculosis are first identified, followed by comparative genomics with the host, finally incorporating a novel structural analysis of the binding sites to assess the feasibility of a protein as a target. Further analyses include correlation with expression data and non-similarity to gut flora proteins as well as 'anti-targets' in the host, leading to the identification of 451 high-confidence targets. Through phylogenetic profiling against 228 pathogen genomes, shortlisted targets have been further explored to identify broad-spectrum antibiotic targets, while also identifying those specific to tuberculosis. Targets that address mycobacterial persistence and drug resistance mechanisms are also analysed. Conclusion: The pipeline developed provides rational schema for drug target identification that are likely to have high rates of success, which is expected to save enormous amounts of money, resources and time in the drug discovery process. A thorough comparison with previously suggested targets in the literature demonstrates the usefulness of the integrated approach used in our study, highlighting the importance of systems-level analyses in particular. The method has the potential to be used as a general strategy for target identification and validation and hence significantly impact most drug discovery programmes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reflection and transmission coefficients of rubberized coir pads over the frequency band 200 kHz to 4 MHz are presented in this Paper. These results are compared with those reported for neoprene, paraffin wax, rubber car mat and plastic door mat1. The rubberized coir pads were found to possess wideband absorption characteristics. It has been experimentally found that 0.05 m thick coir pads have almost 100% absorption in the frequency range 800 kHz-3 MHz with a maximum at 2.35 MHz. We have used this material for lining the water tank for underwater acoustic studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Replacement of deteriorated water pipes is a capital-intensive activity for utility companies. Replacement planning aims to minimize total costs while maintaining a satisfactory level of service and is usually conducted for individual pipes. Scheduling replacement in groups is seen to be a better method and has the potential to provide benefits such as the reduction of maintenance costs and service interruptions. However, developing group replacement schedules is a complex task and often beyond the ability of a human expert, especially when multiple or conflicting objectives need to be catered for, such as minimization of total costs and service interruptions. This paper describes the development of a novel replacement decision optimization model for group scheduling (RDOM-GS), which enables multiple group-scheduling criteria by integrating new cost functions, a service interruption model, and optimization algorithms into a unified procedure. An industry case study demonstrates that RDOM-GS can improve replacement planning significantly and reduce costs and service interruptions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper describes the application of the pipelining principle to the realization of an analogue-to-ternary converter. The circuit shows a considerable saving in hard-ware compared with an earlier proposed circuit. The main hardware components used are analogue comparators, subtractors and the delay elements; hence this method of A/T conversion can operate at a higher sampling frequency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 1999, the Department of Employment, Economic Development and Innovation (DEEDI), Fisheries Queensland undertook a new initiative to collect long term monitoring data of various important stocks including reef fish. This data and monitoring manual for the reef fish component of that program which was based on Underwater Visual Census methodology of 24 reefs on the Great Barrier Reef between 1999 and 2004. Data was collected using six 50m x 5m transects at 4 sites on 24 reefs. Benthic cover type was also recorded for 10m of each transect. The attached Access Database contains 5 tables being: SITE DETAILS TABLE Survey year Data entry complete REF survey site ID Site # (1-4) Location (reef name) Site Date (date surveyed) Observer 1 (3 initials to identify who estimated fish lengths and recorded benthic cover) TRANSECT DETAILS Survey ID Transect Number (1-6) Time (the transect was surveyed) Visibility (in metres) Minimum Depth surveyed (m) Maximum Depth surveyed (m) Percent of survey completed (%) Comments SUBSTRATE Survey ID Transect Number (1-6) then % cover of each of eth following categories of benthic cover types Dead Coral Live Coral Soft Coral Rubble Sand Sponge Algae Sea Grass Other COORDINATES (over survey sites) from -14 38.792 to -19 44.233 and from 145 21.507 to 149 55.515 SIGHTINGS ID Survey ID Transect Number (1-6) CAAB Code Scientific Name Reef Fish Length (estimated Fork Length of fish; -1 = unknown or not recorded) Outside Transect (if a fish was observed outside a transect -1 was recorded) Morph Code (F = footballer morph for Plectropomus laevis, S = Spawning colour morph displayed)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Laboratory confirmation methods are important in bovine cysticerosis diagnosis as other pathologies can result in morphologically similar lesions resulting in false identifications. We developed a probe-based real-time PCR assay to identify Taenia saginata in suspect cysts encountered at meat inspection and compared its use with the traditional method of identification, histology, as well as a published nested PCR. The assay simultaneously detects T. saginata DNA and a bovine internal control using the cytochrome c oxidase subunit 1 gene of each species and shows specificity against parasites causing lesions morphologically similar to those of T. saginata. The assay was sufficiently sensitive to detect 1 fg (Ct 35.09 +/- 0.95) of target DNA using serially-diluted plasmid DNA in reactions spiked with bovine DNA as well as in all viable and caseated positive control cysts. A loss in PCR sensitivity was observed with increasing cyst degeneration as seen in other molecular methods. In comparison to histology, the assay offered greater sensitivity and accuracy with 10/19 (53%) T. saginata positives detected by real-time PCR and none by histology. When the results were compared with the reference PCR, the assay was less sensitive but offered advantages of faster turnaround times and reduced contamination risk. Estimates of the assay's repeatability and reproducibility showed the assay is highly reliable with reliability coefficients greater than 0.94. Crown Copyright (C) 2013 Published by Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A rough hydrophobic surface when immersed in water can result in a ``Cassie'' state of wetting in which the water is in contact with both the solid surface and the entrapped air. The sustainability of the entrapped air on such surfaces is important for underwater applications such as reduction of flow resistance in microchannels and drag reduction of submerged bodies such as hydrofoils. We utilize an optical technique based oil total internal reflection of light at the water-air interface to quantify the spatial distribution of trapped air oil such a surface and its variation with immersion time. With this technique, we evaluate the sustainability of the Cassie state on hydrophobic surfaces with four different kinds of textures. The textures studied are regular arrays of pillars, ridges, and holes that were created in silicon by a wet etching technique, and also a texture of random craters that was obtained through electrodischarge machining of aluminum. These surfaces were rendered hydrophobic with a self-assembled layer Of fluorooctyl trichlorosilane. Depending on the texture, the size and shape of the trapped air pockets were found to vary. However, irrespective of the texture, both the size and the number of air pockets were found to decrease with time gradually and eventually disappear, suggesting that the sustainability of the ``Cassie'' state is finite for all the microstructures Studied. This is possibly due to diffusion of air from the trapped air pockets into the water. The time scale for disappearance of air pockets was found to depend on the kind of microstructure and the hydrostatic pressure at the water-air interface. For the surface with a regular array of pillars, the air pockets were found to be in the form of a thin layer perched on top of the pillars with a large lateral extent compared to the spacing between pillars. For other surfaces studied, the air pockets are smaller and are of the same order as the characteristic length scale of the texture. Measurements for the surface with holes indicate that the time for air-pocket disappearance reduces as the hydrostatic pressure is increased.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of computer-aided inspection integrated with the coordinate measuring machine and laser scanners to inspect manufactured aircraft parts using robust registration of two-point datasets is a subject of active research in computational metrology. This paper presents a novel approach to automated inspection by matching shapes based on the modified iterative closest point (ICP) method to define a criterion for the acceptance or rejection of a part. This procedure improves upon existing methods by doing away with the following, viz., the need for constructing either a tessellated or smooth representation of the inspected part and requirements for an a priori knowledge of approximate registration and correspondence between the points representing the computer-aided design datasets and the part to be inspected. In addition, this procedure establishes a better measure for error between the two matched datasets. The use of localized region-based triangulation is proposed for tracking the error. The approach described improves the convergence of the ICP technique with a dramatic decrease in computational effort. Experimental results obtained by implementing this proposed approach using both synthetic and practical data show that the present method is efficient and robust. This method thereby validates the algorithm, and the examples demonstrate its potential to be used in engineering applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a solution based on message passing bipartite networks, for deep packet inspection, which addresses both speed and memory issues, which are limiting factors in current solutions. We report on a preliminary implementation and propose a parallel architecture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deep packet inspection is a technology which enables the examination of the content of information packets being sent over the Internet. The Internet was originally set up using “end-to-end connectivity” as part of its design, allowing nodes of the network to send packets to all other nodes of the network, without requiring intermediate network elements to maintain status information about the transmission. In this way, the Internet was created as a “dumb” network, with “intelligent” devices (such as personal computers) at the end or “last mile” of the network. The dumb network does not interfere with an application's operation, nor is it sensitive to the needs of an application, and as such it treats all information sent over it as (more or less) equal. Yet, deep packet inspection allows the examination of packets at places on the network which are not endpoints, In practice, this permits entities such as Internet service providers (ISPs) or governments to observe the content of the information being sent, and perhaps even manipulate it. Indeed, the existence and implementation of deep packet inspection may challenge profoundly the egalitarian and open character of the Internet. This paper will firstly elaborate on what deep packet inspection is and how it works from a technological perspective, before going on to examine how it is being used in practice by governments and corporations. Legal problems have already been created by the use of deep packet inspection, which involve fundamental rights (especially of Internet users), such as freedom of expression and privacy, as well as more economic concerns, such as competition and copyright. These issues will be considered, and an assessment of the conformity of the use of deep packet inspection with law will be made. There will be a concentration on the use of deep packet inspection in European and North American jurisdictions, where it has already provoked debate, particularly in the context of discussions on net neutrality. This paper will also incorporate a more fundamental assessment of the values that are desirable for the Internet to respect and exhibit (such as openness, equality and neutrality), before concluding with the formulation of a legal and regulatory response to the use of this technology, in accordance with these values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The first quarter of the 20th century witnessed a rebirth of cosmology, study of our Universe, as a field of scientific research with testable theoretical predictions. The amount of available cosmological data grew slowly from a few galaxy redshift measurements, rotation curves and local light element abundances into the first detection of the cos- mic microwave background (CMB) in 1965. By the turn of the century the amount of data exploded incorporating fields of new, exciting cosmological observables such as lensing, Lyman alpha forests, type Ia supernovae, baryon acoustic oscillations and Sunyaev-Zeldovich regions to name a few. -- CMB, the ubiquitous afterglow of the Big Bang, carries with it a wealth of cosmological information. Unfortunately, that information, delicate intensity variations, turned out hard to extract from the overall temperature. Since the first detection, it took nearly 30 years before first evidence of fluctuations on the microwave background were presented. At present, high precision cosmology is solidly based on precise measurements of the CMB anisotropy making it possible to pinpoint cosmological parameters to one-in-a-hundred level precision. The progress has made it possible to build and test models of the Universe that differ in the way the cosmos evolved some fraction of the first second since the Big Bang. -- This thesis is concerned with the high precision CMB observations. It presents three selected topics along a CMB experiment analysis pipeline. Map-making and residual noise estimation are studied using an approach called destriping. The studied approximate methods are invaluable for the large datasets of any modern CMB experiment and will undoubtedly become even more so when the next generation of experiments reach the operational stage. -- We begin with a brief overview of cosmological observations and describe the general relativistic perturbation theory. Next we discuss the map-making problem of a CMB experiment and the characterization of residual noise present in the maps. In the end, the use of modern cosmological data is presented in the study of an extended cosmological model, the correlated isocurvature fluctuations. Current available data is shown to indicate that future experiments are certainly needed to provide more information on these extra degrees of freedom. Any solid evidence of the isocurvature modes would have a considerable impact due to their power in model selection.