876 resultados para high-turbidity coastal environments


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vision-based underwater navigation and obstacle avoidance demands robust computer vision algorithms, particularly for operation in turbid water with reduced visibility. This paper describes a novel method for the simultaneous underwater image quality assessment, visibility enhancement and disparity computation to increase stereo range resolution under dynamic, natural lighting and turbid conditions. The technique estimates the visibility properties from a sparse 3D map of the original degraded image using a physical underwater light attenuation model. Firstly, an iterated distance-adaptive image contrast enhancement enables a dense disparity computation and visibility estimation. Secondly, using a light attenuation model for ocean water, a color corrected stereo underwater image is obtained along with a visibility distance estimate. Experimental results in shallow, naturally lit, high-turbidity coastal environments show the proposed technique improves range estimation over the original images as well as image quality and color for habitat classification. Furthermore, the recursiveness and robustness of the technique allows implementation onboard an Autonomous Underwater Vehicle for improving navigation and obstacle avoidance performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sedimentation and high turbidity have long been considered a major threat to corals, causing world-wide concern for the health of coral reefs in coastal environments. While studies have demonstrated that sediment conditions characteristic of inshore reefs cause stress in corals, the consequences of such conditions for the physiological status of corals require testing in field situations. Here, I compare the size of energy stores (as lipid content), a proxy for physiological condition, of 2 coral species (Turbinaria mesenterina and Acropora valida) between coastal and offshore environments. Corals on coastal reefs contained 4-fold (T mesenterina) and 2-fold (A. valida) more lipid than conspecifics offshore, despite 1 order of magnitude higher turbidity levels inshore. Results were consistent across 4 sites in each environment. Reproductive investment in A. valida (a seasonal mass spawner) did not vary between environments, suggesting that the larger lipid stores in corals on coastal reefs are mainly somatic energy reserves. These results demonstrate that the environmental conditions on inshore, high-turbidity reefs do not always impact negatively on the physiology of corals. The contrasting lipid levels of T. mesenterina between environments may explain its greater success on coastal reefs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purification of drinking water is routinely achieved by use of conventional coagulants and disinfection procedures. However, there are instances such as flood events when the level of turbidity reaches extreme levels while NOM may be an issue throughout the year. Consequently, there is a need to develop technologies which can effectively treat water of high turbidity during flood events and natural organic matter (NOM) content year round. It was our hypothesis that pebble matrix filtration potentially offered a relatively cheap, simple and reliable means to clarify such challenging water samples. Therefore, a laboratory scale pebble matrix filter (PMF) column was used to evaluate the turbidity and natural organic matter (NOM) pre-treatment performance in relation to 2013 Brisbane River flood water. Since the high turbidity was only a seasonal and short term problem, the general applicability of pebble matrix filters for NOM removal was also investigated. A 1.0 m deep bed of pebbles (the matrix) partly in-filled with either sand or crushed glass was tested, upon which was situated a layer of granular activated carbon (GAC). Turbidity was measured as a surrogate for suspended solids (SS), whereas, total organic carbon (TOC) and UV Absorbance at 254 nm were measured as surrogate parameters for NOM. Experiments using natural flood water showed that without the addition of any chemical coagulants, PMF columns achieved at least 50% turbidity reduction when the source water contained moderate hardness levels. For harder water samples, above 85% turbidity reduction was obtained. The ability to remove 50% turbidity without chemical coagulants may represent significant cost savings to water treatment plants and added environmental benefits accrue due to less sludge formation. A TOC reduction of 35-47% and UV-254 nm reduction of 24-38% was also observed. In addition to turbidity removal during flood periods, the ability to remove NOM using the pebble matrix filter throughout the year may have the benefit of reducing disinfection by-products (DBP) formation potential and coagulant demand at water treatment plants. Final head losses were remarkably low, reaching only 11 cm at a filtration velocity of 0.70 m/h.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The 2004 earthquake left several traces of coseismic land deformation and tsunami deposits, both on the islands along the plate boundary and distant shores of the Indian Ocean rim countries. Researchers are now exploring these sites to develop a chronology of past events. Where the coastal regions are also inundated by storm surges, there is an additional challenge to discriminate between the deposits formed by these two processes. Paleo-tsunami research relies largely on finding deposits where preservation potential is high and storm surge origin can be excluded. During the past decade of our work along the Andaman and Nicobar Islands and the east coast of India, we have observed that the 2004 tsunami deposits are best preserved in lagoons, inland streams and also on elevated terraces. Chronological evidence for older events obtained from such sites is better correlated with those from Thailand, Sri Lanka and Indonesia, reiterating their usefulness in tsunami geology studies. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The spectral reflectance of the sea surface recorded using ocean colour satellite sensors has been used to estimate chlorophyll-a concentrations for decades. However, in bio-optically complex coastal waters, these estimates are compromised by the presence of several other coloured components besides chlorophyll, especially in regions affected by low-salinity waters. The present work aims to (a) describe the influence of the freshwater plume from the La Plata River on the variability of in situ remote sensing reflectance and (b) evaluate the performance of operational ocean colour chlorophyll algorithms applied to Southwestern Atlantic waters, which receive a remarkable seasonal contribution from La Plata River discharges. Data from three oceanographic cruises are used, in addition to a historical regional bio-optical dataset. Deviations found between measured and estimated concentrations of chlorophyll-a are examined in relation to surface water salinity and turbidity gradients to investigate the source of errors in satellite estimates of pigment concentrations. We observed significant seasonal variability in surface reflectance properties that are strongly driven by La Plata River plume dynamics and arise from the presence of high levels of inorganic suspended solids and coloured dissolved materials. As expected, existing operational algorithms overestimate the concentration of chlorophyll-a, especially in waters of low salinity (S<33.5) and high turbidity (Rrs(670)>0.0012 sr−1). Additionally, an updated version of the regional algorithm is presented, which clearly improves the chlorophyll estimation in those types of coastal environment. In general, the techniques presented here allow us to directly distinguish the bio-optical types of waters to be considered in algorithm studies by the ocean colour community.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the key environmental concerns about shrimp farming is the discharge of waters with high levels of nutrients and suspended solids into adjacent waterways. In this paper we synthesize the results of our multidisciplinary research linking ecological processes in intensive shrimp ponds with their downstream impacts in tidal, mangrove-lined creeks. The incorporation of process measurements and bioindicators, in addition to water quality measurements, improved our understanding of the effect of shrimp farm discharges on the ecological health of the receiving water bodies. Changes in water quality parameters were an oversimplification of the ecological effects of water discharges, and use of key measures including primary production rates, phytoplankton responses to nutrients, community shifts in zooplankton and delta(15)N ratios in marine plants have the potential to provide more integrated and robust measures. Ultimately, reduction in nutrient discharges is most likely to ensure the future sustainability of the industry. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Shiga toxin-producing Escherichia coli (STEC) and enteropathogenic E. coli (EPEC) strains may be responsible for food-borne infections in humans. Twenty-eight STEC and 75 EPEC strains previously isolated from French shellfish-harvesting areas and their watersheds and belonging to 68 distinguishable serotypes were characterized in this study. High-throughput real-time PCR was used to search for the presence of 75 E. coli virulence-associated gene targets, and genes encoding Shiga toxin (stx) and intimin (eae) were subtyped using PCR tests and DNA sequencing, respectively. The results showed a high level of diversity between strains, with 17 unique virulence gene profiles for STEC and 56 for EPEC. Seven STEC and 15 EPEC strains were found to display a large number or a particular combination of genetic markers of virulence and the presence of stx and/or eae variants, suggesting their potential pathogenicity for humans. Among these, an O26:H11 stx1a eae-β1 strain was associated with a large number of virulence-associated genes (n = 47), including genes carried on the locus of enterocyte effacement (LEE) or other pathogenicity islands, such as OI-122, OI-71, OI-43/48, OI-50, OI-57, and the high-pathogenicity island (HPI). One O91:H21 STEC strain containing 4 stx variants (stx1a, stx2a, stx2c, and stx2d) was found to possess genes associated with pathogenicity islands OI-122, OI-43/48, and OI-15. Among EPEC strains harboring a large number of virulence genes (n, 34 to 50), eight belonged to serotype O26:H11, O103:H2, O103:H25, O145:H28, O157:H7, or O153:H2.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current software tools for documenting and developing models of buildings focus on supporting a single user who is a specialist in the specific software used within their own discipline. Extensions to these tools for use by teams maintain the single discipline view and focus on version and file management. There is a perceived need in industry to have tools that specifically support collaboration among individuals from multiple disciplines with both a graphical representation of the design and a persistent data model. This project involves the development of a prototype of such a software tool. We have identified multi-user 3D virtual worlds as an appropriate software base for the development of a collaborative design tool. These worlds are inherently multi-user and therefore directly support collaboration through a sense of awareness of others in the virtual world, their location within the world, and provide various channels for direct and indirect communication. Such software platforms also provide a 3D building and modelling environment that can be adapted to the needs of the building and construction industry. DesignWorld is a prototype system for collaborative design developed by augmenting the Second Life (SL) commercial software platform1 with a collection web-based tools for communication and design. Agents manage communication between the 3D virtual world and the web-based tools. In addition, agents maintain a persistent external model of designs in the 3D world which can be augmented with data such as relationships, disciplines and versions not usually associated with 3D virtual worlds but required in design scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report is for one of the four Tasks of the CRC project ‘Regenerating Construction to Enhance Sustainability’. The report specifically addresses Task 2 ‘Design guidelines for delivering high quality indoor environments’.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The quality of office indoor environments is considered to consist of those factors that impact the occupants according to their health and well-being and (by consequence) their productivity. Indoor Environment Quality (IEQ) can be characterized by four indicators: • Indoor air quality indicators • Thermal comfort indicators • Lighting indicators • Noise indicators. Within each indicator, there are specific metrics that can be utilized in determining an acceptable quality of an indoor environment based on existing knowledge and best practice. Examples of these metrics are: indoor air levels of pollutants or odorants; operative temperature and its control; radiant asymmetry; task lighting; glare; ambient noise. The way in which these metrics impact occupants is not fully understood, especially when multiple metrics may interact in their impacts. It can be estimated that the potential cost of lost productivity from poor IEQ may be much in excess of other operating costs of a building. However, the relative productivity impacts of each of the four indicators is largely unknown. The CRC Project ‘Regenerating Construction to Enhance Sustainability’ has a focus on IEQ impacts before and after building refurbishment. This paper provides an overview of IEQ impacts and criteria and the implementation of a CRC project that is currently researching these factors during the refurbishment of a Melbourne office building. IEQ measurements and their impacts will be reported in a future paper

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The quality of office indoor environments is considered to consist of those factors that impact occupants according to their health and well-being and (by consequence) their productivity. Indoor Environment Quality (IEQ) can be characterized by four indicators: • Indoor air quality indicators • Thermal comfort indicators • Lighting indicators • Noise indicators. Within each indicator, there are specific metrics that can be utilized in determining an acceptable quality of an indoor environment based on existing knowledge and best practice. Examples of these metrics are: indoor air levels of pollutants or odorants; operative temperature and its control; radiant asymmetry; task lighting; glare; ambient noise. The way in which these metrics impact occupants is not fully understood, especially when multiple metrics may interact in their impacts. While the potential cost of lost productivity from poor IEQ has been estimated to exceed building operation costs, the level of impact and the relative significance of the above four indicators are largely unknown. However, they are key factors in the sustainable operation or refurbishment of office buildings. This paper presents a methodology for assessing indoor environment quality (IEQ) in office buildings, and indicators with related metrics for high performance and occupant comfort. These are intended for integration into the specification of sustainable office buildings as key factors to ensure a high degree of occupant habitability, without this being impaired by other sustainability factors. The assessment methodology was applied in a case study on IEQ in Australia’s first ‘six star’ sustainable office building, Council House 2 (CH2), located in the centre of Melbourne. The CH2 building was designed and built with specific focus on sustainability and the provision of a high quality indoor environment for occupants. Actual IEQ performance was assessed in this study by field assessment after construction and occupancy. For comparison, the methodology was applied to a 30 year old conventional building adjacent to CH2 which housed the same or similar occupants and activities. The impact of IEQ on occupant productivity will be reported in a separate future paper

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In today’s global design world, architectural and other related design firms design across time zones and geographically distant locations. High bandwidth virtual environments have the potential to make a major impact on these global design teams. However, there is insufficient evidence about the way designers collaborate in their normal working environments using traditional and/or digital media. This paper presents a method to study the impact of communication and information technologies on collaborative design practice by comparing design tasks done in a normal working environment with design tasks done in a virtual environment. Before introducing high bandwidth collaboration technology to the work environment, a baseline study is conducted to observe and analyze the existing collaborative process. Designers currently rely on phone, fax, email, and image files for communication and collaboration. Describing the current context is important for comparison with the following phases. We developed the coding scheme that will be used in analyzing three stages of the collaborative design activity. The results will establish the basis for measures of collaborative design activity when a new technology is introduced later to the same work environment – for example, designers using electronic whiteboards, 3D virtual worlds, webcams, and internet phone. The results of this work will form the basis of guidelines for the introduction of technology into global design offices

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to support intelligent transportation system (ITS) road safety applications such as collision avoidance, lane departure warnings and lane keeping, Global Navigation Satellite Systems (GNSS) based vehicle positioning system has to provide lane-level (0.5 to 1 m) or even in-lane-level (0.1 to 0.3 m) accurate and reliable positioning information to vehicle users. However, current vehicle navigation systems equipped with a single frequency GPS receiver can only provide road-level accuracy at 5-10 meters. The positioning accuracy can be improved to sub-meter or higher with the augmented GNSS techniques such as Real Time Kinematic (RTK) and Precise Point Positioning (PPP) which have been traditionally used in land surveying and or in slowly moving environment. In these techniques, GNSS corrections data generated from a local or regional or global network of GNSS ground stations are broadcast to the users via various communication data links, mostly 3G cellular networks and communication satellites. This research aimed to investigate the precise positioning system performances when operating in the high mobility environments. This involves evaluation of the performances of both RTK and PPP techniques using: i) the state-of-art dual frequency GPS receiver; and ii) low-cost single frequency GNSS receiver. Additionally, this research evaluates the effectiveness of several operational strategies in reducing the load on data communication networks due to correction data transmission, which may be problematic for the future wide-area ITS services deployment. These strategies include the use of different data transmission protocols, different correction data format standards, and correction data transmission at the less-frequent interval. A series of field experiments were designed and conducted for each research task. Firstly, the performances of RTK and PPP techniques were evaluated in both static and kinematic (highway with speed exceed 80km) experiments. RTK solutions achieved the RMS precision of 0.09 to 0.2 meter accuracy in static and 0.2 to 0.3 meter in kinematic tests, while PPP reported 0.5 to 1.5 meters in static and 1 to 1.8 meter in kinematic tests by using the RTKlib software. These RMS precision values could be further improved if the better RTK and PPP algorithms are adopted. The tests results also showed that RTK may be more suitable in the lane-level accuracy vehicle positioning. The professional grade (dual frequency) and mass-market grade (single frequency) GNSS receivers were tested for their performance using RTK in static and kinematic modes. The analysis has shown that mass-market grade receivers provide the good solution continuity, although the overall positioning accuracy is worse than the professional grade receivers. In an attempt to reduce the load on data communication network, we firstly evaluate the use of different correction data format standards, namely RTCM version 2.x and RTCM version 3.0 format. A 24 hours transmission test was conducted to compare the network throughput. The results have shown that 66% of network throughput reduction can be achieved by using the newer RTCM version 3.0, comparing to the older RTCM version 2.x format. Secondly, experiments were conducted to examine the use of two data transmission protocols, TCP and UDP, for correction data transmission through the Telstra 3G cellular network. The performance of each transmission method was analysed in terms of packet transmission latency, packet dropout, packet throughput, packet retransmission rate etc. The overall network throughput and latency of UDP data transmission are 76.5% and 83.6% of TCP data transmission, while the overall accuracy of positioning solutions remains in the same level. Additionally, due to the nature of UDP transmission, it is also found that 0.17% of UDP packets were lost during the kinematic tests, but this loss doesn't lead to significant reduction of the quality of positioning results. The experimental results from the static and the kinematic field tests have also shown that the mobile network communication may be blocked for a couple of seconds, but the positioning solutions can be kept at the required accuracy level by setting of the Age of Differential. Finally, we investigate the effects of using less-frequent correction data (transmitted at 1, 5, 10, 15, 20, 30 and 60 seconds interval) on the precise positioning system. As the time interval increasing, the percentage of ambiguity fixed solutions gradually decreases, while the positioning error increases from 0.1 to 0.5 meter. The results showed the position accuracy could still be kept at the in-lane-level (0.1 to 0.3 m) when using up to 20 seconds interval correction data transmission.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a novel optimum path planning strategy for long duration AUV operations in environments with time-varying ocean currents. These currents can exceed the maximum achievable speed of the AUV, as well as temporally expose obstacles. In contrast to most other path planning strategies, paths have to be defined in time as well as space. The solution described here exploits ocean currents to achieve mission goals with minimal energy expenditure, or a tradeoff between mission time and required energy. The proposed algorithm uses a parallel swarm search as a means to reduce the susceptibility to large local minima on the complex cost surface. The performance of the optimisation algorithms is evaluated in simulation and experimentally with the Starbug AUV using a validated ocean model of Brisbane’s Moreton Bay.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Alliance for Coastal Technologies (ACT) Workshop entitled "Technologies for Measuring Currents in Coastal Environments" was held in Portland, Maine, October 26-28, 2005, with sponsorship by the Gulf of Maine Ocean Observing System (GoMOOS), an ACT partner organization. The primary goals of the event were to summarize recent trends in nearshore research and management applications for current meter technologies, identify how current meters can assist coastal managers to fulfill their regulatory and management objectives, and to recommend actions to overcome barriers to use of the technologies. The workshop was attended by 25 participants representing state and federal environmental management agencies, manufacturers of current meter technologies, and researchers from academic institutions and private industry. Common themes that were discussed during the workshop included 1) advantages and limitations of existing current measuring equipment, 2) reliability and ease of use with each instrument type, 3) data decoding and interpretation procedures, and 4) mechanisms to facilitate better training and guidance to a broad user group. Seven key recommendations, which were ranked in order of importance during the last day of the workshop are listed below. 1. Forums should be developed to facilitate the exchange of information among users and industry: a) On-line forums that not only provide information on specific instruments and technologies, but also provide an avenue for the exchange of user experiences with various instruments (i.e. problems encountered, cautions, tips, advantages, etc). (see References for manufacturer websites with links to application and technical forums at end of report) b) Regional training/meetings for operational managers to exchange ideas on methods for measuring currents and evaluating data. c) Organize mini-meetings or tutorial sessions within larger conference venues. 2. A committee of major stakeholders should be convened to develop common standards (similar to the Institute of Electrical and Electronics Engineers (IEEE) committee) that enable users to switch sensors without losing software or display capabilities. (pdf contains 28 pages)