912 resultados para Macro-tidal Beaches
Resumo:
A pilot fish culture project was initiated by Shell Petroleum Development Company of Nigeria Limited in 1981 with specific aims and objectives. Site selection, survey, pond construction method and fish production with regards to experiences gathered and gained, problems and solutions so far obtained are discussed. Trials of freshwater fish species to check their adaptability to brackishwater ponds were carried out and the promising results of the growth rate of these species when compared with the traditional local brackishwater species selected for culture are reported. The extension programme so far carried out is briefly described
Resumo:
Some problems of edge waves and standing waves on beaches are examined.
The nonlinear interaction of a wave normally incident on a sloping beach with a subharmonic edge wave is studied. A two-timing expansion is used in the full nonlinear theory to obtain the modulation equations which describe the evolution of the waves. It is shown how large amplitude edge waves are produced; and the results of the theory are compared with some recent laboratory experiments.
Traveling edge waves are considered in two situations. First, the full linear theory is examined to find the finite depth effect on the edge waves produced by a moving pressure disturbance. In the second situation, a Stokes' expansion is used to discuss the nonlinear effects in shallow water edge waves traveling over a bottom of arbitrary shape. The results are compared with the ones of the full theory for a uniformly sloping bottom.
The finite amplitude effects for waves incident on a sloping beach, with perfect reflection, are considered. A Stokes' expansion is used in the full nonlinear theory to find the corrections to the dispersion relation for the cases of normal and oblique incidence.
Finally, an abstract formulation of the linear water waves problem is given in terms of a self adjoint but nonlocal operator. The appropriate spectral representations are developed for two particular cases.
Resumo:
Congress established a legal imperative to restore the quality of our surface waters when it enacted the Clean Water Act in 1972. The act requires that existing uses of coastal waters such as swimming and shellfishing be protected and restored. Enforcement of this mandate is frequently measured in terms of the ability to swim and harvest shellfish in tidal creeks, rivers, sounds, bays, and ocean beaches. Public-health agencies carry out comprehensive water-quality sampling programs to check for bacteria contamination in coastal areas where swimming and shellfishing occur. Advisories that restrict swimming and shellfishing are issued when sampling indicates that bacteria concentrations exceed federal health standards. These actions place these coastal waters on the U.S. Environmental Protection Agencies’ (EPA) list of impaired waters, an action that triggers a federal mandate to prepare a Total Maximum Daily Load (TMDL) analysis that should result in management plans that will restore degraded waters to their designated uses. When coastal waters become polluted, most people think that improper sewage treatment is to blame. Water-quality studies conducted over the past several decades have shown that improper sewage treatment is a relatively minor source of this impairment. In states like North Carolina, it is estimated that about 80 percent of the pollution flowing into coastal waters is carried there by contaminated surface runoff. Studies show this runoff is the result of significant hydrologic modifications of the natural coastal landscape. There was virtually no surface runoff occurring when the coastal landscape was natural in places such as North Carolina. Most rainfall soaked into the ground, evaporated, or was used by vegetation. Surface runoff is largely an artificial condition that is created when land uses harden and drain the landscape surfaces. Roofs, parking lots, roads, fields, and even yards all result in dramatic changes in the natural hydrology of these coastal lands, and generate huge amounts of runoff that flow over the land’s surface into nearby waterways. (PDF contains 3 pages)
Resumo:
Beachfront jurisdictional lines were established by the South Carolina Beachfront Management Act (SC Code §48- 39-250 et seq.) in 1988 to regulate the new construction, repair, or reconstruction of buildings and erosion control structures along the state’s ocean shorelines. Building within the state’s beachfront “setback area” is allowed, but is subject to special regulations. For “standard beaches” (those not influenced by tidal inlets or associated shoals), a baseline is established at the crest of the primary oceanfront sand dune; for “unstabilized inlet zones,” the baseline is drawn at the most landward point of erosion during the past forty years. The parallel setback line is then established landward of the baseline a distance of forty times the long-term average annual erosion rate (not less than twenty feet from the baseline in stable or accreting areas). The positions of the baseline and setback line are updated every 8-10 years using the best available scientific and historical data, including aerial imagery, LiDAR, historical shorelines, beach profiles, and long-term erosion rates. One advantage of science-based setbacks is that, by using actual historical and current shoreline positions and beach profile data, they reflect the general erosion threat to beachfront structures. However, recent experiences with revising the baseline and setback line indicate that significant challenges and management implications also exist. (PDF contains 3 pages)
Resumo:
Soft engineering solutions are the current standard for addressing coastal erosion in the US. In South Carolina, beach nourishment from offshore sand deposits and navigation channels has mostly replaced construction of seawalls and groins, which were common occurrences in earlier decades. Soft engineering solutions typically provide a more natural product than hard solutions, and also eliminate negative impacts to adjacent areas which are often associated with hard solutions. A soft engineering solution which may be underutilized in certain areas is shoal manipulation. (PDF contains 4 pages)
Resumo:
In the quest to develop viable designs for third-generation optical interferometric gravitational-wave detectors, one strategy is to monitor the relative momentum or speed of the test-mass mirrors, rather than monitoring their relative position. The most straightforward design for a speed-meter interferometer that accomplishes this is described and analyzed in Chapter 2. This design (due to Braginsky, Gorodetsky, Khalili, and Thorne) is analogous to a microwave-cavity speed meter conceived by Braginsky and Khalili. A mathematical mapping between the microwave speed meter and the optical interferometric speed meter is developed and used to show (in accord with the speed being a quantum nondemolition observable) that in principle the interferometric speed meter can beat the gravitational-wave standard quantum limit (SQL) by an arbitrarily large amount, over an arbitrarily wide range of frequencies . However, in practice, to reach or beat the SQL, this specific speed meter requires exorbitantly high input light power. The physical reason for this is explored, along with other issues such as constraints on performance due to optical dissipation.
Chapter 3 proposes a more sophisticated version of a speed meter. This new design requires only a modest input power and appears to be a fully practical candidate for third-generation LIGO. It can beat the SQL (the approximate sensitivity of second-generation LIGO interferometers) over a broad range of frequencies (~ 10 to 100 Hz in practice) by a factor h/hSQL ~ √W^(SQL)_(circ)/Wcirc. Here Wcirc is the light power circulating in the interferometer arms and WSQL ≃ 800 kW is the circulating power required to beat the SQL at 100 Hz (the LIGO-II power). If squeezed vacuum (with a power-squeeze factor e-2R) is injected into the interferometer's output port, the SQL can be beat with a much reduced laser power: h/hSQL ~ √W^(SQL)_(circ)/Wcirce-2R. For realistic parameters (e-2R ≃ 10 and Wcirc ≃ 800 to 2000 kW), the SQL can be beat by a factor ~ 3 to 4 from 10 to 100 Hz. [However, as the power increases in these expressions, the speed meter becomes more narrow band; additional power and re-optimization of some parameters are required to maintain the wide band.] By performing frequency-dependent homodyne detection on the output (with the aid of two kilometer-scale filter cavities), one can markedly improve the interferometer's sensitivity at frequencies above 100 Hz.
Chapters 2 and 3 are part of an ongoing effort to develop a practical variant of an interferometric speed meter and to combine the speed meter concept with other ideas to yield a promising third- generation interferometric gravitational-wave detector that entails low laser power.
Chapter 4 is a contribution to the foundations for analyzing sources of gravitational waves for LIGO. Specifically, it presents an analysis of the tidal work done on a self-gravitating body (e.g., a neutron star or black hole) in an external tidal field (e.g., that of a binary companion). The change in the mass-energy of the body as a result of the tidal work, or "tidal heating," is analyzed using the Landau-Lifshitz pseudotensor and the local asymptotic rest frame of the body. It is shown that the work done on the body is gauge invariant, while the body-tidal-field interaction energy contained within the body's local asymptotic rest frame is gauge dependent. This is analogous to Newtonian theory, where the interaction energy is shown to depend on how one localizes gravitational energy, but the work done on the body is independent of that localization. These conclusions play a role in analyses, by others, of the dynamics and stability of the inspiraling neutron-star binaries whose gravitational waves are likely to be seen and studied by LIGO.
Resumo:
1) The 4-beaches survey was the first of its kind on Lake Victoria. Drawing on Participatory Rural Appraisal (PRA) techniques, four landing sites around the lake were selected for long-term monitoring from March 2000 through to October 2001. 2)Held in all the 3 riparian countries of Lake Victoria the stakeholders' workshops aimed to assess the necessity of fisheries management for Lake Victoria and to identify who the stakeholders in fisheries management would be.
Resumo:
As it is clearly indicated in the title of this book section, it overviews the methodologies used in the 4-beaches Survey and in the various Stakeholders' Workshops held in all the three riparian countries of the Lake Victoria.
Resumo:
This paper analyses the location, potentialities and set-backs of Nkombe Beach, the landing site chosen in Uganda for the 4-beaches survey.
Resumo:
This partial translation of a longer article describes the phenomenon of ”Blasensand”. Blasensand is formed when sedimentation of dried out sand is suddenly flooded from above. A more detailed explanation of Blasensand is given in this translated part of the paper.
Resumo:
Liquefaction is a devastating instability associated with saturated, loose, and cohesionless soils. It poses a significant risk to distributed infrastructure systems that are vital for the security, economy, safety, health, and welfare of societies. In order to make our cities resilient to the effects of liquefaction, it is important to be able to identify areas that are most susceptible. Some of the prevalent methodologies employed to identify susceptible areas include conventional slope stability analysis and the use of so-called liquefaction charts. However, these methodologies have some limitations, which motivate our research objectives. In this dissertation, we investigate the mechanics of origin of liquefaction in a laboratory test using grain-scale simulations, which helps (i) understand why certain soils liquefy under certain conditions, and (ii) identify a necessary precursor for onset of flow liquefaction. Furthermore, we investigate the mechanics of liquefaction charts using a continuum plasticity model; this can help in modeling the surface hazards of liquefaction following an earthquake. Finally, we also investigate the microscopic definition of soil shear wave velocity, a soil property that is used as an index to quantify liquefaction resistance of soil. We show that anisotropy in fabric, or grain arrangement can be correlated with anisotropy in shear wave velocity. This has the potential to quantify the effects of sample disturbance when a soil specimen is extracted from the field. In conclusion, by developing a more fundamental understanding of soil liquefaction, this dissertation takes necessary steps for a more physical assessment of liquefaction susceptibility at the field-scale.
Resumo:
Estimating the abundance of marine macro-invertebrates is complicated by a variety of factors: 1) human factors, such as diver efficiency and diver error; and 2) biological factors, such as aggregation of organisms, crypsis, and nocturnal emergence behavior. Diver efficiency varied according to the detectability of an organism causing under-estimation of density by up to 50% in some species. All common species were aggregated at scales from 10-50 m. Transects need to be long enough to transcend the scale of patchiness to improve accuracy. Some species of sea urchins and sea cucumbers (pepinos) which are cryptic by day emerged at night so that daytime censuses underestimated their abundance by up to 10 times. In the sea cucumber fishery, estimates of abundance need to be made at the scale of the population, i.e. at hundreds of km. A strategy for this is proposed.
Resumo:
Depth data from archival tags on northern rock sole (Lepidopsetta polyxystra) were examined to assess whether fish used tidal currents to aid horizontal migration. Two northern rock sole, out of 115 released with archival tags in the eastern Bering Sea, were recovered 314 and 667 days after release. Both fish made periodic excursions away from the bottom during mostly night-time hours, but also during particular phases of the tide cycle. One fish that was captured and released in an area of rotary currents made vertical excursions that were correlated with tidal current direction. To test the hypothesis that the fish made vertical excursions to use tidal currents to aid migration, a hypothetical migratory path was calculated using a tide model to predict the current direction and speed during periods when the fish was off the bottom. This migration included limited movements from July through December, followed by a 200-km southern migration from January through February, then a return northward in March and April. The successful application of tidal current information to predict a horizontal migratory path not only provides evidence of selective tidal stream transport but indicates that vertical excursions were conducted primarily to assist horizontal migration.