982 resultados para DETERMINES


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This document presents the results of the monitoring of a repaired coral reef injured by the M/V Jacquelyn L vessel grounding incident of July 7, 1991. This grounding occurred in Florida state waters within the boundaries of the Florida Keys National Marine Sanctuary (FKNMS). The National Oceanic and Atmospheric Administration (NOAA) and the Board of Trustees of the Internal Improvement Trust Fund of the State of Florida, (“State of Florida” or “state”) are the co-trustees for the natural resources within the FKNMS and, thus, are responsible for mediating the restoration of the damaged marine resources and monitoring the outcome of the restoration actions. The restoration monitoring program tracks patterns of biological recovery, determines the success of restoration measures, and assesses the resiliency to environmental and anthropogenic disturbances of the site over time. The monitoring program at the Jacquelyn L site was to have included an assessment of the structural stability of installed restoration modules and biological condition of reattached corals performed on the following schedule: immediately (i.e., baseline), 1, 3, and 6 years after restoration and following a catastrophic event. Restoration of this site was completed on July 20, 2000. Due to unavoidable delays in the settlement of the case, the “baseline” monitoring event for this site occurred in July 2004. The catastrophic monitoring event occurred on August 31, 2004, some 2 ½ weeks after the passage of Hurricane Charley which passed nearby, almost directly over the Dry Tortugas. In September 2005, the year one monitoring event occurred shortly after the passage of Hurricane Katrina, some 70 km to the NW. This report presents the results of all three monitoring events. (PDF contains 31 pages.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation: 1) determines the factor(s) responsible for spawning induction in NematosteJla vectensis; 2) isolates, describes, and documents the source of jelly from egg masses of N. vectensis; and 3) describes N. vectensis' early development. Namatostella vectensis were maintained on a 7-day mussel feeding/water change regime over 159 days. Within 36 hours of mussel feeding/water change. 69.1% of females and 78.5% of males spawned reliably. Through manipulation of feeding, water change, oxygen and nitrogenous waste concentrations, spawning induction was found to be triggered by the oxygen concentration associated with water change, and not by feeding. Ammonia, anemones' major waste product, inhibited this induction in a concentration-dependent manner. Female N. vectensis release eggs in a persistent jellied egg mass which is unique among the Actiniaria. The major component of this egg mass jelly was a positive periodic acid-Schiffs staining, 39.5-40.5 kD glycoprotein. Antibodies developed in rabbits against this glycoprotein bound to jelly of intact egg masses and to granules (~ 2.8 IJm in diameter) present in female anemone mesenteries and their associated filaments. Antibodies did not label male tissues. Nematostella vecfensis embryos underwent first karyokinesis -60 minutes following the addition of sperm to eggs. Second nuclear division took place, followed by first cleavage, 90-120 minutes later. Each of the 4 blastomeres that resulted from first cleavage contained a single nucleus. Arrangement of these blastomeres ranged from radial to pseudospiral. Embryonic development was both asynchronous and holoblastic. Following formation of the 4-cell stage, 71% of embryos proceeded to cleave again to form an 8-cell stage. In each of the remaining 29% of embryos, a fusion of from 2-4 blastomeres resulted in 4 possible patterns which had no affect on either cleavage interval timing or subsequent development. The fusion event was not due to ooplasmic segregation. Blastomeres isolated from 4-celled embryos were regulative and developed into normal planula larvae and juvenile anemones that were 1/4 the size of those that developed from intact 4-celled embryos. Embryos exhibiting the fusion phenomenon were examined at the fine structural level. The fusion phenomenon resulted in formation of a secondary syncytium and was not a mere compaction of blastomeres.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aquaculture depends largely upon a good aquatic environment. The quality of the aquatic medium determines success to a large extent in aquaculture. The medium is particularly vulnerable to excessive abstraction (i.e surface or groundwater) and contamination from a range of sources (industrial, agricultural or domestic) as well as risks of self-pollution. Environmental management options proffered so far include: improvements in farming performance (especially related to feed and feeding strategies, stocking densities, water quality management, disease prevention and control, use of chemicals, etc.) and in the selection of sites and culturable species, treatment of effluents, sensitivity of recipient waters and enforcement of environmental regulations and guidelines specific to the culture system. There are presently conceptual frameworks for aquatic environment management backed by legal administrative tools to create or enforce rational system for water management, fisheries and aquaculture development strengthened by adaptive institutionalisation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ENGLISH: Isograms of sea surface temperature (OC) have been produced for 1949-1968 for the areas of the eastern Pacific Ocean in which the majority of the skipjack catch is taken. These are in the immediate coastal zone, California (35° N) to Chile (20 0 S), and the Revillagigedo and Galapagos Islands groups. Skipjack occurrence and apparent abundance (as CSDF, i.e., catch per standard days fishing, standardized in purse-seiner units) for 1951-1968 were then superimposed on the surface temperature isograms. Results show that skipjack occur at surface temperatures> 17° C but with the majority between 20°-30° C. Apparent abundance at CSDF > 1 ton/day is normally Iimited to 20°29° C water, except in two areas in certain years; from the Gulf of Tehuantepec to Cape Mala rates of 1-9 tons/day are relatively common at 29°-30° C, and off Chimbote (Peru) occasionally >9 tons/day are recorded down to 18° C. As expected there were no apparent relationships between annual thermal conditions in the coastal zone and skipjack abundance (total catch or indices of abundance) in the same or 2 subsequent years. An Appendix to the report determines the quantitative relationships between surface temperature and skipjack abundance in relatively small areal strata in Baja California waters in 1955 and 1958. Relationships generally appeared significant and opposite in these years when temperatures were respectively anomalously cold and warm. SPANISH: Se han producido isogramas de la temperatura de la superficie del mar (OC) para 1949-1968 correspondientes a las áreas del Océano Pacífico oriental en donde se obtiene la mayor parte de la captura de barrilete. Estas se encuentran ubicadas en la zona costanera inmediata, desde California (35°N) hasta Chile (200S) y en las Islas Revillagigedo y Galápagos. La ocurrencia de barrilete y su abundancia aparente (expresada como CDSP standardizada en unidades de cerqueros) para 1951-1968 fueron luego superpuestas en los isogramas de la temperatura superficial. Los resultados demuestran que el barrilete aparece en temperaturas superficiales de > 17°C pero la mayoría entre los 20°C-30°C. La abundancia aparente de la CDSP > 1 tonelada/día se limita normalmente a aguas de 20°-29°C, excepto en dos áreas en ciertos años; desde el Golfo de Tehuantepec a Cabo Mala las tasas de 1-9 toneladas/día son relativamente comunes en los 29°-30°C, y frente a Chimbote (Perú) se registran ocasionalmente> 9 toneladas/día a una temperatura tan fría como de 18°C. Como era de esperarse no existió una relación aparente entre las condiciones térmicas anuales de la zona costanera y la abundancia del barrilete (captura total o índices de abundancia) en el mismo año o en los 2 años siguientes. Un Apéndice del informe determina la relación cuantitativa entre la temperatura superficial y la abundancia del barrilete en un estrato de áreas relativamente pequeño en las aguas de Baja California en 1955 y 1968. Las relaciones generalmente aparecieron significativas y opuestas en esos años cuando las temperaturas fueron respectivamente anómalamente frías y calientes. (PDF contains 53 pages.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The loss of species is known to have significant effects on ecosystem functioning, but only recently has it been recognized that species loss might rival the effects of other forms of environmental change on ecosystem processes. There is a need for experimental studies that explicitly manipulate species richness and environmental factors concurrently to determine their relative impacts on key ecosystem processes such as plant litter decomposition. It is crucial to understand what factors affect the rate of plant litter decomposition and the relative magnitude of such effects because the rate at which plant litter is lost and transformed to other forms of organic and inorganic carbon determines the capacity for carbon storage in ecosystems and the rate at which greenhouse gasses such as carbon dioxide are outgassed. Here we compared how an increase in water temperature of 5 degrees C and loss of detritivorous invertebrate and plant litter species affect decomposition rates in a laboratory experiment simulating stream conditions. Like some prior studies, we found that species identity, rather than species richness per se, is a key driver of decomposition, but additionally we showed that the loss of particular species can equal or exceed temperature change in its impact on decomposition. Our results indicate that the loss of particular species can be as important a driver of decomposition as substantial temperature change, but also that predicting the relative consequences of species loss and other forms of environmental change on decomposition requires knowledge of assemblages and their constituent species' ecology and ecophysiology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ABSTRACT Recently, people are confused with two opposite variations of elastic modulus with decreasing size of nano scale sample: elastic modulus either decreases or increases with decreas- ing sample size. In this paper, based on intermolecular potentials and a one dimensional model, we provide a unified understanding of the two opposite size effects. Firstly, we analyzed the mi- crostructural variation near the surface of an fcc nanofilm based on the Lennard-Jones potential. It is found that the atomic lattice near the surface becomes looser in comparison with the bulk, indicating that atoms in the bulk are located at the balance of repulsive forces, resulting in the decrease of the elastic moduli with the decreasing thickness of the film accordingly. In addition, the decrease in moduli should be attributed to both the looser surface layer and smaller coor- dination number of surface atoms. Furthermore, it is found that both looser and tighter lattice near the surface can appear for a general pair potential and the governing mechanism should be attributed to the surplus of the nearest force to all other long range interactions in the pair po- tential. Surprisingly, the surplus can be simply expressed by a sum of the long range interactions and the sum being positive or negative determines the looser or tighter lattice near surface re- spectively. To justify this concept, we examined ZnO in terms of Buckingham potential with long range Coulomb interactions. It is found that compared to its bulk lattice, the ZnO lattice near the surface becomes tighter, indicating the atoms in the bulk located at the balance of attractive forces, owing to the long range Coulomb interaction. Correspondingly, the elastic modulus of one- dimensional ZnO chain increases with decreasing size. Finally, a kind of many-body potential for Cu was examined. In this case, the surface layer becomes tighter than the bulk and the modulus increases with deceasing size, owing to the long range repulsive pair interaction, as well as the cohesive many-body interaction caused by the electron redistribution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A universal Biot number of ceramics, which not only determines the susceptibility of the ceramics to quenching but also indicates the duration that the ceramics fail during thermal shock, is theoretically obtained. The present analysis shows that the thermal shock failure of the ceramics with a Biot number greater than this universal value is a very rapid process that just occurs in the initial regime of the heat conduction of the ceramics. This universal Biot number provides a guide to the selection of the ceramics applying to the thermostructural engineering including thermal shock.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The first thesis topic is a perturbation method for resonantly coupled nonlinear oscillators. By successive near-identity transformations of the original equations, one obtains new equations with simple structure that describe the long time evolution of the motion. This technique is related to two-timing in that secular terms are suppressed in the transformation equations. The method has some important advantages. Appropriate time scalings are generated naturally by the method, and don't need to be guessed as in two-timing. Furthermore, by continuing the procedure to higher order, one extends (formally) the time scale of valid approximation. Examples illustrate these claims. Using this method, we investigate resonance in conservative, non-conservative and time dependent problems. Each example is chosen to highlight a certain aspect of the method.

The second thesis topic concerns the coupling of nonlinear chemical oscillators. The first problem is the propagation of chemical waves of an oscillating reaction in a diffusive medium. Using two-timing, we derive a nonlinear equation that determines how spatial variations in the phase of the oscillations evolves in time. This result is the key to understanding the propagation of chemical waves. In particular, we use it to account for certain experimental observations on the Belusov-Zhabotinskii reaction.

Next, we analyse the interaction between a pair of coupled chemical oscillators. This time, we derive an equation for the phase shift, which measures how much the oscillators are out of phase. This result is the key to understanding M. Marek's and I. Stuchl's results on coupled reactor systems. In particular, our model accounts for synchronization and its bifurcation into rhythm splitting.

Finally, we analyse large systems of coupled chemical oscillators. Using a continuum approximation, we demonstrate mechanisms that cause auto-synchronization in such systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a method of image-speckle contrast for the nonprecalibration measurement of the root-mean-square roughness and the lateral-correlation length of random surfaces with Gaussian correlation. We use the simplified model of the speckle fields produced by the weak scattering object in the theoretical analysis. The explicit mathematical relation shows that the saturation value of the image-speckle contrast at a large aperture radius determines the roughness, while the variation of the contrast with the aperture radius determines the lateral-correlation length. In the experimental performance, we specially fabricate the random surface samples with Gaussian correlation. The square of the image-speckle contrast is measured versus the radius of the aperture in the 4f system, and the roughness and the lateral-correlation length are extracted by fitting the theoretical result to the experimental data. Comparison of the measurement with that by an atomic force microscope shows our method has a satisfying accuracy. (C) 2002 Optical Society of America.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Historical definitions of what determines whether one lives in a coastal area or not have varied over time. According to Culliton (1998), a “coastal county” is defined as a county with at least 15% of its total land area located within a nation’s coastal watershed. This emphasizes the land areas within which water flows into the ocean or Great Lakes, but may be better suited for ecosystems or water quality research (Crowell et al. 2007). Some Federal Emergency Management Agency (FEMA) documents suggest that “coastal” includes shoreline-adjacent coastal counties, and perhaps even counties impacted by flooding from coastal storms. An accurate definition of “coastal” is critical in this regard since FEMA uses such definitions to revise and modernize their Flood Insurance Rate Maps (Crowell et al. 2007). A recent map published by the National Oceanic and Atmospheric Administration’s (NOAA) Coastal Services Center for the Coastal Change Analysis Program shows that the “coastal” boundary covers the entire state of New York and Michigan, while nearly all of South Carolina is considered “coastal.” The definition of “coastal” one chooses can have major implications, including a simple count of coastal population and the influence of local or state coastal policies. There is, however, one aspect of defining what is “coastal” that has often been overlooked; using atmospheric long-term climate variables to define the inland extent of the coastal zone. This definition, which incorporates temperature, precipitation, wind speed, and relative humidity, is furthermore scalable and globally applicable - even in the face of shifting shorelines. A robust definition using common climate variables should condense the large broad definition often associated with “coastal” such that completely landlocked locations would no longer be considered “coastal.” Moreover, the resulting definition, “coastal climate” or “climatology of the coast”, will help coastal resource managers make better-informed decisions on a wide range of climatologically-influenced issues. The following sections outline the methodology employed to derive some new maps of coastal boundaries in the United States. (PDF contains 3 pages)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis focuses mainly on linear algebraic aspects of combinatorics. Let N_t(H) be an incidence matrix with edges versus all subhypergraphs of a complete hypergraph that are isomorphic to H. Richard M. Wilson and the author find the general formula for the Smith normal form or diagonal form of N_t(H) for all simple graphs H and for a very general class of t-uniform hypergraphs H.

As a continuation, the author determines the formula for diagonal forms of integer matrices obtained from other combinatorial structures, including incidence matrices for subgraphs of a complete bipartite graph and inclusion matrices for multisets.

One major application of diagonal forms is in zero-sum Ramsey theory. For instance, Caro's results in zero-sum Ramsey numbers for graphs and Caro and Yuster's results in zero-sum bipartite Ramsey numbers can be reproduced. These results are further generalized to t-uniform hypergraphs. Other applications include signed bipartite graph designs.

Research results on some other problems are also included in this thesis, such as a Ramsey-type problem on equipartitions, Hartman's conjecture on large sets of designs and a matroid theory problem proposed by Welsh.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis introduces fundamental equations and numerical methods for manipulating surfaces in three dimensions via conformal transformations. Conformal transformations are valuable in applications because they naturally preserve the integrity of geometric data. To date, however, there has been no clearly stated and consistent theory of conformal transformations that can be used to develop general-purpose geometry processing algorithms: previous methods for computing conformal maps have been restricted to the flat two-dimensional plane, or other spaces of constant curvature. In contrast, our formulation can be used to produce---for the first time---general surface deformations that are perfectly conformal in the limit of refinement. It is for this reason that we commandeer the title Conformal Geometry Processing.

The main contribution of this thesis is analysis and discretization of a certain time-independent Dirac equation, which plays a central role in our theory. Given an immersed surface, we wish to construct new immersions that (i) induce a conformally equivalent metric and (ii) exhibit a prescribed change in extrinsic curvature. Curvature determines the potential in the Dirac equation; the solution of this equation determines the geometry of the new surface. We derive the precise conditions under which curvature is allowed to evolve, and develop efficient numerical algorithms for solving the Dirac equation on triangulated surfaces.

From a practical perspective, this theory has a variety of benefits: conformal maps are desirable in geometry processing because they do not exhibit shear, and therefore preserve textures as well as the quality of the mesh itself. Our discretization yields a sparse linear system that is simple to build and can be used to efficiently edit surfaces by manipulating curvature and boundary data, as demonstrated via several mesh processing applications. We also present a formulation of Willmore flow for triangulated surfaces that permits extraordinarily large time steps and apply this algorithm to surface fairing, geometric modeling, and construction of constant mean curvature (CMC) surfaces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Different conical emission (CE) patterns are obtained experimentally at various incident powers and beam sizes of pump laser pulses with pulse durations of 7 fs, 44 fs and 100 fs. The results show that it is the incident power but not the incident power density that determines a certain CE pattern. In addition, the critical powers for similar CE patterns are nearly the same for the laser pulses with the same spectral bandwidth. Furthermore, as far as a certain CE pattern is concerned, the wider the spectral bandwidth of pump laser pulse is, the higher the critical power is. This will hopefully provide new insights for the generation of CE pattern in optical medium.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.

Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.

Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.

Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.

In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.

Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.

The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.

Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.