933 resultados para evolving


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The attractiveness of the trophic concept is that it was the first attempt at a holistic perspective on an ecosystem which met with any degree of success. Just as temperature, pressure, and volume allow one to characterize the incomprehensible multitude of particulate motions in a simple gas, the hope is that a small set of figures, such as trophic storages or trophic efficiencies, permit one to compare two ecosystems with overwhelmingly disparate complexities. Thus, if it were possible to demonstrate that an arbitrary network of ecosystem flows could be reduced to a trophic configuration, the aggregation process thus defined would become a key component of the evolving discipline of "macroscopic ecology" (see also Odum 1977 and Ulanowicz 1979).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A study was conducted to assess the status of ecological condition and potential human-health risks in subtidal estuarine waters throughout the North Carolina National Estuarine Research Reserve System (NERRS) (Currituck Sound, Rachel Carson, Masonboro Island, and Zeke’s Island). Field work was conducted in September 2006 and incorporated multiple indicators of ecosystem condition including measures of water quality (dissolved oxygen, salinity, temperature, pH, nutrients and chlorophyll, suspended solids), sediment quality (granulometry, organic matter content, chemical contaminant concentrations), biological condition (diversity and abundances of benthic fauna, fish contaminant levels and pathologies), and human dimensions (fish-tissue contaminant levels relative to human-health consumption limits, various aesthetic properties). A probabilistic sampling design permitted statistical estimation of the spatial extent of degraded versus non-degraded condition across these estuaries relative to specified threshold levels of the various indicators (where possible). With some exceptions, the status of these reserves appeared to be in relatively good to fair ecological condition overall, with the majority of the area (about 54%) having various water quality, sediment quality, and biological (benthic) condition indicators rated in the healthy to intermediate range of corresponding guideline thresholds. Only three stations, representing 10.5% of the area, had one or more of these indicators rated as poor/degraded in all three categories. While such a conclusion is encouraging from a coastal management perspective, it should be viewed with some caution. For example, although co-occurrences of adverse biological and abiotic environmental conditions were limited, at least one indicator of ecological condition rated in the poor/degraded range was observed over a broader area (35.5%) represented by 11 of the 30 stations sampled. In addition, the fish-tissue contaminant data were not included in these overall spatial estimates; however, the majority of samples (77% of fish that were analyzed, from 79%, of stations where fish were caught) contained inorganic arsenic above the consumption limits for human cancer risks, though most likely derived from natural sources. Similarly, aesthetic indicators are not reflected in these spatial estimates of ecological condition, though there was evidence of noxious odors in sediments at many of the stations. Such symptoms reflect a growing realization that North Carolina estuaries are under multiple pressures from a variety of natural and human influences. These data also suggest that, while the current status of overall ecological condition appears to be good to fair, long-term monitoring is warranted to track potential changes in the future. This study establishes an important baseline of overall ecological condition within NC NERRS that can be used to evaluate any such future changes and to trigger appropriate management actions in this rapidly evolving coastal environment. (PDF contains 76 pages)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As part of an ongoing program of benthic sampling and related assessments of sediment quality at Gray’s Reef National Marine Sanctuary (GRNMS) off the coast of Georgia, a survey of soft-bottom benthic habitats was conducted in spring 2005 to characterize condition of macroinfaunal assemblages and levels of chemical contaminants in sediments and biota relative to a baseline survey carried out in spring 2000. Distribution and abundance of macrobenthos were related foremost to sediment type (median particle size, % gravel), which in turn varied according to bottom-habitat mesoscale features (e.g., association with live bottom versus flat or rippled sand areas). Overall abundance and diversity of soft-bottom benthic communities were similar between the two years, though dominance patterns and relative abundances of component species were less repeatable. Seasonal summer pulses of a few taxa (e.g., the bivalve Ervilia sp. A) observed in 2000 were not observed in 2005. Concentrations of chemical contaminants in sediments and biota, though detectable in both years, were consistently at low, background levels and no exceedances of sediment probable bioeffect levels or FDA action levels for edible fish or shellfish were observed. Near-bottom dissolved oxygen levels and organic-matter content of sediments also have remained within normal ranges. Highly diverse benthic assemblages were found in both years, supporting the premise that GRNMS serves as an important reservoir of marine biodiversity. A total of 353 taxa (219 identified to species) were collected during the spring 2005 survey. Cumulatively, 588 taxa (371 identified to species) have been recorded in the sanctuary from surveys in 2000, 2001, 2002, and 2005. Species Accumulation Curves indicate that the theoretical maximum should be in excess of 600 species. Results of this study will be of value in advancing strategic science and management goals for GRNMS, including characterization and long-term monitoring of sanctuary resources and processes, as well as supporting evolving interests in ecosystem-based management of the surrounding South Atlantic Bight (SAB) ecosystem. (PDF contains 46 pages)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although some services that support Open Access have developed a sustainable business model, many started as projects and continue to run on recurrent project funding or goodwill. If these are critical components of the evolving scholarly communication system the foundation of Open Access is vulnerable. Knowledge Exchange has commissioned this study as part of a larger programme of work to look at the issue of sustaining key services into the long term. This report focuses on phases one and two of the programme. Phase one was a scoping exercise, carried out mainly through a literature review and an extensive stakeholder interview exercise, to describe the services that are currently available or would be valuable in the future. It also investigated what roles stakeholders could play in this future scenario. Phase two was a stakeholder consultation and engagement exercise. The aim was to engage stakeholders with the work programme so that they could contribute their views, get involved with the work and have a voice in the thinking about future scenarios. The key services are presented for three future scenarios: ‘Gold’ Open Access, fully ‘Green’ Open Access and Green’ Open Access supplementing subscription access as ‘Gold’ OA grows. Three strategic areas are identified as having particular potential for future work. These are embedding business development expertise into service development; consideration of how to move money around the system to enable Open Access to be achieved optimally; and governance and coordination of the infrastructural foundation of Open Access. The report concludes with seven recommendations, both high-level and practical, for further work around these strategic areas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The principal purpose of this document is to assist programme teams throughout the development process when they are considering the development or review of a route through the award where it will be delivered wholly, or primarily, via online distance learning. Please note that this document is current as of Sept 2015 but it is considered to be an evolving document and is updated/tweaked from time to time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study is concerned with the measurement of total factor prodnctivity in the marine fishing industries in general and in the Pacific coast trawl fishery in particular. The study is divided into two parts. Part I contains suitable empirical and introductory theoretical material for the examination of productivity in the Pacific coast trawl Deet. It is self-contained, and contains the basic formulae, empirical results, and discussion. Because the economic theory of index numbers and productivity is constantly evolving and is widely scattered throughout the economics literature, Part D draws together the theoretical literature into one place to allow ready access for readers interested in more details. The major methodological focus of the study is upon the type of economic index number that is most appropriate for use by economists with the National Marine Fisheries Service. This study recommends that the following types of economic index numbers be used: chain rather than fIxed base; bilateral rather than multilateral; one of the class of superlative indices, such as the Tornqvist or Fisher Ideal. (PDF file contains 40 pages.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The 800 km coastline of Nigeria is a huge gateway to a supply of food and raw materials. But while immense fishery resource is perceived by many, its full exploitation is obstructed by how little is understood of the ocean processes necessary for effective utilisation. Much basic oceanographic research is needed as a prerequisite to evolving successful strategies for full application of Nigeria's marine fisheries resources

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study of exoplanets is rapidly evolving into an important and exciting field of its own. My investigations over the past half-decade have focused on understanding just a small sliver of what they are trying to tell us. That small sliver is their atmospheres. Atmospheres are the buffer between the bulk planet and the vacuum of space. The atmosphere is an important component of a planet as it is the most readily observable and contains the most information about the physical processes that can occur in a planet. I have focused on two aspects of exoplanetary atmospheres. First, I aimed to understand the chemical mechanisms that control the atmospheric abundances. Second, I focused on interpreting exoplanet atmospheric spectra and what they tell us about the temperatures and compositions through inverse modeling. Finally, I interpreted the retrieved temperature and abundances from inverse modeling in the context of chemical disequilibrium in the planetary atmospheres.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The theories of relativity and quantum mechanics, the two most important physics discoveries of the 20th century, not only revolutionized our understanding of the nature of space-time and the way matter exists and interacts, but also became the building blocks of what we currently know as modern physics. My thesis studies both subjects in great depths --- this intersection takes place in gravitational-wave physics.

Gravitational waves are "ripples of space-time", long predicted by general relativity. Although indirect evidence of gravitational waves has been discovered from observations of binary pulsars, direct detection of these waves is still actively being pursued. An international array of laser interferometer gravitational-wave detectors has been constructed in the past decade, and a first generation of these detectors has taken several years of data without a discovery. At this moment, these detectors are being upgraded into second-generation configurations, which will have ten times better sensitivity. Kilogram-scale test masses of these detectors, highly isolated from the environment, are probed continuously by photons. The sensitivity of such a quantum measurement can often be limited by the Heisenberg Uncertainty Principle, and during such a measurement, the test masses can be viewed as evolving through a sequence of nearly pure quantum states.

The first part of this thesis (Chapter 2) concerns how to minimize the adverse effect of thermal fluctuations on the sensitivity of advanced gravitational detectors, thereby making them closer to being quantum-limited. My colleagues and I present a detailed analysis of coating thermal noise in advanced gravitational-wave detectors, which is the dominant noise source of Advanced LIGO in the middle of the detection frequency band. We identified the two elastic loss angles, clarified the different components of the coating Brownian noise, and obtained their cross spectral densities.

The second part of this thesis (Chapters 3-7) concerns formulating experimental concepts and analyzing experimental results that demonstrate the quantum mechanical behavior of macroscopic objects - as well as developing theoretical tools for analyzing quantum measurement processes. In Chapter 3, we study the open quantum dynamics of optomechanical experiments in which a single photon strongly influences the quantum state of a mechanical object. We also explain how to engineer the mechanical oscillator's quantum state by modifying the single photon's wave function.

In Chapters 4-5, we build theoretical tools for analyzing the so-called "non-Markovian" quantum measurement processes. Chapter 4 establishes a mathematical formalism that describes the evolution of a quantum system (the plant), which is coupled to a non-Markovian bath (i.e., one with a memory) while at the same time being under continuous quantum measurement (by the probe field). This aims at providing a general framework for analyzing a large class of non-Markovian measurement processes. Chapter 5 develops a way of characterizing the non-Markovianity of a bath (i.e.,whether and to what extent the bath remembers information about the plant) by perturbing the plant and watching for changes in the its subsequent evolution. Chapter 6 re-analyzes a recent measurement of a mechanical oscillator's zero-point fluctuations, revealing nontrivial correlation between the measurement device's sensing noise and the quantum rack-action noise.

Chapter 7 describes a model in which gravity is classical and matter motions are quantized, elaborating how the quantum motions of matter are affected by the fact that gravity is classical. It offers an experimentally plausible way to test this model (hence the nature of gravity) by measuring the center-of-mass motion of a macroscopic object.

The most promising gravitational waves for direct detection are those emitted from highly energetic astrophysical processes, sometimes involving black holes - a type of object predicted by general relativity whose properties depend highly on the strong-field regime of the theory. Although black holes have been inferred to exist at centers of galaxies and in certain so-called X-ray binary objects, detecting gravitational waves emitted by systems containing black holes will offer a much more direct way of observing black holes, providing unprecedented details of space-time geometry in the black-holes' strong-field region.

The third part of this thesis (Chapters 8-11) studies black-hole physics in connection with gravitational-wave detection.

Chapter 8 applies black hole perturbation theory to model the dynamics of a light compact object orbiting around a massive central Schwarzschild black hole. In this chapter, we present a Hamiltonian formalism in which the low-mass object and the metric perturbations of the background spacetime are jointly evolved. Chapter 9 uses WKB techniques to analyze oscillation modes (quasi-normal modes or QNMs) of spinning black holes. We obtain analytical approximations to the spectrum of the weakly-damped QNMs, with relative error O(1/L^2), and connect these frequencies to geometrical features of spherical photon orbits in Kerr spacetime. Chapter 11 focuses mainly on near-extremal Kerr black holes, we discuss a bifurcation in their QNM spectra for certain ranges of (l,m) (the angular quantum numbers) as a/M → 1. With tools prepared in Chapter 9 and 10, in Chapter 11 we obtain an analytical approximate for the scalar Green function in Kerr spacetime.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Máster en Dirección Empresarial desde la Innovación y la Internacionalización. Curso 2013/2014

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Separating the dynamics of variables that evolve on different timescales is a common assumption in exploring complex systems, and a great deal of progress has been made in understanding chemical systems by treating independently the fast processes of an activated chemical species from the slower processes that proceed activation. Protein motion underlies all biocatalytic reactions, and understanding the nature of this motion is central to understanding how enzymes catalyze reactions with such specificity and such rate enhancement. This understanding is challenged by evidence of breakdowns in the separability of timescales of dynamics in the active site form motions of the solvating protein. Quantum simulation methods that bridge these timescales by simultaneously evolving quantum and classical degrees of freedom provide an important method on which to explore this breakdown. In the following dissertation, three problems of enzyme catalysis are explored through quantum simulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Homologous recombination is a source of diversity in both natural and directed evolution. Standing genetic variation that has passed the test of natural selection is combined in new ways, generating functional and sometimes unexpected changes. In this work we evaluate the utility of homologous recombination as a protein engineering tool, both in comparison with and combined with other protein engineering techniques, and apply it to an industrially important enzyme: Hypocrea jecorina Cel5a.

Chapter 1 reviews work over the last five years on protein engineering by recombination. Chapter 2 describes the recombination of Hypocrea jecorina Cel5a endoglucanase with homologous enzymes in order to improve its activity at high temperatures. A chimeric Cel5a that is 10.1 °C more stable than wild-type and hydrolyzes 25% more cellulose at elevated temperatures is reported. Chapter 3 describes an investigation into the synergy of thermostable cellulases that have been engineered by recombination and other methods. An engineered endoglucanase and two engineered cellobiohydrolases synergistically hydrolyzed cellulose at high temperatures, releasing over 200% more reducing sugars over 60 h at their optimal mixture relative to the best mixture of wild-type enzymes. These results provide a framework for engineering cellulolytic enzyme mixtures for the industrial conditions of high temperatures and long incubation times.

In addition to this work on recombination, we explored three other problems in protein engineering. Chapter 4 describes an investigation into replacing enzymes with complex cofactors with simple cofactors, using an E. coli enolase as a model system. Chapter 5 describes engineering broad-spectrum aldehyde resistance in Saccharomyces cerevisiae by evolving an alcohol dehydrogenase simultaneously for activity and promiscuity. Chapter 6 describes an attempt to engineer gene-targeted hypermutagenesis into E. coli to facilitate continuous in vivo selection systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents a study of the dynamical stability of nascent neutron stars resulting from the accretion induced collapse of rapidly rotating white dwarfs.

Chapter 2 and part of Chapter 3 study the equilibrium models for these neutron stars. They are constructed by assuming that the neutron stars have the same masses, angular momenta, and specific angular momentum distributions as the pre-collapse white dwarfs. If the pre-collapse white dwarf is rapidly rotating, the collapsed object will contain a high density central core of size about 20 km, surrounded by a massive accretion torus extending to hundreds of kilometers from the rotation axis. The ratio of the rotational kinetic energy to gravitational binding energy, β, of these neutron stars is all found to be less than 0.27.

Chapter 3 studies the dynamical stability of these neutron stars by numerically evolving the linearized hydrodynamical equations. A dynamical bar-mode instability is observed when the β of the star is greater than the critical value βd ≈ 0.25. It is expected that the unstable mode will persist until a substantial amount of angular momentum is carried away by gravitational radiation. The detectability of these sources is studied and it is estimated that LIGO II is unlikely to detect them unless the event rate is greater than 10-6/year/galaxy.

All the calculations on the structure and stability of the neutron stars in Chapters 2 and 3 are carried out using Newtonian hydrodynamics and gravity. Chapter 4 studies the relativistic effects on the structure of these neutron stars. New techniques are developed and used to construct neutron star models to the first post-Newtonian (1PN) order. The structures of the 1PN models are qualitatively similar to the corresponding Newtonian models, but the values of β are somewhat smaller. The maximum β for these 1PN neutron stars is found to be 0.24, which is 8% smaller than the Newtonian result (0.26). However, relativistic effects will also change the critical value βd. A detailed post-Newtonian stability analysis has yet to be carried out to study the relativistic effects on the dynamical stability of these neutron stars.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.

Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.

Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.

Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.

In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.

Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.

The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.

Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dependence of the maximum and average energies of protons, which were produced in the interaction of an intense laser pulse (similar to 1 x 10(16) W cm(-2), 65 fs) with hydrogen clusters in a gas jet backed up to 80 bar at liquid nitrogen temperature (similar to 80 K), on the backing pressure has been studied. The general trend of the proton energy dependence on the square of the average cluster radius, which is determined by a calibrated Rayleigh scattering measurement, is similar to that described by theory under the single size approximation. Calculations are made to fit the experimental results under a simplified model by taking into account both a log-normal cluster size distribution and the laser intensity attenuation in the interaction volume. A very good agreement between the experimental proton energy spectra and the calculations is obtained in the high- energy part of the proton energy distributions, but a discrepancy of the fits is revealed in the low-energy part at higher backing pressures which are associated with denser flows. A possible mechanism which would be responsible for this discrepancy is discussed. Finally, from the fits, a variation of the cluster size distributions was revealed to be dependent on the gas backing pressure as well as on the evolving time of the gas flow of clusters.