977 resultados para one percent
Resumo:
This thesis consists of three separate studies of roles that black holes might play in our universe.
In the first part we formulate a statistical method for inferring the cosmological parameters of our universe from LIGO/VIRGO measurements of the gravitational waves produced by coalescing black-hole/neutron-star binaries. This method is based on the cosmological distance-redshift relation, with "luminosity distances" determined directly, and redshifts indirectly, from the gravitational waveforms. Using the current estimates of binary coalescence rates and projected "advanced" LIGO noise spectra, we conclude that by our method the Hubble constant should be measurable to within an error of a few percent. The errors for the mean density of the universe and the cosmological constant will depend strongly on the size of the universe, varying from about 10% for a "small" universe up to and beyond 100% for a "large" universe. We further study the effects of random gravitational lensing and find that it may strongly impair the determination of the cosmological constant.
In the second part of this thesis we disprove a conjecture that black holes cannot form in an early, inflationary era of our universe, because of a quantum-field-theory induced instability of the black-hole horizon. This instability was supposed to arise from the difference in temperatures of any black-hole horizon and the inflationary cosmological horizon; it was thought that this temperature difference would make every quantum state that is regular at the cosmological horizon be singular at the black-hole horizon. We disprove this conjecture by explicitly constructing a quantum vacuum state that is everywhere regular for a massless scalar field. We further show that this quantum state has all the nice thermal properties that one has come to expect of "good" vacuum states, both at the black-hole horizon and at the cosmological horizon.
In the third part of the thesis we study the evolution and implications of a hypothetical primordial black hole that might have found its way into the center of the Sun or any other solar-type star. As a foundation for our analysis, we generalize the mixing-length theory of convection to an optically thick, spherically symmetric accretion flow (and find in passing that the radial stretching of the inflowing fluid elements leads to a modification of the standard Schwarzschild criterion for convection). When the accretion is that of solar matter onto the primordial hole, the rotation of the Sun causes centrifugal hangup of the inflow near the hole, resulting in an "accretion torus" which produces an enhanced outflow of heat. We find, however, that the turbulent viscosity, which accompanies the convective transport of this heat, extracts angular momentum from the inflowing gas, thereby buffering the torus into a lower luminosity than one might have expected. As a result, the solar surface will not be influenced noticeably by the torus's luminosity until at most three days before the Sun is finally devoured by the black hole. As a simple consequence, accretion onto a black hole inside the Sun cannot be an answer to the solar neutrino puzzle.
Resumo:
Using an unperturbed scattering theory, the characteristics of H atom photoionization are studied respectively by a linearly- and by a circularly- polarized one-cycle laser pulse sequence. The asymmetry for photoelectrons in two directions opposite to each other is investigated. It is found that the asymmetry degree varies with the carrier-envelope (CE) phase, laser intensity, as well as the kinetic energy of photoelectrons. For the linear polarization, the maximal ionization rate varies with the CE phase, and the asymmetry degree varies with the CE phase in a sine-like pattern. For the circular polarization, the maximal ionization rate keeps constant for various CE phases, but the variation of asymmetry degree is still in a sine-like pattern.
Resumo:
The photoelectron angular distributions (PADs) from above-threshold ionization of atoms irradiated by one-cycle laser pulses satisfy a scaling law. The scaling law denotes that the main features of the PADs are determined by four dimensionless parameters: (1) the ponderomotive number u(p) = U-p/hw, the ponderomotive energy U-p in units of laser photon energy; (2) the binding number E-b = E-b/h(w), the atomic binding energy E-b in units of laser photon energy; (3) the number of absorbed photons q; (4) the carrier-envelope phase phi(0), the phase of the carrier wave with respect to the envelope. We verify the scaling law by theoretical analysis and numerical calculation, compared to that in long-pulse case. A possible experimental test to verify the scaling law is suggested.
Resumo:
The 42-mile-long White Oak River is one of the last relatively unblemished watery jewels of the N.C. coast. The predominantly black water river meanders through Jones, Carteret and Onslow counties along the central N.C. coast, gradually widening as it flows past Swansboro and into the Atlantic Ocean. It drains almost 12,000 acres of estuaries -- saltwater marshes lined with cordgrass, narrow and impenetrable hardwood swamps and rare stands of red cedar that are flooded with wind tides. The lower portion of the river was so renowned for fat oysters and clams that in times past competing watermen came to blows over its bounty at places that now bear names like Battleground Rock. The lower river is also a designated primary nursery area for such commercially important species as shrimp, spot, Atlantic croaker, blue crabs, weakfish and southern flounder. But the river has been discovered. The permanent population along the lower White Oak increased by almost a third since 1990, and the amount of developed land increased 82 percent during the same period. With the growth have come bacteria. Since the late 1990s, much of the lower White Oak has been added to North Carolina’s list of impaired waters because of bacterial pollution. Forty-two percent of the rivers’ oyster and clam beds are permanently closed to shellfishing because of high bacteria levels. Fully two-thirds of the river’s shellfish beds are now permanently off limits or close temporarily after a moderate rain. State monitoring indicates that increased runoff from urbanization is the probable cause of the bacterial pollution. (PDF contains 4 pages)
Resumo:
数值求解了一维含时的Schroedinger方程,研究了μ子催化核聚变反应中激光强度和波长对介原子μ^3He电离的影响.发现当激光强度为10^19-10^23W/cm^2量级时,介原子μ^3He有2.7%左右的电离率;当激光强度达到6.0×10^24W/cm^2时,对介原子μ^3He有显著的电离,并且电离率随着激光的强度、波长而递增,进而会有效提高μ子的催化效率.
Resumo:
Documento de trabajo
Resumo:
The Lake Elsinore quadrangle covers about 250 square miles and includes parts of the southwest margin of the Perris Block, the Elsinore trough, the southeastern end of the Santa Ana Mountains, and the Elsinore Mountains.
The oldest rocks consist of an assemblage of metamorphics of igneous effusive and sedimentary origin, probably, for the most part, of Triassic age. They are intruded by diorite and various hypabyssal rocks, then in turn by granitic rocks, which occupy over 40 percent of the area. Following this last igneous activity of probable Lower Cretaceous age, an extended period of sedimentation started with the deposition of the marine Upper Cretaceous Chico formation and continued during the Paloecene under alternating marine and continental conditions on the margins of the blocks. A marine regression towards the north, during the Neocene, accounts for the younger Tertiary strata in the region under consideration.
Outpouring of basalts to the southeast indicates that igneous activity was resumed toward the close of the Tertiary. The fault zone, which characterizes the Elsinor trough, marks one of the major tectonic lines of southem California. It separates the upthrown and tilted block of the Santa Ana Mountains to the south from the Perris Block to the north.
Most of the faults are normal in type and nearly parallel to the general trend of the trough, or intersect each other at an acute angle. Vertical displacements generally exceed the horizontal ones and several periods of activity are recognized.
Tilting of Tertiary and older Quaternary sediments in the trough have produced broad synclinal structures which have been modified by subsequent faulting.
Five old surfaces of erosion are exposed on the highlands.
The mineral resources of the region are mainly high-grade clay deposits and mineral waters.
Resumo:
The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.
Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.
Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.
Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.
In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.
Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.
The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.
Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.
Resumo:
For some time now, the Latino voice has been gradually gaining strength in American politics, particularly in such states as California, Florida, Illinois, New York, and Texas, where large numbers of Latino immigrants have settled and large numbers of electoral votes are at stake. Yet the issues public officials in these states espouse and the laws they enact often do not coincide with the interests and preferences of Latinos. The fact that Latinos in California and elsewhere have not been able to influence the political agenda in a way that is commensurate with their numbers may reflect their failure to participate fully in the political process by first registering to vote and then consistently turning out on election day to cast their ballots.
To understand Latino voting behavior, I first examine Latino political participation in California during the ten general elections of the 1980s and 1990s, seeking to understand what percentage of the eligible Latino population registers to vote, with what political party they register, how many registered Latinos to go the polls on election day, and what factors might increase their participation in politics. To ensure that my findings are not unique to California, I also consider Latino voter registration and turnout in Texas for the five general elections of the 1990s and compare these results with my California findings.
I offer a new approach to studying Latino political participation in which I rely on county-level aggregate data, rather than on individual survey data, and employ the ecological inference method of generalized bounds. I calculate and compare Latino and white voting-age populations, registration rates, turnout rates, and party affiliation rates for California's fifty-eight counties. Then, in a secondary grouped logit analysis, I consider the factors that influence these Latino and white registration, turnout, and party affiliation rates.
I find that California Latinos register and turn out at substantially lower rates than do whites and that these rates are more volatile than those of whites. I find that Latino registration is motivated predominantly by age and education, with older and more educated Latinos being more likely to register. Motor voter legislation, which was passed to ease and simplify the registration process, has not encouraged Latino registration . I find that turnout among California's Latino voters is influenced primarily by issues, income, educational attainment, and the size of the Spanish-speaking communities in which they reside. Although language skills may be an obstacle to political participation for an individual, the number of Spanish-speaking households in a community does not encourage or discourage registration but may encourage turnout, suggesting that cultural and linguistic assimilation may not be the entire answer.
With regard to party identification, I find that Democrats can expect a steady Latino political identification rate between 50 and 60 percent, while Republicans attract 20 to 30 percent of Latino registrants. I find that education and income are the dominant factors in determining Latino political party identification, which appears to be no more volatile than that of the larger electorate.
Next, when I consider registration and turnout in Texas, I find that Latino registration rates are nearly equal to those of whites but that Texas Latino turnout rates are volatile and substantially lower than those of whites.
Low turnout rates among Latinos and the volatility of these rates may explain why Latinos in California and Texas have had little influence on the political agenda even though their numbers are large and increasing. Simply put, the voices of Latinos are little heard in the halls of government because they do not turn out consistently to cast their votes on election day.
While these findings suggest that there may not be any short-term or quick fixes to Latino participation, they also suggest that Latinos should be encouraged to participate more fully in the political process and that additional education may be one means of achieving this goal. Candidates should speak more directly to the issues that concern Latinos. Political parties should view Latinos as crossover voters rather than as potential converts. In other words, if Latinos were "a sleeping giant," they may now be a still-drowsy leviathan waiting to be wooed by either party's persuasive political messages and relevant issues.
Resumo:
Part I: An approach to the total synthesis of the triterpene shionone is described, which proceeds through the tetracyclic ketone i. The shionone side chain has been attached to this key intermediate in 5 steps, affording the olefin 2 in 29% yield. A method for the stereo-specific introduction of the angular methyl group at C-5 of shionone has been developed on a model system. The attempted utilization of this method to convert olefin 2 into shionone is described.
Part II: A method has been developed for activating the C-9 and C-10 positions of estrogenic steroids for substitution. Estrone has been converted to 4β,5β-epoxy-10β-hydroxyestr-3-one; cleavage of this epoxyketone using an Eschenmoser procedure, and subsequent modification of the product afforded 4-seco-9-estren-3,5-dione 3-ethylene acetal. This versatile intermediate, suitable for substitution at the 9 and/or 10 position, was converted to androst-4-ene-3-one by known procedures.
Resumo:
The velocity of selectively-introduced edge dislocations in 99.999 percent pure copper crystals has been measured as a function of stress at temperatures from 66°K to 373°K by means of a torsion technique. The range of resolved shear stress was 0 to 15 megadynes/ cm^2 for seven temperatures (66°K, 74°K, 83°K, 123°K, 173°K, 296°K, 296°K, 373°K.
Dislocation mobility is characterized by two distinct features; (a) relatively high velocity at low stress (maximum velocities of about 9000 em/sec were realized at low temperatures), and (b) increasing velocity with decreasing temperature at constant stress.
The relation between dislocation velocity and resolved shear stress is:
v = v_o(τ_r/τ_o)^n
where v is the dislocation velocity at resolved shear stress τ_r, v_o is a constant velocity chosen equal to 2000 cm/ sec, τ_o is the resolved shear stress required to maintain velocity v_o, and n is the mobility coefficient. The experimental results indicate that τ_o decreases from 16.3 x 10^6 to 3.3 x 10^6 dynes/cm^2 and n increases from about 0.9 to 1.1 as the temperature is lowered from 296°K to 66°K.
The experimental dislocation behavior is consistent with an interpretation on the basis of phonon drag. However, the complete temperature dependence of dislocation mobility could not be closely approximated by the predictions of one or a combination of mechanisms.
Resumo:
This article describes the streams of this unique area of Britain and reviews the published and some unpublished information that is currently available. None of the rivers in the New Forest are more than 30 km long. Many reaches have been artificially straightened, channelized and regraded since the 1840's. The stream waters are typically base-poor, with low nutrient concentrations. Primary productivity and standing crops of algae are predictably low when compared with other streams carrying higher concentrations of minerals and nutrients. The earliest records on the macroinvertebrate fauna go back to the late 19th Century. By 1940, over 20 species of Trichoptera and 10 species of Plecoptera had been recorded, but only four species of Ephemeroptera. Twenty species of fish occur in the streams of the New Forest of which the most common are brown trout, minnow, bullhead, stone loach, brook lamprey and eel.
Resumo:
The influence upon the basic viscous flow about two axisymmetric bodies of (i) freestream turbulence level and (ii) the injection of small amounts of a drag-reducing polymer (Polyox WSR 301) into the test model boundary layer was investigated by the schlieren flow visualization technique. The changes in the type and occurrence of cavitation inception caused by the subsequent modifications in the viscous flow were studied. A nuclei counter using the holographic technique was built to monitor freestream nuclei populations and a few preliminary tests investigating the consequences of different populations on cavitation inception were carried out.
Both test models were observed to have a laminar separation over their respective test Reynolds number ranges. The separation on one test model was found to be insensitive to freestream turbulence levels of up to 3.75 percent. The second model was found to be very susceptible having its critical velocity reduced from 30 feet per second at a 0.04 percent turbulence level to 10 feet per second at a 3.75 percent turbulence level. Cavitation tests on both models at the lowest turbulence level showed the value of the incipient cavitation number and the type of cavitation were controlled by the presence of the laminar separation. Cavitation tests on the second model at 0.65 percent turbulence level showed no change in the inception index, but the appearance of the developed cavitation was altered.
The presence of Polyox in the boundary layer resulted in a cavitation suppression comparable to that found by other investigators. The elimination of the normally occurring laminar separation on these bodies by a polymer-induced instability in the laminar boundary layer was found to be responsible for the suppression of inception.
Freestream nuclei populations at test conditions were measured and it was found that if there were many freestream gas bubbles the normally present laminar separation was elminated and travelling bubble type cavitation occurred - the value of the inception index then depended upon the nuclei population. In cases where the laminar separation was present it was found that the value of the inception index was insensitive to the free stream nuclei populations.