994 resultados para Saab 900 GL.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large quantities of teleseismic short-period seismograms recorded at SCARLET provide travel time, apparent velocity and waveform data for study of upper mantle compressional velocity structure. Relative array analysis of arrival times from distant (30° < Δ < 95°) earthquakes at all azimuths constrains lateral velocity variations beneath southern California. We compare dT/dΔ back azimuth and averaged arrival time estimates from the entire network for 154 events to the same parameters derived from small subsets of SCARLET. Patterns of mislocation vectors for over 100 overlapping subarrays delimit the spatial extent of an east-west striking, high-velocity anomaly beneath the Transverse Ranges. Thin lens analysis of the averaged arrival time differences, called 'net delay' data, requires the mean depth of the corresponding lens to be more than 100 km. Our results are consistent with the PKP-delay times of Hadley and Kanamori (1977), who first proposed the high-velocity feature, but we place the anomalous material at substantially greater depths than their 40-100 km estimate.

Detailed analysis of travel time, ray parameter and waveform data from 29 events occurring in the distance range 9° to 40° reveals the upper mantle structure beneath an oceanic ridge to depths of over 900 km. More than 1400 digital seismograms from earthquakes in Mexico and Central America yield 1753 travel times and 58 dT/dΔ measurements as well as high-quality, stable waveforms for investigation of the deep structure of the Gulf of California. The result of a travel time inversion with the tau method (Bessonova et al., 1976) is adjusted to fit the p(Δ) data, then further refined by incorporation of relative amplitude information through synthetic seismogram modeling. The application of a modified wave field continuation method (Clayton and McMechan, 1981) to the data with the final model confirms that GCA is consistent with the entire data set and also provides an estimate of the data resolution in velocity-depth space. We discover that the upper mantle under this spreading center has anomalously slow velocities to depths of 350 km, and place new constraints on the shape of the 660 km discontinuity.

Seismograms from 22 earthquakes along the northeast Pacific rim recorded in southern California form the data set for a comparative investigation of the upper mantle beneath the Cascade Ranges-Juan de Fuca region, an ocean-continent transit ion. These data consist of 853 seismograms (6° < Δ < 42°) which produce 1068 travel times and 40 ray parameter estimates. We use the spreading center model initially in synthetic seismogram modeling, and perturb GCA until the Cascade Ranges data are matched. Wave field continuation of both data sets with a common reference model confirms that real differences exist between the two suites of seismograms, implying lateral variation in the upper mantle. The ocean-continent transition model, CJF, features velocities from 200 and 350 km that are intermediate between GCA and T7 (Burdick and Helmberger, 1978), a model for the inland western United States. Models of continental shield regions (e.g., King and Calcagnile, 1976) have higher velocities in this depth range, but all four model types are similar below 400 km. This variation in rate of velocity increase with tectonic regime suggests an inverse relationship between velocity gradient and lithospheric age above 400 km depth.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The technique of variable-angle, electron energy-loss spectroscopy has been used to study the electronic spectroscopy of the diketene molecule. The experiment was performed using incident electron beam energies of 25 eV and 50 eV, and at scattering angles between 10° and 90°. The energy-loss region from 2 eV to 11 eV was examined. One spin-forbidden transition has been observed at 4.36 eV and three others that are spin-allowed have been located at 5.89 eV, 6.88 eV and 7.84 eV. Based on the intensity variation of these transitions with impact energy and scattering angle, and through analogy with simpler molecules, the first three transitions are tentatively assigned to an n → π* transition, a π - σ* (3s) Rydberg transition and a π → π* transition.

Thermal decomposition of chlorodifluoromethane, chloroform, dichloromethane and chloromethane under flash-vacuum pyrolysis conditions (900-1100°C) was investigated by the technique of electron energy-loss spectroscopy, using the impact energy of 50 eV and a scattering angle of 10°. The pyrolytic reaction follows a hydrogen-chloride α-elimination pathway. The difluoromethylene radical was produced from chlorodifluoromethane pyrolysis at 900°C and identified by its X^1 A_1 → A^1B_1 band at 5.04 eV.

Finally, a number of exploratory studies have been performed. The thermal decomposition of diketene was studied under flash vacuum pressures (1-10 mTorr) and temperatures ranging from 500°C to 1000°C. The complete decomposition of the diketene molecule into two ketene molecules was achieved at 900°C. The pyrolysis of trifluoromethyl iodide molecule at 1000°C produced an electron energy-loss spectrum with several iodine-atom, sharp peaks and only a small shoulder at 8.37 eV as a possible trifluoromethyl radical feature. The electron energy-loss spectrum of trichlorobromomethane at 900°C mainly showed features from bromine atom, chlorine molecule and tetrachloroethylene. Hexachloroacetone decomposed partially at 900°C, but showed well-defined features from chlorine, carbon monoxide and tetrachloroethylene molecules. Bromodichloromethane molecule was investigated at 1000°C and produced a congested, electron energy-loss spectrum with bromine-atom, hydrogen-bromide, hydrogen-chloride and tetrachloroethylene features.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.

Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.

Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.

Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.

In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.

Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.

The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.

Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report the single-shot damage thresholds of MgF2/ZnS onmidirectional reflector for laser pulse durations from 50 A to 900 fs. A coupled dynamic model is applied to study the damage mechanisms, in which we consider not only the electronic excitation of the material, but also the influence of this excitation-induced changes in the complex refractive index of material on the laser pulse itself. The results indicate that this feedback effect plays a very important role during the damage of material. Based on this model, we calculate the threshold fluences and the time-resolved excitation process of the multiplayer. The theoretical calculations agree well with our experimental results. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The brightness of a particular harmonic order is optimized for the chirp and initial phase of the laser pulse by genetic algorithm. The influences of the chirp and initial phase of the excitation pulse on the harmonic spectra are discussed in terms of the semi-classical model including the propagation effects. The results indicate that the harmonic intensity and cutoff have strong dependence on the chirp of the laser pulse, but slightly on its initial phase. The high-order harmonics can be enhanced by the optimal laser pulse and its cutoff can be tuned by optimization of the chirp and initial phase of the laser pulse.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An optimal feedback control of two-photon fluorescence in the ethanol solution of 4-dicyanomethylene-2-methyl-6-p-dimethyl-amiiiostryryl-4H-pyran (DCM) using pulse-shaping technique based on genetic algorithm is demonstrated experimentally. The two-photon fluorescence of the DCM ethanol solution is enhanced in intensity of about 23%. The second harmonic generation frequency-resolved optical gating (SHG-FROG) trace indicates that the effective population transfer arises from the positively chirped pulse. The experimental results appear the potential applications of coherent control to the complicated molecular system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

本文介绍了近年来X射线位相衬度成像技术的发展状况,详细论述了X射线干涉相衬、衍射增强相衬,类同轴相衬和数字位相重构等几种典型的成像原理,讨论了影响成像衬度与分辨率的若干因素,并对位相成像的发展趋势作了展望。

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Damage threshold of crystals SiO2 and YAG against 60-900 fs, 800 nm laser pulses are reported. The breakdown mechanisms were discussed based on the double-flux model and Keldysh theory. We found that impact ionization plays the important role in the femtosecond laser-induced damage in crystalline SiO2, while the roles of photoionization and impact ionization in YAG crystals depend on the laser pulse durations. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis describes a series of experimental studies of lead chalcogenide thermoelectric semiconductors, mainly PbSe. Focusing on a well-studied semiconductor and reporting good but not extraordinary zT, this thesis distinguishes itself by answering the following questions that haven’t been answered: What represents the thermoelectric performance of PbSe? Where does the high zT come from? How (and how much) can we make it better? For the first question, samples were made with highest quality. Each transport property was carefully measured, cross-verified and compared with both historical and contemporary report to overturn commonly believed underestimation of zT. For n- and p-type PbSe zT at 850 K can be 1.1 and 1.0, respectively. For the second question, a systematic approach of quality factor B was used. In n-type PbSe zT is benefited from its high-quality conduction band that combines good degeneracy, low band mass and low deformation potential, whereas zT of p-type is boosted when two mediocre valence bands converge (in band edge energy). In both cases the thermal conductivity from PbSe lattice is inherently low. For the third question, the use of solid solution lead chalcogenide alloys was first evaluated. Simple criteria were proposed to help quickly evaluate the potential of improving zT by introducing atomic disorder. For both PbTe1-xSex and PbSe1-xSx, the impacts in electron and phonon transport compensate each other. Thus, zT in each case was roughly the average of two binary compounds. In p-type Pb1-xSrxSe alloys an improvement of zT from 1.1 to 1.5 at 900 K was achieved, due to the band engineering effect that moves the two valence bands closer in energy. To date, making n-type PbSe better hasn’t been accomplished, but possible strategy is discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

根据石英晶体双折射率的色散特性,对石英波片的偏光干涉谱进行了理论分析和数值模拟,提出了一种石英波片延迟量和厚度的偏光干涉标定法。即由偏光干涉谱,可以得出石英波片在200~2000 nm宽光谱范围内的延迟量;通过对长波段的偏光干涉谱极值波长的精确判断,可以准确地计算出该石英波片的厚度。利用Lambda 900 紫外可见近红外分光光度计对一片石英波片的偏光干涉谱进行了测量。在波长精度为0.1 nm的情况下,测量的厚度精度为0.1 μm。误差分析结果表明,通过提高光谱的最小分辨力及选择较长的光谱波段进行测量计算

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis consists of two separate parts. Part I (Chapter 1) is concerned with seismotectonics of the Middle America subduction zone. In this chapter, stress distribution and Benioff zone geometry are investigated along almost 2000 km of this subduction zone, from the Rivera Fracture Zone in the north to Guatemala in the south. Particular emphasis is placed on the effects on stress distribution of two aseismic ridges, the Tehuantepec Ridge and the Orozco Fracture Zone, which subduct at seismic gaps. Stress distribution is determined by studying seismicity distribution, and by analysis of 190 focal mechanisms, both new and previously published, which are collected here. In addition, two recent large earthquakes that have occurred near the Tehuantepec Ridge and the Orozco Fracture Zone are discussed in more detail. A consistent stress release pattern is found along most of the Middle America subduction zone: thrust events at shallow depths, followed down-dip by an area of low seismic activity, followed by a zone of normal events at over 175 km from the trench and 60 km depth. The zone of low activity is interpreted as showing decoupling of the plates, and the zone of normal activity as showing the breakup of the descending plate. The portion of subducted lithosphere containing the Orozco Fracture Zone does not differ significantly, in Benioff zone geometry or in stress distribution, from adjoining segments. The Playa Azul earthquake of October 25, 1981, Ms=7.3, occurred in this area. Body and surface wave analysis of this event shows a simple source with a shallow thrust mechanism and gives Mo=1.3x1027 dyne-cm. A stress drop of about 45 bars is calculated; this is slightly higher than that of other thrust events in this subduction zone. In the Tehuantepec Ridge area, only minor differences in stress distribution are seen relative to adjoining segments. For both ridges, the only major difference from adjoining areas is the infrequency or lack of occurrence of large interplate thrust events.

Part II involves upper mantle P wave structure studies, for the Canadian shield and eastern North America. In Chapter 2, the P wave structure of the Canadian shield is determined through forward waveform modeling of the phases Pnl, P, and PP. Effects of lateral heterogeneity are kept to a minimum by using earthquakes just outside the shield as sources, with propagation paths largely within the shield. Previous mantle structure studies have used recordings of P waves in the upper mantle triplication range of 15-30°; however, the lack of large earthquakes in the shield region makes compilation of a complete P wave dataset difficult. By using the phase PP, which undergoes triplications at 30-60°, much more information becomes available. The WKBJ technique is used to calculate synthetic seismograms for PP, and these records are modeled almost as well as the P. A new velocity model, designated S25, is proposed for the Canadian shield. This model contains a thick, high-Q, high-velocity lid to 165 km and a deep low-velocity zone. These features combine to produce seismograms that are markedly different from those generated by other shield structure models. The upper mantle discontinuities in S25 are placed at 405 and 660 km, with a simple linear gradient in velocity between them. Details of the shape of the discontinuities are not well constrained. Below 405 km, this model is not very different from many proposed P wave models for both shield and tectonic regions.

Chapter 3 looks in more detail at recordings of Pnl in eastern North America. First, seismograms from four eastern North American earthquakes are analyzed, and seismic moments for the events are calculated. These earthquakes are important in that they are among the largest to have occurred in eastern North America in the last thirty years, yet in some cases were not large enough to produce many good long-period teleseismic records. A simple layer-over-a-halfspace model is used for the initial modeling, and is found to provide an excellent fit for many features of the observed waveforms. The effects on Pnl of varying lid structure are then investigated. A thick lid with a positive gradient in velocity, such as that proposed for the Canadian shield in Chapter 2, will have a pronounced effect on the waveforms, beginning at distances of 800 or 900 km. Pnl records from the same eastern North American events are recalculated for several lid structure models, to survey what kinds of variations might be seen. For several records it is possible to see likely effects of lid structure in the data. However, the dataset is too sparse to make any general observations about variations in lid structure. This type of modeling is expected to be important in the future, as the analysis is extended to more recent eastern North American events, and as broadband instruments make more high-quality regional recordings available.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The initial probabilities of activated, dissociative chemisorption of methane and ethane on Pt(110)-(1 x 2) have been measured. The surface temperature was varied from 450 to 900 K with the reactant gas temperature constant at 300 K. Under these conditions, we probe the kinetics of dissociation via trapping-mediated (as opposed to 'direct') mechanism. It was found that the probabilities of dissociation of both methane and ethane were strong functions of the surface temperature with an apparent activation energies of 14.4 kcal/mol for methane and 2.8 kcal/mol for ethane, which implys that the methane and ethane molecules have fully accommodated to the surface temperature. Kinetic isotope effects were observed for both reactions, indicating that the C-H bond cleavage was involved in the rate-limiting step. A mechanistic model based on the trapping-mediated mechanism is used to explain the observed kinetic behavior. The activation energies for C-H bond dissociation of the thermally accommodated methane and ethane on the surface extracted from the model are 18.4 and 10.3 kcal/mol, respectively.

The studies of the catalytic decomposition of formic acid on the Ru(001) surface with thermal desorption mass spectrometry following the adsorption of DCOOH and HCOOH on the surface at 130 and 310 K are described. Formic acid (DCOOH) chemisorbs dissociatively on the surface via both the cleavage of its O-H bond to form a formate and a hydrogen adatom, and the cleavage of its C-O bond to form a carbon monoxide, a deuterium adatom and an hydroxyl (OH). The former is the predominant reaction. The rate of desorption of carbon dioxide is a direct measure of the kinetics of decomposition of the surface formate. It is characterized by a kinetic isotope effect, an increasingly narrow FWHM, and an upward shift in peak temperature with Ɵ_T, the coverage of the dissociatively adsorbed formic acid. The FWHM and the peak temperature change from 18 K and 326 K at Ɵ_T = 0.04 to 8 K and 395 K at Ɵ_T = 0.89. The increase in the apparent activation energy of the C-D bond cleavage is largely a result of self-poisoning by the formate, the presence of which on the surface alters the electronic properties of the surface such that the activation energy of the decomposition of formate is increased. The variation of the activation energy for carbon dioxide formation with Ɵ_T accounts for the observed sharp carbon dioxide peak. The coverage of surface formate can be adjusted over a relatively wide range so that the activation energy for C-D bond cleavage in the case of DCOOH can be adjusted to be below, approximately equal to, or well above the activation energy for the recombinative desorption of the deuterium adatoms. Accordingly, the desorption of deuterium was observed to be governed completely by the desorption kinetics of the deuterium adatoms at low Ɵ_T, jointly by the kinetics of deuterium desorption and C-D bond cleavage at intermediate Ɵ_T, and solely by the kinetics of C-D bond cleavage at high Ɵ_T. The overall branching ratio of the formate to carbon dioxide and carbon monoxide is approximately unity, regardless the initial coverage Ɵ_T, even though the activation energy for the production of carbon dioxide varies with Ɵ_T. The desorption of water, which implies C-O bond cleavage of the formate, appears at approximately the same temperature as that of carbon dioxide. These observations suggest that the cleavage of the C-D bond and that of the C-O bond of two surface formates are coupled, possibly via the formation of a short-lived surface complex that is the precursor to to the decomposition.

The measurement of steady-state rate is demonstrated here to be valuable in determining kinetics associated with short-lived, molecularly adsorbed precursor to further reactions on the surface, by determining the kinetic parameters of the molecular precursor of formaldehyde to its dissociation on the Pt(110)-(1 x 2) surface.

Overlayers of nitrogen adatoms on Ru(001) have been characterized both by thermal desorption mass spectrometry and low-energy electron diffraction, as well as chemically via the postadsorption and desorption of ammonia and carbon monoxide.

The nitrogen-adatom overlayer was prepared by decomposing ammonia thermally on the surface at a pressure of 2.8 x 10^(-6) Torr and a temperature of 480 K. The saturated overlayer prepared under these conditions has associated with it a (√247/10 x √247/10)R22.7° LEED pattern, has two peaks in its thermal desorption spectrum, and has a fractional surface coverage of 0.40. Annealing the overlayer to approximately 535 K results in a rather sharp (√3 x √3)R30° LEED pattern with an associated fractional surface coverage of one-third. Annealing the overlayer further to 620 K results in the disappearance of the low-temperature thermal desorption peak and the appearance of a rather fuzzy p(2x2) LEED pattern with an associated fractional surface coverage of approximately one-fourth. In the low coverage limit, the presence of the (√3 x √3)R30° N overlayer alters the surface in such a way that the binding energy of ammonia is increased by 20% relative to the clean surface, whereas that of carbon monoxide is reduced by 15%.

A general methodology for the indirect relative determination of the absolute fractional surface coverages has been developed and was utilized to determine the saturation fractional coverage of hydrogen on Ru(001). Formaldehyde was employed as a bridge to lead us from the known reference point of the saturation fractional coverage of carbon monoxide to unknown reference point of the fractional coverage of hydrogen on Ru(001), which is then used to determine accurately the saturation fractional coverage of hydrogen. We find that ƟSAT/H = 1.02 (±0.05), i.e., the surface stoichiometry is Ru : H = 1 : 1. The relative nature of the method, which cancels systematic errors, together with the utilization of a glass envelope around the mass spectrometer, which reduces spurious contributions in the thermal desorption spectra, results in high accuracy in the determination of absolute fractional coverages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The surface resistance and the critical magnetic field of lead electroplated on copper were studied at 205 MHz in a half-wave coaxial resonator. The observed surface resistance at a low field level below 4.2°K could be well described by the BCS surface resistance with the addition of a temperature independent residual resistance. The available experimental data suggest that the major fraction of the residual resistance in the present experiment was due to the presence of an oxide layer on the surface. At higher magnetic field levels the surface resistance was found to be enhanced due to surface imperfections.

The attainable rf critical magnetic field between 2.2°K and T_c of lead was found to be limited not by the thermodynamic critical field but rather by the superheating field predicted by the one-dimensional Ginzburg-Landau theory. The observed rf critical field was very close to the expected superheating field, particularly in the higher reduced temperature range, but showed somewhat stronger temperature dependence than the expected superheating field in the lower reduced temperature range.

The rf critical magnetic field was also studied at 90 MHz for pure tin and indium, and for a series of SnIn and InBi alloys spanning both type I and type II superconductivity. The samples were spherical with typical diameters of 1-2 mm and a helical resonator was used to generate the rf magnetic field in the measurement. The results of pure samples of tin and indium showed that a vortex-like nucleation of the normal phase was responsible for the superconducting-to-normal phase transition in the rf field at temperatures up to about 0.98-0.99 T_c' where the ideal superheating limit was being reached. The results of the alloy samples showed that the attainable rf critical fields near T_c were well described by the superheating field predicted by the one-dimensional GL theory in both the type I and type II regimes. The measurement was also made at 300 MHz resulting in no significant change in the rf critical field. Thus it was inferred that the nucleation time of the normal phase, once the critical field was reached, was small compared with the rf period in this frequency range.