5 resultados para Statistical distribution

em CaltechTHESIS


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The influence of composition on the structure and on the electric and magnetic properties of amorphous Pd-Mn-P and Pd-Co-P prepared by rapid quenching techniques were investigated in terms of (1) the 3d band filling of the first transition metal group, (2) the phosphorus concentration effect which acts as an electron donor and (3) the transition metal concentration.

The structure is essentially characterized by a set of polyhedra subunits essentially inverse to the packing of hard spheres in real space. Examination of computer generated distribution functions using Monte Carlo random statistical distribution of these polyhedra entities demonstrated tile reproducibility of the experimentally calculated atomic distribution function. As a result, several possible "structural parameters" are proposed such as: the number of nearest neighbors, the metal-to-metal distance, the degree of short-range order and the affinity between metal-metal and metal-metalloid. It is shown that the degree of disorder increases from Ni to Mn. Similar behavior is observed with increase in the phosphorus concentration.

The magnetic properties of Pd-Co-P alloys show that they are ferromagnetic with a Curie temperature between 272 and 399°K as the cobalt concentration increases from 15 to 50 at.%. Below 20 at.% Co the short-range exchange interactions which produce the ferromagnetism are unable to establish a long-range magnetic order and a peak in the magnetization shows up at the lowest temperature range . The electric resistivity measurements were performed from liquid helium temperatures up to the vicinity of the melting point (900°K). The thermomagnetic analysis was carried out under an applied field of 6.0 kOe. The electrical resistivity of Pd-Co-P shows the coexistence of a Kondo-like minimum with ferromagnetism. The minimum becomes less important as the transition metal concentration increases and the coefficients of ℓn T and T^2 become smaller and strongly temperature dependent. The negative magnetoresistivity is a strong indication of the existence of localized moment.

The temperature coefficient of resistivity which is positive for Pd- Fe-P, Pd-Ni-P, and Pd-Co-P becomes negative for Pd-Mn-P. It is possible to account for the negative temperature dependence by the localized spin fluctuation model and the high density of states at the Fermi energy which becomes maximum between Mn and Cr. The magnetization curves for Pd-Mn-P are typical of those resulting from the interplay of different exchange forces. The established relationship between susceptibility and resistivity confirms the localized spin fluctuation model. The magnetoresistivity of Pd-Mn-P could be interpreted in tenns of a short-range magnetic ordering that could arise from the Rudennan-Kittel type interactions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The present work deals with the problem of the interaction of the electromagnetic radiation with a statistical distribution of nonmagnetic dielectric particles immersed in an infinite homogeneous isotropic, non-magnetic medium. The wavelength of the incident radiation can be less, equal or greater than the linear dimension of a particle. The distance between any two particles is several wavelengths. A single particle in the absence of the others is assumed to scatter like a Rayleigh-Gans particle, i.e. interaction between the volume elements (self-interaction) is neglected. The interaction of the particles is taken into account (multiple scattering) and conditions are set up for the case of a lossless medium which guarantee that the multiple scattering contribution is more important than the self-interaction one. These conditions relate the wavelength λ and the linear dimensions of a particle a and of the region occupied by the particles D. It is found that for constant λ/a, D is proportional to λ and that |Δχ|, where Δχ is the difference in the dielectric susceptibilities between particle and medium, has to lie within a certain range.

The total scattering field is obtained as a series the several terms of which represent the corresponding multiple scattering orders. The first term is a single scattering term. The ensemble average of the total scattering intensity is then obtained as a series which does not involve terms due to products between terms of different orders. Thus the waves corresponding to different orders are independent and their Stokes parameters add.

The second and third order intensity terms are explicitly computed. The method used suggests a general approach for computing any order. It is found that in general the first order scattering intensity pattern (or phase function) peaks in the forward direction Θ = 0. The second order tends to smooth out the pattern giving a maximum in the Θ = π/2 direction and minima in the Θ = 0 , Θ = π directions. This ceases to be true if ka (where k = 2π/λ) becomes large (> 20). For large ka the forward direction is further enhanced. Similar features are expected from the higher orders even though the critical value of ka may increase with the order.

The first order polarization of the scattered wave is determined. The ensemble average of the Stokes parameters of the scattered wave is explicitly computed for the second order. A similar method can be applied for any order. It is found that the polarization of the scattered wave depends on the polarization of the incident wave. If the latter is elliptically polarized then the first order scattered wave is elliptically polarized, but in the Θ = π/2 direction is linearly polarized. If the incident wave is circularly polarized the first order scattered wave is elliptically polarized except for the directions Θ = π/2 (linearly polarized) and Θ = 0, π (circularly polarized). The handedness of the Θ = 0 wave is the same as that of the incident whereas the handedness of the Θ = π wave is opposite. If the incident wave is linearly polarized the first order scattered wave is also linearly polarized. The second order makes the total scattered wave to be elliptically polarized for any Θ no matter what the incident wave is. However, the handedness of the total scattered wave is not altered by the second order. Higher orders have similar effects as the second order.

If the medium is lossy the general approach employed for the lossless case is still valid. Only the algebra increases in complexity. It is found that the results of the lossless case are insensitive in the first order of kimD where kim = imaginary part of the wave vector k and D a linear characteristic dimension of the region occupied by the particles. Thus moderately extended regions and small losses make (kimD)2 ≪ 1 and the lossy character of the medium does not alter the results of the lossless case. In general the presence of the losses tends to reduce the forward scattering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data were taken in 1979-80 by the CCFRR high energy neutrino experiment at Fermilab. A total of 150,000 neutrino and 23,000 antineutrino charged current events in the approximate energy range 25 < E_v < 250GeV are measured and analyzed. The structure functions F2 and xF_3 are extracted for three assumptions about σ_L/σ_T:R=0., R=0.1 and R= a QCD based expression. Systematic errors are estimated and their significance is discussed. Comparisons or the X and Q^2 behaviour or the structure functions with results from other experiments are made.

We find that statistical errors currently dominate our knowledge of the valence quark distribution, which is studied in this thesis. xF_3 from different experiments has, within errors and apart from level differences, the same dependence on x and Q^2, except for the HPWF results. The CDHS F_2 shows a clear fall-off at low-x from the CCFRR and EMC results, again apart from level differences which are calculable from cross-sections.

The result for the the GLS rule is found to be 2.83±.15±.09±.10 where the first error is statistical, the second is an overall level error and the third covers the rest of the systematic errors. QCD studies of xF_3 to leading and second order have been done. The QCD evolution of xF_3, which is independent of R and the strange sea, does not depend on the gluon distribution and fits yield

ʌ_(LO) = 88^(+163)_(-78) ^(+113)_(-70) MeV

The systematic errors are smaller than the statistical errors. Second order fits give somewhat different values of ʌ, although α_s (at Q^2_0 = 12.6 GeV^2) is not so different.

A fit using the better determined F_2 in place of xF_3 for x > 0.4 i.e., assuming q = 0 in that region, gives

ʌ_(LO) = 266^(+114)_(-104) ^(+85)_(-79) MeV

Again, the statistical errors are larger than the systematic errors. An attempt to measure R was made and the measurements are described. Utilizing the inequality q(x)≥0 we find that in the region x > .4 R is less than 0.55 at the 90% confidence level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis explores the problem of mobile robot navigation in dense human crowds. We begin by considering a fundamental impediment to classical motion planning algorithms called the freezing robot problem: once the environment surpasses a certain level of complexity, the planner decides that all forward paths are unsafe, and the robot freezes in place (or performs unnecessary maneuvers) to avoid collisions. Since a feasible path typically exists, this behavior is suboptimal. Existing approaches have focused on reducing predictive uncertainty by employing higher fidelity individual dynamics models or heuristically limiting the individual predictive covariance to prevent overcautious navigation. We demonstrate that both the individual prediction and the individual predictive uncertainty have little to do with this undesirable navigation behavior. Additionally, we provide evidence that dynamic agents are able to navigate in dense crowds by engaging in joint collision avoidance, cooperatively making room to create feasible trajectories. We accordingly develop interacting Gaussian processes, a prediction density that captures cooperative collision avoidance, and a "multiple goal" extension that models the goal driven nature of human decision making. Navigation naturally emerges as a statistic of this distribution.

Most importantly, we empirically validate our models in the Chandler dining hall at Caltech during peak hours, and in the process, carry out the first extensive quantitative study of robot navigation in dense human crowds (collecting data on 488 runs). The multiple goal interacting Gaussian processes algorithm performs comparably with human teleoperators in crowd densities nearing 1 person/m2, while a state of the art noncooperative planner exhibits unsafe behavior more than 3 times as often as the multiple goal extension, and twice as often as the basic interacting Gaussian process approach. Furthermore, a reactive planner based on the widely used dynamic window approach proves insufficient for crowd densities above 0.55 people/m2. We also show that our noncooperative planner or our reactive planner capture the salient characteristics of nearly any dynamic navigation algorithm. For inclusive validation purposes, we show that either our non-interacting planner or our reactive planner captures the salient characteristics of nearly any existing dynamic navigation algorithm. Based on these experimental results and theoretical observations, we conclude that a cooperation model is critical for safe and efficient robot navigation in dense human crowds.

Finally, we produce a large database of ground truth pedestrian crowd data. We make this ground truth database publicly available for further scientific study of crowd prediction models, learning from demonstration algorithms, and human robot interaction models in general.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.

Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.

Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.

Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.

In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.

Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.

The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.

Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.