951 resultados para statistical distribution


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The stress release model, a stochastic version of the elastic rebound theory, is applied to the large events from four synthetic earthquake catalogs generated by models with various levels of disorder in distribution of fault zone strength (Ben-Zion, 1996) They include models with uniform properties (U), a Parkfield-type asperity (A), fractal brittle properties (F), and multi-size-scale heterogeneities (M). The results show that the degree of regularity or predictability in the assumed fault properties, based on both the Akaike information criterion and simulations, follows the order U, F, A, and M, which is in good agreement with that obtained by pattern recognition techniques applied to the full set of synthetic data. Data simulated from the best fitting stress release models reproduce, both visually and in distributional terms, the main features of the original catalogs. The differences in character and the quality of prediction between the four cases are shown to be dependent on two main aspects: the parameter controlling the sensitivity to departures from the mean stress level and the frequency-magnitude distribution, which differs substantially between the four cases. In particular, it is shown that the predictability of the data is strongly affected by the form of frequency-magnitude distribution, being greatly reduced if a pure Gutenburg-Richter form is assumed to hold out to high magnitudes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A model of dynamical process and stochastic jump has been put forward to study the pattern evolution in damage-fracture. According to the final states of evolution processes, the evolution modes can be classified as globally stable modes (GS modes) and evolution induced catastrophic modes (ElC modes); the latter are responsible for fracture. A statistical description is introduced to clarify the pattern evolution in this paper. It is indicated that the appearance of fracture in disordered materials should be depicted by probability distribution function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The maximum stress concentration factor in brittle materials with a high concentration of cavities is obtained. The interaction between the nearest cavities, in addition to the far field interactions, is taken into account to evaluate the strength distribution based on the statistical analysis of the nearest distance distribution. Through this investigation, it is found that the interaction between the nearest neighbors is much more important than the far field interactions, and one has to consider it in calculating the strength of brittle materials even if the volume fraction of cavities it contains is small. The other important conclusion is that the maximum stress concentration factor has a wide scattered distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The vorticity dynamics of two-dimensional turbulence are investigated analytically, applying the method of Qian (1983). The vorticity equation and its Fourier transform are presented; a set of modal parameters and a modal dynamic equation are derived; and the corresponding Liouville equation for the probability distribution in phase space is solved using a Langevin/Fokker-Planck approach to obtain integral equations for the enstrophy and for the dynamic damping coefficient eta. The equilibrium spectrum for inviscid flow is found to be a stationary solution of the enstrophy equation, and the inertial-range spectrum is determined by introducing a localization factor in the two integral equations and evaluating the localized versions numerically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We compare results of bottom trawl surveys off Washington, Oregon, and California in 1977, 1980, 1983, and 1986 to discern trends in population abundance, distribution, and biology. Catch per unit of effort, area-swept biomass estimates, and age and length compositions for 12 commercially important west coast groundfishes are presented to illustrate trends over the lO-year period. We discuss the precision, accuracy, and statistical significance of observed trends in abundance estimates. The influence of water temperature on the distribution of groundfishes is also briefly examined. Abundance estimates of canary rockfish, Sebastes pinniger, and yellowtail rockfish, S. Jlavidus, declined during the study period; greater declines were observed in Pacific ocean perch, S. alutus, lingcod, Ophiodon elongatus, and arrowtooth flounder, Atheresthes stomias. Biomass estimates of Pacific hake, Merluccius productus, and English, rex, and Dover soles (Pleuronectes vetulus, Errex zachirus, and Microstomus pacificus) increased, while bocaccio, S. paucispinis, and chilipepper, S. goodei, were stable. Sablefish, Anoplopoma fimbria, biomass estimates increased markedly from 1977 to 1980 and declined moderately thereafter. Precision was lowest for rockfishes, lingcod, and sablefish; it was highest for flatfishes because they were uniformly distributed. The accuracy of survey estimates could be gauged only for yellowtail and canary rockfish and sablefish. All fishery-based analyses produced much larger estimates of abundance than bottom trawl surveys-indicative of the true catchability of survey trawls. Population trends from all analyses compared well except in canary rockfish, the species that presents the greatest challenge to obtaining reasonable precision and one that casts doubts on the usefulness of bottom trawl surveys for estimating its abundance. (PDF file contains 78 pages.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ENGLISH: The Inter-American Tropical Tuna Commission is engaged in scientific studies of the tuna resources of the Eastern Tropical Pacific Ocean. One of the most important aspects of these investigations is the evaluation of the effects of fishing upon the populations of yellowfin tuna (Neothunnus macropterus) and skipjack (Katsuwonus pelamis) of this region, based upon the analysis of quantitative records of fishing effort and catch. The systematic collection and compilation of statistical information on the operations and production of the tuna fishing Beet have, therefore, been essential parts of the research program since its inception in 1951. SPANISH: La Comisión Interamericana del Atún Tropical está dedicada al estudio cientifico de los recursos de atún del Océano Pacifico Oriental Tropical. Uno de los aspectos más importantes de las investigaciones es la evaluación de los efectos de la pesca sobre las poblaciones de atún aleta amarilla (Neothunnus macropterus) y barrilete (Katsuwonus pelamis) de esta región, sobre la base del análisis de los registros cuantitativos del esfuerzo de pesca y captura respectiva. La recolección sistemática y la compilación de informaciones estadisticas sobre las operaciones y producción de la flota pesquera de atún han sido, consecuentemente, de esencial importancia dentro de nuestro programa de trabajo desde su comienzo en 1951. (PDF contains 77 pages.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ENGLISH: In a previous Commission Bulletin, Shimada (1957) has described the geographical distribution of the yearly catches of yellowfin tuna (Neothunnus macropterus) and skipjack (Katsuwonus pelamis) from the Eastern Pacific Ocean for the period 1952 to 1955 inclusive, based on information obtained from logbook records of baitboats and purse-seiners. In view of the seasonal nature of the fishery in different areas, a summary of the catches by smaller time units may be of additional value. Accordingly, statistical data employed earlier by Shimada have been retabulated by quarters of the year and form the basis of the present report. SPANISH: En un Boletín anterior de la Comisión, Shimada (1957) hizo un estudio sobre la distribución geográfica de las pescas anuales de atún aleta amarilla (Neothunnus macropterus) y barrilete (Katsuwonus pelamis) del Océano Pacifico Oriental, en el perlado comprendido por los años 1952 a 1955 inclusive. Dicho estudio fué hecho sobre la base de la información obtenida en los registros de las bitácoras de los barcos carnaderos y rederos. Pero en vista de la naturaleza que imprimen las estaciones a la pesquería en las diferentes áreas se ha considerado que podría tener valor complementario un resumen de las pescas en unidades de tiempo menores. De acuerdo con este criterio, los datos estadísticos empleados antes por Shimada, se han tabulado de nuevo ahora, por trimestres, y constituyen siempre la base del presente informe. (PDF contains 49 pages.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes Mateda-2.0, a MATLAB package for estimation of distribution algorithms (EDAs). This package can be used to solve single and multi-objective discrete and continuous optimization problems using EDAs based on undirected and directed probabilistic graphical models. The implementation contains several methods commonly employed by EDAs. It is also conceived as an open package to allow users to incorporate different combinations of selection, learning, sampling, and local search procedures. Additionally, it includes methods to extract, process and visualize the structures learned by the probabilistic models. This way, it can unveil previously unknown information about the optimization problem domain. Mateda-2.0 also incorporates a module for creating and validating function models based on the probabilistic models learned by EDAs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transcription factor binding sites (TFBS) play key roles in genebior 6.8 wavelet expression and regulation. They are short sequence segments with de¯nite structure and can be recognized by the corresponding transcription factors correctly. From the viewpoint of statistics, the candidates of TFBS should be quite di®erent from the segments that are randomly combined together by nucleotide. This paper proposes a combined statistical model for ¯nding over- represented short sequence segments in di®erent kinds of data set. While the over-represented short sequence segment is described by position weight matrix, the nucleotide distribution at most sites of the segment should be far from the background nucleotide distribution. The central idea of this approach is to search for such kind of signals. This algorithm is tested on 3 data sets, including binding sites data set of cyclic AMP receptor protein in E.coli, PlantProm DB which is a non-redundant collection of proximal promoter sequences from di®erent species, collection of the intergenic sequences of the whole genome of E.Coli. Even though the complexity of these three data sets is quite di®erent, the results show that this model is rather general and sensible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, probability models on rankings have been proposed in the field of estimation of distribution algorithms in order to solve permutation-based combinatorial optimisation problems. Particularly, distance-based ranking models, such as Mallows and Generalized Mallows under the Kendall’s-t distance, have demonstrated their validity when solving this type of problems. Nevertheless, there are still many trends that deserve further study. In this paper, we extend the use of distance-based ranking models in the framework of EDAs by introducing new distance metrics such as Cayley and Ulam. In order to analyse the performance of the Mallows and Generalized Mallows EDAs under the Kendall, Cayley and Ulam distances, we run them on a benchmark of 120 instances from four well known permutation problems. The conducted experiments showed that there is not just one metric that performs the best in all the problems. However, the statistical test pointed out that Mallows-Ulam EDA is the most stable algorithm among the studied proposals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data were taken in 1979-80 by the CCFRR high energy neutrino experiment at Fermilab. A total of 150,000 neutrino and 23,000 antineutrino charged current events in the approximate energy range 25 < E_v < 250GeV are measured and analyzed. The structure functions F2 and xF_3 are extracted for three assumptions about σ_L/σ_T:R=0., R=0.1 and R= a QCD based expression. Systematic errors are estimated and their significance is discussed. Comparisons or the X and Q^2 behaviour or the structure functions with results from other experiments are made.

We find that statistical errors currently dominate our knowledge of the valence quark distribution, which is studied in this thesis. xF_3 from different experiments has, within errors and apart from level differences, the same dependence on x and Q^2, except for the HPWF results. The CDHS F_2 shows a clear fall-off at low-x from the CCFRR and EMC results, again apart from level differences which are calculable from cross-sections.

The result for the the GLS rule is found to be 2.83±.15±.09±.10 where the first error is statistical, the second is an overall level error and the third covers the rest of the systematic errors. QCD studies of xF_3 to leading and second order have been done. The QCD evolution of xF_3, which is independent of R and the strange sea, does not depend on the gluon distribution and fits yield

ʌ_(LO) = 88^(+163)_(-78) ^(+113)_(-70) MeV

The systematic errors are smaller than the statistical errors. Second order fits give somewhat different values of ʌ, although α_s (at Q^2_0 = 12.6 GeV^2) is not so different.

A fit using the better determined F_2 in place of xF_3 for x > 0.4 i.e., assuming q = 0 in that region, gives

ʌ_(LO) = 266^(+114)_(-104) ^(+85)_(-79) MeV

Again, the statistical errors are larger than the systematic errors. An attempt to measure R was made and the measurements are described. Utilizing the inequality q(x)≥0 we find that in the region x > .4 R is less than 0.55 at the 90% confidence level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A tabulated summary is presented of the main fisheries data collected to date (1998) by the Nigerian-German Kainji Lake Fisheries Promotion Project, together with a current overview of the fishery. The data are given under the following sections: 1) Fishing localities and types; 2) Frame survey data; 3) Number of licensed fishermen by state; 4) Mesh size distribution; 5) Fishing net characteristics; 6) Fish yield; 7) Total annual fishing effort by gear type; 8) Total annual value of fish landed by gear type; 9) Graphs of effort and CPUE by gear type. (PDF contains 36 pages)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A tabulated summary is presented of the main Lake Kainji fisheries data collected to date (1999) by the Nigerian-German Kainji Lake Fisheries Promotion Project, together with a current overview of the fishery. The data are given under the following sections: 1) Fishing localities and types; 2) Frame survey data; 3) Number of licensed fishermen by state; 4) Mesh size distribution; 5) Fishing net characteristics; 6) Fish yield; 7) Average monthly CPUE by gear type; 8)Average monthly fishing activity by gear type; 9) Total annual fishing effort by gear type; 10) Total annual value of fish landed by gear type; 11) Trends of the total yield by gear type. (PDF contains 34 pages)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis explores the problem of mobile robot navigation in dense human crowds. We begin by considering a fundamental impediment to classical motion planning algorithms called the freezing robot problem: once the environment surpasses a certain level of complexity, the planner decides that all forward paths are unsafe, and the robot freezes in place (or performs unnecessary maneuvers) to avoid collisions. Since a feasible path typically exists, this behavior is suboptimal. Existing approaches have focused on reducing predictive uncertainty by employing higher fidelity individual dynamics models or heuristically limiting the individual predictive covariance to prevent overcautious navigation. We demonstrate that both the individual prediction and the individual predictive uncertainty have little to do with this undesirable navigation behavior. Additionally, we provide evidence that dynamic agents are able to navigate in dense crowds by engaging in joint collision avoidance, cooperatively making room to create feasible trajectories. We accordingly develop interacting Gaussian processes, a prediction density that captures cooperative collision avoidance, and a "multiple goal" extension that models the goal driven nature of human decision making. Navigation naturally emerges as a statistic of this distribution.

Most importantly, we empirically validate our models in the Chandler dining hall at Caltech during peak hours, and in the process, carry out the first extensive quantitative study of robot navigation in dense human crowds (collecting data on 488 runs). The multiple goal interacting Gaussian processes algorithm performs comparably with human teleoperators in crowd densities nearing 1 person/m2, while a state of the art noncooperative planner exhibits unsafe behavior more than 3 times as often as the multiple goal extension, and twice as often as the basic interacting Gaussian process approach. Furthermore, a reactive planner based on the widely used dynamic window approach proves insufficient for crowd densities above 0.55 people/m2. We also show that our noncooperative planner or our reactive planner capture the salient characteristics of nearly any dynamic navigation algorithm. For inclusive validation purposes, we show that either our non-interacting planner or our reactive planner captures the salient characteristics of nearly any existing dynamic navigation algorithm. Based on these experimental results and theoretical observations, we conclude that a cooperation model is critical for safe and efficient robot navigation in dense human crowds.

Finally, we produce a large database of ground truth pedestrian crowd data. We make this ground truth database publicly available for further scientific study of crowd prediction models, learning from demonstration algorithms, and human robot interaction models in general.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.

Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.

Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.

Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.

In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.

Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.

The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.

Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.