970 resultados para EXTENDED EXPONENTIAL DISTRIBUTION
Resumo:
The dynamics of the survival of recruiting fish are analyzed as evolving random processes of aggregation and mortality. The analyses draw on recent advances in the physics of complex networks and, in particular, the scale-free degree distribution arising from growing random networks with preferential attachment of links to nodes. In this study simulations were conducted in which recruiting fish 1) were subjected to mortality by using alternative mortality encounter models and 2) aggregated according to random encounters (two schools randomly encountering one another join into a single school) or preferential attachment (the probability of a successful aggregation of two schools is proportional to the school sizes). The simulations started from either a “disaggregated” (all schools comprised a single fish) or an aggregated initial condition. Results showed the transition of the school-size distribution with preferential attachment evolving toward a scale-free school size distribution, whereas random attachment evolved toward an exponential distribution. Preferential attachment strategies performed better than random attachment strategies in terms of recruitment survival at time when mortality encounters were weighted toward schools rather than to individual fish. Mathematical models were developed whose solutions (either analytic or numerical) mimicked the simulation results. The resulting models included both Beverton-Holt and Ricker-like recruitment, which predict recruitment as a function of initial mean school size as well as initial stock size. Results suggest that school-size distributions during recruitment may provide information on recruitment processes. The models also provide a template for expanding both theoretical and empirical recruitment research.
Resumo:
The field emission properties of nanostructured carbon films deposited by cathodic vacuum arc in a He atmosphere have been studied by measuring the emission currents and the emission site density. The films have an onset field of ∼ 3 V/μm. The emission site density is viewed on a phosphor anode and it increases rapidly with applied field. It is assumed that the emission occurs from surface regions with a range of field enhancement factors but with a constant work function. The field enhancement factor is found to have an exponential distribution.
Resumo:
Extended quark distribution functions are presented obtained by fitting a large amount of experimental data of the l-A DIS process on the basis of an improved nuclear density model. The experimental data of l-A DIS processes with A >= 3 in the region 0.0010 <= x <= 0.9500 axe quite satisfactorily described by using the extended formulae. Our knowledge of the influence of nuclear matter on the quark distributions is deepened.
Resumo:
We study the average property of the isospin effect of reaction induced by halo-neutron nuclei He-8 and He-10 in the intermediate energy heavy ion collisions using the isospin-dependent quantum molecular dynamics model (IQMD). This study is based on the extended neutron density distribution for the halo-neutron nuclei, which includes the average property of the isospin effect-of reaction mechanism and loose inner structure. The extended neutron density distribution brings an important isospin. effect into the average property of reaction mechanism because the interaction potential and nucleon-nucleon(N-N) cross section in IQMD model depend sensitively on the density distribution of colliding system. In order to see clearly the average properties of reaction mechanism induced by halo-neutron nuclei we also compare the results for the neutron-halo colliding systems with those for the corresponding stable colliding systems under the same incident channel condition. We found that the extended density distribution for the neutron-halo projectile brings an important isospin effect to the reaction mechanism, which leads to the decrease of nuclear stopping R, yet induces obvious increase of the neutron-proton ratio of nucleon emissions and isospin fractionation ratio for all beam energies studied in this work, compared to the corresponding stable colliding system. In this case, nuclear stopping, the neutron-proton ratio of nucleon emissions and isospin fractionation ratio induced by halo-neutron nuclei can be used as possible probes for studying the average property of the isospin effect of reaction mechanism and extracting the information of symmetry potential and in-medium N-N cross section by the neutron-halo nuclei in heavy ion collisions.
Resumo:
In terms of the isospin-dependent quantum molecular dynamics model (IQMD), important isospin effect in the halo-neutron nucleus induced reaction mechanism is. investigated, and consequently, the symmetrical potential form is extracted in the intermediate energy heavy ion collision. Because the interactive potential and in-medium nucleon-nucleon (N-N) cross section in the IQMD model sensitively depend on the density distribution of the colliding system, this type of study is much more based on the extended density distribution with a looser inner nuclear structure of the halo-neutron nucleus. Such a density distribution includes averaged characteristics of the isospin effect of the reaction mechanism and the looser inner nuclear structure. In order to understand clearly the isospin effect of the halo-neutron nucleus induced reaction mechanism, the effects caused by the neutron-halo nucleus and by the stable nucleus with the same mass are compared under the same condition of the incident channel. It is found that in the concerned beam energy region, the ratio of the emitted neutrons and protons and the ratio of the isospin fractionations in the neutron-halo nucleus case are considerably larger than those in the stable nucleus case. Therefore, the information of the symmetry potential in the heavy ion collision can be extracted through such a procedure.
Resumo:
基于PC和多轴运动控制器的开放式数控系统是理想的开放式数控系统。介绍了基于PMAC的开放式数控系统结构形式,PMAC的差补、位置控制、伺服功能、以PMAC和PC机为硬件平台搭建了数控系统,并对其硬件构成和软件设计结构进行了分析。着重从软件设计的角度,介绍了PTALK控件的功能和作用,对数控系统软件构成进行了详细的阐述。并设计出了友好的用户界面,在实际应用中具有重要意义。
Resumo:
Rock mass is widely recognized as a kind of geologic body which consists of rock blocks and discontinuities. The deformation and failure of rock mass is not only determined by rock block,but also by discontinuity which is virtually more important. Mutual cutting and combination of discontinuities controlled mechanical property of rock mass. The complex cutting of discontinuities determine the intense anisotropy on mechanical property of rock mass,especially under the effect of ground stress. Engineering practice has show that the brittle failure of hard rock always occurs when its working stress is far lower than the yield strength and compressive strength,the failure always directly related to the fracture propagation of discontinuities. Fracture propagation of discontinuities is the virtue of hard rock’s failure. We can research the rock mass discontinuous mechanical properties precisely by the methods of statistical analysis of discontinuities and Fracture Mechanics. According to Superposition Principle in Fracture Mechanics,A Problem or C Problem could be chosen to research. Problem A mainly calculates the crack-tip stress field and displacement field on internal discontinuities by numerical method. Problem C calculate the crack-tip stress field and displacement field under the assumption of that the mainly rock mass stress field has been known. So the Problem C avoid the complex mutual interference of stress fields of discontinuities,which is called crack system problem in Fracture Mechanics. To solve Problem C, field test on stress field in the rock mass is needed. The linear Superposition of discontinuities strain energies are Scientific and Rational. The difference of Fracture Mechanics between rock mass and other materials can mostly expression as:other materials Fracture Mechanics mostly face the problem A,and can’t avoid multi-crack puzzle, while the Rock mass Fracture Mechanics answer to the Problem C. Problem C can avoid multi-discontinuities mutual interference puzzle via the ground stress test. On the basis of Problem C, Fracture Mechanics could be used conveniently in rock mass. The rock mass statistics fracture constitutive relations, which introduced in this article, are based on the Problem C and the Discontinuity Strain Energy linear superposition. This constitutive relation has several merits: first, it is physical constitutive relation rather than empirical; second, it is very fit to describe the rock mass anisotropy properties; third, it elaborates the exogenous factors such as ground stress. The rock mass statistics fracture constitutive relation is the available approach to answer to the physical, anisotropic and ground stress impacted rock mass problems. This article stand on the foundation of predecessor’s statistics fractures constitutive relation, and improved the discontinuity distributive function. This article had derived the limitation of negative exponential distribution in the course of regression analysis, and advocated to using the two parameter negative exponential distribution for instead. In order to solve the problems of two-dimension stability on engineering key cross-sectional view in rock mass, this article derived the rock mass planar flexibility tensor, and established rock mass two-dimension penetrate statistics fracture constitutive relation on the basis of penetrate fracture mechanics. Based on the crack tip plasticity research production of penetrate fracture, for example the Irwin plasticity equifinality crack, this article established the way to deal with the discontinuity stress singularity and plastic yielding problem at discontinuity tip. The research on deformation parameters is always the high light region of rock mass mechanics field. After the dam foundation excavation of XiaoWan hydroelectric power station, dam foundation rock mass upgrowthed a great deal of unload cracks, rock mass mechanical property gotten intricacy and strong anisotropy. The dam foundation rock mass mostly upgrowthed three group discontinuities: the decantation discontinuity, the steep pitch discontinuity, and the schistosity plane. Most of the discontinuities have got partial unload looseness. In accordance with ground stress field data, the dam foundation stress field greatly non-uniform, which felled under the great impaction of tectonic stress field, self-weight stress field, excavation geometric boundary condition, and excavation, unload. The discontinuity complexity and stress field heterogeneity, created the rock mass mechanical property of dam foundation intricacy and levity. The research on the rock mass mechanics, if not take every respected influencing factor into consideration as best as we can, major errors likely to be created. This article calculated the rock mass elastic modulus that after Xiao Wan hydroelectric power station dam foundation gutter excavation finished. The calculation region covered possession monolith of Xiao Wan concrete double-curvature arch dam. Different monolith were adopted the penetrate fracture statistics constitutive relation or bury fracture statistics constitutive relation selectively. Statistics fracture constitutive relation is fit for the intensity anisotropy and heterogeneity rock mass of Xiao Wan hydroelectric power station dam foundation. This article had contrastive analysis the statistics fracture constitutive relation result with the inclined plane load test actual measurement elastic modulus and RMR method estimated elastic modulus, and find that the three methods elastic modulus have got greatly comparability. So, the statistics fracture constitutive relations are qualified for trust. Generally speaking,this article had finished following works based on predecessors job: “Argumentation the C Problems of superposition principle in Fracture Mechanics, establish two-dimension penetrate statistics fracture constitutive relation of rock mass, argue the negative exponential distribution limitation and improve it, improve of the three-dimension berry statistics fracture constitutive relation of rock mass, discontinuity-tip plastic zone isoeffect calculation, calculate the rock mass elastic modulus on two-dimension cross-sectional view”. The whole research clue of this article inherited from the “statistics rock mass mechanics” of Wu Faquan(1992).
Resumo:
C.G.G. Aitken, Q. Shen, R. Jensen and B. Hayes. The evaluation of evidence for exponentially distributed data. Computational Statistics & Data Analysis, vol. 51, no. 12, pp. 5682-5693, 2007.
Resumo:
Distributed generation unlike centralized electrical generation aims to generate electrical energy on small scale as near as possible to load centers, interchanging electric power with the network. This work presents a probabilistic methodology conceived to assist the electric system planning engineers in the selection of the distributed generation location, taking into account the hourly load changes or the daily load cycle. The hourly load centers, for each of the different hourly load scenarios, are calculated deterministically. These location points, properly weighted according to their load magnitude, are used to calculate the best fit probability distribution. This distribution is used to determine the maximum likelihood perimeter of the area where each source distributed generation point should preferably be located by the planning engineers. This takes into account, for example, the availability and the cost of the land lots, which are factors of special relevance in urban areas, as well as several obstacles important for the final selection of the candidates of the distributed generation points. The proposed methodology has been applied to a real case, assuming three different bivariate probability distributions: the Gaussian distribution, a bivariate version of Freund’s exponential distribution and the Weibull probability distribution. The methodology algorithm has been programmed in MATLAB. Results are presented and discussed for the application of the methodology to a realistic case and demonstrate the ability of the proposed methodology for efficiently handling the determination of the best location of the distributed generation and their corresponding distribution networks.
Resumo:
The thesis deals with analysis of some Stochastic Inventory Models with Pooling/Retrial of Customers.. In the first model we analyze an (s,S) production Inventory system with retrial of customers. Arrival of customers from outside the system form a Poisson process. The inter production times are exponentially distributed with parameter µ. When inventory level reaches zero further arriving demands are sent to the orbit which has capacity M(<∞). Customers, who find the orbit full and inventory level at zero are lost to the system. Demands arising from the orbital customers are exponentially distributed with parameter γ. In the model-II we extend these results to perishable inventory system assuming that the life-time of each item follows exponential with parameter θ. The study deals with an (s,S) production inventory with service times and retrial of unsatisfied customers. Primary demands occur according to a Markovian Arrival Process(MAP). Consider an (s,S)-retrial inventory with service time in which primary demands occur according to a Batch Markovian Arrival Process (BMAP). The inventory is controlled by the (s,S) policy and (s,S) inventory system with service time. Primary demands occur according to Poissson process with parameter λ. The study concentrates two models. In the first model we analyze an (s,S) Inventory system with postponed demands where arrivals of demands form a Poisson process. In the second model, we extend our results to perishable inventory system assuming that the life-time of each item follows exponential distribution with parameter θ. Also it is assumed that when inventory level is zero the arriving demands choose to enter the pool with probability β and with complementary probability (1- β) it is lost for ever. Finally it analyze an (s,S) production inventory system with switching time. A lot of work is reported under the assumption that the switching time is negligible but this is not the case for several real life situation.
Resumo:
This thesis is devoted to the study of some stochastic models in inventories. An inventory system is a facility at which items of materials are stocked. In order to promote smooth and efficient running of business, and to provide adequate service to the customers, an inventory materials is essential for any enterprise. When uncertainty is present, inventories are used as a protection against risk of stock out. It is advantageous to procure the item before it is needed at a lower marginal cost. Again, by bulk purchasing, the advantage of price discounts can be availed. All these contribute to the formation of inventory. Maintaining inventories is a major expenditure for any organization. For each inventory, the fundamental question is how much new stock should be ordered and when should the orders are replaced. In the present study, considered several models for single and two commodity stochastic inventory problems. The thesis discusses two models. In the first model, examined the case in which the time elapsed between two consecutive demand points are independent and identically distributed with common distribution function F(.) with mean (assumed finite) and in which demand magnitude depends only on the time elapsed since the previous demand epoch. The time between disasters has an exponential distribution with parameter . In Model II, the inter arrival time of disasters have general distribution (F.) with mean ( ) and the quantity destructed depends on the time elapsed between disasters. Demands form compound poison processes with inter arrival times of demands having mean 1/. It deals with linearly correlated bulk demand two
Commodity inventory problem, where each arrival demands a random number of items of each commodity C1 and C2, the maximum quantity demanded being a (< S1) and b(
Resumo:
The present study on the characterization of probability distributions using the residual entropy function. The concept of entropy is extensively used in literature as a quantitative measure of uncertainty associated with a random phenomenon. The commonly used life time models in reliability Theory are exponential distribution, Pareto distribution, Beta distribution, Weibull distribution and gamma distribution. Several characterization theorems are obtained for the above models using reliability concepts such as failure rate, mean residual life function, vitality function, variance residual life function etc. Most of the works on characterization of distributions in the reliability context centers around the failure rate or the residual life function. The important aspect of interest in the study of entropy is that of locating distributions for which the shannon’s entropy is maximum subject to certain restrictions on the underlying random variable. The geometric vitality function and examine its properties. It is established that the geometric vitality function determines the distribution uniquely. The problem of averaging the residual entropy function is examined, and also the truncated form version of entropies of higher order are defined. In this study it is established that the residual entropy function determines the distribution uniquely and that the constancy of the same is characteristics to the geometric distribution
Resumo:
In this article it is proved that the stationary Markov sequences generated by minification models are ergodic and uniformly mixing. These results are used to establish the optimal properties of estimators for the parameters in the model. The problem of estimating the parameters in the exponential minification model is discussed in detail.
Resumo:
Lecture notes in LaTex
Resumo:
Exam questions and solutions in PDF