912 resultados para Dwarf Galaxy Fornax Distribution Function Action Based
Resumo:
La thèse qui suit est organisée en deux volets: un premier volet portant sur les modèles de masse de galaxies et un second volet sur la conception de revêtements optiques et le contrôle de leurs propriétés mécaniques. Les modèles de masse présentés dans cette thèse ont été réalisés sur un sous-échantillon de dix galaxies de l'étude SINGS comprenant neuf galaxies normales et une galaxie naine. Ce travail visait à fixer le rapport masse-luminosité du disque à tout rayon en utilisant les résultats de modèles d'évolution galactique chimio-spectrophotométriques ajustés spécifiquement à chaque galaxie grâce à son profil de photométrie multi-bandes. Les résultats montrent que les disques stellaires tels que normalisés par les rapports masse-luminosité issus des modèles ont des masses cohérentes dans toutes les bandes étudiées de l'ultra-violet, du visible ainsi que du proche infrarouge (bandes FUV à IRAC2). Ces disques peuvent être considérés comme maximaux par rapport aux données cinématiques des galaxies étudiées. Ceci est dû au fait que le rapport M/L est plus élevé au centre que sur les bords. Les disques étant maximaux et physiquement justifiés, on ne peut dès lors ignorer les effets de composants tels que les bulbes ou les barres et les corrections nécessaires doivent être apportées aux profils de luminosité et de vitesses de rotation de la galaxie. Dans les travaux de la seconde partie, le logiciel en développement libre OpenFilters a été modifié afin de tenir compte des contraintes mécaniques dans la conception numérique de revêtements optiques. Les contraintes mécaniques dans les couches minces ont un effet délétère sur leurs performances optiques. Un revêtement destiné à rendre réflectives les lames d'un étalon Fabry-Perot utilisé en astronomie a été conçu et fabriqué afin d'évaluer les performances réelles de la méthode. Ce cas a été choisi à cause de la diminution de la finesse d'un étalon Fabry-Perot apporté par la courbure des lames sous l'effet des contraintes. Les résultats montrent que les mesures concordent avec les modèles numériques et qu'il est donc possible à l'aide de ce logiciel d'optimiser les revêtements pour leur comportement mécanique autant que pour leurs propriétés optiques.
Resumo:
This thesis is devoted to the study of some stochastic models in inventories. An inventory system is a facility at which items of materials are stocked. In order to promote smooth and efficient running of business, and to provide adequate service to the customers, an inventory materials is essential for any enterprise. When uncertainty is present, inventories are used as a protection against risk of stock out. It is advantageous to procure the item before it is needed at a lower marginal cost. Again, by bulk purchasing, the advantage of price discounts can be availed. All these contribute to the formation of inventory. Maintaining inventories is a major expenditure for any organization. For each inventory, the fundamental question is how much new stock should be ordered and when should the orders are replaced. In the present study, considered several models for single and two commodity stochastic inventory problems. The thesis discusses two models. In the first model, examined the case in which the time elapsed between two consecutive demand points are independent and identically distributed with common distribution function F(.) with mean (assumed finite) and in which demand magnitude depends only on the time elapsed since the previous demand epoch. The time between disasters has an exponential distribution with parameter . In Model II, the inter arrival time of disasters have general distribution (F.) with mean ( ) and the quantity destructed depends on the time elapsed between disasters. Demands form compound poison processes with inter arrival times of demands having mean 1/. It deals with linearly correlated bulk demand two
Commodity inventory problem, where each arrival demands a random number of items of each commodity C1 and C2, the maximum quantity demanded being a (< S1) and b(
Resumo:
The average availability of a repairable system is the expected proportion of time that the system is operating in the interval [0, t]. The present article discusses the nonparametric estimation of the average availability when (i) the data on 'n' complete cycles of system operation are available, (ii) the data are subject to right censorship, and (iii) the process is observed upto a specified time 'T'. In each case, a nonparametric confidence interval for the average availability is also constructed. Simulations are conducted to assess the performance of the estimators.
Resumo:
Plankton community, drawn from a vary wide variety of animal phyla, formed the basic food supply of marine life and indicators of water mass. The term meroplankton generally referred to that portion of the zooplankton which is transient in nature, remaining rest of their lives in the nektonic or benthic environment. This group was selected for intensive studies, considering the role of meroplankton in the economy of the sea and the scarcity of literature on them from the Indian Ocean. The preser .udy besides providing information regarding the fixation and preservation !e _ iniques and biochemical aspects of tropical meroplankton, also consolidates information regarding their zoogeography in the Indian Ocean region, with a view to amplifying the limited information available from this area. The distribution studies are based on the collections made during the International Indian Ocean Expedition (1960-65), whereas the material for preservation and biochemical studies was collected from the coastal waters during 1968-1978. Salient features:- 2% of formaldehyde buffered with 2% borax, added to the plankton in the ratio of 9:1 was found the best fixative. On fixation the plankton underwent shrinkage due to loss of 15 to 87% water. Addition of antioxidants prevented colour fading. Narcotization by different specific reagents prior to fixation reduced distortions due to violent reaction and improved morphological conditions. One percent formaldehyde solution in sea water buffered with borax or neutralised with calcium carbonate perfectly preserved majority of meroplankton. Equally good was one percent propylene phenoxetol buffered with borax. Biochemical compostion of vaioous taxa showed variations according to their age class, size groups metamorphosing stage, feeding mechanism, type of organism fed and time of collection. General distribution studies of 4 meroplankton taxa - Anthozoan larvae, cirripedia larvae, sipunculoid larvae and gastropod larvae stowed abundance in the coastal areas especially during the SW monsoon period. Based on the larval distribution different zoo-geographical areas in the Indian Ocean are differentiated.
Resumo:
Electron transport in a self-consistent potential along a ballistic two-terminal conductor has been investigated. We have derived general formulas which describe the nonlinear current-voltage characteristics, differential conductance, and low-frequency current and voltage noise assuming an arbitrary distribution function and correlation properties of injected electrons. The analytical results have been obtained for a wide range of biases: from equilibrium to high values beyond the linear-response regime. The particular case of a three-dimensional Fermi-Dirac injection has been analyzed. We show that the Coulomb correlations are manifested in the negative excess voltage noise, i.e., the voltage fluctuations under high-field transport conditions can be less than in equilibrium.
Gemeinsam selbständig. Eine Analyse kooperativen Handelns bei partnerschaftlichen Existenzgründungen
Resumo:
Im Zentrum der empirischen Arbeit steht einerseits eine besondere Form der Unternehmensgründung, andererseits eine besondere Form der Kooperation: die partnerschaftliche Existenzgründung. Das Forschungsvorhaben geht dem Ziel nach, Kooperationsprozesse in solchen unternehmerischen Partnerschaften zu explorieren. Um letztendlich eine praktisch nutzbare Grundlage für Maßnahmen zur Förderung beruflicher Selbständigkeit zu gewinnen, werden Merkmale und Rahmenbedingungen erfolgreicher partnerschaftlicher Existenzgründungen aufgezeigt. Die Empirie stützt sich auf qualitative Interviews mit Personen, die erfolgreich im Team ein Unternehmen gegründet haben und führen. In einem ersten Schritt werden anhand der Interviews konkrete alltägliche Kooperationsprozesse präzise beschrieben. Darauf aufbauend zeichnen tiefer gehende qualitative Analysen Entwicklungen auf einer abstrakteren Handlungsebene nach. Das Spektrum der Betrachtung umfasst neben individuellen Entwicklungen der Unternehmensgründer auch Prozesse auf der Teamebene sowie auf unternehmerischer Ebene. Zur Exploration der Handlungsprozesse werden die eigenen Ergebnisse durch ausgewählte theoretische Ansätze aus Kognitions- und Sozialpsychologie, aus soziologischer Handlungstheorie und Betriebswirtschaftslehre bereichert. Als relevant auf individueller Ebene erweisen sich insbesondere Prozesse der Identitätsentwicklung hin zu einer unternehmerisch-partnerschaftlichen Identität. Auf der Teamebene sind die Bewahrung und Überwindung von Eigensinn in der Interaktion sowie der Aufbau einer vertrauensvollen Beziehung zentral. Auf Unternehmensebene schließlich spielen Prozesse der Sinnfindung zwischen individuellen Erwerbsentwürfen und ökonomischen Strukturen, die Entstehung von Ordnung sowie Problemlösungsprozesse eine entscheidende Rolle. Insgesamt wird deutlich, dass der Weg in eine partnerschaftliche Selbständigkeit ein vielschichtiger Lernprozess ist, der aus der Praxis heraus entsteht, im Wesentlichen von den Gründern selbst organisiert und von gemeinsamen Reflexionen getragen wird. Diese Erkenntnisse stellen erste Ansatzpunkte dar zur Förderung beruflicher Selbständigkeit in Hochschule und im beruflichen Bildungswesen.
Resumo:
Observations in daily practice are sometimes registered as positive values larger then a given threshold α. The sample space is in this case the interval (α,+∞), α > 0, which can be structured as a real Euclidean space in different ways. This fact opens the door to alternative statistical models depending not only on the assumed distribution function, but also on the metric which is considered as appropriate, i.e. the way differences are measured, and thus variability
Resumo:
En este documento se revisa teóricamente la distribución de probabilidad de Poisson como función que asigna a cada suceso definido, sobre una variable aleatoria discreta, la probabilidad de ocurrencia en un intervalo de tiempo o región del espacio disjunto. Adicionalmente se revisa la distribución exponencial negativa empleada para modelar el intervalo de tiempo entre eventos consecutivos de Poisson que ocurren de manera independiente; es decir, en los cuales la probabilidad de ocurrencia de los eventos sucedidos en un intervalo de tiempo no depende de los ocurridos en otros intervalos de tiempo, por esta razón se afirma que es una distribución que no tiene memoria. El proceso de Poisson relaciona la función de Poisson, que representa un conjunto de eventos independientes sucedidos en un intervalo de tiempo o región del espacio con los tiempos dados entre la ocurrencia de los eventos según la distribución exponencial negativa. Los anteriores conceptos se usan en la teoría de colas, rama de la investigación de operaciones que describe y brinda soluciones a situaciones en las que un conjunto de individuos o elementos forman colas en espera de que se les preste un servicio, por lo cual se presentan ejemplos de aplicación en el ámbito médico.
Resumo:
Este trabajo realiza un análisis de las acciones guerrillas del Llano y de Guadalupe Salcedo entre 1949-1957 a partir de los repertorios de acción colectiva violenta que emplearon, basándose en los postulados de Charles Tilly.
Resumo:
A novel statistic for local wave amplitude of the 500-hPa geopotential height field is introduced. The statistic uses a Hilbert transform to define a longitudinal wave envelope and dynamical latitude weighting to define the latitudes of interest. Here it is used to detect the existence, or otherwise, of multimodality in its distribution function. The empirical distribution function for the 1960-2000 period is close to a Weibull distribution with shape parameters between 2 and 3. There is substantial interdecadal variability but no apparent local multimodality or bimodality. The zonally averaged wave amplitude, akin to the more usual wave amplitude index, is close to being normally distributed. This is consistent with the central limit theorem, which applies to the construction of the wave amplitude index. For the period 1960-70 it is found that there is apparent bimodality in this index. However, the different amplitudes are realized at different longitudes, so there is no bimodality at any single longitude. As a corollary, it is found that many commonly used statistics to detect multimodality in atmospheric fields potentially satisfy the assumptions underlying the central limit theorem and therefore can only show approximately normal distributions. The author concludes that these techniques may therefore be suboptimal to detect any multimodality.
Resumo:
This report presents key findings from a small-scale pilot research project that explored the experiences and priorities of young people caring for their siblings in sibling-headed households affected by AIDS in Tanzania and Uganda. Qualitative and participatory research was conducted with 33 young people living in sibling-headed households and 39 NGO staff and community members in rural and urban areas of Tanzania and Uganda. The report analyses the ways that young people manage transitions to caring for their younger siblings following their parents’ death and the impacts of caring on their family relations, education, emotional wellbeing and health, social lives and their transitions to adulthood. The study highlights gendered- and age-related differences in the nature and extent of young people’s care work and discusses young people’s needs and priorities for action, based on the views of young people, NGO staff and community members. Meeting the basic needs of young people living in sibling-headed households, listening to young people’s views, fostering peer support and relationships of trust with supportive adults, raising awareness and advocacy emerge as key priorities to safeguard the rights of children and young people living in sibling-headed households and challenge the stigma and marginalisation they sometimes face.
Resumo:
The orientational ordering of the nematic phase of a polyethylene glycol (PEG)-peptide block copolymer in aqueous solution is probed by small-angle neutron scattering (SANS), with the sample subjected to steady shear in a Couette cell. The PEG-peptide conjugate forms fibrils that behave as semiflexible rodlike chains. The orientational order parameters (P) over bar (2) and (P) over bar (4) are obtained by modeling the data using a series expansion approach to the form factor of uniform cylinders. The method used is independent of assumptions on the form of the singlet orientational distribution function. Good agreement with the anisotropic two-dimensional SANS patterns is obtained. The results show shear alignment starting at very low shear rates, and the orientational order parameters reach a plateau at higher shear rates with a pseudologarithmic dependence on shear rate. The most probable distribution functions correspond to fibrils parallel to the flow direction under shear, but a sample at rest shows a bimodal distribution with some of the rodlike peptide fibrils oriented perpendicular to the flow direction.
Resumo:
A generalized or tunable-kernel model is proposed for probability density function estimation based on an orthogonal forward regression procedure. Each stage of the density estimation process determines a tunable kernel, namely, its center vector and diagonal covariance matrix, by minimizing a leave-one-out test criterion. The kernel mixing weights of the constructed sparse density estimate are finally updated using the multiplicative nonnegative quadratic programming algorithm to ensure the nonnegative and unity constraints, and this weight-updating process additionally has the desired ability to further reduce the model size. The proposed tunable-kernel model has advantages, in terms of model generalization capability and model sparsity, over the standard fixed-kernel model that restricts kernel centers to the training data points and employs a single common kernel variance for every kernel. On the other hand, it does not optimize all the model parameters together and thus avoids the problems of high-dimensional ill-conditioned nonlinear optimization associated with the conventional finite mixture model. Several examples are included to demonstrate the ability of the proposed novel tunable-kernel model to effectively construct a very compact density estimate accurately.
Resumo:
Neurofuzzy modelling systems combine fuzzy logic with quantitative artificial neural networks via a concept of fuzzification by using a fuzzy membership function usually based on B-splines and algebraic operators for inference, etc. The paper introduces a neurofuzzy model construction algorithm using Bezier-Bernstein polynomial functions as basis functions. The new network maintains most of the properties of the B-spline expansion based neurofuzzy system, such as the non-negativity of the basis functions, and unity of support but with the additional advantages of structural parsimony and Delaunay input space partitioning, avoiding the inherent computational problems of lattice networks. This new modelling network is based on the idea that an input vector can be mapped into barycentric co-ordinates with respect to a set of predetermined knots as vertices of a polygon (a set of tiled Delaunay triangles) over the input space. The network is expressed as the Bezier-Bernstein polynomial function of barycentric co-ordinates of the input vector. An inverse de Casteljau procedure using backpropagation is developed to obtain the input vector's barycentric co-ordinates that form the basis functions. Extension of the Bezier-Bernstein neurofuzzy algorithm to n-dimensional inputs is discussed followed by numerical examples to demonstrate the effectiveness of this new data based modelling approach.
Resumo:
This paper compares a number of different extreme value models for determining the value at risk (VaR) of three LIFFE futures contracts. A semi-nonparametric approach is also proposed, where the tail events are modeled using the generalised Pareto distribution, and normal market conditions are captured by the empirical distribution function. The value at risk estimates from this approach are compared with those of standard nonparametric extreme value tail estimation approaches, with a small sample bias-corrected extreme value approach, and with those calculated from bootstrapping the unconditional density and bootstrapping from a GARCH(1,1) model. The results indicate that, for a holdout sample, the proposed semi-nonparametric extreme value approach yields superior results to other methods, but the small sample tail index technique is also accurate.