991 resultados para weighted quasi-arithmetic means


Relevância:

30.00% 30.00%

Publicador:

Resumo:

© 2015, Institute of Mathematical Statistics. All rights reserved.In order to use persistence diagrams as a true statistical tool, it would be very useful to have a good notion of mean and variance for a set of diagrams. In [23], Mileyko and his collaborators made the first study of the properties of the Fréchet mean in (Dp, Wp), the space of persistence diagrams equipped with the p-th Wasserstein metric. In particular, they showed that the Fréchet mean of a finite set of diagrams always exists, but is not necessarily unique. The means of a continuously-varying set of diagrams do not themselves (necessarily) vary continuously, which presents obvious problems when trying to extend the Fréchet mean definition to the realm of time-varying persistence diagrams, better known as vineyards. We fix this problem by altering the original definition of Fréchet mean so that it now becomes a probability measure on the set of persistence diagrams; in a nutshell, the mean of a set of diagrams will be a weighted sum of atomic measures, where each atom is itself a persistence diagram determined using a perturbation of the input diagrams. This definition gives for each N a map (Dp)N→ℙ(Dp). We show that this map is Hölder continuous on finite diagrams and thus can be used to build a useful statistic on vineyards.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The interaction between supernova ejecta and circumstellar matter, arising from previous episodes of mass loss, provides us with a means of constraining the progenitors of supernovae. Radio observations of a number of supernovae show quasi-periodic deviations from a strict power-law decline at late times. Although several possibilities have been put forward to explain these modulations, no single explanation has proven to be entirely satisfactory. Here we suggest that Luminous blue variables undergoing S-Doradus type variations give rise to enhanced phases of mass loss that are imprinted on the immediate environment of the exploding star as a series of density enhancements. The variations in mass loss arise from changes in the ionization balance of Fe, the dominant ion that drives the wind. With this idea, we find that both the recurrence timescale of the variability and the amplitude of the modulations are in line with the observations. Our scenario thus provides a natural, single-star explanation for the observed behaviour that is, in fact, expected on theoretical grounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nitrogen Dioxide (NO2) is known to act as an environmental trigger for many respiratory illnesses. As a pollutant it is difficult to map accurately, as concentrations can vary greatly over small distances. In this study three geostatistical techniques were compared, producing maps of NO2 concentrations in the United Kingdom (UK). The primary data source for each technique was NO2 point data, generated from background automatic monitoring and background diffusion tubes, which are analysed by different laboratories on behalf of local councils and authorities in the UK. The techniques used were simple kriging (SK), ordinary kriging (OK) and simple kriging with a locally varying mean (SKlm). SK and OK make use of the primary variable only. SKlm differs in that it utilises additional data to inform prediction, and hence potentially reduces uncertainty. The secondary data source was Oxides of Nitrogen (NOx) derived from dispersion modelling outputs, at 1km x 1km resolution for the UK. These data were used to define the locally varying mean in SKlm, using two regression approaches: (i) global regression (GR) and (ii) geographically weighted regression (GWR). Based upon summary statistics and cross-validation prediction errors, SKlm using GWR derived local means produced the most accurate predictions. Therefore, using GWR to inform SKlm was beneficial in this study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent efforts in the finite element modelling of delamination have concentrated on the development of cohesive interface elements. These are characterised by a bilinear constitutive law, where there is an initial high positive stiffness until a threshold stress level is reached, followed by a negative tangent stiffness representing softening (or damage evolution). Complete decohesion occurs when the amount of work done per unit area of crack surface is equal to a critical strain energy release rate. It is difficult to achieve a stable, oscillation-free solution beyond the onset of damage, using standard implicit quasi-static methods, unless a very refined mesh is used. In the present paper, a new solution strategy is proposed based on a pseudo-transient formulation and demonstrated through the modelling of a double cantilever beam undergoing Mode I delamination. A detailed analysis into the sensitivity of the user-defined parameters is also presented. Comparisons with other published solutions using a quasi-static formulation show that the pseudo-transient formulation gives improved accuracy and oscillation-free results with coarser meshes

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La compression des données est la technique informatique qui vise à réduire la taille de l’information pour minimiser l’espace de stockage nécessaire et accélérer la transmission des données dans les réseaux à bande passante limitée. Plusieurs techniques de compression telles que LZ77 et ses variantes souffrent d’un problème que nous appelons la redondance causée par la multiplicité d’encodages. La multiplicité d’encodages (ME) signifie que les données sources peuvent être encodées de différentes manières. Dans son cas le plus simple, ME se produit lorsqu’une technique de compression a la possibilité, au cours du processus d’encodage, de coder un symbole de différentes manières. La technique de compression par recyclage de bits a été introduite par D. Dubé et V. Beaudoin pour minimiser la redondance causée par ME. Des variantes de recyclage de bits ont été appliquées à LZ77 et les résultats expérimentaux obtenus conduisent à une meilleure compression (une réduction d’environ 9% de la taille des fichiers qui ont été compressés par Gzip en exploitant ME). Dubé et Beaudoin ont souligné que leur technique pourrait ne pas minimiser parfaitement la redondance causée par ME, car elle est construite sur la base du codage de Huffman qui n’a pas la capacité de traiter des mots de code (codewords) de longueurs fractionnaires, c’est-à-dire qu’elle permet de générer des mots de code de longueurs intégrales. En outre, le recyclage de bits s’appuie sur le codage de Huffman (HuBR) qui impose des contraintes supplémentaires pour éviter certaines situations qui diminuent sa performance. Contrairement aux codes de Huffman, le codage arithmétique (AC) peut manipuler des mots de code de longueurs fractionnaires. De plus, durant ces dernières décennies, les codes arithmétiques ont attiré plusieurs chercheurs vu qu’ils sont plus puissants et plus souples que les codes de Huffman. Par conséquent, ce travail vise à adapter le recyclage des bits pour les codes arithmétiques afin d’améliorer l’efficacité du codage et sa flexibilité. Nous avons abordé ce problème à travers nos quatre contributions (publiées). Ces contributions sont présentées dans cette thèse et peuvent être résumées comme suit. Premièrement, nous proposons une nouvelle technique utilisée pour adapter le recyclage de bits qui s’appuie sur les codes de Huffman (HuBR) au codage arithmétique. Cette technique est nommée recyclage de bits basé sur les codes arithmétiques (ACBR). Elle décrit le cadriciel et les principes de l’adaptation du HuBR à l’ACBR. Nous présentons aussi l’analyse théorique nécessaire pour estimer la redondance qui peut être réduite à l’aide de HuBR et ACBR pour les applications qui souffrent de ME. Cette analyse démontre que ACBR réalise un recyclage parfait dans tous les cas, tandis que HuBR ne réalise de telles performances que dans des cas très spécifiques. Deuxièmement, le problème de la technique ACBR précitée, c’est qu’elle requiert des calculs à précision arbitraire. Cela nécessite des ressources illimitées (ou infinies). Afin de bénéficier de cette dernière, nous proposons une nouvelle version à précision finie. Ladite technique devienne ainsi efficace et applicable sur les ordinateurs avec les registres classiques de taille fixe et peut être facilement interfacée avec les applications qui souffrent de ME. Troisièmement, nous proposons l’utilisation de HuBR et ACBR comme un moyen pour réduire la redondance afin d’obtenir un code binaire variable à fixe. Nous avons prouvé théoriquement et expérimentalement que les deux techniques permettent d’obtenir une amélioration significative (moins de redondance). À cet égard, ACBR surpasse HuBR et fournit une classe plus étendue des sources binaires qui pouvant bénéficier d’un dictionnaire pluriellement analysable. En outre, nous montrons qu’ACBR est plus souple que HuBR dans la pratique. Quatrièmement, nous utilisons HuBR pour réduire la redondance des codes équilibrés générés par l’algorithme de Knuth. Afin de comparer les performances de HuBR et ACBR, les résultats théoriques correspondants de HuBR et d’ACBR sont présentés. Les résultats montrent que les deux techniques réalisent presque la même réduction de redondance sur les codes équilibrés générés par l’algorithme de Knuth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project is a quasi-experimental study involving eight classrooms in two senior elementary schools in St. Catharines, Ontario which received a Project Business Program and were pre- and post-tested to determine the growth of knowledge acquisition in the area of business concepts. Four classrooms received a Project Business treatment while four classrooms acted as a control. The Project Business Program is sponsored by Junior Achievement of Canada; it occurred during a twelveweek period, February to May 1981, and is run by business consultants who, through Action, Dialogue and Career Exploration, teach children about economics and business related topics. The consultants were matched with teacher co-ordinators in whose classrooms they taught and with whom they discussed field trips, students, lesson planning, etc. The statistical analysis of pre- and post-test means revealed a significant statistical growth in the area of knowledge acquisition on the part of those students who received the Project Business Program. This confirms that Project Business makes a difference. A search of the literature appears to advocate economic programs like Project Business, whfch are broadly based, relevant and processoriented. This program recommends itself as a model for other areas of co-operative curricular interactions and as a bridge to future trends and as a result several fruitful areas of research are suggested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Se realizó estudio cuasi experimental con el fin de comparar el efecto sobre la carga física de una intervención tecnológica y en la organización del trabajo en trabajadores en el cargo de horneros en la tarea de extracción de coque en Colombia. Se midió la carga física mediante frecuencia cardiaca e índice de costo cardiaco relativo en una población de trabajadores expuestos (37) y no expuestos (66) a una intervención tecnológica. La monitorización de la frecuencia cardiaca se realizó con 7 pulsímetros Polar RS 800cx debidamente calibrados. Las variables numéricas se describieron con base en la media aritmética, su desviación estándar, y el rango. Para evaluar la diferencia entre las medias de los grupos con respecto a la frecuencia cardiaca en reposo, media, máxima, índice de costo cardiaco relativo, gasto energético de trabajo se aplicó análisis de varianza de una vía. Se estableció a priori un nivel de significación estadística α = 0,05. Se encontraron diferencias estadísticamente significativas en el comportamiento de la frecuencia cardiaca media, frecuencia cardiaca máxima e índice de costo cardiaco relativo, entre los grupos de estudio. Se concluyó que este estudio valida la frecuencia cardiaca como una variable sensible para la medición del riesgo por carga física y a su utilidad en la evaluación intervenciones ergonómica. El estudio demostró que la intervención ergonómica logró controlar la carga física con una disminución significativa la frecuencia cardiaca, en el grupo de intervención.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Es desenvolupa una eina de disseny per l'anàlisi de la tolerància al dany en composites. L'eina pot predir el inici i la propagació de fisures interlaminars. També pot ser utilitzada per avaluar i planificar la necessitat de reparar o reemplaçar components durant la seva vida útil. El model desenvolupat pot ser utilitzat tan per simular càrregues estàtiques com de fatiga. El model proposat és un model de dany termodinàmicament consistent que permet simular la delaminació en composites sota càrregues variables. El model es formula dins el context de la Mecànica del Dany, fent ús dels models de zona cohesiva. Es presenta un metodologia per determinar els paràmetres del model constitutiu que permet utilitzar malles d'elements finits més bastes de les que es poden usar típicament. Finalment, el model és també capaç de simular la delaminació produïda per càrregues de fatiga.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The interannual variability of the stratospheric polar vortex during winter in both hemispheres is observed to correlate strongly with the phase of the quasi-biennial oscillation (QBO) in tropical stratospheric winds. It follows that the lack of a spontaneously generated QBO in most atmospheric general circulation models (AGCMs) adversely affects the nature of polar variability in such models. This study examines QBO–vortex coupling in an AGCM in which a QBO is spontaneously induced by resolved and parameterized waves. The QBO–vortex coupling in the AGCM compares favorably to that seen in reanalysis data [from the 40-yr ECMWF Re-Analysis (ERA-40)], provided that careful attention is given to the definition of QBO phase. A phase angle representation of the QBO is employed that is based on the two leading empirical orthogonal functions of equatorial zonal wind vertical profiles. This yields a QBO phase that serves as a proxy for the vertical structure of equatorial winds over the whole depth of the stratosphere and thus provides a means of subsampling the data to select QBO phases with similar vertical profiles of equatorial zonal wind. Using this subsampling, it is found that the QBO phase that induces the strongest polar vortex response in early winter differs from that which induces the strongest late-winter vortex response. This is true in both hemispheres and for both the AGCM and ERA-40. It follows that the strength and timing of QBO influence on the vortex may be affected by the partial seasonal synchronization of QBO phase transitions that occurs both in observations and in the model. This provides a mechanism by which changes in the strength of QBO–vortex correlations may exhibit variability on decadal time scales. In the model, such behavior occurs in the absence of external forcings or interannual variations in sea surface temperatures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method to solve a quasi-geostrophic two-layer model including the variation of static stability is presented. The divergent part of the wind is incorporated by means of an iterative procedure. The procedure is rather fast and the time of computation is only 60–70% longer than for the usual two-layer model. The method of solution is justified by the conservation of the difference between the gross static stability and the kinetic energy. To eliminate the side-boundary conditions the experiments have been performed on a zonal channel model. The investigation falls mainly into three parts: The first part (section 5) contains a discussion of the significance of some physically inconsistent approximations. It is shown that physical inconsistencies are rather serious and for these inconsistent models which were studied the total kinetic energy increased faster than the gross static stability. In the next part (section 6) we are studying the effect of a Jacobian difference operator which conserves the total kinetic energy. The use of this operator in two-layer models will give a slight improvement but probably does not have any practical use in short periodic forecasts. It is also shown that the energy-conservative operator will change the wave-speed in an erroneous way if the wave-number or the grid-length is large in the meridional direction. In the final part (section 7) we investigate the behaviour of baroclinic waves for some different initial states and for two energy-consistent models, one with constant and one with variable static stability. According to the linear theory the waves adjust rather rapidly in such a way that the temperature wave will lag behind the pressure wave independent of the initial configuration. Thus, both models give rise to a baroclinic development even if the initial state is quasi-barotropic. The effect of the variation of static stability is very small, qualitative differences in the development are only observed during the first 12 hours. For an amplifying wave we will get a stabilization over the troughs and an instabilization over the ridges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is concerned with the development of a funding mechanism, the Student Resource Index, which has been designed to resolve a number of difficulties which emerged following the introduction of integration or inclusion as an alternative means of providing educational support to students with disabilities in the Australian State of Victoria. Prior to 1984, the year in which the major integration or inclusion initiatives were introduced, the great majority of students with disabilities were educated in segregated special schools, however, by 1992 the integration initiatives had been successful in including within regular classes approximately half of the students in receipt of additional educational assistance on the basis of disability. The success of the integration program brought with it a number of administrative and financial problems which were the subject of three government enquiries. Central to these difficulties was the development of a dual system of special education provision. On one hand, additional resources were provided for the students attending segregated special schools by means of weighted student ratios, with one teacher being provided for each six students attending a special school. On the other hand, the requirements of individual students integrated into regular schools were assessed by school-based committees on the basis of their perceived extra educational needs. The major criticism of this dual system of special education funding was that it created inequities in the distribution of resources both between the systems and also within the systems. For example, three students with equivalent needs, one of whom attended a special school and two of whom attended different regular schools could each be funded at substantially differing levels. The solution to these inequities of funding was seen to be in the development of a needs based funding device which encompassed all students in receipt of additional disability related educational support. The Student Resource Index developed in this thesis is a set of behavioural descriptors designed to assess degree of additional educational need across a number of disability domains. These domains include hearing, vision, communication, health, co-ordination (manual and mobility), intellectual capacity and behaviour. The completed Student Resource Index provides a profile of the students’ needs across all of these domains and as such addresses the multiple nature of many disabling conditions. The Student Resource Index was validated in terms of its capacity to predict the ‘known’ membership or the type of special school which some 1200 students in the sample currently attended. The decision to use the existing special school populations as the criterion against which the Student Resource Index was validated was based on the premise that the differing resource levels of these schools had been historically determined by expert opinion, industrial negotiation and reference to other special education systems as the most reliable estimate of the enrolled students’ needs. When discriminant function analysis was applied to some 178 students attending one school for students with mild intellectual disability and one facility for students with moderate to severe intellectual disability the Student Resource Index was successful in predicting the student's known school in 92 percent of cases. An analysis of those students (8 percent) which the Student Resource Index had failed to predict their known school enrolment revealed that 13 students had, for a variety of reasons, been inappropriately placed in these settings. When these students were removed from the sample the predictive accuracy of the Student Resource Index was raised to 96 percent of the sample. By comparison the domains of the Vineland Adaptive Behaviour Scale accurately predicted known enrolments of 76 percent of the sample. By way of replication discriminant function analysis was then applied to the Student Resource Index profiles of 518 students attending Day Special Schools (Mild Intellectual Disability) and 287 students attending Special Developmental Schools (Moderate to Severe Intellectual Disability). In this case, the Student Resource Index profiles were successful in predicting the known enrolments of 85 percent of students. When a third group was added, 147 students attending Day Special Schools for students with physical disabilities, the Student Resource Index predicted known enrolments in 80 percent of cases. The addition of a fourth group of 116 students attending Day Special Schools (Hearing Impaired) to the discriminant analysis led to a small reduction in predictive accuracy from 80 percent to 78 percent of the sample. A final analysis which included students attending a School for the Deaf-Blind, a Hospital School and a Social and Behavioural Unit was successful in predicting known enrolments in 71 percent of the 1114 students in the sample. For reasons which are expanded upon within the thesis it was concluded that the Student Resource Index when used in conjunction with discriminant function analysis was capable of isolating four distinct groups on the basis of their additional educational needs. If the historically determined and varied funding levels provided to these groups, inherent in the cash equivalent of the staffing ratios of Day Special Schools (Mild Intellectual Disability), Special Development Schools (Moderate to Severe Intellectual Disability), Day Special Schools (Physical Disability) and Day Special Schools (Hearing Impairment) are accepted as reasonable reflections of these students’ needs these funding levels can be translated into funding bands. These funding bands can then be applied to students in segregated or inclusive placements. The thesis demonstrates that a new applicant for funding can be introduced into the existing data base and by the use of discriminant function analysis be allocated to one of the four groups. The analysis is in effect saying that this new student’s profile of educational needs has more in common with Group A than with the members of Groups B, C, or D. The student would then be funded at Group A level. It is immaterial from a funding point of view whether the student decides to attend a segregated or inclusive setting. The thesis then examines the impact of the introduction of Student Resource Index based funding upon the current funding of the special schools in one of the major metropolitan regions. Overall, such an initiative would lead to a reduction of 1.54 percent of the total funding accruing to the region’s special schools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we investigate modeling capabilities of Bonferroni means and their generalizations. We will show that weighted Bonferroni means can model the concepts of hard and soft partial conjunction, and lead to several interesting special cases, with quite an intuitive interpretation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the problem of averaging values on lattices, and in particular on discrete product lattices. This problem arises in image processing when several color values given in RGB, HSL, or another coding scheme, need to be combined. We show how the arithmetic mean and the median can be constructed by minimizing appropriate penalties. We also discuss which of them coincide with the Cartesian product of the standard mean and median.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the problem of averaging values on lattices, and in particular on discrete product lattices. This problem arises in image processing when several color values given in RGB, HSL, or another coding scheme, need to be combined. We show how the arithmetic mean and the median can be constructed by minimizing appropriate penalties, and we discuss which of them coincide with the Cartesian product of the standard mean and median. We apply these functions in image processing. We present three algorithms for color image reduction based on minimizing penalty functions on discrete product lattices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the face of mass amounts of information and the need for transparent and fair decision processes, aggregation functions are essential for summarizing data and providing overall evaluations. Although families such as weighted means and medians have been well studied, there are still applications for which no existing aggregation functions can capture the decision makers' preferences. Furthermore, extensions of aggregation functions to lattices are often needed to model operations on L-fuzzy sets, interval-valued and intuitionistic fuzzy sets. In such cases, the aggregation properties need to be considered in light of the lattice structure, as otherwise counterintuitive or unreliable behavior may result. The Bonferroni mean has recently received attention in the fuzzy sets and decision making community as it is able to model useful notions such as mandatory requirements. Here, we consider its associated penalty function to extend the generalized Bonferroni mean to lattices. We show that different notions of dissimilarity on lattices can lead to alternative expressions.