838 resultados para Invariant Measure


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ubiquity of time series data across almost all human endeavors has produced a great interest in time series data mining in the last decade. While dozens of classification algorithms have been applied to time series, recent empirical evidence strongly suggests that simple nearest neighbor classification is exceptionally difficult to beat. The choice of distance measure used by the nearest neighbor algorithm is important, and depends on the invariances required by the domain. For example, motion capture data typically requires invariance to warping, and cardiology data requires invariance to the baseline (the mean value). Similarly, recent work suggests that for time series clustering, the choice of clustering algorithm is much less important than the choice of distance measure used.In this work we make a somewhat surprising claim. There is an invariance that the community seems to have missed, complexity invariance. Intuitively, the problem is that in many domains the different classes may have different complexities, and pairs of complex objects, even those which subjectively may seem very similar to the human eye, tend to be further apart under current distance measures than pairs of simple objects. This fact introduces errors in nearest neighbor classification, where some complex objects may be incorrectly assigned to a simpler class. Similarly, for clustering this effect can introduce errors by “suggesting” to the clustering algorithm that subjectively similar, but complex objects belong in a sparser and larger diameter cluster than is truly warranted.We introduce the first complexity-invariant distance measure for time series, and show that it generally produces significant improvements in classification and clustering accuracy. We further show that this improvement does not compromise efficiency, since we can lower bound the measure and use a modification of triangular inequality, thus making use of most existing indexing and data mining algorithms. We evaluate our ideas with the largest and most comprehensive set of time series mining experiments ever attempted in a single work, and show that complexity-invariant distance measures can produce improvements in classification and clustering in the vast majority of cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present theory and experiments on the dynamics of reaction fronts in two-dimensional, vortex-dominated flows, for both time-independent and periodically driven cases. We find that the front propagation process is controlled by one-sided barriers that are either fixed in the laboratory frame (time-independent flows) or oscillate periodically (periodically driven flows). We call these barriers burning invariant manifolds (BIMs), since their role in front propagation is analogous to that of invariant manifolds in the transport and mixing of passive impurities under advection. Theoretically, the BIMs emerge from a dynamical systems approach when the advection-reaction-diffusion dynamics is recast as an ODE for front element dynamics. Experimentally, we measure the location of BIMs for several laboratory flows and confirm their role as barriers to front propagation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. At present, prostate cancer screening (PCS) guidelines require a discussion of risks, benefits, alternatives, and personal values, making decision aids an important tool to help convey information and to help clarify values. Objective: The overall goal of this study is to provide evidence of the reliability and validity of a PCS anxiety measure and the Decisional Conflict Scale (DCS). Methods. Using data from a randomized, controlled PCS decision aid trial that measured PCS anxiety at baseline and DCS at baseline (T0) and at two-weeks (T2), four psychometric properties were assessed: (1) internal consistency reliability, indicated by factor analysis intraclass correlations and Cronbach's α; (2) construct validity, indicated by patterns of Pearson correlations among subscales; (3) discriminant validity, indicated by the measure's ability to discriminate between undecided men and those with a definite screening intention; and (4) factor validity and invariance using confirmatory factor analyses (CFA). Results. The PCS anxiety measure had adequate internal consistency reliability and good construct and discriminant validity. CFAs indicated that the 3-factor model did not have adequate fit. CFAs for a general PCS anxiety measure and a PSA anxiety measure indicated adequate fit. The general PCS anxiety measure was invariant across clinics. The DCS had adequate internal consistency reliability except for the support subscale and had adequate discriminate validity. Good construct validity was found at the private clinic, but was only found for the feeling informed subscale at the public clinic. The traditional DCS did not have adequate fit at T0 or at T2. The alternative DCS had adequate fit at T0 but was not identified at T2. Factor loadings indicated that two subscales, feeling informed and feeling clear about values, were not distinct factors. Conclusions. Our general PCS anxiety measure can be used in PCS decision aid studies. The alternative DCS may be appropriate for men eligible for PCS. Implications: More emphasis needs to be placed on the development of PCS anxiety items relating to testing procedures. We recommend that the two DCS versions be validated in other samples of men eligible for PCS and in other health care decisions that involve uncertainty. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantum groups have been studied intensively for the last two decades from various points of view. The underlying mathematical structure is that of an algebra with a coproduct. Compact quantum groups admit Haar measures. However, if we want to have a Haar measure also in the noncompact case, we are forced to work with algebras without identity, and the notion of a coproduct has to be adapted. These considerations lead to the theory of multiplier Hopf algebras, which provides the mathematical tool for studying noncompact quantum groups with Haar measures. I will concentrate on the *-algebra case and assume positivity of the invariant integral. Doing so, I create an algebraic framework that serves as a model for the operator algebra approach to quantum groups. Indeed, the theory of locally compact quantum groups can be seen as the topological version of the theory of quantum groups as they are developed here in a purely algebraic context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although it may sound reasonable that American education continues to be more effective at sending high school students to college, in a study conducted in 2009, The Council of the Great City Schools states that "slightly more than half of entering ninth grade students arrive performing below grade level in reading and math, while one in five entering ninth grade students is more than two years behind grade level...[and] 25% received support in the form of remedial literacy instruction or interventions" (Council of the Great City Schools, 2009). Students are distracted with technology (Lei & Zhao, 2005), family (Xu & Corno, 2003), medical illnesses (Nielson, 2009), learning disabilities and perhaps the most detrimental to academic success, the very lack of interest in school (Ruch, 1963). In a Johns Hopkins research study, Building a Graduation Nation - Colorado (Balfanz, 2008), warning signs were apparent years before the student dropped out of high school. The ninth grade was often referenced as a critical point that indicated success or failure to graduate high school. The research conducted by Johns Hopkins illustrates the problem: students who become disengaged from school have a much greater chance of dropping out of high school and not graduating. The first purpose of this study was to compare different measurement models of the Student School Engagement (SSE) using Factor Analysis to verify model fit with student engagement. The second purpose was to determine the extent to which the SSE instrument measures student school engagement by investigating convergent validity (via the SSE and Appleton, Christenson, Kim and Reschly's instrument and Fredricks, Blumenfeld, Friedel and Paris's instrument), discriminant validity (via Huebner's Student Life Satisfaction Survey) and criterion-related validity (via the sub-latent variables of Aspirations, Belonging and Productivity and student outcome measures such as achievement, attendance and discipline). Discriminant validity was established between the SSE and the Appleton, Christenson, Kim and Reschly's model and Fredricks, Blumenfeld, Friedel and Paris's (2005) Student Engagement Instruments (SEI). When confirming discriminant validity, the SSE's correlations were weak and statistically not significant, thus establishing discriminant validity with the SLSS. Criterion-related validity was established through structural equation modeling when the SSE was found to be a significant predictor of student outcome measures when both risk score and CSAP scores were used. The third purpose of this study was to assess the factorial invariance of the SSE instrument across gender to ensure the instrument is measuring the intended construct across different groups. Conclusively, configural, weak and metric invariances were established for the SSE as a non-significant change in chi-square indicating that all parameters including the error variances were invariant across groups of gender. Engagement is not a clearly defined psychological construct; it requires more research in order to fully comprehend its complexity. Hopefully, with parental and teacher involvement and a sense of community, student engagement can be nurtured to result in a meaningful attachment to school and academic success.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We show that quantum information can be encoded into entangled states of multiple indistinguishable particles in such a way that any inertial observer can prepare, manipulate, or measure the encoded state independent of their Lorentz reference frame. Such relativistically invariant quantum information is free of the difficulties associated with encoding into spin or other degrees of freedom in a relativistic context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neural network learning rules can be viewed as statistical estimators. They should be studied in Bayesian framework even if they are not Bayesian estimators. Generalisation should be measured by the divergence between the true distribution and the estimated distribution. Information divergences are invariant measurements of the divergence between two distributions. The posterior average information divergence is used to measure the generalisation ability of a network. The optimal estimators for multinomial distributions with Dirichlet priors are studied in detail. This confirms that the definition is compatible with intuition. The results also show that many commonly used methods can be put under this unified framework, by assume special priors and special divergences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Size distributions in woody plant populations have been used to assess their regeneration status, assuming that size structures with reverse-J shapes represent stable populations. We present an empirical approach of this issue using five woody species from the Cerrado. Considering count data for all plants of these five species over a 12-year period, we analyzed size distribution by: a) plotting frequency distributions and their adjustment to the negative exponential curve and b) calculating the Gini coefficient. To look for a relationship between size structure and future trends, we considered the size structures from the first census year. We analyzed changes in number over time and performed a simple population viability analysis, which gives the mean population growth rate, its variance and the probability of extinction in a given time period. Frequency distributions and the Gini coefficient were not able to predict future trends in population numbers. We recommend that managers should not use measures of size structure as a basis for management decisions without applying more appropriate demographic studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently, the acoustic and nanoindentation techniques are two of the most used techniques for material elastic modulus measurement. In this article fundamental principles and limitations of both techniques are shown and discussed. Last advances in nanoindentation technique are also reviewed. An experimental study in ceramic, metallic, composite and single crystals was also done. Results shown that ultrasonic technique is capable to provide results in agreement with those reported in literature. However, ultrasonic technique does not allow measuring the elastic modulus of some small samples and single crystals. On the other hand, the nanoindentation technique estimates the elastic modulus values in reasonable agreement with those measured by acoustic methods, particularly in amorphous materials, while in some policristaline materials some deviation from expected values was obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Microarray techniques have become an important tool to the investigation of genetic relationships and the assignment of different phenotypes. Since microarrays are still very expensive, most of the experiments are performed with small samples. This paper introduces a method to quantify dependency between data series composed of few sample points. The method is used to construct gene co-expression subnetworks of highly significant edges. Results: The results shown here are for an adapted subset of a Saccharomyces cerevisiae gene expression data set with low temporal resolution and poor statistics. The method reveals common transcription factors with a high confidence level and allows the construction of subnetworks with high biological relevance that reveals characteristic features of the processes driving the organism adaptations to specific environmental conditions. Conclusion: Our method allows a reliable and sophisticated analysis of microarray data even under severe constraints. The utilization of systems biology improves the biologists ability to elucidate the mechanisms underlying celular processes and to formulate new hypotheses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Data: Photodynamic therapy (PDT) involves the photoinduction of cytotoxicity using a photosensitizer agent, a light source of the proper wavelength, and the presence of molecular oxygen. A model for tissue response to PDT based on the photodynamic threshold dose (Dth) has been widely used. In this model cells exposed to doses below Dth survive while at doses above the Dth necrosis takes place. Objective: This study evaluated the light Dth values by using two different methods of determination. One model concerns the depth of necrosis and the other the width of superficial necrosis. Materials and Methods: Using normal rat liver we investigated the depth and width of necrosis induced by PDT when a laser with a gaussian intensity profile is used. Different light doses, photosensitizers (Photogem, Photofrin, Photosan, Foscan, Photodithazine, and Radachlorin), and concentrations were employed. Each experiment was performed on five animals and the average and standard deviations were calculated. Results: A simple depth and width of necrosis model analysis allows us to determine the threshold dose by measuring both depth and surface data. Comparison shows that both measurements provide the same value within the degree of experimental error. Conclusion: This work demonstrates that by knowing the extent of the superficial necrotic area of a target tissue irradiated by a gaussian light beam, it is possible to estimate the threshold dose. This technique may find application where the determination of Dth must be done without cutting the tissue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a measurement of pi(+)pi(-)pi(+)pi(-) photonuclear production in ultraperipheral Au-Au collisions at root s(NN) = 200 GeV from the STAR experiment. The pi(+)pi(-)pi(+)pi(-) final states are observed at low transverse momentum and are accompanied by mutual nuclear excitation of the beam particles. The strong enhancement of the production cross section at low transverse momentum is consistent with coherent photoproduction. The pi(+)pi(-)pi(+)pi(-) invariant mass spectrum of the coherent events exhibits a broad peak around 1540 +/- 40 MeV/c(2) with a width of 570 +/- 60 MeV/c(2), in agreement with the photoproduction data for the rho(0)(1700). We do not observe a corresponding peak in the pi(+)pi(-) final state and measure an upper limit for the ratio of the branching fractions of the rho(0)(1700) to pi(+)pi(-) and pi(+)pi(-)pi(+)pi(-) of 2.5% at 90% confidence level. The ratio of rho(0)(1700) and rho(0)(770) coherent production cross sections is measured to be 13.4 +/- 0.8(stat.) +/- 4.4(syst.)%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The generation of a large recoil velocity from the inspiral and merger of binary black holes represents one of the most exciting results of numerical-relativity calculations. While many aspects of this process have been investigated and explained, the ""antikick,"" namely, the sudden deceleration after the merger, has not yet found a simple explanation. We show that the antikick can be understood in terms of the radiation from a deformed black hole where the anisotropic curvature distribution on the horizon correlates with the direction and intensity of the recoil. Our analysis is focused on Robinson-Trautman spacetimes and allows us to measure both the energies and momenta radiated in a gauge-invariant manner. At the same time, this simpler setup provides the qualitative and quantitative features of merging black holes, opening the way to a deeper understanding of the nonlinear dynamics of black-hole spacetimes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is by now well known that the Poincare group acts on the Moyal plane with a twisted coproduct. Poincare invariant classical field theories can be formulated for this twisted coproduct. In this paper we systematically study such a twisted Poincare action in quantum theories on the Moyal plane. We develop quantum field theories invariant under the twisted action from the representations of the Poincare group, ensuring also the invariance of the S-matrix under the twisted action of the group. A significant new contribution here is the construction of the Poincare generators using quantum fields.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose an alternative fidelity measure (namely, a measure of the degree of similarity) between quantum states and benchmark it against a number of properties of the standard Uhlmann-Jozsa fidelity. This measure is a simple function of the linear entropy and the Hilbert-Schmidt inner product between the given states and is thus, in comparison, not as computationally demanding. It also features several remarkable properties such as being jointly concave and satisfying all of Jozsa's axioms. The trade-off, however, is that it is supermultiplicative and does not behave monotonically under quantum operations. In addition, metrics for the space of density matrices are identified and the joint concavity of the Uhlmann-Jozsa fidelity for qubit states is established.