953 resultados para Generalized Resolvent Operator


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the influence of a background uniform magnetic field and boundary conditions on the vacuum of a quantized charged spinor matter field confined between two parallel neutral plates; the magnetic field is directed orthogonally to the plates. The admissible set of boundary conditions at the plates is determined by the requirement that the Dirac Hamiltonian operator be self-adjoint. It is shown that, in the case of a sufficiently strong magnetic field and a sufficiently large separation of the plates, the generalized Casimir force is repulsive, being independent of the choice of a boundary condition, as well as of the distance between the plates. The detection of this effect seems to be feasible in the foreseeable future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Despite long-standing calls to disseminate evidence-based treatments for generalized anxiety (GAD), modest progress has been made in the study of how such treatments should be implemented. The primary objective of this study was to test three competing strategies on how to implement a cognitive behavioral treatment (CBT) for out-patients with GAD (i.e., comparison of one compensation vs. two capitalization models). METHODS: For our three-arm, single-blinded, randomized controlled trial (implementation of CBT for GAD [IMPLEMENT]), we recruited adults with GAD using advertisements in high-circulation newspapers to participate in a 14-session cognitive behavioral treatment (Mastery of your Anxiety and Worry, MAW-packet). We randomly assigned eligible patients using a full randomization procedure (1:1:1) to three different conditions of implementation: adherence priming (compensation model), which had a systematized focus on patients' individual GAD symptoms and how to compensate for these symptoms within the MAW-packet, and resource priming and supportive resource priming (capitalization model), which had systematized focuses on patients' strengths and abilities and how these strengths can be capitalized within the same packet. In the intention-to-treat population an outcome composite of primary and secondary symptoms-related self-report questionnaires was analyzed based on a hierarchical linear growth model from intake to 6-month follow-up assessment. This trial is registered at ClinicalTrials.gov (identifier: NCT02039193) and is closed to new participants. FINDINGS: From June 2012 to Nov. 2014, from 411 participants that were screened, 57 eligible participants were recruited and randomly assigned to three conditions. Forty-nine patients (86%) provided outcome data at post-assessment (14% dropout rate). All three conditions showed a highly significant reduction of symptoms over time. However, compared with the adherence priming condition, both resource priming conditions indicated faster symptom reduction. The observer ratings of a sub-sample of recorded videos (n = 100) showed that the therapists in the resource priming conditions conducted more strength-oriented interventions in comparison with the adherence priming condition. No patients died or attempted suicide. INTERPRETATION: To our knowledge, this is the first trial that focuses on capitalization and compensation models during the implementation of one prescriptive treatment packet for GAD. We have shown that GAD related symptoms were significantly faster reduced by the resource priming conditions, although the limitations of our study included a well-educated population. If replicated, our results suggest that therapists who implement a mental health treatment for GAD might profit from a systematized focus on capitalization models. FUNDING: Swiss Science National Foundation (SNSF-Nr. PZ00P1_136937/1) awarded to CF. KEYWORDS: Cognitive behavioral therapy; Evidence-based treatment; Implementation strategies; Randomized controlled trial

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A ladder operator solution to the particle in a box problem of elementary quantum mechanics is presented, although the pedagogical use of this method for this problem is questioned.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Determining the profit maximizing input-output bundle of a firm requires data on prices. This paper shows how endogenously determined shadow prices can be used in place of actual prices to obtain the optimal input-output bundle where the firm.s shadow profit is maximized. This approach amounts to an application of the Weak Axiom of Profit Maximization (WAPM) formulated by Varian (1984) based on shadow prices rather than actual prices. At these prices the shadow profit of a firm is zero. Thus, the maximum profit that could have been attained at some other input-output bundle is a measure of the inefficiency of the firm. Because the benchmark input-output bundle is always an observed bundle from the data, it can be determined without having to solve any elaborate programming problem. An empirical application to U.S. airlines data illustrates the proposed methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a nonparametric model for global cost minimization as a framework for optimal allocation of a firm's output target across multiple locations, taking account of differences in input prices and technologies across locations. This should be useful for firms planning production sites within a country and for foreign direct investment decisions by multi-national firms. Two illustrative examples are included. The first example considers the production location decision of a manufacturing firm across a number of adjacent states of the US. In the other example, we consider the optimal allocation of US and Canadian automobile manufacturers across the two countries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the recognition of the importance of evidence-based medicine, there is an emerging need for methods to systematically synthesize available data. Specifically, methods to provide accurate estimates of test characteristics for diagnostic tests are needed to help physicians make better clinical decisions. To provide more flexible approaches for meta-analysis of diagnostic tests, we developed three Bayesian generalized linear models. Two of these models, a bivariate normal and a binomial model, analyzed pairs of sensitivity and specificity values while incorporating the correlation between these two outcome variables. Noninformative independent uniform priors were used for the variance of sensitivity, specificity and correlation. We also applied an inverse Wishart prior to check the sensitivity of the results. The third model was a multinomial model where the test results were modeled as multinomial random variables. All three models can include specific imaging techniques as covariates in order to compare performance. Vague normal priors were assigned to the coefficients of the covariates. The computations were carried out using the 'Bayesian inference using Gibbs sampling' implementation of Markov chain Monte Carlo techniques. We investigated the properties of the three proposed models through extensive simulation studies. We also applied these models to a previously published meta-analysis dataset on cervical cancer as well as to an unpublished melanoma dataset. In general, our findings show that the point estimates of sensitivity and specificity were consistent among Bayesian and frequentist bivariate normal and binomial models. However, in the simulation studies, the estimates of the correlation coefficient from Bayesian bivariate models are not as good as those obtained from frequentist estimation regardless of which prior distribution was used for the covariance matrix. The Bayesian multinomial model consistently underestimated the sensitivity and specificity regardless of the sample size and correlation coefficient. In conclusion, the Bayesian bivariate binomial model provides the most flexible framework for future applications because of its following strengths: (1) it facilitates direct comparison between different tests; (2) it captures the variability in both sensitivity and specificity simultaneously as well as the intercorrelation between the two; and (3) it can be directly applied to sparse data without ad hoc correction. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Complex diseases, such as cancer, are caused by various genetic and environmental factors, and their interactions. Joint analysis of these factors and their interactions would increase the power to detect risk factors but is statistically. Bayesian generalized linear models using student-t prior distributions on coefficients, is a novel method to simultaneously analyze genetic factors, environmental factors, and interactions. I performed simulation studies using three different disease models and demonstrated that the variable selection performance of Bayesian generalized linear models is comparable to that of Bayesian stochastic search variable selection, an improved method for variable selection when compared to standard methods. I further evaluated the variable selection performance of Bayesian generalized linear models using different numbers of candidate covariates and different sample sizes, and provided a guideline for required sample size to achieve a high power of variable selection using Bayesian generalize linear models, considering different scales of number of candidate covariates. ^ Polymorphisms in folate metabolism genes and nutritional factors have been previously associated with lung cancer risk. In this study, I simultaneously analyzed 115 tag SNPs in folate metabolism genes, 14 nutritional factors, and all possible genetic-nutritional interactions from 1239 lung cancer cases and 1692 controls using Bayesian generalized linear models stratified by never, former, and current smoking status. SNPs in MTRR were significantly associated with lung cancer risk across never, former, and current smokers. In never smokers, three SNPs in TYMS and three gene-nutrient interactions, including an interaction between SHMT1 and vitamin B12, an interaction between MTRR and total fat intake, and an interaction between MTR and alcohol use, were also identified as associated with lung cancer risk. These lung cancer risk factors are worthy of further investigation.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En el marco de una investigación cualitativa se entrevistó a personal de bodegas de Luján y Maipú (Mendoza, Argentina) para conocer su opinión sobre la contaminación ambiental producida por dichos establecimientos. Se estudiaron cuatro dimensiones de la contaminación: ruidos, olores, residuos sólidos y residuos líquidos. Por muestreo aleatorio se seleccionaron 18 bodegas. En cada una se entrevistó un empleado de nivel gerencial y a un operario de planta utilizando un formulario semiestructurado con preguntas abiertas. La tarea de reducción de la información cualitativa consistió en una categorización y agrupamiento de respuestas. Mediante inducción analítica se generalizó para el área de estudio. Se concluyó que la contaminación ambiental en el lugar de trabajo de las bodegas es percibida por su personal como de mediana magnitud. Con respecto a los residuos líquidos, se consideran de bajo impacto ambiental externo . No se perciben impactos externos causados por las otras dimensiones. El grado de conciencia empresarial es muy variable. En algunas empresas existe conducta favorable al cuidado del medio ambiente, pero esto no es generalizable a todas las bodegas. Hay descuido por parte de los operarios en el uso de protectores para ruidos y olores.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The operator effect is a well-known methodological bias already quantified in some taphonomic studies. However, the replicability effect, i.e., the use of taphonomic attributes as a replicable scientific method, has not been taken into account to the present. Here, we quantified for the first time this replicability bias using different multivariate statistical techniques, testing if the operator effect is related to the replicability effect. We analyzed the results reported by 15 operators working on the same dataset. Each operator analyzed 30 biological remains (bivalve shells) from five different sites, considering the attributes fragmentation, edge rounding, corrasion, bioerosion and secondary color. The operator effect followed the same pattern reported in previous studies, characterized by a worse correspondence for those attributes having more than two levels of damage categories. However, the effect did not appear to have relation with the replicability effect, because nearly all operators found differences among sites. Despite the binary attribute bioerosion exhibited 83% of correspondence among operators it was the taphonomic attributes that showed the highest dispersion among operators (28%). Therefore, we conclude that binary attributes (despite showing a reduction of the operator effect) diminish replicability, resulting in different interpretations of concordant data. We found that a variance value of nearly 8% among operators, was enough to generate a different taphonomic interpretation, in a Q-mode cluster analysis. The results reported here showed that the statistical method employed influences the level of replicability and comparability of a study and that the availability of results may be a valid alternative to reduce bias.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show the existence of free dense subgroups, generated by two elements, in the holomorphic shear and overshear group of complex-Euclidean space and extend this result to the group of holomorphic automorphisms of Stein manifolds with the density property, provided there exists a generalized translation. The conjugation operator associated to this generalized translation is hypercyclic on the topological space of holomorphic automorphisms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this work is to solve a question raised for average sampling in shift-invariant spaces by using the well-known matrix pencil theory. In many common situations in sampling theory, the available data are samples of some convolution operator acting on the function itself: this leads to the problem of average sampling, also known as generalized sampling. In this paper we deal with the existence of a sampling formula involving these samples and having reconstruction functions with compact support. Thus, low computational complexity is involved and truncation errors are avoided. In practice, it is accomplished by means of a FIR filter bank. An answer is given in the light of the generalized sampling theory by using the oversampling technique: more samples than strictly necessary are used. The original problem reduces to finding a polynomial left inverse of a polynomial matrix intimately related to the sampling problem which, for a suitable choice of the sampling period, becomes a matrix pencil. This matrix pencil approach allows us to obtain a practical method for computing the compactly supported reconstruction functions for the important case where the oversampling rate is minimum. Moreover, the optimality of the obtained solution is established.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a methodology for reducing a straight line fitting regression problem to a Least Squares minimization one. This is accomplished through the definition of a measure on the data space that takes into account directional dependences of errors, and the use of polar descriptors for straight lines. This strategy improves the robustness by avoiding singularities and non-describable lines. The methodology is powerful enough to deal with non-normal bivariate heteroscedastic data error models, but can also supersede classical regression methods by making some particular assumptions. An implementation of the methodology for the normal bivariate case is developed and evaluated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Networks of Evolutionary Processors (NEPs) are computing mechanisms directly inspired from the behavior of cell populations more specifically the point mutations in DNA strands. These mechanisms are been used for solving NP-complete problems by means of a parallel computation postulation. This paper describes an implementation of the basic model of NEP using Web technologies and includes the possibility of designing some of the most common variants of it by means the use of the web page design which eases the configuration of a given problem. It is a system intended to be used in a multicore processor in order to benefit from the multi thread use.