47 resultados para likelihood to publication
Resumo:
Two main approaches are commonly used to empirically evaluate linear factor pricingmodels: regression and SDF methods, with centred and uncentred versions of the latter.We show that unlike standard two-step or iterated GMM procedures, single-step estimatorssuch as continuously updated GMM yield numerically identical values for prices of risk,pricing errors, Jensen s alphas and overidentifying restrictions tests irrespective of the modelvalidity. Therefore, there is arguably a single approach regardless of the factors being tradedor not, or the use of excess or gross returns. We illustrate our results by revisiting Lustigand Verdelhan s (2007) empirical analysis of currency returns.
Resumo:
In Duchenne muscular dystrophy (DMD), a persistently altered and reorganizing extracellular matrix (ECM) within inflamed muscle promotes damage and dysfunction. However, the molecular determinants of the ECM that mediate inflammatory changes and faulty tissue reorganization remain poorly defined. Here, we show that fibrin deposition is a conspicuous consequence of muscle-vascular damage in dystrophic muscles of DMD patients and mdx mice and that elimination of fibrin(ogen) attenuated dystrophy progression in mdx mice. These benefits appear to be tied to: (i) a decrease in leukocyte integrin α(M)β(2)-mediated proinflammatory programs, thereby attenuating counterproductive inflammation and muscle degeneration; and (ii) a release of satellite cells from persistent inhibitory signals, thereby promoting regeneration. Remarkably, Fib-gamma(390-396A) (Fibγ(390-396A)) mice expressing a mutant form of fibrinogen with normal clotting function, but lacking the α(M)β(2) binding motif, ameliorated dystrophic pathology. Delivery of a fibrinogen/α(M)β(2) blocking peptide was similarly beneficial. Conversely, intramuscular fibrinogen delivery sufficed to induce inflammation and degeneration in fibrinogen-null mice. Thus, local fibrin(ogen) deposition drives dystrophic muscle inflammation and dysfunction, and disruption of fibrin(ogen)-α(M)β(2) interactions may provide a novel strategy for DMD treatment.
Resumo:
Precise estimation of propagation parameters inprecipitation media is of interest to improve the performanceof communications systems and in remote sensing applications.In this paper, we present maximum-likelihood estimators ofspecific attenuation and specific differential phase in rain. Themodel used for obtaining the cited estimators assumes coherentpropagation, reflection symmetry of the medium, and Gaussianstatistics of the scattering matrix measurements. No assumptionsabout the microphysical properties of the medium are needed.The performance of the estimators is evaluated through simulateddata. Results show negligible estimators bias and variances closeto Cramer–Rao bounds.
Resumo:
Con este nuevo número, la revista Intangible Capital, inicia el cuarto volumen avanzando hacia el quinto año de publicación. Como ya es tradición en la revista, iniciamos este nuevo volumen evaluando el anterior y presentando las nuevas direcciones. Como principales aportaciones del 2007, se destacan hechos relevantes como la renovación de convenios para la indexación científica de la revista, el cambio de plataforma a OJS, la inclusión de un nuevo editor, la nueva composición del editorial board, el equipo de revisores, el cambio a un modelo bilingüe de revista, la nueva financiación obtenida y el trabajo que estamos realizando gran número de editores científicos de acceso abierto en España para el reconocimiento por parte de la Comisión Nacional Evaluadora de la Actividad Investigadora.This issue opens the fourth volume of the Intangible Capital journal, which makes its way towards the fifth year of publication. As usually, we start this volume by evaluating the previous one and tracing new directions. Among the main contributions during the year 2007, we consider important to highlight the following aspects: the renewal of the scientific indexation agreements, the platform change to OJS, the appointment of a new editor, new members included in the editorial board, the board of reviewers, the change towards a bilingual model, the new financing obtained and, the last but not the least, the work undertaken together with many scientific editors of open access Spanish journals for obtaining the positive evaluation of the CNEAI (National Commission for the Evaluation of the Research Activity) and thus, being a proof of scientific excellence
Resumo:
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.
Resumo:
A new statistical parallax method using the Maximum Likelihood principle is presented, allowing the simultaneous determination of a luminosity calibration, kinematic characteristics and spatial distribution of a given sample. This method has been developed for the exploitation of the Hipparcos data and presents several improvements with respect to the previous ones: the effects of the selection of the sample, the observational errors, the galactic rotation and the interstellar absorption are taken into account as an intrinsic part of the formulation (as opposed to external corrections). Furthermore, the method is able to identify and characterize physically distinct groups in inhomogeneous samples, thus avoiding biases due to unidentified components. Moreover, the implementation used by the authors is based on the extensive use of numerical methods, so avoiding the need for simplification of the equations and thus the bias they could introduce. Several examples of application using simulated samples are presented, to be followed by applications to real samples in forthcoming articles.
Resumo:
This paper examines the most productive authors, institutions and countries in regional and urban science from 1991 to 2000 using information on published articles (and pages) from a sample of widely recognized journals in this field: ARS, JUE, JRS, IJURR, IRSR, PRS, RSUE, RS and US. We also consider the relation between the country of the institution named in articles and the country in which the journal is published, in order to know if there are a home publication bias in regional and urban science. Analysis was made for the whole decade and by subperiods, this allowed us to make a more dynamic interpretation of the results
Resumo:
This paper examines the most productive authors, institutions and countries in regional and urban science from 1991 to 2000 using information on published articles (and pages) from a sample of widely recognized journals in this field: ARS, JUE, JRS, IJURR, IRSR, PRS, RSUE, RS and US. We also consider the relation between the country of the institution named in articles and the country in which the journal is published, in order to know if there are a home publication bias in regional and urban science. Analysis was made for the whole decade and by subperiods, this allowed us to make a more dynamic interpretation of the results
Resumo:
Biometric system performance can be improved by means of data fusion. Several kinds of information can be fused in order to obtain a more accurate classification (identification or verification) of an input sample. In this paper we present a method for computing the weights in a weighted sum fusion for score combinations, by means of a likelihood model. The maximum likelihood estimation is set as a linear programming problem. The scores are derived from a GMM classifier working on a different feature extractor. Our experimental results assesed the robustness of the system in front a changes on time (different sessions) and robustness in front a change of microphone. The improvements obtained were significantly better (error bars of two standard deviations) than a uniform weighted sum or a uniform weighted product or the best single classifier. The proposed method scales computationaly with the number of scores to be fussioned as the simplex method for linear programming.
Resumo:
BACKGROUND: Pharmacists can play a decisive role in the management of ambulatory patients with depression who have poor adherence to antidepressant drugs. OBJECTIVE: To systematically evaluate the effectiveness of pharmacist care in improving adherence of depressed outpatients to antidepressants. METHODS: A systematic review and meta-analysis of randomized controlled trials (RCTs) was conducted. RCTs were identified through electronic databases (MEDLINE, Cochrane Central Register of Controlled Trials, Institute for Scientific Information Web of Knowledge, and Spanish National Research Council) from inception to April 2010, reference lists were checked, and experts were consulted. RCTs that evaluated the impact of pharmacist interventions on improving adherence to antidepressants in depressed patients in an outpatient setting (community pharmacy or pharmacy service) were included. Methodologic quality was assessed and methodologic details and outcomes were extracted in duplicate. RESULTS: Six RCTs were identified. A total of 887 patients with an established diagnosis of depression who were initiating or maintaining pharmacologic treatment with antidepressant drugs and who received pharmacist care (459 patients) or usual care (428 patients) were included in the review. The most commonly reported interventions were patient education and monitoring, monitoring and management of toxicity and adverse effects, adherence promotion, provision of written or visual information, and recommendation or implementation of changes or adjustments in medication. Overall, no statistical heterogeneity or publication bias was detected. The pooled odds ratio, using a random effects model, was 1.64 (95% CI 1.24 to 2.17). Subgroup analysis showed no statistically significant differences in results by type of pharmacist involved, adherence measure, diagnostic tool, or analysis strategy. CONCLUSIONS: These results suggest that pharmacist intervention is effective in the improvement of patient adherence to antidepressants. However, data are still limited and we would recommend more research in this area, specifically outside of the US.
Resumo:
The restricted maximum likelihood is preferred by many to the full maximumlikelihood for estimation with variance component and other randomcoefficientmodels, because the variance estimator is unbiased. It is shown that thisunbiasednessis accompanied in some balanced designs by an inflation of the meansquared error.An estimator of the cluster-level variance that is uniformly moreefficient than the fullmaximum likelihood is derived. Estimators of the variance ratio are alsostudied.
Resumo:
Motivation: The comparative analysis of gene gain and loss rates is critical for understanding the role of natural selection and adaptation in shaping gene family sizes. Studying complete genome data from closely related species allows accurate estimation of gene family turnover rates. Current methods and software tools, however, are not well designed for dealing with certain kinds of functional elements, such as microRNAs or transcription factor binding sites. Results: Here, we describe BadiRate, a new software tool to estimate family turnover rates, as well as the number of elements in internal phylogenetic nodes, by likelihood-based methods and parsimony. It implements two stochastic population models, which provide the appropriate statistical framework for testing hypothesis, such as lineage-specific gene family expansions or contractions. We have assessed the accuracy of BadiRate by computer simulations, and have also illustrated its functionality by analyzing a representative empirical dataset.
Resumo:
Motivation: The comparative analysis of gene gain and loss rates is critical for understanding the role of natural selection and adaptation in shaping gene family sizes. Studying complete genome data from closely related species allows accurate estimation of gene family turnover rates. Current methods and software tools, however, are not well designed for dealing with certain kinds of functional elements, such as microRNAs or transcription factor binding sites. Results: Here, we describe BadiRate, a new software tool to estimate family turnover rates, as well as the number of elements in internal phylogenetic nodes, by likelihood-based methods and parsimony. It implements two stochastic population models, which provide the appropriate statistical framework for testing hypothesis, such as lineage-specific gene family expansions or contractions. We have assessed the accuracy of BadiRate by computer simulations, and have also illustrated its functionality by analyzing a representative empirical dataset.
A Survey on Detection Techniques to Prevent Cross-Site Scripting Attacks on Current Web Applications