894 resultados para consistent and asymptotically normal estimators


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Except for the complement C1q, the immunological functions of other C1q family members have remained unclear. Here we describe zebrafish C1q-like, whose transcription and translation display a uniform distribution in early embryos, and are restricted to mid-hind brain and eye in later embryos. In vitro studies showed that C1q-like could inhibit the apoptosis induced by ActD and CHX in EPC cells, through repressing caspase 3/9 activities. Moreover, its physiological roles were studied by morpholino-mediated knockdown in zebrafish embryogenesis. In comparison with control embryos, the C1q-like knockdown embryos display obvious defects in the head and cramofacial development mediated through p53-induced apoptosis, which was confirmed by the in vitro transcribed C1q-like mRNA or p53 MO co-injection. TUNEL assays revealed extensive cell death, and caspase 3/9 activity measurement also revealed about two folds increase in C1q-like morphant embryos, which was inhibited by p53 MO co-injection. Real-time quantitative PCR showed the up-regulation expression of several apoptosis regulators such as p53, mdm2, p21, Box and caspase 3, and down-regulation expression of hbae1 in the C1q-like morphant embryos. Knockdown of C1q-like in zebrafish embryos decreased hemoglobin production and impaired the organization of mesencephalic vein and other brain blood vessels. Interestingly, exposure of zebrafish embryos to UV resulted in an increase in mRNA expression of C1q-like, whereas over-expression of C1q-like was not enough resist to the damage. Furthermore, C1q-like transcription was up-regulated in response to pathogen Aeromonas hydrophila, and embryo survival significantly decreased in the C1q-like morphants after exposure to the bacteria. The data suggested that C1q-like might play an antiapoptotic and protective role in inhibiting p53-dependent and caspase 3/9-mediated apoptosis during embryogenesis, especially in the brain development, and C1q-like should be a novel regulator of cell survival during zebrafish embryogenesis. (c) 2008 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human hepatoma (SMMC-7721) and normal liver (L02) cells were irradiated with c-rays, 12C6+ and 36Ar18+ ion beams at the Heavy Ion Research Facility in Lanzhou (HIRFL). By using the Calyculin-A induced premature chromosome condensation technique, chromatid-type breaks and isochromatid-type breaks were scored separately. Tumor cells irradiated with heavy ions produced a majority of isochromatid break, while chromatid breaks were dominant when cells were exposed to c-rays. The relative biological effectiveness (RBE) for irradiation-induced chromatid breaks were 3.6 for L02 and 3.5 for SMMC-7721 cell lines at the LET peak of 96 keVlm 1 12C6+ ions, and 2.9 for both of the two cell lines of 512 keVlm 1 36Ar18+ ions. It suggested that the RBE of isochromatid-type breaks was pretty high when high-LET radiations were induced. Thus we concluded that the high production of isochromatid-type breaks, induced by the densely ionizing track structure, could be regarded as a signature of high-LET radiation exposure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Endoscopic sphincterotomy (ES) is indicated in patients with confirmed bile duct stones at endoscopic retrograde cholangiopancreatography (ERCP). The role of ES in patients with suspected bile duct stones but a normal cholangiogram, in the prevention of recurrent biliary symptoms, when cholecystectomy is not planned, is unclear.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose. To evaluate differences in optic disc and visual field damage between African-American and Caucasian Normal Tension Glaucoma (NTG) patients. Methods. We retrospectively selected 33 African-American patients with the diagnosis of NTG and age-matched them with 33 Caucasian patients with the same diagnosis. Three masked observers graded disc photographs and visual fields as being normal, globally damaged or focally damaged for both eyes of the subject. Chi-square test was used to evaluate statistically significant differences between groups. Results. The results of the visual fields showed that in the African-American group, 24% were graded normal, 30% showed global damage, and 46% showed focal damage. This data was compared with the Caucasian group which showed 41% normal graded eyes, 22% with global damage, and 37% with focal damage (p = 0.28). The results of the optic disc photos showed that in the African-American group, 25% were graded normal, 45% showed global damage, and 30% showed focal damage. This data was compared with the Caucasian group which showed 43% graded normal, 32% with global damage, and 25% with focal damage (p=0.16). Conclusions. In our study there was no difference in the frequency of globally damaged, focally damaged, and normal graded discs or visual fields between African-American and Caucasian NTG patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The infrared and the Raman spectra of eSelF has been obtained for the first time and has been analysed to give the in-plane normal vibrational frequencies of the molecule, in the ground state. A normal co-ordinate analysis has been carried out for the molecules CSF2, CSClF and eSel 2 using a Urey-Bradley type of potential function and the elements of the [L] matrix elements, the distribution of the potential energy in Urey-Bradley space, and the displacement vector diagrams for the normal modes of vibration for these molecules, have been obtained. The bond for~e constants obtained through the normal co-ordinate analysis, have given some interesting results. The stretching force constant, Kes ' varies markedly with halogen substitution and the force constants KeF and Keel also vary with substitution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Parmi les méthodes d’estimation de paramètres de loi de probabilité en statistique, le maximum de vraisemblance est une des techniques les plus populaires, comme, sous des conditions l´egères, les estimateurs ainsi produits sont consistants et asymptotiquement efficaces. Les problèmes de maximum de vraisemblance peuvent être traités comme des problèmes de programmation non linéaires, éventuellement non convexe, pour lesquels deux grandes classes de méthodes de résolution sont les techniques de région de confiance et les méthodes de recherche linéaire. En outre, il est possible d’exploiter la structure de ces problèmes pour tenter d’accélerer la convergence de ces méthodes, sous certaines hypothèses. Dans ce travail, nous revisitons certaines approches classiques ou récemment d´eveloppées en optimisation non linéaire, dans le contexte particulier de l’estimation de maximum de vraisemblance. Nous développons également de nouveaux algorithmes pour résoudre ce problème, reconsidérant différentes techniques d’approximation de hessiens, et proposons de nouvelles méthodes de calcul de pas, en particulier dans le cadre des algorithmes de recherche linéaire. Il s’agit notamment d’algorithmes nous permettant de changer d’approximation de hessien et d’adapter la longueur du pas dans une direction de recherche fixée. Finalement, nous évaluons l’efficacité numérique des méthodes proposées dans le cadre de l’estimation de modèles de choix discrets, en particulier les modèles logit mélangés.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cette thèse est organisée en trois chapitres. Les deux premiers s'intéressent à l'évaluation, par des méthodes d'estimations, de l'effet causal ou de l'effet d'un traitement, dans un environnement riche en données. Le dernier chapitre se rapporte à l'économie de l'éducation. Plus précisément dans ce chapitre j'évalue l'effet de la spécialisation au secondaire sur le choix de filière à l'université et la performance. Dans le premier chapitre, j'étudie l'estimation efficace d'un paramètre de dimension finie dans un modèle linéaire où le nombre d'instruments peut être très grand ou infini. L'utilisation d'un grand nombre de conditions de moments améliore l'efficacité asymptotique des estimateurs par variables instrumentales, mais accroit le biais. Je propose une version régularisée de l'estimateur LIML basée sur trois méthodes de régularisations différentes, Tikhonov, Landweber Fridman, et composantes principales, qui réduisent le biais. Le deuxième chapitre étend les travaux précédents, en permettant la présence d'un grand nombre d'instruments faibles. Le problème des instruments faibles est la consequence d'un très faible paramètre de concentration. Afin d'augmenter la taille du paramètre de concentration, je propose d'augmenter le nombre d'instruments. Je montre par la suite que les estimateurs 2SLS et LIML régularisés sont convergents et asymptotiquement normaux. Le troisième chapitre de cette thèse analyse l'effet de la spécialisation au secondaire sur le choix de filière à l'université. En utilisant des données américaines, j'évalue la relation entre la performance à l'université et les différents types de cours suivis pendant les études secondaires. Les résultats suggèrent que les étudiants choisissent les filières dans lesquelles ils ont acquis plus de compétences au secondaire. Cependant, on a une relation en U entre la diversification et la performance à l'université, suggérant une tension entre la spécialisation et la diversification. Le compromis sous-jacent est évalué par l'estimation d'un modèle structurel de l'acquisition du capital humain au secondaire et de choix de filière. Des analyses contrefactuelles impliquent qu'un cours de plus en matière quantitative augmente les inscriptions dans les filières scientifiques et technologiques de 4 points de pourcentage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The question of stability of black hole was first studied by Regge and Wheeler who investigated linear perturbations of the exterior Schwarzschild spacetime. Further work on this problem led to the study of quasi-normal modes which is believed as a characteristic sound of black holes. Quasi-normal modes (QNMs) describe the damped oscillations under perturbations in the surrounding geometry of a black hole with frequencies and damping times of oscillations entirely fixed by the black hole parameters.In the present work we study the influence of cosmic string on the QNMs of various black hole background spacetimes which are perturbed by a massless Dirac field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The literature related to skew–normal distributions has grown rapidly in recent years but at the moment few applications concern the description of natural phenomena with this type of probability models, as well as the interpretation of their parameters. The skew–normal distributions family represents an extension of the normal family to which a parameter (λ) has been added to regulate the skewness. The development of this theoretical field has followed the general tendency in Statistics towards more flexible methods to represent features of the data, as adequately as possible, and to reduce unrealistic assumptions as the normality that underlies most methods of univariate and multivariate analysis. In this paper an investigation on the shape of the frequency distribution of the logratio ln(Cl−/Na+) whose components are related to waters composition for 26 wells, has been performed. Samples have been collected around the active center of Vulcano island (Aeolian archipelago, southern Italy) from 1977 up to now at time intervals of about six months. Data of the logratio have been tentatively modeled by evaluating the performance of the skew–normal model for each well. Values of the λ parameter have been compared by considering temperature and spatial position of the sampling points. Preliminary results indicate that changes in λ values can be related to the nature of environmental processes affecting the data

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Following an introduction to the diagonalization of matrices, one of the more difficult topics for students to grasp in linear algebra is the concept of Jordan normal form. In this note, we show how the important notions of diagonalization and Jordan normal form can be introduced and developed through the use of the computer algebra package Maple®.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two simple and frequently used capture–recapture estimates of the population size are compared: Chao's lower-bound estimate and Zelterman's estimate allowing for contaminated distributions. In the Poisson case it is shown that if there are only counts of ones and twos, the estimator of Zelterman is always bounded above by Chao's estimator. If counts larger than two exist, the estimator of Zelterman is becoming larger than that of Chao's, if only the ratio of the frequencies of counts of twos and ones is small enough. A similar analysis is provided for the binomial case. For a two-component mixture of Poisson distributions the asymptotic bias of both estimators is derived and it is shown that the Zelterman estimator can experience large overestimation bias. A modified Zelterman estimator is suggested and also the bias-corrected version of Chao's estimator is considered. All four estimators are compared in a simulation study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[1] Decadal hindcast simulations of Arctic Ocean sea ice thickness made by a modern dynamic-thermodynamic sea ice model and forced independently by both the ERA-40 and NCEP/NCAR reanalysis data sets are compared for the first time. Using comprehensive data sets of observations made between 1979 and 2001 of sea ice thickness, draft, extent, and speeds, we find that it is possible to tune model parameters to give satisfactory agreement with observed data, thereby highlighting the skill of modern sea ice models, though the parameter values chosen differ according to the model forcing used. We find a consistent decreasing trend in Arctic Ocean sea ice thickness since 1979, and a steady decline in the Eastern Arctic Ocean over the full 40-year period of comparison that accelerated after 1980, but the predictions of Western Arctic Ocean sea ice thickness between 1962 and 1980 differ substantially. The origins of differing thickness trends and variability were isolated not to parameter differences but to differences in the forcing fields applied, and in how they are applied. It is argued that uncertainty, differences and errors in sea ice model forcing sets complicate the use of models to determine the exact causes of the recently reported decline in Arctic sea ice thickness, but help in the determination of robust features if the models are tuned appropriately against observations.