936 resultados para Consistent and asymptotically normal estimators
Resumo:
GLUT2-null mice are hyperglycemic, hypoinsulinemic, hyperglucagonemic, and glycosuric and die within the first 3 weeks of life. Their endocrine pancreas shows a loss of first phase glucose-stimulated insulin secretion (GSIS) and inverse alpha to beta cell ratio. Here we show that reexpression by transgenesis of either GLUT1 or GLUT2 in the pancreatic beta cells of these mice allowed mouse survival and breeding. The rescued mice had normal-fed glycemia but fasted hypoglycemia, glycosuria, and an elevated glucagon to insulin ratio. Glucose tolerance was, however, normal. In vivo, insulin secretion assessed following hyperglycemic clamps was normal. In vitro, islet perifusion studies revealed that first phase of insulin secretion was restored as well by GLUT1 or GLUT2, and this was accompanied by normalization of the glucose utilization rate. The ratio of pancreatic insulin to glucagon and volume densities of alpha to beta cells were, however, not corrected. These data demonstrate that 1) reexpression of GLUT1 or GLUT2 in beta cells is sufficient to rescue GLUT2-null mice from lethality, 2) GLUT1 as well as GLUT2 can restore normal GSIS, 3) restoration of GSIS does not correct the abnormal composition of the endocrine pancreas. Thus, normal GSIS does not depend on transporter affinity but on the rate of uptake at stimulatory glucose concentrations.
Resumo:
The infrared and the Raman spectra of eSelF has been obtained for the first time and has been analysed to give the in-plane normal vibrational frequencies of the molecule, in the ground state. A normal co-ordinate analysis has been carried out for the molecules CSF2, CSClF and eSel 2 using a Urey-Bradley type of potential function and the elements of the [L] matrix elements, the distribution of the potential energy in Urey-Bradley space, and the displacement vector diagrams for the normal modes of vibration for these molecules, have been obtained. The bond for~e constants obtained through the normal co-ordinate analysis, have given some interesting results. The stretching force constant, Kes ' varies markedly with halogen substitution and the force constants KeF and Keel also vary with substitution.
Resumo:
Parmi les méthodes d’estimation de paramètres de loi de probabilité en statistique, le maximum de vraisemblance est une des techniques les plus populaires, comme, sous des conditions l´egères, les estimateurs ainsi produits sont consistants et asymptotiquement efficaces. Les problèmes de maximum de vraisemblance peuvent être traités comme des problèmes de programmation non linéaires, éventuellement non convexe, pour lesquels deux grandes classes de méthodes de résolution sont les techniques de région de confiance et les méthodes de recherche linéaire. En outre, il est possible d’exploiter la structure de ces problèmes pour tenter d’accélerer la convergence de ces méthodes, sous certaines hypothèses. Dans ce travail, nous revisitons certaines approches classiques ou récemment d´eveloppées en optimisation non linéaire, dans le contexte particulier de l’estimation de maximum de vraisemblance. Nous développons également de nouveaux algorithmes pour résoudre ce problème, reconsidérant différentes techniques d’approximation de hessiens, et proposons de nouvelles méthodes de calcul de pas, en particulier dans le cadre des algorithmes de recherche linéaire. Il s’agit notamment d’algorithmes nous permettant de changer d’approximation de hessien et d’adapter la longueur du pas dans une direction de recherche fixée. Finalement, nous évaluons l’efficacité numérique des méthodes proposées dans le cadre de l’estimation de modèles de choix discrets, en particulier les modèles logit mélangés.
Resumo:
Cette thèse est organisée en trois chapitres. Les deux premiers s'intéressent à l'évaluation, par des méthodes d'estimations, de l'effet causal ou de l'effet d'un traitement, dans un environnement riche en données. Le dernier chapitre se rapporte à l'économie de l'éducation. Plus précisément dans ce chapitre j'évalue l'effet de la spécialisation au secondaire sur le choix de filière à l'université et la performance. Dans le premier chapitre, j'étudie l'estimation efficace d'un paramètre de dimension finie dans un modèle linéaire où le nombre d'instruments peut être très grand ou infini. L'utilisation d'un grand nombre de conditions de moments améliore l'efficacité asymptotique des estimateurs par variables instrumentales, mais accroit le biais. Je propose une version régularisée de l'estimateur LIML basée sur trois méthodes de régularisations différentes, Tikhonov, Landweber Fridman, et composantes principales, qui réduisent le biais. Le deuxième chapitre étend les travaux précédents, en permettant la présence d'un grand nombre d'instruments faibles. Le problème des instruments faibles est la consequence d'un très faible paramètre de concentration. Afin d'augmenter la taille du paramètre de concentration, je propose d'augmenter le nombre d'instruments. Je montre par la suite que les estimateurs 2SLS et LIML régularisés sont convergents et asymptotiquement normaux. Le troisième chapitre de cette thèse analyse l'effet de la spécialisation au secondaire sur le choix de filière à l'université. En utilisant des données américaines, j'évalue la relation entre la performance à l'université et les différents types de cours suivis pendant les études secondaires. Les résultats suggèrent que les étudiants choisissent les filières dans lesquelles ils ont acquis plus de compétences au secondaire. Cependant, on a une relation en U entre la diversification et la performance à l'université, suggérant une tension entre la spécialisation et la diversification. Le compromis sous-jacent est évalué par l'estimation d'un modèle structurel de l'acquisition du capital humain au secondaire et de choix de filière. Des analyses contrefactuelles impliquent qu'un cours de plus en matière quantitative augmente les inscriptions dans les filières scientifiques et technologiques de 4 points de pourcentage.
Resumo:
The question of stability of black hole was first studied by Regge and Wheeler who investigated linear perturbations of the exterior Schwarzschild spacetime. Further work on this problem led to the study of quasi-normal modes which is believed as a characteristic sound of black holes. Quasi-normal modes (QNMs) describe the damped oscillations under perturbations in the surrounding geometry of a black hole with frequencies and damping times of oscillations entirely fixed by the black hole parameters.In the present work we study the influence of cosmic string on the QNMs of various black hole background spacetimes which are perturbed by a massless Dirac field.
Resumo:
The literature related to skew–normal distributions has grown rapidly in recent years but at the moment few applications concern the description of natural phenomena with this type of probability models, as well as the interpretation of their parameters. The skew–normal distributions family represents an extension of the normal family to which a parameter (λ) has been added to regulate the skewness. The development of this theoretical field has followed the general tendency in Statistics towards more flexible methods to represent features of the data, as adequately as possible, and to reduce unrealistic assumptions as the normality that underlies most methods of univariate and multivariate analysis. In this paper an investigation on the shape of the frequency distribution of the logratio ln(Cl−/Na+) whose components are related to waters composition for 26 wells, has been performed. Samples have been collected around the active center of Vulcano island (Aeolian archipelago, southern Italy) from 1977 up to now at time intervals of about six months. Data of the logratio have been tentatively modeled by evaluating the performance of the skew–normal model for each well. Values of the λ parameter have been compared by considering temperature and spatial position of the sampling points. Preliminary results indicate that changes in λ values can be related to the nature of environmental processes affecting the data
Resumo:
Following an introduction to the diagonalization of matrices, one of the more difficult topics for students to grasp in linear algebra is the concept of Jordan normal form. In this note, we show how the important notions of diagonalization and Jordan normal form can be introduced and developed through the use of the computer algebra package Maple®.
Resumo:
Two simple and frequently used capture–recapture estimates of the population size are compared: Chao's lower-bound estimate and Zelterman's estimate allowing for contaminated distributions. In the Poisson case it is shown that if there are only counts of ones and twos, the estimator of Zelterman is always bounded above by Chao's estimator. If counts larger than two exist, the estimator of Zelterman is becoming larger than that of Chao's, if only the ratio of the frequencies of counts of twos and ones is small enough. A similar analysis is provided for the binomial case. For a two-component mixture of Poisson distributions the asymptotic bias of both estimators is derived and it is shown that the Zelterman estimator can experience large overestimation bias. A modified Zelterman estimator is suggested and also the bias-corrected version of Chao's estimator is considered. All four estimators are compared in a simulation study.
Resumo:
[1] Decadal hindcast simulations of Arctic Ocean sea ice thickness made by a modern dynamic-thermodynamic sea ice model and forced independently by both the ERA-40 and NCEP/NCAR reanalysis data sets are compared for the first time. Using comprehensive data sets of observations made between 1979 and 2001 of sea ice thickness, draft, extent, and speeds, we find that it is possible to tune model parameters to give satisfactory agreement with observed data, thereby highlighting the skill of modern sea ice models, though the parameter values chosen differ according to the model forcing used. We find a consistent decreasing trend in Arctic Ocean sea ice thickness since 1979, and a steady decline in the Eastern Arctic Ocean over the full 40-year period of comparison that accelerated after 1980, but the predictions of Western Arctic Ocean sea ice thickness between 1962 and 1980 differ substantially. The origins of differing thickness trends and variability were isolated not to parameter differences but to differences in the forcing fields applied, and in how they are applied. It is argued that uncertainty, differences and errors in sea ice model forcing sets complicate the use of models to determine the exact causes of the recently reported decline in Arctic sea ice thickness, but help in the determination of robust features if the models are tuned appropriately against observations.
Resumo:
Oxidative damage to DNA is thought to play a role in carcinogenesis by causing Mutations, and indeed accumulation of oxidized DNA bases has been observed in samples obtained from tumors but not from surrounding tissue within the same patient. Base excision repair (BER) is the main pathway for the repair of oxidized modifications both in nuclear and mitochondrial, DNA. In order to ascertain whether diminished BER capacity might account for increased levels of oxidative DNA damage in cancer cells, the activities of BER enzymes in three different lung cancer cell lines and their non-cancerous counterparts were measured using oligonucleotide substrates with single DNA lesions to assess specific BER enzymes. The activities of four BER enzymes, OGG1, NTH1, UDG and APE1, were compared in mitochondrial and nuclear extracts. For each specific lesion, the repair activities were similar among the three cell lines used. However, the specific activities and cancer versus control comparison differed significantly between the nuclear and mitochondrial compartments. OGG1 activity, as measured by 8-oxodA incision, was upregulated in cancer cell mitochondria but down-regulated in the nucleus when compared to control cells. Similarly, NTH1 activity was also up-regulated in mitochondrial extracts from cancer cells but did not change significantly in the nucleus. Together, these results support the idea that alterations in BER capacity are associated with carcinogenesis.
Resumo:
This paper provides a systematic and unified treatment of the developments in the area of kernel estimation in econometrics and statistics. Both the estimation and hypothesis testing issues are discussed for the nonparametric and semiparametric regression models. A discussion on the choice of windowwidth is also presented.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)