50 resultados para Mean diameter
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Background PPP1R6 is a protein phosphatase 1 glycogen-targeting subunit (PP1-GTS) abundant in skeletal muscle with an undefined metabolic control role. Here PPP1R6 effects on myotube glycogen metabolism, particle size and subcellular distribution are examined and compared with PPP1R3C/PTG and PPP1R3A/GM. Results PPP1R6 overexpression activates glycogen synthase (GS), reduces its phosphorylation at Ser-641/0 and increases the extracted and cytochemically-stained glycogen content, less than PTG but more than GM. PPP1R6 does not change glycogen phosphorylase activity. All tested PP1-GTS-cells have more glycogen particles than controls as found by electron microscopy of myotube sections. Glycogen particle size is distributed for all cell-types in a continuous range, but PPP1R6 forms smaller particles (mean diameter 14.4 nm) than PTG (36.9 nm) and GM (28.3 nm) or those in control cells (29.2 nm). Both PPP1R6- and GM-derived glycogen particles are in cytosol associated with cellular structures; PTG-derived glycogen is found in membrane- and organelle-devoid cytosolic glycogen-rich areas; and glycogen particles are dispersed in the cytosol in control cells. A tagged PPP1R6 protein at the C-terminus with EGFP shows a diffuse cytosol pattern in glucose-replete and -depleted cells and a punctuate pattern surrounding the nucleus in glucose-depleted cells, which colocates with RFP tagged with the Golgi targeting domain of β-1,4-galactosyltransferase, according to a computational prediction for PPP1R6 Golgi location. Conclusions PPP1R6 exerts a powerful glycogenic effect in cultured muscle cells, more than GM and less than PTG. PPP1R6 protein translocates from a Golgi to cytosolic location in response to glucose. The molecular size and subcellular location of myotube glycogen particles is determined by the PPP1R6, PTG and GM scaffolding.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
Aquest estudi s’ha realitzat amb el principal objectiu de localitzar, analitzar i diagnosticar els arbres singulars subjectes a ser declarats monumentals dins el Parc Natural de l’Alt Pirineu. Concretament s’han inventariat la Vall Ferrera i la Vall de Cardós. L’objectiu secundari ha estat fer una proposta innovadora d’educació ambiental, utilitzant l’arbre com a un instrument pedagògic. S’han inventariat vint-i-tres arbres sent un d’ells ja declarat Arbre Monumental, “l’Avet del Pla de la Selva”. Primerament s’han localitzats els arbres amb l’ajuda dels tècnics del Parc, el coneixement popular i documentació. S’ha utilitzat una metodologia basada en estudis anteriors, mitjançant uns formularis de camp que recullen totes les característiques ecològiques i socioculturals de cada arbre. Posteriorment s’han analitzat les dades obtingudes i s’ha realitzat la diagnosi. S’ha proposat un mètode quantitatiu i un mètode qualitatiu (Rànquing d’Arbres Monumentals). Aquest últim valora cada arbre comparant-lo amb un llistat de tots aquells arbres monumentals de la mateixa espècie en el territori català realitzat per la Generalitat de Catalunya, segons tres paràmetres, l’alçada, el volt de canó i el diàmetre de la capçada. Finalment es proposa a cada arbre la protecció corresponent segons el seu estat de conservació i altres paràmetres. Un dels resultats obtinguts d’aquest estudi ha estat la realització d’una carpeta de material divulgatiu utilitzant cada arbre com a eix central per explicar el medi natural que l’envolta. Amb aquesta iniciativa es vol destacar l’important paper dels arbres monumentals com a connectors amb el medi natural i sociocultural i la necessitat de protegir en tots els Parcs Naturals els arbres singulars.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
This paper conducts an empirical analysis of the relationship between wage inequality, employment structure, and returns to education in urban areas of Mexico during the past two decades (1987-2008). Applying Melly’s (2005) quantile regression based decomposition, we find that changes in wage inequality have been driven mainly by variations in educational wage premia. Additionally, we find that changes in employment structure, including occupation and firm size, have played a vital role. This evidence seems to suggest that the changes in wage inequality in urban Mexico cannot be interpreted in terms of a skill-biased change, but rather they are the result of an increasing demand for skills during that period.
Resumo:
Kriging is an interpolation technique whose optimality criteria are based on normality assumptions either for observed or for transformed data. This is the case of normal, lognormal and multigaussian kriging.When kriging is applied to transformed scores, optimality of obtained estimators becomes a cumbersome concept: back-transformed optimal interpolations in transformed scores are not optimal in the original sample space, and vice-versa. This lack of compatible criteria of optimality induces a variety of problems in both point and block estimates. For instance, lognormal kriging, widely used to interpolate positivevariables, has no straightforward way to build consistent and optimal confidence intervals for estimates.These problems are ultimately linked to the assumed space structure of the data support: for instance, positive values, when modelled with lognormal distributions, are assumed to be embedded in the whole real space, with the usual real space structure and Lebesgue measure
Resumo:
There is almost not a case in exploration geology, where the studied data doesn’tincludes below detection limits and/or zero values, and since most of the geological dataresponds to lognormal distributions, these “zero data” represent a mathematicalchallenge for the interpretation.We need to start by recognizing that there are zero values in geology. For example theamount of quartz in a foyaite (nepheline syenite) is zero, since quartz cannot co-existswith nepheline. Another common essential zero is a North azimuth, however we canalways change that zero for the value of 360°. These are known as “Essential zeros”, butwhat can we do with “Rounded zeros” that are the result of below the detection limit ofthe equipment?Amalgamation, e.g. adding Na2O and K2O, as total alkalis is a solution, but sometimeswe need to differentiate between a sodic and a potassic alteration. Pre-classification intogroups requires a good knowledge of the distribution of the data and the geochemicalcharacteristics of the groups which is not always available. Considering the zero valuesequal to the limit of detection of the used equipment will generate spuriousdistributions, especially in ternary diagrams. Same situation will occur if we replace thezero values by a small amount using non-parametric or parametric techniques(imputation).The method that we are proposing takes into consideration the well known relationshipsbetween some elements. For example, in copper porphyry deposits, there is always agood direct correlation between the copper values and the molybdenum ones, but whilecopper will always be above the limit of detection, many of the molybdenum values willbe “rounded zeros”. So, we will take the lower quartile of the real molybdenum valuesand establish a regression equation with copper, and then we will estimate the“rounded” zero values of molybdenum by their corresponding copper values.The method could be applied to any type of data, provided we establish first theircorrelation dependency.One of the main advantages of this method is that we do not obtain a fixed value for the“rounded zeros”, but one that depends on the value of the other variable.Key words: compositional data analysis, treatment of zeros, essential zeros, roundedzeros, correlation dependency
Resumo:
Stone groundwood (SGW) is a fibrous matter commonly prepared in a high yield process, and mainly used for papermaking applications. In this work, the use of SGW fibers is explored as reinforcing element of polypropylene (PP) composites. Due to its chemical and superficial features, the use of coupling agents is needed for a good adhesion and stress transfer across the fiber-matrix interface. The intrinsic strength of the reinforcement is a key parameter to predict the mechanical properties of the composite and to perform an interface analysis. The main objective of the present work was the determination of the intrinsic tensile strength of stone groundwood fibers. Coupled and non-coupled PP composites from stone groundwood fibers were prepared. The influence of the surface morphology and the quality at interface on the final properties of the composite was analyzed and compared to that of fiberglass PP composites. The intrinsic tensile properties of stone groundwood fibers, as well as the fiber orientation factor and the interfacial shear strength of the current composites were determined
Resumo:
The behavior of stone groundwood / polypropylene injection-molded composites was evaluated with and without coupling agent. Stone groundwood (SGW) is a fibrous material commonly prepared in a high yield process and mainly used for papermaking applications. In this work, the use of SGW fibers was explored as a reinforcing element of polypropylene (PP) composites. The surface charge density of the composite components was evaluated, as well as the fiber’s length and diameter inside the composite material. Two mixing extrusion processes were evaluated, and the use of a kinetic mixer, instead of an internal mixer, resulted in longer mean fiber lengths of the reinforcing fibers. On the other hand, the accessibility of surface hydroxyl groups of stone groundwood fibers was improved by treating the fibers with 5% of sodium hydroxide, resulting in a noticeable increase of the tensile strength of the composites, for a similar percentage of coupling agent. A new parameter called Fiber Tensile Strength Factor is defined and used as a baseline for the comparison of the properties of the different composite materials. Finally the competitiveness of stone groundwood / polypropylene / polypropylene-co-maleic anhydride system, which compared favorably to sized glass-fiber / polypropylene GF/PP and glass-fiber / polypropylene / polypropylene-co-maleic anhydride composite formulations, was quantified by means of the fiber tensile strength factor
Resumo:
We establish the validity of subsampling confidence intervals for themean of a dependent series with heavy-tailed marginal distributions.Using point process theory, we study both linear and nonlinear GARCH-liketime series models. We propose a data-dependent method for the optimalblock size selection and investigate its performance by means of asimulation study.
Resumo:
Structural equation models are widely used in economic, socialand behavioral studies to analyze linear interrelationships amongvariables, some of which may be unobservable or subject to measurementerror. Alternative estimation methods that exploit different distributionalassumptions are now available. The present paper deals with issues ofasymptotic statistical inferences, such as the evaluation of standarderrors of estimates and chi--square goodness--of--fit statistics,in the general context of mean and covariance structures. The emphasisis on drawing correct statistical inferences regardless of thedistribution of the data and the method of estimation employed. A(distribution--free) consistent estimate of $\Gamma$, the matrix ofasymptotic variances of the vector of sample second--order moments,will be used to compute robust standard errors and a robust chi--squaregoodness--of--fit squares. Simple modifications of the usual estimateof $\Gamma$ will also permit correct inferences in the case of multi--stage complex samples. We will also discuss the conditions under which,regardless of the distribution of the data, one can rely on the usual(non--robust) inferential statistics. Finally, a multivariate regressionmodel with errors--in--variables will be used to illustrate, by meansof simulated data, various theoretical aspects of the paper.
Spanning tests in return and stochastic discount factor mean-variance frontiers: A unifying approach
Resumo:
We propose new spanning tests that assess if the initial and additional assets share theeconomically meaningful cost and mean representing portfolios. We prove their asymptoticequivalence to existing tests under local alternatives. We also show that unlike two-step oriterated procedures, single-step methods such as continuously updated GMM yield numericallyidentical overidentifyng restrictions tests, so there is arguably a single spanning test.To prove these results, we extend optimal GMM inference to deal with singularities in thelong run second moment matrix of the influence functions. Finally, we test for spanningusing size and book-to-market sorted US stock portfolios.
Resumo:
This paper presents a two--factor model of the term structure ofinterest rates. We assume that default free discount bond prices aredetermined by the time to maturity and two factors, the long--term interestrate and the spread (difference between the long--term rate and theshort--term (instantaneous) riskless rate). Assuming that both factorsfollow a joint Ornstein--Uhlenbeck process, a general bond pricing equationis derived. We obtain a closed--form expression for bond prices andexamine its implications for the term structure of interest rates. We alsoderive a closed--form solution for interest rate derivatives prices. Thisexpression is applied to price European options on discount bonds andmore complex types of options. Finally, empirical evidence of the model'sperformance is presented.
Resumo:
Portfolio and stochastic discount factor (SDF) frontiers are usually regarded as dual objects, and researchers sometimes use one to answer questions about the other. However, the introduction of conditioning information and active portfolio strategies alters this relationship. For instance, the unconditional portfolio frontier in Hansen and Richard (1987) is not dual to the unconditional SDF frontier in Gallant, Hansen and Tauchen (1990). We characterise the dual objects to those frontiers, and relate them to the frontiers generated with managed portfolios, which are commonly used in empirical work. We also study the implications of a safe asset and other special cases.
Resumo:
[spa] La conceptuación de talento ha ido cobrando cada vez más importancia tanto para académicos como profesionales, con el fin de avanzar en el estudio de la gestión del talento. De hecho, la confusión sobre el significado de talento en la realidad empresarial impide llegar a un consenso sobre el concepto y la práctica de la gestión del talento. En este estudio teórico revisamos el concepto de talento en el mundo de la empresa con el fin de resumir lo que hemos aprendido y discutir las ventajas y limitaciones de las diferentes acepciones. Concluimos con la formulación de una definición de este concepto, ya que una correcta interpretación de la gestión del talento—por no hablar de una exitosa gestión del talento— depende de tener una comprensión clara de lo que se entiende por talento en un contexto organizativo. Además, con la definición de talento propuesta delimitamos el concepto de talento evitando algunos problemas detectados en las definiciones anteriores (por ejemplo, generalidades y tautologías), y poniendo de relieve las variables importantes que le afectan y lo hacen más manejable.