742 resultados para C12
Resumo:
This paper confirms presence of GARCH(1,1) effect on stock return time series of Vietnam’s newborn stock market. We performed tests on four different time series, namely market returns (VN-Index), and return series of the first four individual stocks listed on the Vietnamese exchange (the Ho Chi Minh City Securities Trading Center) since August 2000. The results have been quite relevant to previously reported empirical studies on different markets.
Resumo:
In this paper, we analyze the context of Vietnam’s economic standings in the reform period. The first section embarks on most remarkable factors, which promote the development of financial markets are: (i) Doi Moi policies in 1986 unleash ‘productive powers’. Real GDP growth, and key economic indicators improve. The economy truly departs from the old-style command economy; (ii) FDI component is present in the economy as sine qua non; a crucial growth engine, forming part of the financial markets, planting the ‘seeds’ for its growth; and (iii) the private economy is both the result and cause of the reform. Its growth is steady. Today, it represents a powerhouse, and helps form part of the genuine financial economy. A few noteworthy points found in the next section are: (i) No evidence of financial markets existence was found before Doi Moi. The reform has generated a bulk of private-sector financial companies. New developments have roots in the 1992-amended constitution (x3.2); (ii) The need to reform the financial started with the domino collapse of credit cooperatives in early 1990s. More stress is caused by the ‘blow’ of banking deficiency in late 1990s; and (iii) Laws on SBV and credit institutions, and the launch of the stock market are bold steps. Besides, the Asian financial turmoil forces the economy to reaffirm its reform agenda. Our findings also indicate, through empirical evidences, that economic conditions have stabilized throughout the reform, thanks to the contributions of the FDI and private economic sector. Private investment flows continue to be an eminent factor that drives the economy growth.
Resumo:
In this paper, we examine exchange rates in Vietnam’s transitional economy. Evidence of long-run equilibrium are established in most cases through a single co-integrating vector among endogenous variables that determine the real exchange rates. This supports relative PPP in which ECT of the system can be combined linearly into a stationary process, reducing deviation from PPP in the long run. Restricted coefficient vectors ß’ = (1, 1, -1) for real exchange rates of currencies in question are not rejected. This empirics of relative PPP adds to found evidences by many researchers, including Flre et al. (1999), Lee (1999), Johnson (1990), Culver and Papell (1999), Cuddington and Liang (2001). Instead of testing for different time series on a common base currency, we use different base currencies (USD, GBP, JPY and EUR). By doing so we want to know the whether theory may posit significant differences against one currency? We have found consensus, given inevitable technical differences, even with smallerdata sample for EUR. Speeds of convergence to PPP and adjustment are faster compared to results from other researches for developed economies, using both observed and bootstrapped HL measures. Perhaps, a better explanation is the adjustment from hyperinflation period, after which the theory indicates that adjusting process actually accelerates. We observe that deviation appears to have been large in early stages of the reform, mostly overvaluation. Over time, its correction took place leading significant deviations to gradually disappear.
Resumo:
This paper looks into economic insights offerred by considerations of two important financial markets in Vietnam, gold and USD. In general, the paper focuses on time series properties, mainly returns at different frequencies, and test the weak-form efficient market hypothesis. All the test rejects the efficiency of both gold and foreign exchange markets. All time series exhibit strong serial correlations. ARMA-GARCH specifications appear to have performed well with different time series. In all cases the changing volatility phenomenon is strongly supported through empirical data. An additional test is performed on the daily USD return to try to capture the impacts of Asian financial crisis and daily price limits applicable. No substantial impacts of the Asian crisis and the central bank-devised limits are found to influence the risk level of daily USD return.
Resumo:
The potential use of negative electrospray ionisation mass spectrometry (ESI-MS) in the characterisation of the three polyacetylenes common in carrots (Daucus carota) has been assessed. The MS scans have demonstrated that the polyacetylenes undergo a modest degree of in-source decomposition in the negative ionisation mode while the positive ionisation mode has shown predominantly sodiated ions and no [M+H](+) ions. Tandem mass spectrometric (MS/MS) studies have shown that the polyacetylenes follow two distinct fragmentation pathways: one that involves cleavage of the C3-C4 bond and the other with cleavage of the C7-C8 bond. The cleavage of the C7-C8 bond generated product ions m/z 105.0 for falcarinol, m/z 105/107.0 for falcarindiol, m/z 147.0/149.1 for falcarindiol-3-acetate. In addition to these product ions, the transitions m/z 243.2 -> 187.1 (falcarinol), m/z 259.2 -> 203.1 (falcarindiol), m/z 301.2 -> 255.2/203.1 (falcarindiol-3-acetate), mostly from the C3-C4 bond cleavage, can form the basis of multiple reaction monitoring (MRM)-quantitative methods which are poorly represented in the literature. The 'MS3' experimental data confirmed a less pronounced homolytic cleavage site between the C11-C12 bond in the falcarinol-type polacetylenes. The optimised liquid chromatography (LC)/MS conditions have achieved a baseline chromatographic separation of the three polyacetylenes investigated within 40 min total run-time. Copyright (C) 2011 John Wiley & Sons, Ltd.
Resumo:
Macrophage cholesterol homeostasis is a key process involved in the initiation and progression of atherosclerosis. Peroxisome proliferator-activated receptors (PPARs) regulate the transcription of the genes involved in cholesterol homeostasis and thus represent an important therapeutic target in terms of reducing atherosclerosis. Conjugated linoleic acid (CLA) is a potent anti-atherogenic dietary fatty acid in animal models of atherosclerosis and is capable of activating PPARs in vitro and in vivo. Therefore, this study examined whether the anti-atherogenic effects of CLA in vivo could be ascribed to altered cholesterol homeostasis in macrophages and macrophage derived foam cells. Of several genes that regulate cholesterol homeostasis investigated, CLA had most effect on the class B scavenger receptor CD36. The cis-9,trans-11 CLA (c9,t11-CLA) and trans-10,cis-12 CLA (t10,c12-CLA) isomers augmented CD36 mRNA expression (P
Resumo:
Tese de doutoramento, Farmácia (Bioquímica), Universidade de Lisboa, Faculdade de Farmácia, 2014
Resumo:
In common with many other plasma membrane glycoproteins of eukaryotic origin, the promastigote surface protease (PSP) of the protozoan parasite Leishmania contains a glycosyl-phosphatidylinositol (GPI) membrane anchor. The GPI anchor of Leishmania major PSP was purified following proteolysis of the PSP and analyzed by two-dimensional 1H-1H NMR, compositional and methylation linkage analyses, chemical and enzymatic modifications, and amino acid sequencing. From these results, the structure of the GPI-containing peptide was found to be Asp-Gly-Gly-Asn-ethanolamine-PO4-6Man alpha 1-6Man alpha 1-4GlcN alpha 1-6myo-inositol-1-PO4-(1-alkyl-2-acyl-glycerol). The glycan structure is identical to the conserved glycan core regions of the GPI anchor of Trypanosoma brucei variant surface glycoprotein and rat brain Thy-1 antigen, supporting the notion that this portion of GPIs are highly conserved. The phosphatidylinositol moiety of the PSP anchor is unusual, containing a fully saturated, unbranched 1-O-alkyl chain (mainly C24:0) and a mixture of fully saturated unbranched 2-O-acyl chains (C12:0, C14:0, C16:0, and C18:0). This lipid composition differs significantly from those of the GPIs of T. brucei variant surface glycoprotein and mammalian erythrocyte acetylcholinesterase but is similar to that of a family of glycosylated phosphoinositides found uniquely in Leishmania.
Resumo:
Ms. issu du scriptorium personnel de Pierre d'Ailly (1351-1420), comme les mss. lat. 5703 et 17473 (cf. P. Hefti). Ce ms. est le jumeau du ms. lat. 14531 provenant de Saint-Victor de Paris. Le style de l'écriture et la décoration de ces deux manuscrits sont identiques (cf. P. Hefti).
Resumo:
Hg(18-Crown-6)C12 and Cd(18-Crown-6)C12 are isostructura1, space group Cl~ Z = 2. For the mercury compound, a = 10.444(2) A° , b = 11. 468(1) A° , c = 7.754(1) A° , a = 90.06(1)°, B = 82.20(1)°, Y = 90.07(1)°, Dobs = 1.87, Dca1c = 1.93, V = 920.05 13, R = 4.66%. For the cadmium compound, 000 a = 10.374(1) A, b = 11.419(2) A, c = 7.729(1) A, a = 89.95(1)°, B = 81.86(2)°, Y = 89.99(1)°, Dobs = 1.61, Dcalc = 1.64, V = 906.4613, R = 3.95%. The mercury and cadmium ions exhibit hexagonal bipyramidal coordination, with the metal ion located on a centre of symmetry in the plane of the oxygen atoms. The main differences between the two structures are an increase in the metal-oxygen distance and a reduction in the metalchloride distance when the central ion changes from Cd2+ to Hg2+. These differences may be explained in terms of the differences in hardness or softness of the metal ions and the donor atoms.
Resumo:
Dans ce texte, nous revoyons certains développements récents de l’économétrie qui peuvent être intéressants pour des chercheurs dans des domaines autres que l’économie et nous soulignons l’éclairage particulier que l’économétrie peut jeter sur certains thèmes généraux de méthodologie et de philosophie des sciences, tels la falsifiabilité comme critère du caractère scientifique d’une théorie (Popper), la sous-détermination des théories par les données (Quine) et l’instrumentalisme. En particulier, nous soulignons le contraste entre deux styles de modélisation - l’approche parcimonieuse et l’approche statistico-descriptive - et nous discutons les liens entre la théorie des tests statistiques et la philosophie des sciences.
Resumo:
A wide range of tests for heteroskedasticity have been proposed in the econometric and statistics literature. Although a few exact homoskedasticity tests are available, the commonly employed procedures are quite generally based on asymptotic approximations which may not provide good size control in finite samples. There has been a number of recent studies that seek to improve the reliability of common heteroskedasticity tests using Edgeworth, Bartlett, jackknife and bootstrap methods. Yet the latter remain approximate. In this paper, we describe a solution to the problem of controlling the size of homoskedasticity tests in linear regression contexts. We study procedures based on the standard test statistics [e.g., the Goldfeld-Quandt, Glejser, Bartlett, Cochran, Hartley, Breusch-Pagan-Godfrey, White and Szroeter criteria] as well as tests for autoregressive conditional heteroskedasticity (ARCH-type models). We also suggest several extensions of the existing procedures (sup-type of combined test statistics) to allow for unknown breakpoints in the error variance. We exploit the technique of Monte Carlo tests to obtain provably exact p-values, for both the standard and the new tests suggested. We show that the MC test procedure conveniently solves the intractable null distribution problem, in particular those raised by the sup-type and combined test statistics as well as (when relevant) unidentified nuisance parameter problems under the null hypothesis. The method proposed works in exactly the same way with both Gaussian and non-Gaussian disturbance distributions [such as heavy-tailed or stable distributions]. The performance of the procedures is examined by simulation. The Monte Carlo experiments conducted focus on : (1) ARCH, GARCH, and ARCH-in-mean alternatives; (2) the case where the variance increases monotonically with : (i) one exogenous variable, and (ii) the mean of the dependent variable; (3) grouped heteroskedasticity; (4) breaks in variance at unknown points. We find that the proposed tests achieve perfect size control and have good power.
Resumo:
Dans ce texte, nous analysons les développements récents de l’économétrie à la lumière de la théorie des tests statistiques. Nous revoyons d’abord quelques principes fondamentaux de philosophie des sciences et de théorie statistique, en mettant l’accent sur la parcimonie et la falsifiabilité comme critères d’évaluation des modèles, sur le rôle de la théorie des tests comme formalisation du principe de falsification de modèles probabilistes, ainsi que sur la justification logique des notions de base de la théorie des tests (tel le niveau d’un test). Nous montrons ensuite que certaines des méthodes statistiques et économétriques les plus utilisées sont fondamentalement inappropriées pour les problèmes et modèles considérés, tandis que de nombreuses hypothèses, pour lesquelles des procédures de test sont communément proposées, ne sont en fait pas du tout testables. De telles situations conduisent à des problèmes statistiques mal posés. Nous analysons quelques cas particuliers de tels problèmes : (1) la construction d’intervalles de confiance dans le cadre de modèles structurels qui posent des problèmes d’identification; (2) la construction de tests pour des hypothèses non paramétriques, incluant la construction de procédures robustes à l’hétéroscédasticité, à la non-normalité ou à la spécification dynamique. Nous indiquons que ces difficultés proviennent souvent de l’ambition d’affaiblir les conditions de régularité nécessaires à toute analyse statistique ainsi que d’une utilisation inappropriée de résultats de théorie distributionnelle asymptotique. Enfin, nous soulignons l’importance de formuler des hypothèses et modèles testables, et de proposer des techniques économétriques dont les propriétés sont démontrables dans les échantillons finis.
Resumo:
We propose finite sample tests and confidence sets for models with unobserved and generated regressors as well as various models estimated by instrumental variables methods. The validity of the procedures is unaffected by the presence of identification problems or \"weak instruments\", so no detection of such problems is required. We study two distinct approaches for various models considered by Pagan (1984). The first one is an instrument substitution method which generalizes an approach proposed by Anderson and Rubin (1949) and Fuller (1987) for different (although related) problems, while the second one is based on splitting the sample. The instrument substitution method uses the instruments directly, instead of generated regressors, in order to test hypotheses about the \"structural parameters\" of interest and build confidence sets. The second approach relies on \"generated regressors\", which allows a gain in degrees of freedom, and a sample split technique. For inference about general possibly nonlinear transformations of model parameters, projection techniques are proposed. A distributional theory is obtained under the assumptions of Gaussian errors and strictly exogenous regressors. We show that the various tests and confidence sets proposed are (locally) \"asymptotically valid\" under much weaker assumptions. The properties of the tests proposed are examined in simulation experiments. In general, they outperform the usual asymptotic inference methods in terms of both reliability and power. Finally, the techniques suggested are applied to a model of Tobin’s q and to a model of academic performance.