943 resultados para Low Density Lipoprotein Cholesterol
Resumo:
"May 1976."
Resumo:
"The experimental study reported here has been performed under the sponsorship of the Fluid Dynamics Branch, Aeronautical Research laboratories of the U.S. Air Force Air Research and Development Command, Contract no. AF 33(616)-6025."
Resumo:
Recently very potent extracorporeal cholesterol-lowering treatment options have become available for patients with hypercholesterolemia. LDL immunoapheresis treatment selectively removes LDL and lipoprotein(a) from the circulation. Since LDL is the major carrier of lipophilic antioxidants in plasma, the purpose of the present study was to assess the effects of a single LDL apheresis treatment on plasma concentrations of tocopherols (alpha- and gamma-tocopherol) and carotenoids (alpha- and beta-carotene, zeaxanthin, cryptoxanthin, canthaxanthin, lycopene, and retinol). Plasma antioxidant concentrations were determined by HPLC in 7 patients with familial hypercholesterolemia before and after LDL immunoapheresis treatment. Plasma concentrations of both alpha- and gamma-tocopherol and the different carotenoids were significantly reduced by LDL apheresis. However, when standardized for cholesterol to adjust for cholesterol removal, alpha- and gamma-tocopherol, retinol, and the more polar carotenoids lutein and zeaxanthin increased in response to apheresis treatment, while the more unpolar carotenoids such as beta-carotene and lycopene did not change. These data demonstrate that a single LDL immunoapheresis treatment affects tocopherols and individual carotenoids differently. This may be explained by differences in chemical structure and preferential association with different lipoproteins. These results further imply that tocopherols, lutein, zeaxanthin, and retinol, are associated in part with lipoproteins and other carriers such as retinol-binding protein that are not removed during apheresis treatment. (C) 2004 Wiley-Liss, Inc.
Resumo:
Urban encroachment on dense, coastal koala populations has ensured that their management has received increasing government and public attention. The recently developed National Koala Conservation Strategy calls for maintenance of viable populations in the wild. Yet the success of this, and other, conservation initiatives is hampered by lack of reliable and generally accepted national and regional population estimates. In this paper we address this problem in a potentially large, but poorly studied, regional population in the State that is likely to have the largest wild populations. We draw on findings from previous reports in this series and apply the faecal standing-crop method (FSCM) to derive a regional estimate of more than 59 000 individuals. Validation trials in riverine communities showed that estimates of animal density obtained from the FSCM and direct observation were in close agreement. Bootstrapping and Monte Carlo simulations were used to obtain variance estimates for our population estimates in different vegetation associations across the region. The most favoured habitat was riverine vegetation, which covered only 0.9% of the region but supported 45% of the koalas. We also estimated that between 1969 and 1995 similar to 30% of the native vegetation associations that are considered as potential koala habitat were cleared, leading to a decline of perhaps 10% in koala numbers. Management of this large regional population has significant implications for the national conservation of the species: the continued viability of this population is critically dependent on the retention and management of riverine and residual vegetation communities, and future vegetation-management guidelines should be cognisant of the potential impacts of clearing even small areas of critical habitat. We also highlight eight management implications.
Resumo:
The consensus from published studies is that plasma lipids are each influenced by genetic factors, and that this contributes to genetic variation in risk of cardiovascular disease. Heritability estimates for lipids and lipoproteins are in the range .48 to .87, when measured once per study participant. However, this ignores the confounding effects of biological variation measurement error and ageing, and a truer assessment of genetic effects on cardiovascular risk may be obtained from analysis of longitudinal twin or family data. We have analyzed information on plasma high-density lipoprotein (HDL) and low-density lipoprotein (LDL) cholesterol, and triglycerides, from 415 adult twins who provided blood on two to five occasions over 10 to 17 years. Multivariate modeling of genetic and environmental contributions to variation within and across occasions was used to assess the extent to which genetic and environmental factors have long-term effects on plasma lipids. Results indicated that more than one genetic factor influenced HDL and LDL components of cholesterol, and triglycerides over time in all studies. Nonshared environmental factors did not have significant long-term effects except for HDL. We conclude that when heritability of lipid risk factors is estimated on only one occasion, the existence of biological variation and measurement errors leads to underestimation of the importance of genetic factors as a cause of variation in long-term risk within the population. In addition our data suggest that different genes may affect the risk profile at different ages.
Resumo:
A longitudinal capture-mark-recapture study was conducted to determine the temporal dynamics of rabbit haemorrhagic disease (RHD) in a European rabbit (Oryctolagus cuniculus) population of low to moderate density on sand-hill country in the lower North Island of New Zealand. A combination of sampling ( trapping and radio-tracking) and diagnostic (cELISA, PCR and isotype ELISA) methods was employed to obtain data weekly from May 1998 until June 2001. Although rabbit haemorrhagic disease virus ( RHDV) infection was detected in the study population in all 3 years, disease epidemics were evident only in the late summer or autumn months in 1999 and 2001. Overall, 20% of 385 samples obtained from adult animals older than 11 weeks were seropositive. An RHD outbreak in 1999 contributed to an estimated population decline of 26%. A second RHD epidemic in February 2001 was associated with a population decline of 52% over the subsequent month. Following the outbreaks, the seroprevalence in adult survivors was between 40% and 50%. During 2000, no deaths from RHDV were confirmed and mortalities were predominantly attributed to predation. Influx of seronegative immigrants was greatest in the 1999 and 2001 breeding seasons, and preceded the RHD epidemics in those years. Our data suggest that RHD epidemics require the population immunity level to fall below a threshold where propagation of infection can be maintained through the population.
Resumo:
Low-density parity-check codes with irregular constructions have recently been shown to outperform the most advanced error-correcting codes to date. In this paper we apply methods of statistical physics to study the typical properties of simple irregular codes. We use the replica method to find a phase transition which coincides with Shannon's coding bound when appropriate parameters are chosen. The decoding by belief propagation is also studied using statistical physics arguments; the theoretical solutions obtained are in good agreement with simulation results. We compare the performance of irregular codes with that of regular codes and discuss the factors that contribute to the improvement in performance.
Resumo:
A variation of low-density parity check (LDPC) error-correcting codes defined over Galois fields (GF(q)) is investigated using statistical physics. A code of this type is characterised by a sparse random parity check matrix composed of C non-zero elements per column. We examine the dependence of the code performance on the value of q, for finite and infinite C values, both in terms of the thermodynamical transition point and the practical decoding phase characterised by the existence of a unique (ferromagnetic) solution. We find different q-dependence in the cases of C = 2 and C ≥ 3; the analytical solutions are in agreement with simulation results, providing a quantitative measure to the improvement in performance obtained using non-binary alphabets.
Resumo:
We study the performance of Low Density Parity Check (LDPC) error-correcting codes using the methods of statistical physics. LDPC codes are based on the generation of codewords using Boolean sums of the original message bits by employing two randomly-constructed sparse matrices. These codes can be mapped onto Ising spin models and studied using common methods of statistical physics. We examine various regular constructions and obtain insight into their theoretical and practical limitations. We also briefly report on results obtained for irregular code constructions, for codes with non-binary alphabet, and on how a finite system size effects the error probability.
Resumo:
The modem digital communication systems are made transmission reliable by employing error correction technique for the redundancies. Codes in the low-density parity-check work along the principles of Hamming code, and the parity-check matrix is very sparse, and multiple errors can be corrected. The sparseness of the matrix allows for the decoding process to be carried out by probability propagation methods similar to those employed in Turbo codes. The relation between spin systems in statistical physics and digital error correcting codes is based on the existence of a simple isomorphism between the additive Boolean group and the multiplicative binary group. Shannon proved general results on the natural limits of compression and error-correction by setting up the framework known as information theory. Error-correction codes are based on mapping the original space of words onto a higher dimensional space in such a way that the typical distance between encoded words increases.
Resumo:
Typical performance of low-density parity-check (LDPC) codes over a general binary-input output-symmetric memoryless channel is investigated using methods of statistical mechanics. The binary-input additive-white-Gaussian-noise channel and the binary-input Laplace channel are considered as specific channel noise models.
Resumo:
We review recent theoretical progress on the statistical mechanics of error correcting codes, focusing on low-density parity-check (LDPC) codes in general, and on Gallager and MacKay-Neal codes in particular. By exploiting the relation between LDPC codes and Ising spin systems with multispin interactions, one can carry out a statistical mechanics based analysis that determines the practical and theoretical limitations of various code constructions, corresponding to dynamical and thermodynamical transitions, respectively, as well as the behaviour of error-exponents averaged over the corresponding code ensemble as a function of channel noise. We also contrast the results obtained using methods of statistical mechanics with those derived in the information theory literature, and show how these methods can be generalized to include other channel types and related communication problems.