203 resultados para kernel density estimator


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The molecular mechanism between atherosclerosis formation and periodontal pathogens is not clear although positive correlation between periodontal infections and cardiovascular diseases has been reported. Objective: To determine if atherosclerosis related genes were affected in foam cells during and after its formation by P. gingivalis lipopolysaccharide (LPS) stimulation. Methods: Macrophages from human THP-1 monocytes were treated with oxidized low density lipoprotein (oxLDL) to induce the formation of foam cells. P. gingivalis LPS was added to cultures of either oxLDL-induced macrophages or foam cells. The expression of atherosclerosis related genes was assayed by quantitative real time PCR and the protein production of granulocyte-macrophage colony-stimulating factor(GM-CSF), monocyte chemotactic protein-1 (MCP-1), IL-1β, IL-10 and IL-12 was determined by ELISA. Nuclear translocation of NF-κB P65 was detected by immunocytochemistry and western blot was used to evaluate IKB-α degradation to confirm the NF-κB pathway activation. Results: P. gingivalis LPS stimulated atherosclerosis related gene expression in foam cells and increased oxLDL induced expression of chemokines, adhesion molecules, growth factors, apoptotic genes, and nuclear receptors in macrophages. Transcription of the pro-inflammatory cytokines IL-1β and IL-12 was elevated in response to LPS in both macrophages and foam cells, whereas the anti-inflammatory cytokine IL-10 was not affected. Increased NF-κB pathway activation was also observed in LPS and oxLDL co-stimulated macrophages. Conclusion: P. gingivalis LPS appears to be an important factor in the development of atherosclerosis by stimulation of atherosclerosis related gene expression in both macrophages and foam cells via activation of the NF-κB pathway.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chondrocyte density in articular cartilage is known to change with the development and growth of the tissue and may play an important role in the formation of a functional extracellular matrix (ECM). The objective of this study was to determine how initial chondrocyte density in an alginate hydrogel affects the matrix composition, its distribution between the cell-associated (CM) and further removed matrix (FRM) fractions, and the tensile mechanical properties of the developing engineered cartilage. Alginate constructs containing primary bovine chondrocytes at densities of 0, 4, 16, and 64 million cells/ml were fabricated and cultured for 1 or 2 weeks, at which time structural, biochemical, and mechanical properties were analyzed. Both matrix content and distribution varied with the initial cell density. Increasing cell density resulted in an increasing content of collagen and sulfated-glycosaminoglycan (GAG) and an increasing proportion of these molecules localized in the CM. While the equilibrium tensile modulus of cell-free alginate did not change with time in culture, the constructs with highest cell density were 116% stiffer than cell-free controls after 2 weeks of culture. The equilibrium tensile modulus was positively correlated with total collagen (r2 = 0.47, p < 0.001) and GAG content (r2 = 0.68, p < 0.001), and these relationships were enhanced when analyzing only those matrix molecules in the CM fraction (r2 = 0.60 and 0.72 for collagen and GAG, respectively, each p < 0.001). Overall, the results of this study indicate that initial cell density has a considerable effect on the developing composition, structure, and function of alginate–chondrocyte constructs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND - High-density lipoprotein (HDL) protects against arterial atherothrombosis, but it is unknown whether it protects against recurrent venous thromboembolism. METHODS AND RESULTS - We studied 772 patients after a first spontaneous venous thromboembolism (average follow-up 48 months) and recorded the end point of symptomatic recurrent venous thromboembolism, which developed in 100 of the 772 patients. The relationship between plasma lipoprotein parameters and recurrence was evaluated. Plasma apolipoproteins AI and B were measured by immunoassays for all subjects. Compared with those without recurrence, patients with recurrence had lower mean (±SD) levels of apolipoprotein AI (1.12±0.22 versus 1.23±0.27 mg/mL, P<0.001) but similar apolipoprotein B levels. The relative risk of recurrence was 0.87 (95% CI, 0.80 to 0.94) for each increase of 0.1 mg/mL in plasma apolipoprotein AI. Compared with patients with apolipoprotein AI levels in the lowest tertile (<1.07 mg/mL), the relative risk of recurrence was 0.46 (95% CI, 0.27 to 0.77) for the highest-tertile patients (apolipoprotein AI >1.30 mg/mL) and 0.78 (95% CI, 0.50 to 1.22) for midtertile patients (apolipoprotein AI of 1.07 to 1.30 mg/mL). Using nuclear magnetic resonance, we determined the levels of 10 major lipoprotein subclasses and HDL cholesterol for 71 patients with recurrence and 142 matched patients without recurrence. We found a strong trend for association between recurrence and low levels of HDL particles and HDL cholesterol. CONCLUSIONS - Patients with high levels of apolipoprotein AI and HDL have a decreased risk of recurrent venous thromboembolism. © 2007 American Heart Association, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background-Although dyslipoproteinemia is associated with arterial atherothrombosis, little is known about plasma lipoproteins in venous thrombosis patients. Methods and Results-We determined plasma lipoprotein subclass concentrations using nuclear magnetic resonance spectroscopy and antigenic levels of apolipoproteins AI and B in blood samples from 49 male venous thrombosis patients and matched controls aged <55 years. Venous thrombosis patients had significantly lower levels of HDL particles, large HDL particles, HDL cholesterol, and apolipoprotein AI and significantly higher levels of LDL particles and small LDL particles. The quartile-based odds ratios for decreased HDL particle and apolipoprotein AI levels in patients compared with controls were 6.5 and 6.0 (95% CI, 2.3 to 19 and 2.1 to 17), respectively. Odds ratios for apolipoprotein B/apolipoprotein AI ratio and LDL cholesterol/HDL cholesterol ratio were 6.3 and 2.7 (95% CI, 1.9 to 21 and 1.1 to 6.5), respectively. When polymorphisms in genes for hepatic lipase, endothelial lipase, and cholesteryl ester transfer protein were analyzed, patients differed significantly from controls in the allelic frequency for the TaqI B1/B2 polymorphism in cholesteryl ester transfer protein, consistent with the observed pattern of lower HDL and higher LDL. Conclusions-Venous thrombosis in men aged <55 years old is associated with dyslipoproteinemia involving lower levels of HDL particles, elevated levels of small LDL particles, and an elevated ratio of apolipoprotein B/apolipoprotein AI. This dyslipoproteinemia seems associated with a related cholesteryl ester transfer protein genotype difference. © 2005 American Heart Association, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Kernel-based learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information is contained in the so-called kernel matrix, a symmetric and positive semidefinite matrix that encodes the relative positions of all points. Specifying this matrix amounts to specifying the geometry of the embedding space and inducing a notion of similarity in the input space - classical model selection problems in machine learning. In this paper we show how the kernel matrix can be learned from data via semidefinite programming (SDP) techniques. When applied to a kernel matrix associated with both training and test data this gives a powerful transductive algorithm -using the labeled part of the data one can learn an embedding also for the unlabeled part. The similarity between test points is inferred from training points and their labels. Importantly, these learning problems are convex, so we obtain a method for learning both the model class and the function without local minima. Furthermore, this approach leads directly to a convex method for learning the 2-norm soft margin parameter in support vector machines, solving an important open problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent research on multiple kernel learning has lead to a number of approaches for combining kernels in regularized risk minimization. The proposed approaches include different formulations of objectives and varying regularization strategies. In this paper we present a unifying optimization criterion for multiple kernel learning and show how existing formulations are subsumed as special cases. We also derive the criterion’s dual representation, which is suitable for general smooth optimization algorithms. Finally, we evaluate multiple kernel learning in this framework analytically using a Rademacher complexity bound on the generalization error and empirically in a set of experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent research on multiple kernel learning has lead to a number of approaches for combining kernels in regularized risk minimization. The proposed approaches include different formulations of objectives and varying regularization strategies. In this paper we present a unifying general optimization criterion for multiple kernel learning and show how existing formulations are subsumed as special cases. We also derive the criterion's dual representation, which is suitable for general smooth optimization algorithms. Finally, we evaluate multiple kernel learning in this framework analytically using a Rademacher complexity bound on the generalization error and empirically in a set of experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Kernel-based learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information is contained in the so-called kernel matrix, a symmetric and positive definite matrix that encodes the relative positions of all points. Specifying this matrix amounts to specifying the geometry of the embedding space and inducing a notion of similarity in the input space -- classical model selection problems in machine learning. In this paper we show how the kernel matrix can be learned from data via semi-definite programming (SDP) techniques. When applied to a kernel matrix associated with both training and test data this gives a powerful transductive algorithm -- using the labelled part of the data one can learn an embedding also for the unlabelled part. The similarity between test points is inferred from training points and their labels. Importantly, these learning problems are convex, so we obtain a method for learning both the model class and the function without local minima. Furthermore, this approach leads directly to a convex method to learn the 2-norm soft margin parameter in support vector machines, solving another important open problem. Finally, the novel approach presented in the paper is supported by positive empirical results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the multi-view approach to semisupervised learning, we choose one predictor from each of multiple hypothesis classes, and we co-regularize our choices by penalizing disagreement among the predictors on the unlabeled data. We examine the co-regularization method used in the co-regularized least squares (CoRLS) algorithm, in which the views are reproducing kernel Hilbert spaces (RKHS's), and the disagreement penalty is the average squared difference in predictions. The final predictor is the pointwise average of the predictors from each view. We call the set of predictors that can result from this procedure the co-regularized hypothesis class. Our main result is a tight bound on the Rademacher complexity of the co-regularized hypothesis class in terms of the kernel matrices of each RKHS. We find that the co-regularization reduces the Rademacher complexity by an amount that depends on the distance between the two views, as measured by a data dependent metric. We then use standard techniques to bound the gap between training error and test error for the CoRLS algorithm. Experimentally, we find that the amount of reduction in complexity introduced by co regularization correlates with the amount of improvement that co-regularization gives in the CoRLS algorithm.