984 resultados para Meta-heuristics algorithms
Resumo:
BACKGROUND: Data on the association between subclinical thyroid dysfunction and fractures conflict. PURPOSE: To assess the risk for hip and nonspine fractures associated with subclinical thyroid dysfunction among prospective cohorts. DATA SOURCES: Search of MEDLINE and EMBASE (1946 to 16 March 2014) and reference lists of retrieved articles without language restriction. STUDY SELECTION: Two physicians screened and identified prospective cohorts that measured thyroid function and followed participants to assess fracture outcomes. DATA EXTRACTION: One reviewer extracted data using a standardized protocol, and another verified data. Both reviewers independently assessed methodological quality of the studies. DATA SYNTHESIS: The 7 population-based cohorts of heterogeneous quality included 50,245 participants with 1966 hip and 3281 nonspine fractures. In random-effects models that included the 5 higher-quality studies, the pooled adjusted hazard ratios (HRs) of participants with subclinical hyperthyroidism versus euthyrodism were 1.38 (95% CI, 0.92 to 2.07) for hip fractures and 1.20 (CI, 0.83 to 1.72) for nonspine fractures without statistical heterogeneity (P = 0.82 and 0.52, respectively; I2= 0%). Pooled estimates for the 7 cohorts were 1.26 (CI, 0.96 to 1.65) for hip fractures and 1.16 (CI, 0.95 to 1.42) for nonspine fractures. When thyroxine recipients were excluded, the HRs for participants with subclinical hyperthyroidism were 2.16 (CI, 0.87 to 5.37) for hip fractures and 1.43 (CI, 0.73 to 2.78) for nonspine fractures. For participants with subclinical hypothyroidism, HRs from higher-quality studies were 1.12 (CI, 0.83 to 1.51) for hip fractures and 1.04 (CI, 0.76 to 1.42) for nonspine fractures (P for heterogeneity = 0.69 and 0.88, respectively; I2 = 0%). LIMITATIONS: Selective reporting cannot be excluded. Adjustment for potential common confounders varied and was not adequately done across all studies. CONCLUSION: Subclinical hyperthyroidism might be associated with an increased risk for hip and nonspine fractures, but additional large, high-quality studies are needed. PRIMARY FUNDING SOURCE: Swiss National Science Foundation.
Resumo:
Recently, several anonymization algorithms have appeared for privacy preservation on graphs. Some of them are based on random-ization techniques and on k-anonymity concepts. We can use both of them to obtain an anonymized graph with a given k-anonymity value. In this paper we compare algorithms based on both techniques in orderto obtain an anonymized graph with a desired k-anonymity value. We want to analyze the complexity of these methods to generate anonymized graphs and the quality of the resulting graphs.
Resumo:
PURPOSE: Subclinical hypothyroidism has been associated with elevated cholesterol and increased risk for atherosclerosis, but data on the risk of coronary heart disease (CHD) are conflicting. We performed a systematic review to determine whether subclinical hypothyroidism is associated with CHD in adults. METHODS: We searched MEDLINE from 1966 to April 2005, and the bibliographies of key articles to identify studies that provided risk estimates for CHD or cardiovascular mortality associated with subclinical hypothyroidism. Two authors independently reviewed each potential study for eligibility, assessed methodologic quality, and extracted the data. RESULTS: We identified 14 observational studies that met eligibility criteria. Subclinical hypothyroidism increased the risk of CHD (summary odds ratio [OR]: 1.65, 95% confidence interval [CI], 1.28-2.12). The summary OR for CHD was 1.81 (CI, 1.38-2.39) in 9 studies adjusted or matched for demographic characteristics, and 2.38 (CI, 1.53-3.69) after pooling the studies that adjusted for most cardiovascular risk factors. Sensitivity analyses including only population-based studies and those with formal outcome adjudication procedures yielded similar results. Subgroup analyses by type of study design showed a similar trend, but lower risk, in the 5 prospective cohort studies (OR 1.42, CI, 0.91-2.21), compared with the case-control and cross-sectional studies (OR 1.72, CI, 1.25-2.38). CONCLUSION: Our systematic review indicates that subclinical hypothyroidism is associated with an increased risk of CHD. Clinical trials are needed to assess whether thyroxine replacement reduces the risk of CHD in subjects with subclinical hypothyroidism.
Resumo:
Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.
Resumo:
The incidence of hepatocellular carcinoma (HCC) is increasing in Western countries. Although several clinical factors have been identified, many individuals never develop HCC, suggesting a genetic susceptibility. However, to date, only a few single-nucleotide polymorphisms have been reproducibly shown to be linked to HCC onset. A variant (rs738409 C>G, encoding for p.I148M) in the PNPLA3 gene is associated with liver damage in chronic liver diseases. Interestingly, several studies have reported that the minor rs738409[G] allele is more represented in HCC cases in chronic hepatitis C (CHC) and alcoholic liver disease (ALD). However, a significant association with HCC related to CHC has not been consistently observed, and the strength of the association between rs738409 and HCC remains unclear. We performed a meta-analysis of individual participant data including 2,503 European patients with cirrhosis to assess the association between rs738409 and HCC, particularly in ALD and CHC. We found that rs738409 was strongly associated with overall HCC (odds ratio [OR] per G allele, additive model=1.77; 95% confidence interval [CI]: 1.42-2.19; P=2.78 × 10(-7) ). This association was more pronounced in ALD (OR=2.20; 95% CI: 1.80-2.67; P=4.71 × 10(-15) ) than in CHC patients (OR=1.55; 95% CI: 1.03-2.34; P=3.52 × 10(-2) ). After adjustment for age, sex, and body mass index, the variant remained strongly associated with HCC. Conclusion: Overall, these results suggest that rs738409 exerts a marked influence on hepatocarcinogenesis in patients with cirrhosis of European descent and provide a strong argument for performing further mechanistic studies to better understand the role of PNPLA3 in HCC development.
Resumo:
RAPPORT DE SYNTHÈSE : Contexte Les programmes de prévention cardiovasculaire secondaire après un événement coronarien aigu ont pu démontrer leur efficacité dans le contexte des soins ambulatoires. L'hospitalisation pour une maladie aiguë peut être considérée comme un «instant charnière», particulièrement adapté à un changement de comportement de santé et où des interventions de prévention secondaire, telle l'éducation du patient, pourraient être particulièrement efficaces. De plus, la prescription de médicaments de prévention cardiovasculaire durant l'hospitalisation semble augmenter la proportion des patients traités selon les recommandations sur le long terme. Récemment, plusieurs études ont évalué l'efficacité de programmes de prévention ayant pour but l'éducation des patients et/ou une augmentation du taux de prescription de médicaments prouvés efficaces par les médecins en charge. L'article faisant l'objet du travail de thèse synthétise la littérature existante concernant l'efficacité en termes de mortalité des interventions multidimensionnelles de prévention cardiovasculaire après un syndrome coronarien aigu, débutées à l'hôpital, centrées sur le patient et ciblant plusieurs facteurs de risque cardiovasculaire. MÉTHODE ET RÉSULTATS : En utilisant une stratégie de recherche définie à l'avance, nous avons inclus des essais cliniques avec groupe contrôle et des études avant-après, débutées à l'hôpital et qui incluaient des résultats cliniques de suivi en terme de mortalité, de taux de réadmission et/ou de récidive de syndrome coronarien aigu. Nous avons catégorisé les études selon qu'elles ciblaient les patients (par exemple une intervention d'éducation aux patients par des infirmières), les soignants (par exemple des cours destinés aux médecins-assistants pour leur enseigner comment prodiguer des interventions éducatives) ou le système de soins (par exemple la mise en place d'itinéraires cliniques au niveau de l'institution). Globalement, les interventions rapportées dans les 14 études répondant aux critères montraient une réduction du risque relatif (RR) de mortalité après un an (RR= 0.79; 95% intervalle de confiance (IC), 0.69-0.92; n=37'585). Cependant, le bénéfice semblait dépendre du type d'étude et du niveau d'intervention. Les études avant-après suggéraient une réduction du risque de mortalité (RR, 0.77; 95% IC, 0.66-0.90; n=3680 décès), tandis que le RR était de 0.96 (95% IC, 0.64-1.44; n=99 décès) pour les études cliniques contrôlées. Seules les études avant-après et les études ciblant les soignants et le système, en plus de cibler les patients, semblaient montrer un bénéfice en termes de mortalité à une année. CONCLUSIONS ET PERSPECTIVES : Les preuves d'efficacité des interventions de prévention secondaires débutées à l'hôpital, ciblant le patient, sont prometteuses, mais pas définitives. En effet, seules les études avant-après montrent un bénéfice en termes de mortalité. Les recherches futures dans ce domaine devraient tester formellement quels éléments des interventions amènent le plus de bénéfices pour les patients.
Resumo:
In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.
Resumo:
O objetivo deste trabalho foi realizar uma meta-análise da associação do ácido linoleico conjugado (CLA) com o desempenho e a qualidade de carcaça e de carne em suínos (Sus scrofa domesticus). A base de dados utilizada contemplou 15 artigos publicados entre 1999 e 2006, e totalizou 216 dietas e 5.223 animais. A meta-análise foi realizada por meio de análises gráficas (para observar coerência biológica dos dados), de correlação (para identificar variáveis correlacionadas) e de variância-covariância. O modelo da análise de variância incluiu apenas as variáveis de carne e carcaça mais correlacionadas com o consumo de CLA pelos animais, além das codificações para os efeitos inter e intra-experimentos. A inclusão do ácido linoleico apresentou correlação negativa com a eficiência alimentar e positiva com o consumo de ração e o ganho de peso dos animais. Não houve alteração do consumo de ração, do ganho de peso e da eficiência alimentar dos suínos. O ácido linoleico conjugado aumentou em 9% o conteúdo de carne magra na carcaça, e seu consumo variou a espessura média de toucinho. O ácido linoleico conjugado aumenta o conteúdo de carne magra e reduz a espessura de toucinho na carcaça, sem influenciar o desempenho e a qualidade da carne em suínos.
Resumo:
In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.
Meta-analysis: subclinical thyroid dysfunction and the risk for coronary heart disease and mortality
Resumo:
BACKGROUND: Data on the association between subclinical thyroid dysfunction and coronary heart disease (CHD) and mortality are conflicting. PURPOSE: To summarize prospective evidence about the relationship between subclinical thyroid dysfunction and CHD and mortality. DATA SOURCES: MEDLINE (1950 to January 2008) without language restrictions and reference lists of retrieved articles were searched. STUDY SELECTION: Two reviewers screened and selected cohort studies that measured thyroid function and then followed persons prospectively to assess CHD or mortality. DATA EXTRACTION: By using a standardized protocol and forms, 2 reviewers independently abstracted and assessed studies. DATA SYNTHESIS: Ten of 12 identified studies involved population-based cohorts that included 14 449 participants. All 10 population-based cohort studies examined risks associated with subclinical hypothyroidism (2134 CHD events and 2822 deaths), whereas only 5 examined risks associated with subclinical hyperthyroidism (1392 CHD events and 1993 deaths). In a random-effects model, the relative risk (RR) for subclinical hypothyroidism for CHD was 1.20 (95% CI, 0.97 to 1.49; P for heterogeneity = 0.14; I(2 )= 33.4%). Risk estimates were lower when higher-quality studies were pooled (RR, 1.02 to 1.08) and were higher among participants younger than 65 years (RR, 1.51 [CI, 1.09 to 2.09] for studies with mean participant age <65 years and 1.05 [CI, 0.90 to 1.22] for studies with mean participant age > or =65 years). The RR was 1.18 (CI, 0.98 to 1.42) for cardiovascular mortality and 1.12 (CI, 0.99 to 1.26) for total mortality. For subclinical hyperthyroidism, the RR was 1.21 (CI, 0.88 to 1.68) for CHD, 1.19 (CI, 0.81 to 1.76) for cardiovascular mortality, and 1.12 (CI, 0.89 to 1.42) for total mortality (P for heterogeneity >0.50; I(2 )= 0% for all studies). LIMITATIONS: Individual studies adjusted for different potential confounders, and 1 study provided only unadjusted data. Publication bias or selective reporting of outcomes could not be excluded. CONCLUSION: Subclinical hypothyroidism and hyperthyroidism may be associated with a modest increased risk for CHD and mortality, with lower risk estimates when pooling higher-quality studies and larger CIs for subclinical hyperthyroidism
Resumo:
The purpose of this meta-analysis was to examine the efficacy of maintenance treatments for bipolar disorder. Placebo-controlled or active comparator bipolar maintenance clinical trials of ≥6 months' duration with at least 15 patients/treatment group were identified using Medline, EMBASE, clinicaltrials.gov, and Cochrane databases (1993 to July 2010). The main outcome measure was relative risk for relapse for patients in remission. Twenty trials (5,364 patients) were identified. Overall, lithium and quetiapine were the most studied agents (eight and five trials, respectively). The majority of studies included patients who had previously responded to treatment for an acute episode. All interventions, with the exception of perphenazine+mood stabilizer, showed a relative risk for manic/mixed or depressive relapse below 1.0, although there was variation in the statistical significance of the findings vs. placebo. No monotherapy was associated with a significantly reduced risk for both manic/mixed and depressed relapse. Of the combination treatments, only quetiapine+lithium/divalproex, was associated with a significantly reduced risk vs. comparator (placebo+lithium/valproate) for relapse at both the manic/mixed and depressed poles of bipolar illness. Limitations for the analysis include differences in study durations and definitions of relapse. In conclusion, available maintenance therapies show considerable variation in efficacy. The efficacy of lithium and divalproex has been confirmed, but newer therapies, such as a number of atypical antipsychotics were also shown to be effective in bipolar disorder. Efficacy of all maintenance interventions needs to be balanced against the safety and tolerability profiles of individual agents.
Resumo:
BACKGROUND: Pharmacists can play a decisive role in the management of ambulatory patients with depression who have poor adherence to antidepressant drugs. OBJECTIVE: To systematically evaluate the effectiveness of pharmacist care in improving adherence of depressed outpatients to antidepressants. METHODS: A systematic review and meta-analysis of randomized controlled trials (RCTs) was conducted. RCTs were identified through electronic databases (MEDLINE, Cochrane Central Register of Controlled Trials, Institute for Scientific Information Web of Knowledge, and Spanish National Research Council) from inception to April 2010, reference lists were checked, and experts were consulted. RCTs that evaluated the impact of pharmacist interventions on improving adherence to antidepressants in depressed patients in an outpatient setting (community pharmacy or pharmacy service) were included. Methodologic quality was assessed and methodologic details and outcomes were extracted in duplicate. RESULTS: Six RCTs were identified. A total of 887 patients with an established diagnosis of depression who were initiating or maintaining pharmacologic treatment with antidepressant drugs and who received pharmacist care (459 patients) or usual care (428 patients) were included in the review. The most commonly reported interventions were patient education and monitoring, monitoring and management of toxicity and adverse effects, adherence promotion, provision of written or visual information, and recommendation or implementation of changes or adjustments in medication. Overall, no statistical heterogeneity or publication bias was detected. The pooled odds ratio, using a random effects model, was 1.64 (95% CI 1.24 to 2.17). Subgroup analysis showed no statistically significant differences in results by type of pharmacist involved, adherence measure, diagnostic tool, or analysis strategy. CONCLUSIONS: These results suggest that pharmacist intervention is effective in the improvement of patient adherence to antidepressants. However, data are still limited and we would recommend more research in this area, specifically outside of the US.
Resumo:
This work focuses on the prediction of the two main nitrogenous variables that describe the water quality at the effluent of a Wastewater Treatment Plant. We have developed two kind of Neural Networks architectures based on considering only one output or, in the other hand, the usual five effluent variables that define the water quality: suspended solids, biochemical organic matter, chemical organic matter, total nitrogen and total Kjedhal nitrogen. Two learning techniques based on a classical adaptative gradient and a Kalman filter have been implemented. In order to try to improve generalization and performance we have selected variables by means genetic algorithms and fuzzy systems. The training, testing and validation sets show that the final networks are able to learn enough well the simulated available data specially for the total nitrogen
Resumo:
Purpose: More than five hundred million direct dental restorations are placed each year worldwide. In about 55% of the cases, resin composites or compomers are used, and in 45% amalgam. The longevity of posterior resin restorations is well documented. However, data on resin composites that are placed without enamel/dentin conditioning and resin composites placed with self-etching adhesive systems are missing. Material and Methods: The database SCOPUS was searched for clinical trials on posterior resin composites without restricting the search to the year of publication. The inclusion criteria were: (1) prospective clinical trial with at least 2 years of observation; (2) minimum number of restorations at last recall = 20; (3) report on dropout rate; (4) report of operative technique and materials used; (5) utilization of Ryge or modified Ryge evaluation criteria. For amalgam, only those studies were included that directly compared composite resin restorations with amalgam. For the statistical analysis, a linear mixed model was used with random effects to account for the heterogeneity between the studies. P-values under 0.05 were considered significant. Results: Of the 373 clinical trials, 59 studies met the inclusion criteria. In 70% of the studies, Class II and Class I restorations had been placed. The overall success rate of composite resin restorations was about 90% after 10 years, which was not different from that of amalgam. Restorations with compomers had a significantly lower longevity. The main reason for replacement were bulk fractures and caries adjacent to restorations. Both of these incidents were infrequent in most studies and accounted only for about 6% of all replaced restorations after 10 years. Restorations with macrofilled composites and compomer suffered significantly more loss of anatomical form than restorations with other types of material. Restorations that were placed without enamel acid etching and a dentin bonding agent showed significantly more marginal staining and detectable margins compared to those restorations placed using the enamel-etch or etch-and-rinse technique; restorations with self-etching systems were between the other groups. Restorations with compomer suffered significantly more chippings (repairable fracture) than restorations with other materials, which did not statistically differ among each other. Restorations that were placed with a rubber-dam showed significantly fewer material fractures that needed replacement, and this also had a significant effect on the overall longevity. Conclusion: Restorations with hybrid and microfilled composites that were placed with the enamel-etching technique and rubber-dam showed the best overall performance; the longevity of these restorations was similar to amalgam restorations. Compomer restorations, restorations placed with macrofilled composites, and resin restorations with no-etching or self-etching adhesives demonstrated significant shortcomings and shorter longevity.