41 resultados para Weapons of Mass Destruction
Resumo:
Integrated approaches using different in vitro methods in combination with bioinformatics can (i) increase the success rate and speed of drug development; (ii) improve the accuracy of toxicological risk assessment; and (iii) increase our understanding of disease. Three-dimensional (3D) cell culture models are important building blocks of this strategy which has emerged during the last years. The majority of these models are organotypic, i.e., they aim to reproduce major functions of an organ or organ system. This implies in many cases that more than one cell type forms the 3D structure, and often matrix elements play an important role. This review summarizes the state of the art concerning commonalities of the different models. For instance, the theory of mass transport/metabolite exchange in 3D systems and the special analytical requirements for test endpoints in organotypic cultures are discussed in detail. In the next part, 3D model systems for selected organs--liver, lung, skin, brain--are presented and characterized in dedicated chapters. Also, 3D approaches to the modeling of tumors are presented and discussed. All chapters give a historical background, illustrate the large variety of approaches, and highlight up- and downsides as well as specific requirements. Moreover, they refer to the application in disease modeling, drug discovery and safety assessment. Finally, consensus recommendations indicate a roadmap for the successful implementation of 3D models in routine screening. It is expected that the use of such models will accelerate progress by reducing error rates and wrong predictions from compound testing.
Resumo:
General Summary Although the chapters of this thesis address a variety of issues, the principal aim is common: test economic ideas in an international economic context. The intention has been to supply empirical findings using the largest suitable data sets and making use of the most appropriate empirical techniques. This thesis can roughly be divided into two parts: the first one, corresponding to the first two chapters, investigates the link between trade and the environment, the second one, the last three chapters, is related to economic geography issues. Environmental problems are omnipresent in the daily press nowadays and one of the arguments put forward is that globalisation causes severe environmental problems through the reallocation of investments and production to countries with less stringent environmental regulations. A measure of the amplitude of this undesirable effect is provided in the first part. The third and the fourth chapters explore the productivity effects of agglomeration. The computed spillover effects between different sectors indicate how cluster-formation might be productivity enhancing. The last chapter is not about how to better understand the world but how to measure it and it was just a great pleasure to work on it. "The Economist" writes every week about the impressive population and economic growth observed in China and India, and everybody agrees that the world's center of gravity has shifted. But by how much and how fast did it shift? An answer is given in the last part, which proposes a global measure for the location of world production and allows to visualize our results in Google Earth. A short summary of each of the five chapters is provided below. The first chapter, entitled "Unraveling the World-Wide Pollution-Haven Effect" investigates the relative strength of the pollution haven effect (PH, comparative advantage in dirty products due to differences in environmental regulation) and the factor endowment effect (FE, comparative advantage in dirty, capital intensive products due to differences in endowments). We compute the pollution content of imports using the IPPS coefficients (for three pollutants, namely biological oxygen demand, sulphur dioxide and toxic pollution intensity for all manufacturing sectors) provided by the World Bank and use a gravity-type framework to isolate the two above mentioned effects. Our study covers 48 countries that can be classified into 29 Southern and 19 Northern countries and uses the lead content of gasoline as proxy for environmental stringency. For North-South trade we find significant PH and FE effects going in the expected, opposite directions and being of similar magnitude. However, when looking at world trade, the effects become very small because of the high North-North trade share, where we have no a priori expectations about the signs of these effects. Therefore popular fears about the trade effects of differences in environmental regulations might by exaggerated. The second chapter is entitled "Is trade bad for the Environment? Decomposing worldwide SO2 emissions, 1990-2000". First we construct a novel and large database containing reasonable estimates of SO2 emission intensities per unit labor that vary across countries, periods and manufacturing sectors. Then we use these original data (covering 31 developed and 31 developing countries) to decompose the worldwide SO2 emissions into the three well known dynamic effects (scale, technique and composition effect). We find that the positive scale (+9,5%) and the negative technique (-12.5%) effect are the main driving forces of emission changes. Composition effects between countries and sectors are smaller, both negative and of similar magnitude (-3.5% each). Given that trade matters via the composition effects this means that trade reduces total emissions. We next construct, in a first experiment, a hypothetical world where no trade happens, i.e. each country produces its imports at home and does no longer produce its exports. The difference between the actual and this no-trade world allows us (under the omission of price effects) to compute a static first-order trade effect. The latter now increases total world emissions because it allows, on average, dirty countries to specialize in dirty products. However, this effect is smaller (3.5%) in 2000 than in 1990 (10%), in line with the negative dynamic composition effect identified in the previous exercise. We then propose a second experiment, comparing effective emissions with the maximum or minimum possible level of SO2 emissions. These hypothetical levels of emissions are obtained by reallocating labour accordingly across sectors within each country (under the country-employment and the world industry-production constraints). Using linear programming techniques, we show that emissions are reduced by 90% with respect to the worst case, but that they could still be reduced further by another 80% if emissions were to be minimized. The findings from this chapter go together with those from chapter one in the sense that trade-induced composition effect do not seem to be the main source of pollution, at least in the recent past. Going now to the economic geography part of this thesis, the third chapter, entitled "A Dynamic Model with Sectoral Agglomeration Effects" consists of a short note that derives the theoretical model estimated in the fourth chapter. The derivation is directly based on the multi-regional framework by Ciccone (2002) but extends it in order to include sectoral disaggregation and a temporal dimension. This allows us formally to write present productivity as a function of past productivity and other contemporaneous and past control variables. The fourth chapter entitled "Sectoral Agglomeration Effects in a Panel of European Regions" takes the final equation derived in chapter three to the data. We investigate the empirical link between density and labour productivity based on regional data (245 NUTS-2 regions over the period 1980-2003). Using dynamic panel techniques allows us to control for the possible endogeneity of density and for region specific effects. We find a positive long run elasticity of density with respect to labour productivity of about 13%. When using data at the sectoral level it seems that positive cross-sector and negative own-sector externalities are present in manufacturing while financial services display strong positive own-sector effects. The fifth and last chapter entitled "Is the World's Economic Center of Gravity Already in Asia?" computes the world economic, demographic and geographic center of gravity for 1975-2004 and compares them. Based on data for the largest cities in the world and using the physical concept of center of mass, we find that the world's economic center of gravity is still located in Europe, even though there is a clear shift towards Asia. To sum up, this thesis makes three main contributions. First, it provides new estimates of orders of magnitudes for the role of trade in the globalisation and environment debate. Second, it computes reliable and disaggregated elasticities for the effect of density on labour productivity in European regions. Third, it allows us, in a geometrically rigorous way, to track the path of the world's economic center of gravity.
Resumo:
BACKGROUND: Whole pelvis intensity modulated radiotherapy (IMRT) is increasingly being used to treat cervical cancer aiming to reduce side effects. Encouraged by this, some groups have proposed the use of simultaneous integrated boost (SIB) to target the tumor, either to get a higher tumoricidal effect or to replace brachytherapy. Nevertheless, physiological organ movement and rapid tumor regression throughout treatment might substantially reduce any benefit of this approach. PURPOSE: To evaluate the clinical target volume - simultaneous integrated boost (CTV-SIB) regression and motion during chemo-radiotherapy (CRT) for cervical cancer, and to monitor treatment progress dosimetrically and volumetrically to ensure treatment goals are met. METHODS AND MATERIALS: Ten patients treated with standard doses of CRT and brachytherapy were retrospectively re-planned using a helical Tomotherapy - SIB technique for the hypothetical scenario of this feasibility study. Target and organs at risk (OAR) were contoured on deformable fused planning-computed tomography and megavoltage computed tomography images. The CTV-SIB volume regression was determined. The center of mass (CM) was used to evaluate the degree of motion. The Dice's similarity coefficient (DSC) was used to assess the spatial overlap of CTV-SIBs between scans. A cumulative dose-volume histogram modeled estimated delivered doses. RESULTS: The CTV-SIB relative reduction was between 31 and 70%. The mean maximum CM change was 12.5, 9, and 3 mm in the superior-inferior, antero-posterior, and right-left dimensions, respectively. The CTV-SIB-DSC approached 1 in the first week of treatment, indicating almost perfect overlap. CTV-SIB-DSC regressed linearly during therapy, and by the end of treatment was 0.5, indicating 50% discordance. Two patients received less than 95% of the prescribed dose. Much higher doses to the OAR were observed. A multiple regression analysis showed a significant interaction between CTV-SIB reduction and OAR dose increase. CONCLUSIONS: The CTV-SIB had important regression and motion during CRT, receiving lower therapeutic doses than expected. The OAR had unpredictable shifts and received higher doses. The use of SIB without frequent adaptation of the treatment plan exposes cervical cancer patients to an unpredictable risk of under-dosing the target and/or overdosing adjacent critical structures. In that scenario, brachytherapy continues to be the gold standard approach.
Resumo:
Profiles of carbon isotopes were studied in marine limestones of Late Permian and Early Triassic age of the Tethyan region from 20 sections in Yugoslavia, Greece, Turkey, Armenian SSR, Iran, Pakistan, India, Nepal, and China. The Upper Permian sections continue the high positive values of 13C previously found in Upper Permian basins in NW Europe and western USA. In the more complete sections of Tethys it can now be demonstrated that the values of 13C drop from the Murgabian to the Dzhulfian Stages of the Upper Permian, then sharply to values near zero during the last two biozones of the Dorashamian. These levels of 13C sample the Tethys Sea and the world ocean, and equal values from deep-water sediments at Salamis Greece indicate that they apply to the whole water column. We hypothesize that the high values of 13C are a consequence of Late Paleozoic storage of organic carbon, and that the declines represent an episodic cessation of this organic deposition, and partial oxidation of the organic reservoir, extending over a period of several million years. The carbon isotope profile may reflect parallel complexity in the pattern of mass extinction in Late Permian time. Des profils isotopiques du carbone ont été établis dans des calcaires marins d'âge tardi-permien à éo-triasique répartis dans 20 endroits du domaine téthysien: Yougoslavie, Grèce, Turquie, République d'Arménie, Iran, Pakistan, Inde, Népal et Chine. Les profils établis dans le Permien supérieur montrent les mêmes valeurs positives de 13C observées antérieurement dans des bassins de même âge en Europe occidentale et dans l'ouest des USA. Dans les profils les plus complets de la Téthys, il est maintenant établi que les valeurs de 13C décroissent depuis le Murgabien jusqu'au Dzhulfien (Permien supérieur) pour devenir proches de zéro dans les deux dernières biozones du Dorasharmen. Ces valeurs de 13C sont caractéristiques de la Téthys et de l'Océan mondial; elles s'appliquent à toutes les profondeurs d'eau, comme en témoignent les valeurs fournies par des sédiments de mer profonde à Salamis (Grèce). Nous formulons l'hypothèse que les hautes valeurs de 13C sont la conséquence du stockage du carbone organique au Paléozoïque supérieur et que leur décroissance traduit un arrêt épisodique de cette sédimentation organique, accompagné d'une oxydation partielle de la matière organique s'étendant sur une période de plusieurs Ma. L'influence parallèle des phénomènes d'extinction massive à le fin du Permien se refléterait également dans les profils isotopiques du carbone.
Resumo:
In recent years, protein-ligand docking has become a powerful tool for drug development. Although several approaches suitable for high throughput screening are available, there is a need for methods able to identify binding modes with high accuracy. This accuracy is essential to reliably compute the binding free energy of the ligand. Such methods are needed when the binding mode of lead compounds is not determined experimentally but is needed for structure-based lead optimization. We present here a new docking software, called EADock, that aims at this goal. It uses an hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 A around the center of mass of the ligand position in the crystal structure, and on the contrary to other benchmarks, our algorithm was fed with optimized ligand positions up to 10 A root mean square deviation (RMSD) from the crystal structure, excluding the latter. This validation illustrates the efficiency of our sampling strategy, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures could be explained by the presence of crystal contacts in the experimental structure. Finally, the ability of EADock to accurately predict binding modes on a real application was illustrated by the successful docking of the RGD cyclic pentapeptide on the alphaVbeta3 integrin, starting far away from the binding pocket.
Resumo:
Following the success of the first round table in 2001, the Swiss Proteomic Society has organized two additional specific events during its last two meetings: a proteomic application exercise in 2002 and a round table in 2003. Such events have as their main objective to bring together, around a challenging topic in mass spectrometry, two groups of specialists, those who develop and commercialize mass spectrometry equipment and software, and expert MS users for peptidomics and proteomics studies. The first round table (Geneva, 2001) entitled "Challenges in Mass Spectrometry" was supported by brief oral presentations that stressed critical questions in the field of MS development or applications (Stöcklin and Binz, Proteomics 2002, 2, 825-827). Topics such as (i) direct analysis of complex biological samples, (ii) status and perspectives for MS investigations of noncovalent peptide-ligant interactions; (iii) is it more appropriate to have complementary instruments rather than a universal equipment, (iv) standardization and improvement of the MS signals for protein identification, (v) what would be the new generation of equipment and finally (vi) how to keep hardware and software adapted to MS up-to-date and accessible to all. For the SPS'02 meeting (Lausanne, 2002), a full session alternative event "Proteomic Application Exercise" was proposed. Two different samples were prepared and sent to the different participants: 100 micro g of snake venom (a complex mixture of peptides and proteins) and 10-20 micro g of almost pure recombinant polypeptide derived from the shrimp Penaeus vannamei carrying an heterogeneous post-translational modification (PTM). Among the 15 participants that received the samples blind, eight returned results and most of them were asked to present their results emphasizing the strategy, the manpower and the instrumentation used during the congress (Binz et. al., Proteomics 2003, 3, 1562-1566). It appeared that for the snake venom extract, the quality of the results was not particularly dependant on the strategy used, as all approaches allowed Lication of identification of a certain number of protein families. The genus of the snake was identified in most cases, but the species was ambiguous. Surprisingly, the precise identification of the recombinant almost pure polypeptides appeared to be much more complicated than expected as only one group reported the full sequence. Finally the SPS'03 meeting reported here included a round table on the difficult and challenging task of "Quantification by Mass Spectrometry", a discussion sustained by four selected oral presentations on the use of stable isotopes, electrospray ionization versus matrix-assisted laser desorption/ionization approaches to quantify peptides and proteins in biological fluids, the handling of differential two-dimensional liquid chromatography tandem mass spectrometry data resulting from high throughput experiments, and the quantitative analysis of PTMs. During these three events at the SPS meetings, the impressive quality and quantity of exchanges between the developers and providers of mass spectrometry equipment and software, expert users and the audience, were a key element for the success of these fruitful events and will have definitively paved the way for future round tables and challenging exercises at SPS meetings.
Resumo:
Research into the biomechanical manifestation of fatigue during exhaustive runs is increasingly popular but additional understanding of the adaptation of the spring-mass behaviour during the course of strenuous, self-paced exercises continues to be a challenge in order to develop optimized training and injury prevention programs. This study investigated continuous changes in running mechanics and spring-mass behaviour during a 5-km run. 12 competitive triathletes performed a 5-km running time trial (mean performance: ̴17 min 30 s) on a 200 m indoor track. Vertical and anterior-posterior ground reaction forces were measured every 200 m by a 5-m long force platform system, and used to determine spring-mass model characteristics. After a fast start, running velocity progressively decreased (- 11.6%; P<0.001) in the middle part of the race before an end spurt in the final 400-600 m. Stride length (- 7.4%; P<0.001) and frequency (- 4.1%; P=0.001) decreased over the 25 laps, while contact time (+ 8.9%; P<0.001) and total stride duration (+ 4.1%; P<0.001) progressively lengthened. Peak vertical forces (- 2.0%; P<0.01) and leg compression (- 4.3%; P<0.05), but not centre of mass vertical displacement (+ 3.2%; P>0.05), decreased with time. As a result, vertical stiffness decreased (- 6.0%; P<0.001) during the run, whereas leg stiffness changes were not significant (+ 1.3%; P>0.05). Spring-mass behaviour progressively changes during a 5-km time trial towards deteriorated vertical stiffness, which alters impact and force production characteristics.
Resumo:
3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.
Resumo:
This study investigated behavioral adaptability, which could be defined as a blend between stability and flexibility of the limbs movement and their inter-limb coordination, when individuals received informational constraints. Seven expert breaststroke swimmers performed three 200-m in breaststroke at constant submaximal intensity. Each trial was performed randomly in a different coordination pattern: 'freely-chosen', 'maximal glide' and 'minimal glide'. Two underwater and four aerial cameras enabled 3D movement analysis in order to assess elbow and knee angles, elbow-knee pair coordination, intra-cyclic velocity variations of the center of mass, stroke rate and stroke length and inter-limb coordination. The energy cost of locomotion was calculated from gas exchanges and blood lactate concentration. The results showed significantly higher glide, intra-cyclic velocity variations and energy cost under 'maximal glide' compared to 'freely-chosen' instructional conditions, as well as higher reorganization of limb movement and inter-limb coordination (p<0.05). In the 'minimal glide' condition, the swimmers did not show significantly shorter glide and lower energy cost, but they exhibited significantly lower deceleration of the center of mass, as well as modified limb movement and inter-limb coordination (p<0.05). These results highlight that a variety of structural adaptations can functionally satisfy the task-goal.
Resumo:
This article draws on empirical material to reflect on what drives rapid change in flood risk management practice, reflecting wider interest in the way that scientific practices make risk landscapes and a specific focus on extreme events as drivers of rapid change. Such events are commonly referred to as a form of creative destruction, ones that reveal both the composition of socioenvironmental assemblages and provide a creative opportunity to remake those assemblages in alternate ways, therefore rapidly changing policy and practice. Drawing on wider thinking in complexity theory, we argue that what happens between events might be as, if not more, important than the events themselves. We use two empirical examples concerned with flood risk management practice: a rapid shift in the dominant technologies used to map flood risk in the United Kingdom and an experimental approach to public participation tested in two different locations, with dramatically different consequences. Both show that the state of the socioenvironmental assemblage in which the events take place matters as much as the magnitude of the events themselves. The periods between rapid changes are not simply periods of discursive consolidation but involve the ongoing mutation of such assemblages, which could either sensitize or desensitize them to rapid change. Understanding these intervening periods matters as much as the events themselves. If events matter, it is because of the ways in which they might bring into sharp focus the coding or framing of a socioenvironmental assemblage in policy or scientific practice irrespective of whether or not those events evolve the assemblage in subtle or more radical ways.
Resumo:
AIM: To provide insight into cancer registration coverage, data access and use in Europe. This contributes to data and infrastructure harmonisation and will foster a more prominent role of cancer registries (CRs) within public health, clinical policy and cancer research, whether within or outside the European Research Area. METHODS: During 2010-12 an extensive survey of cancer registration practices and data use was conducted among 161 population-based CRs across Europe. Responding registries (66%) operated in 33 countries, including 23 with national coverage. RESULTS: Population-based oncological surveillance started during the 1940-50s in the northwest of Europe and from the 1970s to 1990s in other regions. The European Union (EU) protection regulations affected data access, especially in Germany and France, but less in the Netherlands or Belgium. Regular reports were produced by CRs on incidence rates (95%), survival (60%) and stage for selected tumours (80%). Evaluation of cancer control and quality of care remained modest except in a few dedicated CRs. Variables evaluated were support of clinical audits, monitoring adherence to clinical guidelines, improvement of cancer care and evaluation of mass cancer screening. Evaluation of diagnostic imaging tools was only occasional. CONCLUSION: Most population-based CRs are well equipped for strengthening cancer surveillance across Europe. Data quality and intensity of use depend on the role the cancer registry plays in the politico, oncomedical and public health setting within the country. Standard registration methodology could therefore not be translated to equivalent advances in cancer prevention and mass screening, quality of care, translational research of prognosis and survivorship across Europe. Further European collaboration remains essential to ensure access to data and comparability of the results.