75 resultados para socialcommerce, makers, blomming, artigiani

em Université de Lausanne, Switzerland


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This guide introduces Data Envelopment Analysis (DEA), a performance measurement technique, in such a way as to be appropriate to decision makers with little or no background in economics and operational research. The use of mathematics is kept to a minimum. This guide therefore adopts a strong practical approach in order to allow decision makers to conduct their own efficiency analysis and to easily interpret results. DEA helps decision makers for the following reasons: - By calculating an efficiency score, it indicates if a firm is efficient or has capacity for improvement. - By setting target values for input and output, it calculates how much input must be decreased or output increased in order to become efficient. - By identifying the nature of returns to scale, it indicates if a firm has to decrease or increase its scale (or size) in order to minimize the average cost. - By identifying a set of benchmarks, it specifies which other firms' processes need to be analysed in order to improve its own practices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Layout of My Thesis This thesis contains three chapters in Industrial Organization that build on the work outlined above. The first two chapters combine leniency programs with multimarket contact and provide a thorough analysis of the potential effects of Amnesty Plus and Penalty Plus. The third chapter puts the whole discussion on leniency programs into perspective by examining other enforcement tools available to an antitrust authority. The main argument in that last chapter is that a specific instrument can only be as effective as the policy in which it is embedded. It is therefore important for an antitrust authority to know how it best accompanies the introduction or modification of a policy instrument that helps deterrence. INTRODUCTION Chapter 1 examines the efféct of Amnesty Plus and Penalty Plus on the incentives of firms to report cartel activities. The main question is whether the inclusion of these policies in a leniency program undermine the effectiveness of the latter by discouraging the firms to apply for amnesty. The model is static and focus on the ex post incentives of firms to desist from collusion. The results suggest that, because Amnesty Plus and Penalty Plus encourage the reporting of a second cartel after a first detection, a firm, anticipating this, may be reluctant to seek leniency and to report in the first place. However, the effect may also go in the opposite direction, and Amnesty Plus and Penalty Plus may encourage the simultaneous reporting of two cartels. Chapter 2 takes this idea further to the stage of cartel formation. This chapter provides a complete characterization of the potential anticompetitive and procompetitive effects of Amnesty Plus in a infinitely repeated game framework when the firms use their multimarket contact to harshen punishment. I suggest a clear-cut policy rule that prevents potential adverse effects and thereby show that, if policy makers follow this rule, a leniency program with Amnesty Plus performs better than one without. Chapter 3 characterizes the socially optimal enforcement effort of an antitrust authority and shows how this effort changes with the introduction or modification of specific policy instruments. The intuition is that the policy instrument may increase the marginal benefit of conducting investigations. If this effect is strong enough, a more rigorous detection policy becomes socially desirable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION: Hip fractures are responsible for excessive mortality, decreasing the 5-year survival rate by about 20%. From an economic perspective, they represent a major source of expense, with direct costs in hospitalization, rehabilitation, and institutionalization. The incidence rate sharply increases after the age of 70, but it can be reduced in women aged 70-80 years by therapeutic interventions. Recent analyses suggest that the most efficient strategy is to implement such interventions in women at the age of 70 years. As several guidelines recommend bone mineral density (BMD) screening of postmenopausal women with clinical risk factors, our objective was to assess the cost-effectiveness of two screening strategies applied to elderly women aged 70 years and older. METHODS: A cost-effectiveness analysis was performed using decision-tree analysis and a Markov model. Two alternative strategies, one measuring BMD of all women, and one measuring BMD only of those having at least one risk factor, were compared with the reference strategy "no screening". Cost-effectiveness ratios were measured as cost per year gained without hip fracture. Most probabilities were based on data observed in EPIDOS, SEMOF and OFELY cohorts. RESULTS: In this model, which is mostly based on observed data, the strategy "screen all" was more cost effective than "screen women at risk." For one woman screened at the age of 70 and followed for 10 years, the incremental (additional) cost-effectiveness ratio of these two strategies compared with the reference was 4,235 euros and 8,290 euros, respectively. CONCLUSION: The results of this model, under the assumptions described in the paper, suggest that in women aged 70-80 years, screening all women with dual-energy X-ray absorptiometry (DXA) would be more effective than no screening or screening only women with at least one risk factor. Cost-effectiveness studies based on decision-analysis trees maybe useful tools for helping decision makers, and further models based on different assumptions should be performed to improve the level of evidence on cost-effectiveness ratios of the usual screening strategies for osteoporosis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: The purpose of this article is to present the specific public health indicators recently developed by EUROCAT that aim to summarize important aspects of the public health impact of congenital anomalies in a few quantitative measures. METHODS: The six indicators are: (1) congenital anomaly perinatal mortality, (2) congenital anomaly prenatal diagnosis prevalence, (3) congenital anomaly termination of pregnancy, (4) Down syndrome livebirth prevalence, (5) congenital anomaly pediatric surgery, and (6) neural tube defects (NTD) total prevalence. Data presented for this report pertained to all cases (livebirths, fetal deaths, or stillbirths after 20 weeks of gestation and terminations of pregnancy for fetal anomaly [TOPFA]) of congenital anomaly from 27 full member registries of EUROCAT that could provide data for at least 3 years during the period 2004 to 2008. Prevalence of anomalies, prenatal diagnosis, TOPFA, pediatric surgery, and perinatal mortality were calculated per 1000 births. RESULTS: The overall perinatal mortality was approximately 1.0 per 1000 births for EUROCAT registries with almost half due to fetal and the other half due to first week deaths. There were wide variations in perinatal mortality across the registries with the highest rates observed in Dublin and Malta, registries in countries where TOPFA are illegal, and in Ukraine. The overall perinatal mortality across EUROCAT registries slightly decreased between 2004 and 2008 due to a decrease in first week deaths. The prevalence of TOPFA was fairly stable at about 4 per 1000 births. There were variations in livebirth prevalence of cases typically requiring surgery across the registries; however, for most registries this prevalence was between 3 and 5 per 1000 births. Prevalence of NTD decreased by about 10% from 1.05 in 2004 to 0.94 per 1000 in 2008. CONCLUSION: It is hoped that by publishing the data on EUROCAT indicators, the public health importance of congenital anomalies can be clearly summarized to policy makers, the need for accurate data from registries emphasized, the need for primary prevention and treatment services highlighted, and the impact of current services measured.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

1. Statistical modelling is often used to relate sparse biological survey data to remotely derived environmental predictors, thereby providing a basis for predictively mapping biodiversity across an entire region of interest. The most popular strategy for such modelling has been to model distributions of individual species one at a time. Spatial modelling of biodiversity at the community level may, however, confer significant benefits for applications involving very large numbers of species, particularly if many of these species are recorded infrequently. 2. Community-level modelling combines data from multiple species and produces information on spatial pattern in the distribution of biodiversity at a collective community level instead of, or in addition to, the level of individual species. Spatial outputs from community-level modelling include predictive mapping of community types (groups of locations with similar species composition), species groups (groups of species with similar distributions), axes or gradients of compositional variation, levels of compositional dissimilarity between pairs of locations, and various macro-ecological properties (e.g. species richness). 3. Three broad modelling strategies can be used to generate these outputs: (i) 'assemble first, predict later', in which biological survey data are first classified, ordinated or aggregated to produce community-level entities or attributes that are then modelled in relation to environmental predictors; (ii) 'predict first, assemble later', in which individual species are modelled one at a time as a function of environmental variables, to produce a stack of species distribution maps that is then subjected to classification, ordination or aggregation; and (iii) 'assemble and predict together', in which all species are modelled simultaneously, within a single integrated modelling process. These strategies each have particular strengths and weaknesses, depending on the intended purpose of modelling and the type, quality and quantity of data involved. 4. Synthesis and applications. The potential benefits of modelling large multispecies data sets using community-level, as opposed to species-level, approaches include faster processing, increased power to detect shared patterns of environmental response across rarely recorded species, and enhanced capacity to synthesize complex data into a form more readily interpretable by scientists and decision-makers. Community-level modelling therefore deserves to be considered more often, and more widely, as a potential alternative or supplement to modelling individual species.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Continuing developments in science and technology mean that the amounts of information forensic scientists are able to provide for criminal investigations is ever increasing. The commensurate increase in complexity creates difficulties for scientists and lawyers with regard to evaluation and interpretation, notably with respect to issues of inference and decision. Probability theory, implemented through graphical methods, and specifically Bayesian networks, provides powerful methods to deal with this complexity. Extensions of these methods to elements of decision theory provide further support and assistance to the judicial system. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science provides a unique and comprehensive introduction to the use of Bayesian decision networks for the evaluation and interpretation of scientific findings in forensic science, and for the support of decision-makers in their scientific and legal tasks. Includes self-contained introductions to probability and decision theory. Develops the characteristics of Bayesian networks, object-oriented Bayesian networks and their extension to decision models. Features implementation of the methodology with reference to commercial and academically available software. Presents standard networks and their extensions that can be easily implemented and that can assist in the reader's own analysis of real cases. Provides a technique for structuring problems and organizing data based on methods and principles of scientific reasoning. Contains a method for the construction of coherent and defensible arguments for the analysis and evaluation of scientific findings and for decisions based on them. Is written in a lucid style, suitable for forensic scientists and lawyers with minimal mathematical background. Includes a foreword by Ian Evett. The clear and accessible style of this second edition makes this book ideal for all forensic scientists, applied statisticians and graduate students wishing to evaluate forensic findings from the perspective of probability and decision analysis. It will also appeal to lawyers and other scientists and professionals interested in the evaluation and interpretation of forensic findings, including decision making based on scientific information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Tiwi people of northern Australia have managed natural resources continuously for 6000-8000 years. Tiwi management objectives and outcomes may reflect how they gather information about the environment. We qualitatively analyzed Tiwi documents and management techniques to examine the relation between the social and physical environment of decision makers and their decision-making strategies. We hypothesized that principles of bounded rationality, namely, the use of efficient rules to navigate complex decision problems, explain how Tiwi managers use simple decision strategies (i.e., heuristics) to make robust decisions. Tiwi natural resource managers reduced complexity in decision making through a process that gathers incomplete and uncertain information to quickly guide decisions toward effective outcomes. They used management feedback to validate decisions through an information loop that resulted in long-term sustainability of environmental use. We examined the Tiwi decision-making processes relative to management of barramundi (Lates calcarifer) fisheries and contrasted their management with the state government's management of barramundi. Decisions that enhanced the status of individual people and their attainment of aspiration levels resulted in reliable resource availability for Tiwi consumers. Different decision processes adopted by the state for management of barramundi may not secure similarly sustainable outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The epidemiological methods have become useful tools for the assessment of the effectiveness and safety of health care technologies. The experimental methods, namely the randomized controlled trials (RCT), give the best evidence of the effect of a technology. However, the ethical issues and the very nature of the intervention under study sometimes make it difficult to carry out an RCT. Therefore, quasi-experimental and non-experimental study designs are also applied. The critical issues concerning these designs are discussed. The results of evaluative studies are of importance for decision-makers in health policy. The measurements of the impact of a medical technology should go beyond a statement of its effectiveness, because the essential outcome of an intervention or programme is the health status and quality of life of the individuals and populations concerned.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In less than half a century, allergy, originally perceived as a rare disease, has become a major public health threat, today affecting the lives of more than 60 million people in Europe, and probably close to one billion worldwide, thereby heavily impacting the budgets of public health systems. More disturbingly, its prevalence and impact are on the rise, a development that has been associated with environmental and lifestyle changes accompanying the continuous process of urbanization and globalization. Therefore, there is an urgent need to prioritize and concert research efforts in the field of allergy, in order to achieve sustainable results on prevention, diagnosis and treatment of this most prevalent chronic disease of the 21st century.The European Academy of Allergy and Clinical Immunology (EAACI) is the leading professional organization in the field of allergy, promoting excellence in clinical care, education, training and basic and translational research, all with the ultimate goal of improving the health of allergic patients. The European Federation of Allergy and Airways Diseases Patients' Associations (EFA) is a non-profit network of allergy, asthma and Chronic Obstructive Pulmonary Disorder (COPD) patients' organizations. In support of their missions, the present EAACI Position Paper, in collaboration with EFA, highlights the most important research needs in the field of allergy to serve as key recommendations for future research funding at the national and European levels.Although allergies may involve almost every organ of the body and an array of diverse external factors act as triggers, there are several common themes that need to be prioritized in research efforts. As in many other chronic diseases, effective prevention, curative treatment and accurate, rapid diagnosis represent major unmet needs. Detailed phenotyping/endotyping stands out as widely required in order to arrange or re-categorize clinical syndromes into more coherent, uniform and treatment-responsive groups. Research efforts to unveil the basic pathophysiologic pathways and mechanisms, thus leading to the comprehension and resolution of the pathophysiologic complexity of allergies will allow for the design of novel patient-oriented diagnostic and treatment protocols. Several allergic diseases require well-controlled epidemiological description and surveillance, using disease registries, pharmacoeconomic evaluation, as well as large biobanks. Additionally, there is a need for extensive studies to bring promising new biotechnological innovations, such as biological agents, vaccines of modified allergen molecules and engineered components for allergy diagnosis, closer to clinical practice. Finally, particular attention should be paid to the difficult-to-manage, precarious and costly severe disease forms and/or exacerbations. Nonetheless, currently arising treatments, mainly in the fields of immunotherapy and biologicals, hold great promise for targeted and causal management of allergic conditions. Active involvement of all stakeholders, including Patient Organizations and policy makers are necessary to achieve the aims emphasized herein.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation focuses on the practice of regulatory governance, throughout the study of the functioning of formally independent regulatory agencies (IRAs), with special attention to their de facto independence. The research goals are grounded on a "neo-positivist" (or "reconstructed positivist") position (Hawkesworth 1992; Radaelli 2000b; Sabatier 2000). This perspective starts from the ontological assumption that even if subjective perceptions are constitutive elements of political phenomena, a real world exists beyond any social construction and can, however imperfectly, become the object of scientific inquiry. Epistemologically, it follows that hypothetical-deductive theories with explanatory aims can be tested by employing a proper methodology and set of analytical techniques. It is thus possible to make scientific inferences and general conclusions to a certain extent, according to a Bayesian conception of knowledge, in order to update the prior scientific beliefs in the truth of the related hypotheses (Howson 1998), while acknowledging the fact that the conditions of truth are at least partially subjective and historically determined (Foucault 1988; Kuhn 1970). At the same time, a sceptical position is adopted towards the supposed disjunction between facts and values and the possibility of discovering abstract universal laws in social science. It has been observed that the current version of capitalism corresponds to the golden age of regulation, and that since the 1980s no government activity in OECD countries has grown faster than regulatory functions (Jacobs 1999). Following an apparent paradox, the ongoing dynamics of liberalisation, privatisation, decartelisation, internationalisation, and regional integration hardly led to the crumbling of the state, but instead promoted a wave of regulatory growth in the face of new risks and new opportunities (Vogel 1996). Accordingly, a new order of regulatory capitalism is rising, implying a new division of labour between state and society and entailing the expansion and intensification of regulation (Levi-Faur 2005). The previous order, relying on public ownership and public intervention and/or on sectoral self-regulation by private actors, is being replaced by a more formalised, expert-based, open, and independently regulated model of governance. Independent regulation agencies (IRAs), that is, formally independent administrative agencies with regulatory powers that benefit from public authority delegated from political decision makers, represent the main institutional feature of regulatory governance (Gilardi 2008). IRAs constitute a relatively new technology of regulation in western Europe, at least for certain domains, but they are increasingly widespread across countries and sectors. For instance, independent regulators have been set up for regulating very diverse issues, such as general competition, banking and finance, telecommunications, civil aviation, railway services, food safety, the pharmaceutical industry, electricity, environmental protection, and personal data privacy. Two attributes of IRAs deserve a special mention. On the one hand, they are formally separated from democratic institutions and elected politicians, thus raising normative and empirical concerns about their accountability and legitimacy. On the other hand, some hard questions about their role as political actors are still unaddressed, though, together with regulatory competencies, IRAs often accumulate executive, (quasi-)legislative, and adjudicatory functions, as well as about their performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4−year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard−setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: potential toxic and safety hazards of nanomaterials throughout their lifecycles; fate and persistence of nanoparticles in humans, animals and the environment; risks associated to nanoparticle exposure; participation in the preparation of nomenclature, standards, methodologies, protocols and benchmarks; development of best practice guidelines; voluntary schemes on responsibility; databases of materials, research topics and themes. Findings show that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Consequently NIN will encourage stakeholders to be active members. These survey findings will be used to improve NIN's communication tools to further build on interdisciplinary relationships towards a healthy future with nanotechnology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Executive Summary The first essay of this dissertation investigates whether greater exchange rate uncertainty (i.e., variation over time in the exchange rate) fosters or depresses the foreign investment of multinational firms. In addition to the direct capital financing it supplies, foreign investment can be a source of valuable technology and know-how, which can have substantial positive effects on a host country's economic growth. Thus, it is critically important for policy makers and central bankers, among others, to understand how multinationals base their investment decisions on the characteristics of foreign exchange markets. In this essay, I first develop a theoretical framework to improve our knowledge regarding how the aggregate level of foreign investment responds to exchange rate uncertainty when an economy consists of many firms, each of which is making decisions. The analysis predicts a U-shaped effect of exchange rate uncertainty on the total level of foreign investment of the economy. That is, the effect is negative for low levels of uncertainty and positive for higher levels of uncertainty. This pattern emerges because the relationship between exchange rate volatility and 'the probability of investment is negative for firms with low productivity at home (i.e., firms that find it profitable to invest abroad) and the relationship is positive for firms with high productivity at home (i.e., firms that prefer exporting their product). This finding stands in sharp contrast to predictions in the existing literature that consider a single firm's decision to invest in a unique project. The main contribution of this research is to show that the aggregation over many firms produces a U-shaped pattern between exchange rate uncertainty and the probability of investment. Using data from industrialized countries for the period of 1982-2002, this essay offers a comprehensive empirical analysis that provides evidence in support of the theoretical prediction. In the second essay, I aim to explain the time variation in sovereign credit risk, which captures the risk that a government may be unable to repay its debt. The importance of correctly evaluating such a risk is illustrated by the central role of sovereign debt in previous international lending crises. In addition, sovereign debt is the largest asset class in emerging markets. In this essay, I provide a pricing formula for the evaluation of sovereign credit risk in which the decision to default on sovereign debt is made by the government. The pricing formula explains the variation across time in daily credit spreads - a widely used measure of credit risk - to a degree not offered by existing theoretical and empirical models. I use information on a country's stock market to compute the prevailing sovereign credit spread in that country. The pricing formula explains a substantial fraction of the time variation in daily credit spread changes for Brazil, Mexico, Peru, and Russia for the 1998-2008 period, particularly during the recent subprime crisis. I also show that when a government incentive to default is allowed to depend on current economic conditions, one can best explain the level of credit spreads, especially during the recent period of financial distress. In the third essay, I show that the risk of sovereign default abroad can produce adverse consequences for the U.S. equity market through a decrease in returns and an increase in volatility. The risk of sovereign default, which is no longer limited to emerging economies, has recently become a major concern for financial markets. While sovereign debt plays an increasing role in today's financial environment, the effects of sovereign credit risk on the U.S. financial markets have been largely ignored in the literature. In this essay, I develop a theoretical framework that explores how the risk of sovereign default abroad helps explain the level and the volatility of U.S. equity returns. The intuition for this effect is that negative economic shocks deteriorate the fiscal situation of foreign governments, thereby increasing the risk of a sovereign default that would trigger a local contraction in economic growth. The increased risk of an economic slowdown abroad amplifies the direct effect of these shocks on the level and the volatility of equity returns in the U.S. through two channels. The first channel involves a decrease in the future earnings of U.S. exporters resulting from unfavorable adjustments to the exchange rate. The second channel involves investors' incentives to rebalance their portfolios toward safer assets, which depresses U.S. equity prices. An empirical estimation of the model with monthly data for the 1994-2008 period provides evidence that the risk of sovereign default abroad generates a strong leverage effect during economic downturns, which helps to substantially explain the level and the volatility of U.S. equity returns.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: In February, 2005, the canton of Geneva in Switzerland prohibited the off-premise sale of alcoholic beverages between 9pm and 7am, and banned their sale in gas stations and video stores. The aim of this study is to assess the impact of this policy change on hospital admission rates for alcoholic intoxication.Methods: An interrupted time series analysis of this natural experiment was performed with data on hospitalisations for acute alcoholic intoxication during the 2002-2007 period. The canton of Geneva was treated as the experimental group, while all other Swiss cantons were used as the control group.Results: In the experimental site, the policy change was found to have a significant effect on admission rates among adolescents and young adults. Depending on the age group, hospitalisation rates for alcoholic intoxication fell by an estimated 25-40% as the result of restricted alcohol availability.Conclusions: Modest restrictions on opening hours and the density of off-premise outlets were found to be of relevance for public health in the canton of Geneva. In light of this finding, policy makers should consider such action as a promising approach to alcohol prevention. (C) 2011 Elsevier Ireland Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4-year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard-setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: • the potential toxic and safety hazards of nanomaterials throughout their lifecycles; • the fate and persistence of nanoparticles in humans, animals and the environment; • the associated risks of nanoparticle exposure; • greater participation in: the preparation of nomenclature, standards, methodologies, protocols and benchmarks; • the development of best practice guidelines; • voluntary schemes on responsibility; • databases of materials, research topics and themes, but also of expertise. These findings suggested that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Subsequently a workshop was organised by NIN focused on building a sustainable multi-stakeholder dialogue. Specific questions were asked to different stakeholder groups to encourage discussions and open communication. 1. What information do stakeholders need from researchers and why? The discussions about this question confirmed the needs identified in the targeted phone calls. 2. How to communicate information? While it was agreed that reporting should be enhanced, commercial confidentiality and economic competition were identified as major obstacles. It was recognised that expertise was needed in the areas of commercial law and economics for a wellinformed treatment of this communication issue. 3. Can engineered nanomaterials be used safely? The idea that nanomaterials are probably safe because some of them have been produced 'for a long time', was questioned, since many materials in common use have been proved to be unsafe. The question of safety is also about whether the public has confidence. New legislation like REACH could help with this issue. Hazards do not materialise if exposure can be avoided or at least significantly reduced. Thus, there is a need for information on what can be regarded as acceptable levels of exposure. Finally, it was noted that there is no such thing as a perfectly safe material but only boundaries. At this moment we do not know where these boundaries lie. The matter of labelling of products containing nanomaterials was raised, as in the public mind safety and labelling are connected. This may need to be addressed since the issue of nanomaterials in food, drink and food packaging may be the first safety issue to attract public and media attention, and this may have an impact on 'nanotechnology as a whole. 4. Do we need more or other regulation? Any decision making process should accommodate the changing level of uncertainty. To address the uncertainties, adaptations of frameworks such as REACH may be indicated for nanomaterials. Regulation is often needed even if voluntary measures are welcome because it mitigates the effects of competition between industries. Data cannot be collected on voluntary bases for example. NIN will continue with an active stakeholder dialogue to further build on interdisciplinary relationships towards a healthy future with nanotechnology.