864 resultados para Assessment and certification systems
Resumo:
A scientific forum on “The Future Science of Exoplanets and Their Systems,” sponsored by Europlanet* and the International Space Science Institute (ISSI)† and co-organized by the Center for Space and Habitability (CSH)‡ of the University of Bern, was held during December 5 and 6, 2012, in Bern, Switzerland. It gathered 24 well-known specialists in exoplanetary, Solar System, and stellar science to discuss the future of the fast-expanding field of exoplanetary research, which now has nearly 1000 objects to analyze and compare and will develop even more quickly over the coming years. The forum discussions included a review of current observational knowledge, efforts for exoplanetary atmosphere characterization and their formation, water formation, atmospheric evolution, habitability aspects, and our understanding of how exoplanets interact with their stellar and galactic environment throughout their history. Several important and timely research areas of focus for further research efforts in the field were identified by the forum participants. These scientific topics are related to the origin and formation of water and its delivery to planetary bodies and the role of the disk in relation to planet formation, including constraints from observations as well as star-planet interaction processes and their consequences for atmosphere-magnetosphere environments, evolution, and habitability. The relevance of these research areas is outlined in this report, and possible themes for future ISSI workshops are identified that may be proposed by the international research community over the coming 2–3 years.
Resumo:
Abelian and non-Abelian gauge theories are of central importance in many areas of physics. In condensed matter physics, AbelianU(1) lattice gauge theories arise in the description of certain quantum spin liquids. In quantum information theory, Kitaev’s toric code is a Z(2) lattice gauge theory. In particle physics, Quantum Chromodynamics (QCD), the non-Abelian SU(3) gauge theory of the strong interactions between quarks and gluons, is nonperturbatively regularized on a lattice. Quantum link models extend the concept of lattice gauge theories beyond the Wilson formulation, and are well suited for both digital and analog quantum simulation using ultracold atomic gases in optical lattices. Since quantum simulators do not suffer from the notorious sign problem, they open the door to studies of the real-time evolution of strongly coupled quantum systems, which are impossible with classical simulation methods. A plethora of interesting lattice gauge theories suggests itself for quantum simulation, which should allow us to address very challenging problems, ranging from confinement and deconfinement, or chiral symmetry breaking and its restoration at finite baryon density, to color superconductivity and the real-time evolution of heavy-ion collisions, first in simpler model gauge theories and ultimately in QCD.
Resumo:
The considerable search for synergistic agents in cancer research is motivated by the therapeutic benefits achieved by combining anti-cancer agents. Synergistic agents make it possible to reduce dosage while maintaining or enhancing a desired effect. Other favorable outcomes of synergistic agents include reduction in toxicity and minimizing or delaying drug resistance. Dose-response assessment and drug-drug interaction analysis play an important part in the drug discovery process, however analysis are often poorly done. This dissertation is an effort to notably improve dose-response assessment and drug-drug interaction analysis. The most commonly used method in published analysis is the Median-Effect Principle/Combination Index method (Chou and Talalay, 1984). The Median-Effect Principle/Combination Index method leads to inefficiency by ignoring important sources of variation inherent in dose-response data and discarding data points that do not fit the Median-Effect Principle. Previous work has shown that the conventional method yields a high rate of false positives (Boik, Boik, Newman, 2008; Hennessey, Rosner, Bast, Chen, 2010) and, in some cases, low power to detect synergy. There is a great need for improving the current methodology. We developed a Bayesian framework for dose-response modeling and drug-drug interaction analysis. First, we developed a hierarchical meta-regression dose-response model that accounts for various sources of variation and uncertainty and allows one to incorporate knowledge from prior studies into the current analysis, thus offering a more efficient and reliable inference. Second, in the case that parametric dose-response models do not fit the data, we developed a practical and flexible nonparametric regression method for meta-analysis of independently repeated dose-response experiments. Third, and lastly, we developed a method, based on Loewe additivity that allows one to quantitatively assess interaction between two agents combined at a fixed dose ratio. The proposed method makes a comprehensive and honest account of uncertainty within drug interaction assessment. Extensive simulation studies show that the novel methodology improves the screening process of effective/synergistic agents and reduces the incidence of type I error. We consider an ovarian cancer cell line study that investigates the combined effect of DNA methylation inhibitors and histone deacetylation inhibitors in human ovarian cancer cell lines. The hypothesis is that the combination of DNA methylation inhibitors and histone deacetylation inhibitors will enhance antiproliferative activity in human ovarian cancer cell lines compared to treatment with each inhibitor alone. By applying the proposed Bayesian methodology, in vitro synergy was declared for DNA methylation inhibitor, 5-AZA-2'-deoxycytidine combined with one histone deacetylation inhibitor, suberoylanilide hydroxamic acid or trichostatin A in the cell lines HEY and SKOV3. This suggests potential new epigenetic therapies in cell growth inhibition of ovarian cancer cells.
Resumo:
Ore-forming and geoenviromental systems commonly involve coupled fluid flowand chemical reaction processes. The advanced numerical methods and computational modeling have become indispensable tools for simulating such processes in recent years. This enables many hitherto unsolvable geoscience problems to be addressed using numerical methods and computational modeling approaches. For example, computational modeling has been successfully used to solve ore-forming and mine site contamination/remediation problems, in which fluid flow and geochemical processes play important roles in the controlling dynamic mechanisms. The main purpose of this paper is to present a generalized overview of: (1) the various classes and models associated with fluid flow/chemically reacting systems in order to highlight possible opportunities and developments for the future; (2) some more general issues that need attention in the development of computational models and codes for simulating ore-forming and geoenviromental systems; (3) the related progresses achieved on the geochemical modeling over the past 50 years or so; (4) the general methodology for modeling of oreforming and geoenvironmental systems; and (5) the future development directions associated with modeling of ore-forming and geoenviromental systems.
Resumo:
Global environmental change includes changes in a wide range of global scale phenomena, which are expected to affect a number of physical processes, as well as the vulnerability of the communities that will experience their impact. Decision-makers are in need of tools that will enable them to assess the loss of such processes under different future scenarios and to design risk reduction strategies. In this paper, a tool is presented that can be used by a range of end-users (e.g. local authorities, decision makers, etc.) for the assessment of the monetary loss from future landslide events, with a particular focus on torrential processes. The toolbox includes three functions: a) enhancement of the post-event damage data collection process, b) assessment of monetary loss of future events and c) continuous updating and improvement of an existing vulnerability curve by adding data of recent events. All functions of the tool are demonstrated through examples of its application.
Resumo:
We present an image quality assessment and enhancement method for high-resolution Fourier-Domain OCT imaging like in sub-threshold retina therapy. A Maximum-Likelihood deconvolution algorithm as well as a histogram-based quality assessment method are evaluated.
Resumo:
A prerequisite for preventive measures is to diagnose erosive tooth wear and to evaluate the different etiological factors in order to identify persons at risk. No diagnostic device is available for the assessment of erosive defects. Thus, they can only be detected clinically. Consequently, erosion not diagnosed at an early stage may render timely preventive measures difficult. In order to assess the risk factors, patients should record their dietary intake for a distinct period of time. Then a dentist can determine the erosive potential of the diet. A table with common beverages and foodstuffs is presented for judging the erosive potential. Particularly, patients with more than 4 dietary acid intakes have a higher risk for erosion when other risk factors are present. Regurgitation of gastric acids is a further important risk factor for the development of erosion which has to be taken into account. Based on these analyses, an individually tailored preventive program may be suggested to the patients. It may comprise dietary advice, use of calcium-enriched beverages, optimization of prophylactic regimes, stimulation of salivary flow rate, use of buffering medicaments and particular motivation for nondestructive toothbrushing habits with an erosive-protecting toothpaste as well as rinsing solutions. Since erosion and abrasion often occur simultaneously, all of the causative components must be taken into consideration when planning preventive strategies but only those important and feasible for an individual should be communicated to the patient.
Resumo:
Both obesity and asthma are highly prevalent, complex diseases modified by multiple factors. Genetic, developmental, lung mechanical, immunological and behavioural factors have all been suggested as playing a causal role between the two entities; however, their complex mechanistic interactions are still poorly understood and evidence of causality in children remains scant. Equally lacking is evidence of effective treatment strategies, despite the fact that imbalances at vulnerable phases in childhood can impact long-term health. This review is targeted at both clinicians frequently faced with the dilemma of how to investigate and treat the obese asthmatic child and researchers interested in the topic. Highlighting the breadth of the spectrum of factors involved, this review collates evidence regarding the investigation and treatment of asthma in obese children, particularly in comparison with current approaches in 'difficult-to-treat' childhood asthma. Finally, the authors propose hypotheses for future research from a systems-based perspective.
Resumo:
Assessing and managing risks relating to the consumption of food stuffs for humans and to the environment has been one of the most complex legal issues in WTO law, ever since the Agreement on Sanitary and Phytosanitary Measures was adopted at the end of the Uruguay Round and entered into force in 1995. The problem was expounded in a number of cases. Panels and the Appellate Body adopted different philosophies in interpreting the agreement and the basic concept of risk assessment as defined in Annex A para. 4 of the Agreement. Risk assessment entails fundamental question on law and science. Different interpretations reflect different underlying perceptions of science and its relationship to the law. The present thesis supported by the Swiss National Research Foundation undertakes an in-depth analysis of these underlying perceptions. The author expounds the essence and differences of positivism and relativism in philosophy and natural sciences. He clarifies the relationship of fundamental concepts such as risk, hazards and probability. This investigation is a remarkable effort on the part of lawyer keen to learn more about the fundamentals based upon which the law – often unconsciously – is operated by the legal profession and the trade community. Based upon these insights, he turns to a critical assessment of jurisprudence both of panels and the Appellate Body. Extensively referring and discussing the literature, he deconstructs findings and decisions in light of implied and assumed underlying philosophies and perceptions as to the relationship of law and science, in particular in the field of food standards. Finding that both positivism and relativism does not provide adequate answers, the author turns critical rationalism and applies the methodologies of falsification developed by Karl R. Popper. Critical rationalism allows combining discourse in science and law and helps preparing the ground for a new approach to risk assessment and risk management. Linking the problem to the doctrine of multilevel governance the author develops a theory allocating risk assessment to international for a while leaving the matter of risk management to national and democratically accountable government. While the author throughout the thesis questions the possibility of separating risk assessment and risk management, the thesis offers new avenues which may assist in structuring a complex and difficult problem