852 resultados para Critical Point Theory
Resumo:
PURPOSE Positron emission tomography (PET)∕computed tomography (CT) measurements on small lesions are impaired by the partial volume effect, which is intrinsically tied to the point spread function of the actual imaging system, including the reconstruction algorithms. The variability resulting from different point spread functions hinders the assessment of quantitative measurements in clinical routine and especially degrades comparability within multicenter trials. To improve quantitative comparability there is a need for methods to match different PET∕CT systems through elimination of this systemic variability. Consequently, a new method was developed and tested that transforms the image of an object as produced by one tomograph to another image of the same object as it would have been seen by a different tomograph. The proposed new method, termed Transconvolution, compensates for differing imaging properties of different tomographs and particularly aims at quantitative comparability of PET∕CT in the context of multicenter trials. METHODS To solve the problem of image normalization, the theory of Transconvolution was mathematically established together with new methods to handle point spread functions of different PET∕CT systems. Knowing the point spread functions of two different imaging systems allows determining a Transconvolution function to convert one image into the other. This function is calculated by convolving one point spread function with the inverse of the other point spread function which, when adhering to certain boundary conditions such as the use of linear acquisition and image reconstruction methods, is a numerically accessible operation. For reliable measurement of such point spread functions characterizing different PET∕CT systems, a dedicated solid-state phantom incorporating (68)Ge∕(68)Ga filled spheres was developed. To iteratively determine and represent such point spread functions, exponential density functions in combination with a Gaussian distribution were introduced. Furthermore, simulation of a virtual PET system provided a standard imaging system with clearly defined properties to which the real PET systems were to be matched. A Hann window served as the modulation transfer function for the virtual PET. The Hann's apodization properties suppressed high spatial frequencies above a certain critical frequency, thereby fulfilling the above-mentioned boundary conditions. The determined point spread functions were subsequently used by the novel Transconvolution algorithm to match different PET∕CT systems onto the virtual PET system. Finally, the theoretically elaborated Transconvolution method was validated transforming phantom images acquired on two different PET systems to nearly identical data sets, as they would be imaged by the virtual PET system. RESULTS The proposed Transconvolution method matched different PET∕CT-systems for an improved and reproducible determination of a normalized activity concentration. The highest difference in measured activity concentration between the two different PET systems of 18.2% was found in spheres of 2 ml volume. Transconvolution reduced this difference down to 1.6%. In addition to reestablishing comparability the new method with its parameterization of point spread functions allowed a full characterization of imaging properties of the examined tomographs. CONCLUSIONS By matching different tomographs to a virtual standardized imaging system, Transconvolution opens a new comprehensive method for cross calibration in quantitative PET imaging. The use of a virtual PET system restores comparability between data sets from different PET systems by exerting a common, reproducible, and defined partial volume effect.
Resumo:
Due to its scope and depth, Moore’s Causation and Responsibility is probably the most important publication in the philosophy of law since the publication of Hart’s and Honoré’s Causation in the Law in 1959. This volume offers, for the first time, a detailed exchange between legal and philosophical scholars over Moore’s most recent work. In particular, it pioneers the dialogue between English-speaking and German philosophy of law on a broad range of pressing foundational questions concerning causation in the law. It thereby fulfills the need for a comprehensive, international and critical discussion of Moore’s influential arguments. The 15 contributors to the proposed volume span the whole interdisciplinary field from law and morals to metaphysics, and the authors include distinguished criminal and tort lawyers, as well as prominent theoretical and practical philosophers from four nations. In addition, young researchers take brand-new approaches in the field. The collection is essential reading for anyone interested in legal and moral theory.
Resumo:
PURPOSE To assess the extent of early recoil in patients with critical limb ischemia (CLI) undergoing conventional tibial balloon angioplasty. METHODS Our hypothesis was that early recoil, defined as lumen compromise >10%, is frequent and accounts for considerable luminal narrowing after tibial angioplasty, promoting restenosis. To test this theory, 30 consecutive CLI patients (18 men; mean age 76.2±12.1 years) were angiographically evaluated immediately after tibial balloon angioplasty and 15 minutes later. Half the patients were diabetics. Target lesions included anterior and posterior tibial arteries and the peroneal artery with / without the tibioperoneal trunk. Mean tibial lesion length was 83.8 mm. Early elastic recoil was determined on the basis of minimal lumen diameter (MLD) measurements at baseline (MLDbaseline), immediately after tibial balloon angioplasty (MLDpostdilation), and 15 minutes thereafter (MLD15min). RESULTS Elastic recoil was observed in 29 (97%) patients with a mean luminal compromise of 29% according to MLD measurements (MLDbaseline 0.23 mm, MLD postdilation 2.0 mm, and MLD15min 1.47 mm). CONCLUSION Early recoil is frequently observed in CLI patients undergoing tibial angioplasty and may significantly contribute to restenosis. These findings support the role of dedicated mechanical scaffolding approaches for the prevention of restenosis in tibial arteries.
Resumo:
Consistent with social role theory's assumption that the role behavior of men and women shapes gender stereotypes, earlier experiments have found that men's and women's occupancy of the same role eliminated gender-stereotypical judgments of greater agency and lower communion in men than women. The shifting standards model raises the question of whether a shift to within-sex standards in judgments of men and women in roles could have masked underlying gender stereotypes. To examine this possibility, two experiments obtained judgments of men and women using measures that do or do not restrain shifts to within-sex standards. This measure variation did not affect the social role pattern of smaller perceived sex differences in the presence of role information. These findings thus support the social role theory claim that designations of identical roles for subgroups of men and women eliminate or reduce perceived sex differences.
Resumo:
In recent years, the econometrics literature has shown a growing interest in the study of partially identified models, in which the object of economic and statistical interest is a set rather than a point. The characterization of this set and the development of consistent estimators and inference procedures for it with desirable properties are the main goals of partial identification analysis. This review introduces the fundamental tools of the theory of random sets, which brings together elements of topology, convex geometry, and probability theory to develop a coherent mathematical framework to analyze random elements whose realizations are sets. It then elucidates how these tools have been fruitfully applied in econometrics to reach the goals of partial identification analysis.
Resumo:
Although prior research on new venture creation has identified several antecedents that differentiate entrepreneurs from non-entrepreneurs, scholars still have an incomplete understanding of the factors and decision processes that lead an individual to become an entrepreneur. By applying prospect theory, we introduce the reference point as an important antecedent of new venture creation. Testing our research model and hypotheses with entrepreneurs and employees, results show that entrepreneurs set more aspiring reference points and therefore find themselves more often in a perceived loss situation. Results are also robust when testing for entrepreneurial intention of business graduate students. According to prospect theory, the perceived loss triggers more risk-seeking behavior. Summing up, the reference point has a positive effect on new venture creation and differentiates entrepreneurs from nonentrepreneurs. We discuss theoretical and managerial implications of the findings and develop avenues for future research.
Resumo:
We investigate reductions of M-theory beyond twisted tori by allowing the presence of KK6 monopoles (KKO6-planes) compatible with N = 4 supersymmetry in four dimensions. The presence of KKO6-planes proves crucial to achieve full moduli stabilisation as they generate new universal moduli powers in the scalar potential. The resulting gauged supergravities turn out to be compatible with a weak G2 holonomy at N = 1 as well as at some non-supersymmetric AdS4 vacua. The M-theory flux vacua we present here cannot be obtained from ordinary type IIA orientifold reductions including background fluxes, D6-branes (O6-planes) and/or KK5 (KKO5) sources. However, from a four-dimensional point of view, they still admit a description in terms of so-called non-geometric fluxes. In this sense we provide the M-theory interpretation for such non-geometric type IIA flux vacua.
Resumo:
The analytic continuation needed for the extraction of transport coefficients necessitates in principle a continuous function of the Euclidean time variable. We report on progress towards achieving the continuum limit for 2-point correlator measurements in thermal SU(3) gauge theory, with specific attention paid to scale setting. In particular, we improve upon the determination of the critical lattice coupling and the critical temperature of pure SU(3) gauge theory, estimating r0Tc ≃ 0.7470(7) after a continuum extrapolation. As an application the determination of the heavy quark momentum diffusion coefficient from a correlator of colour-electric fields attached to a Polyakov loop is discussed.
Resumo:
We consider the Schrödinger equation for a relativistic point particle in an external one-dimensional δ-function potential. Using dimensional regularization, we investigate both bound and scattering states, and we obtain results that are consistent with the abstract mathematical theory of self-adjoint extensions of the pseudodifferential operator H=p2+m2−−−−−−−√. Interestingly, this relatively simple system is asymptotically free. In the massless limit, it undergoes dimensional transmutation and it possesses an infrared conformal fixed point. Thus it can be used to illustrate nontrivial concepts of quantum field theory in the simpler framework of relativistic quantum mechanics.
Resumo:
We present three methods for the distortion-free enhancement of THz signals measured by electro-optic sampling in zinc blende-type detector crystals, e.g., ZnTe or GaP. A technique commonly used in optically heterodyne-detected optical Kerr effect spectroscopy is introduced, which is based on two measurements at opposite optical biases near the zero transmission point in a crossed polarizer detection geometry. In contrast to other techniques for an undistorted THz signal enhancement, it also works in a balanced detection scheme and does not require an elaborate procedure for the reconstruction of the true signal as the two measured waveforms are simply subtracted to remove distortions. We study three different approaches for setting an optical bias using the Jones matrix formalism and discuss them also in the framework of optical heterodyne detection. We show that there is an optimal bias point in realistic situations where a small fraction of the probe light is scattered by optical components. The experimental demonstration will be given in the second part of this two-paper series [J. Opt. Soc. Am. B, doc. ID 204877 (2014, posted online)].
Resumo:
This paper addresses two major topics concerning the role of expectations in the formation of reference points. First, we show that when expectations are present, they have a significant impact on reference point formation. Second, we find that decision-makers employ expected values when forming reference points (integrated mechanism) as opposed to single possible outcomes (segregated mechanism). Despite the importance of reference points in prospect theory, to date, there is no standard method of examining these. We develop a new experimental design that employs an indirect approach and extends an existing direct approach. Our findings are consistent across the two approaches.
Resumo:
Quarks were introduced 50 years ago opening the road towards our understanding of the elementary constituents of matter and their fundamental interactions. Since then, a spectacular progress has been made with important discoveries that led to the establishment of the Standard Theory that describes accurately the basic constituents of the observable matter, namely quarks and leptons, interacting with the exchange of three fundamental forces, the weak, electromagnetic and strong force. Particle physics is now entering a new era driven by the quest of understanding of the composition of our Universe such as the unobservable (dark) matter, the hierarchy of masses and forces, the unification of all fundamental interactions with gravity in a consistent quantum framework, and several other important questions. A candidate theory providing answers to many of these questions is string theory that replaces the notion of point particles by extended objects, such as closed and open strings. In this short note, I will give a brief overview of string unification, describe in particular how quarks and leptons can emerge and discuss what are possible predictions for particle physics and cosmology that could test these ideas.
Resumo:
Quarks were introduced 50 years ago opening the road towards our understanding of the elementary constituents of matter and their fundamental interactions. Since then, a spectacular progress has been made with important discoveries that led to the establishment of the Standard Theory that describes accurately the basic constituents of the observable matter, namely quarks and leptons, interacting with the exchange of three fundamental forces, the weak, electromagnetic and strong force. Particle physics is now entering a new era driven by the quest of understanding of the composition of our Universe such as the unobservable (dark) matter, the hierarchy of masses and forces, the unification of all fundamental interactions with gravity in a consistent quantum framework, and several other important questions. A candidate theory providing answers to many of these questions is string theory that replaces the notion of point particles by extended objects, such as closed and open strings. In this short note, I will give a brief overview of string unification, describe in particular how quarks and leptons can emerge and discuss what are possible predictions for particle physics and cosmology that could test these ideas.
Resumo:
Among resummation techniques for perturbative QCD in the context of collider and flavor physics, soft-collinear effective theory (SCET) has emerged as both a powerful and versatile tool, having been applied to a large variety of processes, from B-meson decays to jet production at the LHC. This book provides a concise, pedagogical introduction to this technique. It discusses the expansion of Feynman diagrams around the high-energy limit, followed by the explicit construction of the effective Lagrangian - first for a scalar theory, then for QCD. The underlying concepts are illustrated with the quark vector form factor at large momentum transfer, and the formalism is applied to compute soft-gluon resummation and to perform transverse-momentum resummation for the Drell-Yan process utilizing renormalization group evolution in SCET. Finally, the infrared structure of n-point gauge-theory amplitudes is analyzed by relating them to effective-theory operators. This text is suitable for graduate students and non-specialist researchers alike as it requires only basic knowledge of perturbative QCD.