903 resultados para Closed Convex Sets


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Images of an object under different illumination are known to provide strong cues about the object surface. A mathematical formalization of how to recover the normal map of such a surface leads to the so-called uncalibrated photometric stereo problem. In the simplest instance, this problem can be reduced to the task of identifying only three parameters: the so-called generalized bas-relief (GBR) ambiguity. The challenge is to find additional general assumptions about the object, that identify these parameters uniquely. Current approaches are not consistent, i.e., they provide different solutions when run multiple times on the same data. To address this limitation, we propose exploiting local diffuse reflectance (LDR) maxima, i.e., points in the scene where the normal vector is parallel to the illumination direction (see Fig. 1). We demonstrate several noteworthy properties of these maxima: a closed-form solution, computational efficiency and GBR consistency. An LDR maximum yields a simple closed-form solution corresponding to a semi-circle in the GBR parameters space (see Fig. 2); because as few as two diffuse maxima in different images identify a unique solution, the identification of the GBR parameters can be achieved very efficiently; finally, the algorithm is consistent as it always returns the same solution given the same data. Our algorithm is also remarkably robust: It can obtain an accurate estimate of the GBR parameters even with extremely high levels of outliers in the detected maxima (up to 80 % of the observations). The method is validated on real data and achieves state-of-the-art results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of fitting a union of subspaces to a collection of data points drawn from one or more subspaces and corrupted by noise and/or gross errors. We pose this problem as a non-convex optimization problem, where the goal is to decompose the corrupted data matrix as the sum of a clean and self-expressive dictionary plus a matrix of noise and/or gross errors. By self-expressive we mean a dictionary whose atoms can be expressed as linear combinations of themselves with low-rank coefficients. In the case of noisy data, our key contribution is to show that this non-convex matrix decomposition problem can be solved in closed form from the SVD of the noisy data matrix. The solution involves a novel polynomial thresholding operator on the singular values of the data matrix, which requires minimal shrinkage. For one subspace, a particular case of our framework leads to classical PCA, which requires no shrinkage. For multiple subspaces, the low-rank coefficients obtained by our framework can be used to construct a data affinity matrix from which the clustering of the data according to the subspaces can be obtained by spectral clustering. In the case of data corrupted by gross errors, we solve the problem using an alternating minimization approach, which combines our polynomial thresholding operator with the more traditional shrinkage-thresholding operator. Experiments on motion segmentation and face clustering show that our framework performs on par with state-of-the-art techniques at a reduced computational cost.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The motion of lung tumors during respiration makes the accurate delivery of radiation therapy to the thorax difficult because it increases the uncertainty of target position. The adoption of four-dimensional computed tomography (4D-CT) has allowed us to determine how a tumor moves with respiration for each individual patient. Using information acquired during a 4D-CT scan, we can define the target, visualize motion, and calculate dose during the planning phase of the radiotherapy process. One image data set that can be created from the 4D-CT acquisition is the maximum-intensity projection (MIP). The MIP can be used as a starting point to define the volume that encompasses the motion envelope of the moving gross target volume (GTV). Because of the close relationship that exists between the MIP and the final target volume, we investigated four MIP data sets created with different methodologies (3 using various 4D-CT sorting implementations, and one using all available cine CT images) to compare target delineation. It has been observed that changing the 4D-CT sorting method will lead to the selection of a different collection of images; however, the clinical implications of changing the constituent images on the resultant MIP data set are not clear. There has not been a comprehensive study that compares target delineation based on different 4D-CT sorting methodologies in a patient population. We selected a collection of patients who had previously undergone thoracic 4D-CT scans at our institution, and who had lung tumors that moved at least 1 cm. We then generated the four MIP data sets and automatically contoured the target volumes. In doing so, we identified cases in which the MIP generated from a 4D-CT sorting process under-represented the motion envelope of the target volume by more than 10% than when measured on the MIP generated from all of the cine CT images. The 4D-CT methods suffered from duplicate image selection and might not choose maximum extent images. Based on our results, we suggest utilization of a MIP generated from the full cine CT data set to ensure a representative inclusive tumor extent, and to avoid geometric miss.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present local stereological estimators of Minkowski tensors defined on convex bodies in ℝ d . Special cases cover a number of well-known local stereological estimators of volume and surface area in ℝ3, but the general set-up also provides new local stereological estimators of various types of centres of gravity and tensors of rank two. Rank two tensors can be represented as ellipsoids and contain information about shape and orientation. The performance of some of the estimators of centres of gravity and volume tensors of rank two is investigated by simulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, the econometrics literature has shown a growing interest in the study of partially identified models, in which the object of economic and statistical interest is a set rather than a point. The characterization of this set and the development of consistent estimators and inference procedures for it with desirable properties are the main goals of partial identification analysis. This review introduces the fundamental tools of the theory of random sets, which brings together elements of topology, convex geometry, and probability theory to develop a coherent mathematical framework to analyze random elements whose realizations are sets. It then elucidates how these tools have been fruitfully applied in econometrics to reach the goals of partial identification analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Europeanization challenges national democratic systems. As part and parcel of the broader internationalization of politics, Europeanization is associated with a shift from policymaking within majoritarian, elected representative bodies towards technocratic decisions among non-majoritarian and non-elected bodies (Kohler-Koch and Rittberger 2008, Lavenex 2013). It is thus said to weaken the influence of citizens and parliaments on the making of policies and to undermine democratic collective identity (Lavenex 2013, Schimmelfennig 2010). The weakening of national parliaments has been referred to as “de-parliamentarisation” (Goetz and Meyer-Sahling 2008) and has nurtured a broader debate regarding the democratic deficit in the EU. While not being a member of the EU, Switzerland has not remained unaffected by these changes. As discussed in the contribution by Fischer and Sciarini, state executive actors take the lead in Switzerland's European policy. They are responsible for the conduct of international negotiations, they own the treaty making power, and it is up to them to decide whether they wish to launch a negotiation with the EU. In addition, the strong take-it or leave-it character of Europeanized acts limits the room for manoeuver of the parliamentary body also in the ratification phase. Among the public, the rejection of the treaty on the European constitution has definitely closed the era of “permissive consensus” (Hooghe and Marks 2009). However, the process of European unification remains far remote from the European public. In Switzerland, the strongly administrative character of international legislation hinders public discussion (Vögeli 2007). In such a context, the media may serve as cue for the public: By delivering information about the extent and nature of Europeanized policymaking, the media enable citizens to form their own opinions and to hold their representatives accountable. In this sense media coverage may not only be considered an indicator of the information delivered to the public, but it may also enhance the democratic legitimacy of Europeanized policymaking (for a similar argument, see Tresch and Jochum 2005). While the previous contributions to this debate have examined the Europeanization of Swiss (primary and secondary) legislation, we take a closer look at two additional domestic arenas that are both supposed to be under pressure due to Europeanization: The parliament and the media. To that end, we rely on data gathered in a research project that two of us carried out in the context of the NCCR Democracy.1 While this project was primarily interested in the mediatization of decision-making processes in Switzerland, it also investigated the conditional role played by internationalization/Europeanization. For our present purposes, we shall exploit the two data-sets that were developed as part of a study of the political agenda-setting power of the media (Sciarini and Tresch 2012, 2013, Tresch et al. 2013): A data-set on issue attention in parliamentary interventions (initiatives, motions, postulates,2 interpellations and questions) and a data-set on issue attention in articles from the Neue Zürcher Zeitung (NZZ). The data covers the years 1995 to 2003 and the coding of issues followed the classification system developed in the “Policy Agendas Project” (Baumgartner and Jones 1993).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a novel approach to the reconstruction of depth from light field data. Our method uses dictionary representations and group sparsity constraints to derive a convex formulation. Although our solution results in an increase of the problem dimensionality, we keep numerical complexity at bay by restricting the space of solutions and by exploiting an efficient Primal-Dual formulation. Comparisons with state of the art techniques, on both synthetic and real data, show promising performances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposed an automated 3D lumbar intervertebral disc (IVD) segmentation strategy from MRI data. Starting from two user supplied landmarks, the geometrical parameters of all lumbar vertebral bodies and intervertebral discs are automatically extracted from a mid-sagittal slice using a graphical model based approach. After that, a three-dimensional (3D) variable-radius soft tube model of the lumbar spine column is built to guide the 3D disc segmentation. The disc segmentation is achieved as a multi-kernel diffeomorphic registration between a 3D template of the disc and the observed MRI data. Experiments on 15 patient data sets showed the robustness and the accuracy of the proposed algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dieser Artikel bietet einen Überblick über die Entwicklung und Zusammenhänge der einzelnen Elemente der Fuzzy-Logik, wovon Fuzzy-Set-Theorie die Grundlage bildet. Die Grundproblematik besteht in der Handhabung von linguistischen Informationen, die häufig durch Ungenauigkeit gekennzeichnet sind. Die verschiedenen technischen Anwendungen von Fuzzy-Logik bieten eine Möglichkeit, intelligentere Computersysteme zu konstruieren, die mit unpräzisen Informationen umgehen können. Solche Systeme sind Indizien für die Entstehung einer neuen Ära des Cognitive-Computing, di in diesemArtikel ebenfalls zur Sprache kommt. Für das bessere Verständnis wird der Artikel mit einem Beispiel aus der Meteorologie (d. h. Schnee in Adelboden) begleitet.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study evaluated the feasibility of documenting patterned injury using three dimensions and true colour photography without complex 3D surface documentation methods. This method is based on a generated 3D surface model using radiologic slice images (CT) while the colour information is derived from photographs taken with commercially available cameras. The external patterned injuries were documented in 16 cases using digital photography as well as highly precise photogrammetry-supported 3D structured light scanning. The internal findings of these deceased were recorded using CT and MRI. For registration of the internal with the external data, two different types of radiographic markers were used and compared. The 3D surface model generated from CT slice images was linked with the photographs, and thereby digital true-colour 3D models of the patterned injuries could be created (Image projection onto CT/IprojeCT). In addition, these external models were merged with the models of the somatic interior. We demonstrated that 3D documentation and visualization of external injury findings by integration of digital photography in CT/MRI data sets is suitable for the 3D documentation of individual patterned injuries to a body. Nevertheless, this documentation method is not a substitution for photogrammetry and surface scanning, especially when the entire bodily surface is to be recorded in three dimensions including all external findings, and when precise data is required for comparing highly detailed injury features with the injury-inflicting tool.