63 resultados para data analysis: algorithms and implementation
Resumo:
Screening for malignant disease aims to reduce the population risk of impaired health due to the tumor in question. Screening does not only entail testing but covers all steps required to achieve the intended reduction in risk, from the appropriate information of the population to a suitable therapy. Screening tests are performed in individuals free or unaware of any symptoms associated with the tumor. An essential condition is a recognizable pathological abnormality, which occurs without symptoms and represents a pre-clinical, early stage of the tumor. Overdiagnosis and overtreatment have only recently been recognized as important problems of screening for malignant disease. Overdiagnosis is defined as a screening-detected tumor that would never have led to symptoms. In prostate-specific antigen (PSA) screening for prostate cancer 50 % - 70 % of screening-detected cancers represent such overdiagnoses. Similarly, in the case of mammography screening 20 % - 30 % of screening-detected breast cancers are overdiagnoses. The evaluation of screening interventions is often affected by biases such as healthy screenee effects or length and lead time bias. Randomized controlled trials are therefore needed to examine the efficacy and effectiveness of screening interventions and to define the rate of adverse outcomes such as unnecessary diagnostic evaluations, overdiagnosis and overtreatment. Unfortunately there is no independent Swiss body comparable to the National Screening Committee in the United Kingdom or the United States Preventive Services Task Force, which examines screening tests and programs and develops recommendations. Clearly defined goals, a central organization responsible for inviting eligible individuals, documentation and quality assurance and balanced information of the public are important attributes of successful screening programs. In Switzerland the establishment of such programs is hampered by the highly fragmented, Federal health system which allows patients to access specialists directly.
Resumo:
The past 1500 years provide a valuable opportunity to study the response of the climate system to external forcings. However, the integration of paleoclimate proxies with climate modeling is critical to improving the understanding of climate dynamics. In this paper, a climate system model and proxy records are therefore used to study the role of natural and anthropogenic forcings in driving the global climate. The inverse and forward approaches to paleoclimate data–model comparison are applied, and sources of uncertainty are identified and discussed. In the first of two case studies, the climate model simulations are compared with multiproxy temperature reconstructions. Robust solar and volcanic signals are detected in Southern Hemisphere temperatures, with a possible volcanic signal detected in the Northern Hemisphere. The anthropogenic signal dominates during the industrial period. It is also found that seasonal and geographical biases may cause multiproxy reconstructions to overestimate the magnitude of the long-term preindustrial cooling trend. In the second case study, the model simulations are compared with a coral δ18O record from the central Pacific Ocean. It is found that greenhouse gases, solar irradiance, and volcanic eruptions all influence the mean state of the central Pacific, but there is no evidence that natural or anthropogenic forcings have any systematic impact on El Niño–Southern Oscillation. The proxy climate relationship is found to change over time, challenging the assumption of stationarity that underlies the interpretation of paleoclimate proxies. These case studies demonstrate the value of paleoclimate data–model comparison but also highlight the limitations of current techniques and demonstrate the need to develop alternative approaches.
Resumo:
The T2K long-baseline neutrino oscillation experiment in Japan needs precise predictions of the initial neutrino flux. The highest precision can be reached based on detailed measurements of hadron emission from the same target as used by T2K exposed to a proton beam of the same kinetic energy of 30 GeV. The corresponding data were recorded in 2007-2010 by the NA61/SHINE experiment at the CERN SPS using a replica of the T2K graphite target. In this paper details of the experiment, data taking, data analysis method and results from the 2007 pilot run are presented. Furthermore, the application of the NA61/SHINE measurements to the predictions of the T2K initial neutrino flux is described and discussed.
Resumo:
The central assumption in the literature on collaborative networks and policy networks is that political outcomes are affected by a variety of state and nonstate actors. Some of these actors are more powerful than others and can therefore have a considerable effect on decision making. In this article, we seek to provide a structural and institutional explanation for these power differentials in policy networks and support the explanation with empirical evidence. We use a dyadic measure of influence reputation as a proxy for power, and posit that influence reputation over the political outcome is related to vertical integration into the political system by means of formal decision-making authority, and to horizontal integration by means of being well embedded into the policy network. Hence, we argue that actors are perceived as influential because of two complementary factors: (a) their institutional roles and (b) their structural positions in the policy network. Based on temporal and cross-sectional exponential random graph models, we compare five cases about climate, telecommunications, flood prevention, and toxic chemicals politics in Switzerland and Germany. The five networks cover national and local networks at different stages of the policy cycle. The results confirm that institutional and structural drivers seem to have a crucial impact on how an actor is perceived in decision making and implementation and, therefore, their ability to significantly shape outputs and service delivery.
Resumo:
In this paper we propose a new fully-automatic method for localizing and segmenting 3D intervertebral discs from MR images, where the two problems are solved in a unified data-driven regression and classification framework. We estimate the output (image displacements for localization, or fg/bg labels for segmentation) of image points by exploiting both training data and geometric constraints simultaneously. The problem is formulated in a unified objective function which is then solved globally and efficiently. We validate our method on MR images of 25 patients. Taking manually labeled data as the ground truth, our method achieves a mean localization error of 1.3 mm, a mean Dice metric of 87%, and a mean surface distance of 1.3 mm. Our method can be applied to other localization and segmentation tasks.
Resumo:
A feasibility study by Pail et al. (Can GOCE help to improve temporal gravity field estimates? In: Ouwehand L (ed) Proceedings of the 4th International GOCE User Workshop, ESA Publication SP-696, 2011b) shows that GOCE (‘Gravity field and steady-state Ocean Circulation Explorer’) satellite gravity gradiometer (SGG) data in combination with GPS derived orbit data (satellite-to-satellite tracking: SST-hl) can be used to stabilize and reduce the striping pattern of a bi-monthly GRACE (‘Gravity Recovery and Climate Experiment’) gravity field estimate. In this study several monthly (and bi-monthly) combinations of GRACE with GOCE SGG and GOCE SST-hl data on the basis of normal equations are investigated. Our aim is to assess the role of the gradients (solely) in the combination and whether already one month of GOCE observations provides sufficient data for having an impact in the combination. The estimation of clean and stable monthly GOCE SGG normal equations at high resolution ( > d/o 150) is found to be difficult, and the SGG component, solely, does not show significant added value to monthly and bi-monthly GRACE gravity fields. Comparisons of GRACE-only and combined monthly and bi-monthly solutions show that the striping pattern can only be reduced when using both GOCE observation types (SGG, SST-hl), and mainly between d/o 45 and 60.
Resumo:
New pollen based reconstructions of summer (May-to-August) and winter (December-to-February) temperatures between 15 and 8 ka BP along a S-N transect in the Baltic-Belarus (BB) area display trends in temporal and spatial changes in climate variability. These results are completed by two chironomid-based July mean temperature reconstructions. The magnitude of change compared with modern temperatures was more prominent in the northern part of BB area. The 4 C degrees winter and 2 C degrees summer warming at the start of GI-1 was delayed in the BB area and Lateglacial maximum temperatures were reached at ca 13.6 ka BP, being 4 C degrees colder than the modern mean. The Younger Dryas cooling in the area was 5 C degrees colder than present, as inferred by all proxies. In addition, our analyses show an early Holocene divergence in winter temperature trends with modern values reaching 1 ka earlier (10 ka BP) in southern BB compared to the northern part of the region (9 ka BP).
Resumo:
Index tracking has become one of the most common strategies in asset management. The index-tracking problem consists of constructing a portfolio that replicates the future performance of an index by including only a subset of the index constituents in the portfolio. Finding the most representative subset is challenging when the number of stocks in the index is large. We introduce a new three-stage approach that at first identifies promising subsets by employing data-mining techniques, then determines the stock weights in the subsets using mixed-binary linear programming, and finally evaluates the subsets based on cross validation. The best subset is returned as the tracking portfolio. Our approach outperforms state-of-the-art methods in terms of out-of-sample performance and running times.