48 resultados para spatial data analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client’s site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PURPOSE: Tumor stage and nuclear grade are the most important prognostic parameters of clear cell renal cell carcinoma (ccRCC). The progression risk of ccRCC remains difficult to predict particularly for tumors with organ-confined stage and intermediate differentiation grade. Elucidating molecular pathways deregulated in ccRCC may point to novel prognostic parameters that facilitate planning of therapeutic approaches. EXPERIMENTAL DESIGN: Using tissue microarrays, expression patterns of 15 different proteins were evaluated in over 800 ccRCC patients to analyze pathways reported to be physiologically controlled by the tumor suppressors von Hippel-Lindau protein and phosphatase and tensin homologue (PTEN). Tumor staging and grading were improved by performing variable selection using Cox regression and a recursive bootstrap elimination scheme. RESULTS: Patients with pT2 and pT3 tumors that were p27 and CAIX positive had a better outcome than those with all remaining marker combinations. A prolonged survival among patients with intermediate grade (grade 2) correlated with both nuclear p27 and cytoplasmic PTEN expression, as well as with inactive, nonphosphorylated ribosomal protein S6. By applying graphical log-linear modeling for over 700 ccRCC for which the molecular parameters were available, only a weak conditional dependence existed between the expression of p27, PTEN, CAIX, and p-S6, suggesting that the dysregulation of several independent pathways are crucial for tumor progression. CONCLUSIONS: The use of recursive bootstrap elimination, as well as graphical log-linear modeling for comprehensive tissue microarray (TMA) data analysis allows the unraveling of complex molecular contexts and may improve predictive evaluations for patients with advanced renal cancer.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this article, we will link neuroimaging, data analysis, and intervention methods in an important psychiatric condition: auditory verbal hallucinations (AVH). The clinical and phenomenological background as well as neurophysiological findings will be covered and discussed with respect to noninvasive brain stimulation. Additionally, methods of noninvasive brain stimulation will be presented as ways to intervene with AVH. Finally, preliminary conclusions and possible future perspectives will be proposed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: High intercoder reliability (ICR) is required in qualitative content analysis for assuring quality when more than one coder is involved in data analysis. The literature is short of standardized procedures for ICR procedures in qualitative content analysis. OBJECTIVE: To illustrate how ICR assessment can be used to improve codings in qualitative content analysis. METHODS: Key steps of the procedure are presented, drawing on data from a qualitative study on patients' perspectives on low back pain. RESULTS: First, a coding scheme was developed using a comprehensive inductive and deductive approach. Second, 10 transcripts were coded independently by two researchers, and ICR was calculated. A resulting kappa value of .67 can be regarded as satisfactory to solid. Moreover, varying agreement rates helped to identify problems in the coding scheme. Low agreement rates, for instance, indicated that respective codes were defined too broadly and would need clarification. In a third step, the results of the analysis were used to improve the coding scheme, leading to consistent and high-quality results. DISCUSSION: The quantitative approach of ICR assessment is a viable instrument for quality assurance in qualitative content analysis. Kappa values and close inspection of agreement rates help to estimate and increase quality of codings. This approach facilitates good practice in coding and enhances credibility of analysis, especially when large samples are interviewed, different coders are involved, and quantitative results are presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Exposimeters are increasingly applied in bioelectromagnetic research to determine personal radiofrequency electromagnetic field (RF-EMF) exposure. The main advantages of exposimeter measurements are their convenient handling for study participants and the large amount of personal exposure data, which can be obtained for several RF-EMF sources. However, the large proportion of measurements below the detection limit is a challenge for data analysis. With the robust ROS (regression on order statistics) method, summary statistics can be calculated by fitting an assumed distribution to the observed data. We used a preliminary sample of 109 weekly exposimeter measurements from the QUALIFEX study to compare summary statistics computed by robust ROS with a naïve approach, where values below the detection limit were replaced by the value of the detection limit. For the total RF-EMF exposure, differences between the naïve approach and the robust ROS were moderate for the 90th percentile and the arithmetic mean. However, exposure contributions from minor RF-EMF sources were considerably overestimated with the naïve approach. This results in an underestimation of the exposure range in the population, which may bias the evaluation of potential exposure-response associations. We conclude from our analyses that summary statistics of exposimeter data calculated by robust ROS are more reliable and more informative than estimates based on a naïve approach. Nevertheless, estimates of source-specific medians or even lower percentiles depend on the assumed data distribution and should be considered with caution.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study presents a proxy-based, quantitative reconstruction of cold-season (mean October to May, TOct–May) air temperatures covering nearly the entire last millennium (AD 1060–2003, some hiatuses). The reconstruction was based on subfossil chrysophyte stomatocyst remains in the varved sediments of high-Alpine Lake Silvaplana, eastern Swiss Alps (46°27’N, 9°48′W, 1791 m a.s.l.). Previous studies have demonstrated the reliability of this proxy by comparison to meteorological data. Cold-season air temperatures could therefore be reconstructed quantitatively, at a high resolution (5-yr) and with high chronological accuracy. Spatial correlation analysis suggests that the reconstruction reflects cold season climate variability over the high- Alpine region and substantial parts of central and western Europe. Cold-season temperatures were characterized by a relatively stable first part of the millennium until AD 1440 (2σ of 5-yr mean values = 0.7 °C) and highly variable TOct–May after that (AD 1440–1900, 2σ of 5-yr mean values = 1.3 °C). Recent decades (AD, 1991-present) were unusually warm in the context of the last millennium (exceeding the 2σ-range of the mean decadal TOct–May) but this warmth was not unprecedented. The coolest decades occurred from AD 1510–1520 and AD 1880–1890. The timing of extremely warm and cold decades is generally in good agreement with documentary data representing Switzerland and central European lowlands. The transition from relatively stable to highly variable TOct–May coincided with large changes in atmospheric circulation patterns in the North Atlantic region. Comparison of reconstructed cold season temperatures to the North Atlantic Oscillation index (NAO) during the past 1000 years showed that the relatively stable and warm conditions at the study site until AD 1440 coincided with a persistent positive mode of the NAO. We propose that the transition to large TOct–May variability around AD 1440 was linked to the subsequent absence of this persistent zonal flow pattern, which would allow other climatic drivers to gain importance in the study area. From AD 1440–1900, the similarity of reconstructed TOct–May to reconstructed air pressure in the Siberian High suggests a relatively strong influence of continental anticyclonic systems on Alpine cold season climate parameters during periods when westerly airflow was subdued. A more continental type of atmospheric circulation thus seems to be characteristic for the Little Ice Age in Europe. Comparison of Toct–May to summer temperature reconstructions from the same study site shows that, as expected, summer and cold season temperature trends and variability differed completely throughout nearly the entire last 1000 years. Since AD 1980, however, summer and cold season temperatures show a simultaneous, strong increase, which is unprecedented in the context of the last millennium. We suggest that the most likely explanation for this recent trend is anthropogenic greenhouse gas (GHG) forcing.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Frequency-transformed EEG resting data has been widely used to describe normal and abnormal brain functional states as function of the spectral power in different frequency bands. This has yielded a series of clinically relevant findings. However, by transforming the EEG into the frequency domain, the initially excellent time resolution of time-domain EEG is lost. The topographic time-frequency decomposition is a novel computerized EEG analysis method that combines previously available techniques from time-domain spatial EEG analysis and time-frequency decomposition of single-channel time series. It yields a new, physiologically and statistically plausible topographic time-frequency representation of human multichannel EEG. The original EEG is accounted by the coefficients of a large set of user defined EEG like time-series, which are optimized for maximal spatial smoothness and minimal norm. These coefficients are then reduced to a small number of model scalp field configurations, which vary in intensity as a function of time and frequency. The result is thus a small number of EEG field configurations, each with a corresponding time-frequency (Wigner) plot. The method has several advantages: It does not assume that the data is composed of orthogonal elements, it does not assume stationarity, it produces topographical maps and it allows to include user-defined, specific EEG elements, such as spike and wave patterns. After a formal introduction of the method, several examples are given, which include artificial data and multichannel EEG during different physiological and pathological conditions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cluster randomized trials (CRTs) use as the unit of randomization clusters, which are usually defined as a collection of individuals sharing some common characteristics. Common examples of clusters include entire dental practices, hospitals, schools, school classes, villages, and towns. Additionally, several measurements (repeated measurements) taken on the same individual at different time points are also considered to be clusters. In dentistry, CRTs are applicable as patients may be treated as clusters containing several individual teeth. CRTs require certain methodological procedures during sample calculation, randomization, data analysis, and reporting, which are often ignored in dental research publications. In general, due to similarity of the observations within clusters, each individual within a cluster provides less information compared with an individual in a non-clustered trial. Therefore, clustered designs require larger sample sizes compared with non-clustered randomized designs, and special statistical analyses that account for the fact that observations within clusters are correlated. It is the purpose of this article to highlight with relevant examples the important methodological characteristics of cluster randomized designs as they may be applied in orthodontics and to explain the problems that may arise if clustered observations are erroneously treated and analysed as independent (non-clustered).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents an overview of the Mobile Data Challenge (MDC), a large-scale research initiative aimed at generating innovations around smartphone-based research, as well as community-based evaluation of mobile data analysis methodologies. First, we review the Lausanne Data Collection Campaign (LDCC), an initiative to collect unique longitudinal smartphone dataset for the MDC. Then, we introduce the Open and Dedicated Tracks of the MDC, describe the specific datasets used in each of them, discuss the key design and implementation aspects introduced in order to generate privacy-preserving and scientifically relevant mobile data resources for wider use by the research community, and summarize the main research trends found among the 100+ challenge submissions. We finalize by discussing the main lessons learned from the participation of several hundred researchers worldwide in the MDC Tracks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the present study the challenge of analyzing complex micro X-ray diffraction (microXRD) patterns from cement–clay interfaces has been addressed. In order to extract the maximum information concerning both the spatial distribution and the crystal structure type associated with each of the many diffracting grains in heterogeneous, polycrystalline samples, an approach has been developed in which microXRD was applied to thin sections which were rotated in the X-ray beam. The data analysis, performed on microXRD patterns collected from a filled vein of a cement–clay interface from the natural analogue in Maqarin (Jordan), and a sample from a two-year-old altered interface between cement and argillaceous rock, demonstrate the potential of this method.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Water stable isotope ratios and net snow accumulation in ice cores are commonly interpreted as temperature or precipitation proxies. However, only in a few cases has a direct calibration with instrumental data been attempted. In this study we took advantage of the dense network of observations in the European Alpine region to rigorously test the relationship of the annual and seasonal resolved proxy data from two highly resolved ice cores with local temperature and precipitation. We focused on the time period 1961–2001 with the highest amount and quality of meteorological data and the minimal uncertainty in ice core dating (±1 year). The two ice cores were retrieved from the Fiescherhorn glacier (northern Alps, 3900 m a.s.l.), and Grenzgletscher (southern Alps, 4200 m a.s.l.). A parallel core from the Fiescherhorn glacier allowed assessing the reproducibility of the ice core proxy data. Due to the orographic barrier, the two flanks of the Alpine chain are affected by distinct patterns of precipitation. The different location of the two glaciers therefore offers a unique opportunity to test whether such a specific setting is reflected in the proxy data. On a seasonal scale a high fraction of δ18O variability was explained by the seasonal cycle of temperature (~60% for the ice cores, ~70% for the nearby stations of the Global Network of Isotopes in Precipitation – GNIP). When the seasonality is removed, the correlations decrease for all sites, indicating that factors other than temperature such as changing moisture sources and/or precipitation regimes affect the isotopic signal on this timescale. Post-depositional phenomena may additionally modify the ice core data. On an annual scale, the δ18O/temperature relationship was significant at the Fiescherhorn, whereas for Grenzgletscher this was the case only when weighting the temperature with precipitation. In both cases the fraction of interannual temperature variability explained was ~20%, comparable to the values obtained from the GNIP stations data. Consistently with previous studies, we found an altitude effect for the δ18O of −0.17‰/100 m for an extended elevation range combining data of the two ice core sites and four GNIP stations. Significant correlations between net accumulation and precipitation were observed for Grenzgletscher during the entire period of investigation, whereas for Fiescherhorn this was the case only for the less recent period (1961–1977). Local phenomena, probably related to wind, seem to partly disturb the Fiescherhorn accumulation record. Spatial correlation analysis shows the two glaciers to be influenced by different precipitation regimes, with the Grenzgletscher reflecting the characteristic precipitation regime south of the Alps and the Fiescherhorn accumulation showing a pattern more closely linked to northern Alpine stations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES To identify the timing of significant arch dimensional increases during orthodontic alignment involving round and rectangular nickel-titanium (NiTi) wires and rectangular stainless steel (SS). A secondary aim was to compare the timing of changes occurring with conventional and self-ligating fixed appliance systems. METHODS In this non-primary publication, additional data from a multicenter randomised trial initially involving 96 patients, aged 16 years and above, were analysed. The main pre-specified outcome measures were the magnitude and timing of maxillary intercanine, interpremolar, and intermolar dimensions. Each participant underwent alignment with a standard Damon (Ormco, Orange, CA) wire sequence for a minimum of 34 weeks. Blinding of clinicians and patients was not possible; however, outcome assessors and data analysts were kept blind to the appliance type during data analysis. RESULTS Complete data were obtained from 71 subjects. Significant arch dimensional changes were observed relatively early in treatment. In particular, changes in maxillary inter-first and second premolar dimensions occurred after alignment with an 0.014in. NiTi wire (P<0.05). No statistical differences in transverse dimensions were found between rectangular NiTi and working SS wires for each transverse dimension (P>0.05). Bracket type had no significant effect on the timing of the transverse dimensional changes. CONCLUSIONS Arch dimensional changes were found to occur relatively early in treatment, irrespective of the appliance type. Nickel-titanium wires may have a more profound effect on transverse dimensions than previously believed. CLINICAL SIGNIFICANCE On the basis of this research orthodontic expansion may occur relatively early in treatment. Nickel-titanium wires may have a more profound effect on transverse dimensions than previously believed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PRINCIPALS Over a million people worldwide die each year from road traffic injuries and more than 10 million sustain permanent disabilities. Many of these victims are pedestrians. The present retrospective study analyzes the severity and mortality of injuries suffered by adult pedestrians, depending on whether they used a zebra crosswalk. METHODS Our retrospective data analysis covered adult patients admitted to our emergency department (ED) between 1 January 2000 and 31 December 2012 after being hit by a vehicle while crossing the road as a pedestrian. Patients were identified by using a string term. Medical, police and ambulance records were reviewed for data extraction. RESULTS A total of 347 patients were eligible for study inclusion. Two hundred and three (203; 58.5%) patients were on a zebra crosswalk and 144 (41.5%) were not. The mean ISS (injury Severity Score) was 12.1 (SD 14.7, range 1-75). The vehicles were faster in non-zebra crosswalk accidents (47.7 km/n, versus 41.4 km/h, p<0.027). The mean ISS score was higher in patients with non-zebra crosswalk accidents; 14.4 (SD 16.5, range 1-75) versus 10.5 (SD13.14, range 1-75) (p<0.019). Zebra crosswalk accidents were associated with less risk of severe injury (OR 0.61, 95% CI 0.38-0.98, p<0.042). Accidents involving a truck were associated with increased risk of severe injury (OR 3.53, 95%CI 1.21-10.26, p<0.02). CONCLUSION Accidents on zebra crosswalks are more common than those not on zebra crosswalks. The injury severity of non-zebra crosswalk accidents is significantly higher than in patients with zebra crosswalk accidents. Accidents involving large vehicles are associated with increased risk of severe injury. Further prospective studies are needed, with detailed assessment of motor vehicle types and speed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Swiss Swiss Consultant Trust Fund (CTF) support covered the period from July to December 2007 and comprised four main tasks: (1) Analysis of historic land degradation trends in the four watersheds of Zerafshan, Surkhob, Toirsu, and Vanj; (2) Translation of standard CDE GIS training materials into Russian and Tajik to enable local government staff and other specialists to use geospatial data and tools; (3) Demonstration of geospatial tools that show land degradation trends associated with land use and vegetative cover data in the project areas, (4) Preliminary training of government staff in using appropriate data, including existing information, global datasets, inexpensive satellite imagery and other datasets and webbased visualization tools like spatial data viewers, etc. The project allowed building of local awareness of, and skills in, up-to-date, inexpensive, easy-to-use GIS technologies, data sources, and applications relevant to natural resource management and especially to sustainable land management. In addition to supporting the implementation of the World Bank technical assistance activity to build capacity in the use of geospatial tools for natural resource management, the Swiss CTF support also aimed at complementing the Bank supervision work on the ongoing Community Agriculture and Watershed Management Project (CAWMP).