69 resultados para High-dimensional data visualization
Resumo:
Background Abstractor training is a key element in creating valid and reliable data collection procedures. The choice between in-person vs. remote or simultaneous vs. sequential abstractor training has considerable consequences for time and resource utilization. We conducted a web-based (webinar) abstractor training session to standardize training across six individual Cancer Research Network (CRN) sites for a study of breast cancer treatment effects in older women (BOWII). The goals of this manuscript are to describe the training session, its participants and participants' evaluation of webinar technology for abstraction training. Findings A webinar was held for all six sites with the primary purpose of simultaneously training staff and ensuring consistent abstraction across sites. The training session involved sequential review of over 600 data elements outlined in the coding manual in conjunction with the display of data entry fields in the study's electronic data collection system. Post-training evaluation was conducted via Survey Monkey©. Inter-rater reliability measures for abstractors within each site were conducted three months after the commencement of data collection. Ten of the 16 people who participated in the training completed the online survey. Almost all (90%) of the 10 trainees had previous medical record abstraction experience and nearly two-thirds reported over 10 years of experience. Half of the respondents had previously participated in a webinar, among which three had participated in a webinar for training purposes. All rated the knowledge and information delivered through the webinar as useful and reported it adequately prepared them for data collection. Moreover, all participants would recommend this platform for multi-site abstraction training. Consistent with participant-reported training effectiveness, results of data collection inter-rater agreement within sites ranged from 89 to 98%, with a weighted average of 95% agreement across sites. Conclusions Conducting training via web-based technology was an acceptable and effective approach to standardizing medical record review across multiple sites for this group of experienced abstractors. Given the substantial time and cost savings achieved with the webinar, coupled with participants' positive evaluation of the training session, researchers should consider this instructional method as part of training efforts to ensure high quality data collection in multi-site studies.
Resumo:
The immune system exhibits an enormous complexity. High throughput methods such as the "-omic'' technologies generate vast amounts of data that facilitate dissection of immunological processes at ever finer resolution. Using high-resolution data-driven systems analysis, causal relationships between complex molecular processes and particular immunological phenotypes can be constructed. However, processes in tissues, organs, and the organism itself (so-called higher level processes) also control and regulate the molecular (lower level) processes. Reverse systems engineering approaches, which focus on the examination of the structure, dynamics and control of the immune system, can help to understand the construction principles of the immune system. Such integrative mechanistic models can properly describe, explain, and predict the behavior of the immune system in health and disease by combining both higher and lower level processes. Moving from molecular and cellular levels to a multiscale systems understanding requires the development of methodologies that integrate data from different biological levels into multiscale mechanistic models. In particular, 3D imaging techniques and 4D modeling of the spatiotemporal dynamics of immune processes within lymphoid tissues are central for such integrative approaches. Both dynamic and global organ imaging technologies will be instrumental in facilitating comprehensive multiscale systems immunology analyses as discussed in this review.
Resumo:
[1] Early and Mid-Pleistocene climate, ocean hydrography and ice sheet dynamics have been reconstructed using a high-resolution data set (planktonic and benthicδ18O time series, faunal-based sea surface temperature (SST) reconstructions and ice-rafted debris (IRD)) record from a high-deposition-rate sedimentary succession recovered at the Gardar Drift formation in the subpolar North Atlantic (Integrated Ocean Drilling Program Leg 306, Site U1314). Our sedimentary record spans from late in Marine Isotope Stage (MIS) 31 to MIS 19 (1069–779 ka). Different trends of the benthic and planktonic oxygen isotopes, SST and IRD records before and after MIS 25 (∼940 ka) evidence the large increase in Northern Hemisphere ice-volume, linked to the cyclicity change from the 41-kyr to the 100-kyr that occurred during the Mid-Pleistocene Transition (MPT). Beside longer glacial-interglacial (G-IG) variability, millennial-scale fluctuations were a pervasive feature across our study. Negative excursions in the benthicδ18O time series observed at the times of IRD events may be related to glacio-eustatic changes due to ice sheets retreats and/or to changes in deep hydrography. Time series analysis on surface water proxies (IRD, SST and planktonicδ18O) of the interval between MIS 31 to MIS 26 shows that the timing of these millennial-scale climate changes are related to half-precessional (10 kyr) components of the insolation forcing, which are interpreted as cross-equatorial heat transport toward high latitudes during both equinox insolation maxima at the equator.
Resumo:
One of the major challenges for a mission to the Jovian system is the radiation tolerance of the spacecraft (S/C) and the payload. Moreover, being able to achieve science observations with high signal to noise ratios (SNR), while passing through the high flux radiation zones, requires additional ingenuity on the part of the instrument provider. Consequently, the radiation mitigation is closely intertwined with the payload, spacecraft and trajectory design, and requires a systems-level approach. This paper presents a design for the Io Volcano Observer (IVO), a Discovery mission concept that makes multiple close encounters with Io while orbiting Jupiter. The mission aims to answer key outstanding questions about Io, especially the nature of its intense active volcanism and the internal processes that drive it. The payload includes narrow-angle and wide-angle cameras (NAC and WAC), dual fluxgate magnetometers (FGM), a thermal mapper (ThM), dual ion and neutral mass spectrometers (INMS), and dual plasma ion analyzers (PIA). The radiation mitigation is implemented by drawing upon experiences from designs and studies for missions such as the Radiation Belt Storm Probes (RBSP) and Jupiter Europa Orbiter (JEO). At the core of the radiation mitigation is IVO's inclined and highly elliptical orbit, which leads to rapid passes through the most intense radiation near Io, minimizing the total ionizing dose (177 krads behind 100 mils of Aluminum with radiation design margin (RDM) of 2 after 7 encounters). The payload and the spacecraft are designed specifically to accommodate the fast flyby velocities (e.g. the spacecraft is radioisotope powered, remaining small and agile without any flexible appendages). The science instruments, which collect the majority of the high-priority data when close to Io and thus near the peak flux, also have to mitigate transient noise in their detectors. The cameras use a combination of shielding and CMOS detectors with extremely fast readout to mi- imize noise. INMS microchannel plate detectors and PIA channel electron multipliers require additional shielding. The FGM is not sensitive to noise induced by energetic particles and the ThM microbolometer detector is nearly insensitive. Detailed SNR calculations are presented. To facilitate targeting agility, all of the spacecraft components are shielded separately since this approach is more mass efficient than using a radiation vault. IVO uses proven radiation-hardened parts (rated at 100 krad behind equivalent shielding of 280 mils of Aluminum with RDM of 2) and is expected to have ample mass margin to increase shielding if needed.
Resumo:
BACKGROUND: Chinese herbal medicine (CHM) is increasingly used in the West, but the evidence on its effectiveness is a matter of debate. We compared the characteristics, study quality and results of clinical trials of CHM and conventional medicine. METHODS: Comparative study of placebo-controlled trials of CHM and conventional medicine. Eleven bibliographic databases and searches by hand of 48 Chinese-language journals. Conventional medicine trials matched for condition and type of outcome were randomly selected from the Cochrane Controlled Trials Register (issue 1, 2003). Trials described as double-blind, with adequate generation of allocation sequence and adequate concealment of allocation, were assumed to be of high quality. Data were analysed using funnel plots and multivariable meta-regression models. RESULTS: 136 CHM trials (119 published in Chinese, 17 published in English) and 136 matched conventional medicine trials (125 published in English) were analysed. The quality of Chinese-language CHM trials tended to be lower than that of English-language CHM trials and conventional medicine trials. Three (2%) CHM trials and 10 (7%) conventional medicine trials were of high quality. In all groups, smaller trials showed more beneficial treatment effects than larger trials. CHM trials published in Chinese showed considerably larger effects than CHM trials published in English (adjusted ratio of ORs 0.29, 95% confidence intervals 0.17-0.52). CONCLUSIONS: Biases are present both in placebo-controlled trials of CHM and conventional medicine, but may be most pronounced in CHM trials published in Chinese-language journals. Only few CHM trials of adequate methodology exist and the effectiveness of CHM therefore remains poorly documented.
Resumo:
Recent brain imaging work has expanded our understanding of the mechanisms of perceptual, cognitive, and motor functions in human subjects, but research into the cerebral control of emotional and motivational function is at a much earlier stage. Important concepts and theories of emotion are briefly introduced, as are research designs and multimodal approaches to answering the central questions in the field. We provide a detailed inspection of the methodological and technical challenges in assessing the cerebral correlates of emotional activation, perception, learning, memory, and emotional regulation behavior in healthy humans. fMRI is particularly challenging in structures such as the amygdala as it is affected by susceptibility-related signal loss, image distortion, physiological and motion artifacts and colocalized Resting State Networks (RSNs). We review how these problems can be mitigated by using optimized echo-planar imaging (EPI) parameters, alternative MR sequences, and correction schemes. High-quality data can be acquired rapidly in these problematic regions with gradient compensated multiecho EPI or high resolution EPI with parallel imaging and optimum gradient directions, combined with distortion correction. Although neuroimaging studies of emotion encounter many difficulties regarding the limitations of measurement precision, research design, and strategies of validating neuropsychological emotion constructs, considerable improvement in data quality and sensitivity to subtle effects can be achieved. The methods outlined offer the prospect for fMRI studies of emotion to provide more sensitive, reliable, and representative models of measurement that systematically relate the dynamics of emotional regulation behavior with topographically distinct patterns of activity in the brain. This will provide additional information as an aid to assessment, categorization, and treatment of patients with emotional and personality disorders.
Resumo:
Meteorological or climatological extremes are rare and hence studying them requires long meteorological data sets. Moreover, for addressing the underlying atmospheric processes, detailed three-dimensional data are desired. Until recently the two requirements were incompatible as long meteorological series were only available for a few locations, whereas detailed 3-dimensional data sets such as reanalyses were limited to the past few decades. In 2011, the “Twentieth Century Reanalysis” (20CR) was released, a 6-hourly global atmospheric data set covering the past 140 years, thus combining the two properties. The collection of short papers in this volume contains case studies of individual extreme events in the 20CR data set. In this overview paper we introduce the first six cases and summarise some common findings. All of the events are represented in 20CR in a physically consistent way, allowing further meteorological interpretations and process studies. Also, for most of the events, the magnitudes are underestimated in the ensemble mean. Possible causes are addressed. For interpreting extrema it may be necessary to address individual ensemble members. Also, the density of observations underlying 20CR should be considered. Finally, we point to problems in wind speeds over the Arctic and the northern North Pacific in 20CR prior to the 1950s.
Resumo:
Analysing historical weather extremes such as the tropical cyclone in Samoa in March 1889 could add to our understanding of extreme events. However, up to now the availability of suitable data was limiting the analysis of historical extremes, particularly in remote regions. The new “Twentieth Century Reanalysis” (20CR), which provides six-hourly, three-dimensional data for the entire globe back to 1871, might provide the means to study this and other early events. While its suitability for studying historical extremes has been analysed for events in the northern extratropics (see other papers in this volume), the representation of tropical cyclones, especially in early times, remains unknown. The aim of this paper is to study to the hurricane that struck Samoa on 15-16 March 1889. We analyse the event in 20CR as well as in contemporary observations. We find that the event is not reproduced in the ensemble mean of 20CR, nor is it within the ensemble spread. We argue that this is due to the paucity of data assimilated into 20CR. A preliminary compilation of historical observations from ships for that period, in contrast, provides a relatively consistent picture of the event. This shows that more observations would be available and implies that future versions of surface-based reanalyses might profit from digitizing further observations in the tropical region.
Resumo:
Guided tissue regeneration (GTR) with bioabsorbable collagen membranes (CM) is commonly used for the treatment of periodontal defects. The objective of this systematic review of randomized clinical trials was to assess the clinical efficacy of GTR procedures with CM, with or without bone substitutes, in periodontal infrabony defects compared with that of open flap debridement (OFD) alone. Primary outcomes were tooth loss and gain in clinical attachment level (CAL). Screening of records, data extraction, and risk-of-bias assessments were performed by two reviewers. Weighted mean differences were estimated by random effects meta-analysis. We included 21 reports on 17 trials. Risk of bias was generally high. No data were available for the primary outcome tooth loss. The summary treatment effect for change in CAL for GTR with CM compared with OFD was 1.58 mm (95% CI, 1.27 to 1.88). Despite large between-trial heterogeneity (I2 = 75%, p < .001), all trials favored GTR over OFD. No differences in treatment effects were detected between trials of GTR with CM alone and trials of GTR with CM in combination with bone substitutes (p for interaction, .31). GTR with CM, with or without substitutes, may result in improved clinical outcomes compared with those achieved with OFD alone. Our findings support GTR with CM for the treatment of infrabony periodontal defects.
Resumo:
This paper introduces and analyzes a stochastic search method for parameter estimation in linear regression models in the spirit of Beran and Millar [Ann. Statist. 15(3) (1987) 1131–1154]. The idea is to generate a random finite subset of a parameter space which will automatically contain points which are very close to an unknown true parameter. The motivation for this procedure comes from recent work of Dümbgen et al. [Ann. Statist. 39(2) (2011) 702–730] on regression models with log-concave error distributions.
Resumo:
In the setting of high-dimensional linear models with Gaussian noise, we investigate the possibility of confidence statements connected to model selection. Although there exist numerous procedures for adaptive (point) estimation, the construction of adaptive confidence regions is severely limited (cf. Li in Ann Stat 17:1001–1008, 1989). The present paper sheds new light on this gap. We develop exact and adaptive confidence regions for the best approximating model in terms of risk. One of our constructions is based on a multiscale procedure and a particular coupling argument. Utilizing exponential inequalities for noncentral χ2-distributions, we show that the risk and quadratic loss of all models within our confidence region are uniformly bounded by the minimal risk times a factor close to one.
Resumo:
In recent years, high-accuracy data for pionic hydrogen and deuterium have become the primary source of information on the pion–nucleon scattering lengths. Matching the experimental precision requires, in particular, the study of isospin-breaking corrections both in pion– nucleon and pion–deuteron scattering. We review the mechanisms that lead to the cancellation of potentially enhanced virtual-photon corrections in the pion–deuteron system, and discuss the subtleties regarding the definition of the pion–nucleon scattering lengths in the presence of electromagnetic interactions by comparing to nucleon–nucleon scattering. Based on the p±p channels we find for the virtual-photon-subtracted scattering lengths in the isospin basis a1/2/ g= (170.5±2.0) · 10−3M−1p and a3/2/ g= (−86.5±1.8) · 10−3M−1p .
Resumo:
BACKGROUND: Although brucellosis (Brucella spp.) and Q Fever (Coxiella burnetii) are zoonoses of global importance, very little high quality data are available from West Africa. METHODS/PRINCIPAL FINDINGS: A serosurvey was conducted in Togo's main livestock-raising zone in 2011 in 25 randomly selected villages, including 683 people, 596 cattle, 465 sheep and 221 goats. Additionally, 464 transhumant cattle from Burkina Faso were sampled in 2012. The serological analyses performed were the Rose Bengal Test and ELISA for brucellosis and ELISA and the immunofluorescence assay (IFA) for Q Fever Brucellosis did not appear to pose a major human health problem in the study zone, with only 7 seropositive participants. B. abortus was isolated from 3 bovine hygroma samples, and is likely to be the predominant circulating strain. This may explain the observed seropositivity amongst village cattle (9.2%, 95%CI:4.3-18.6%) and transhumant cattle (7.3%, 95%CI:3.5-14.7%), with an absence of seropositive small ruminants. Exposure of livestock and people to C. burnetii was common, potentially influenced by cultural factors. People of Fulani ethnicity had greater livestock contact and a significantly higher seroprevalence than other ethnic groups (Fulani: 45.5%, 95%CI:37.7-53.6%; non-Fulani: 27.1%, 95%CI:20.6-34.7%). Appropriate diagnostic test cut-off values in endemic settings requires further investigation. Both brucellosis and Q Fever appeared to impact on livestock production. Seropositive cows were more likely to have aborted a foetus during the previous year than seronegative cows, when adjusted for age. This odds was 3.8 times higher (95%CI: 1.2-12.1) for brucellosis and 6.7 times higher (95%CI: 1.3-34.8) for Q Fever. CONCLUSIONS: This is the first epidemiological study of zoonoses in Togo in linked human and animal populations, providing much needed data for West Africa. Exposure to Brucella and C. burnetii is common but further research is needed into the clinical and economic impact.
Resumo:
One of the current challenges in evolutionary ecology is understanding the long-term persistence of contemporary-evolving predator–prey interactions across space and time. To address this, we developed an extension of a multi-locus, multi-trait eco-evolutionary individual-based model that incorporates several interacting species in explicit landscapes. We simulated eco-evolutionary dynamics of multiple species food webs with different degrees of connectance across soil-moisture islands. A broad set of parameter combinations led to the local extinction of species, but some species persisted, and this was associated with (1) high connectance and omnivory and (2) ongoing evolution, due to multi-trait genetic variability of the embedded species. Furthermore, persistence was highest at intermediate island distances, likely because of a balance between predation-induced extinction (strongest at short island distances) and the coupling of island diversity by top predators, which by travelling among islands exert global top-down control of biodiversity. In the simulations with high genetic variation, we also found widespread trait evolutionary changes indicative of eco-evolutionary dynamics. We discuss how the ever-increasing computing power and high-resolution data availability will soon allow researchers to start bridging the in vivo–in silico gap.
Resumo:
OBJECTIVE The Short Communication presents a clinical case in which a novel procedure--the "Individualized Scanbody Technique" (IST)--was applied, starting with an intraoral digital impression and using CAD/CAM process for fabrication of ceramic reconstructions in bone level implants. MATERIAL AND METHODS A standardized scanbody was individually modified in accordance with the created emergence profile of the provisional implant-supported restoration. Due to the specific adaptation of the scanbody, the conditioned supra-implant soft tissue complex was stabilized for the intraoral optical scan process. Then, the implant platform position and the supra-implant mucosa outline were transferred into the three-dimensional data set with a digital impression system. Within the technical workflow, the ZrO2 -implant-abutment substructure could be designed virtually with predictable margins of the supra-implant mucosa. RESULTS After finalization of the 1-piece screw-retained full ceramic implant crown, the restoration demonstrated an appealing treatment outcome with harmonious soft tissue architecture. CONCLUSIONS The IST facilitates a simple and fast approach for a supra-implant mucosal outline transfer in the digital workflow. Moreover, the IST closes the interfaces in the full digital pathway.