931 resultados para Capture-recapture Data
Resumo:
Changes in the oceanic heat storage (HS) can reveal important evidences of climate variability related to ocean heat fluxes. Specifically, long-term variations in HS are a powerful indicator of climate change as HS represents the balance between the net surface energy flux and the poleward heat transported by the ocean currents. HS is estimated from sea surface height anomaly measured from the altimeters TOPEX/Poseidon and Jason 1 from 1993 to 2006. To characterize and validate the altimeter-based HS in the Atlantic, we used the data from the Pilot Research Moored Array in the Tropical Atlantic (PIRATA) array. Correlations and rms differences are used as statistical figures of merit to compare the HS estimates. The correlations range from 0.50 to 0.87 in the buoys located at the equator and at the southern part of the array. In that region the rms differences range between 0.40 and 0.51 x 10(9) Jm(-2). These results are encouraging and indicate that the altimeter has the precision necessary to capture the interannual trends in HS in the Atlantic. Albeit relatively small, salinity changes can also have an effect on the sea surface height anomaly. To account for this effect, NCEP/GODAS reanalysis data are used to estimate the haline contraction. To understand which dynamical processes are involved in the HS variability, the total signal is decomposed into nonpropagating basin-scale and seasonal (HS(l)) planetary waves, mesoscale eddies, and small-scale residual components. In general, HS(l) is the dominant signal in the tropical region. Results show a warming trend of HS(l) in the past 13 years almost all over the Atlantic basin with the most prominent slopes found at high latitudes. Positive interannual trends are found in the halosteric component at high latitudes of the South Atlantic and near the Labrador Sea. This could be an indication that the salinity anomaly increased in the upper layers during this period. The dynamics of the South Atlantic subtropical gyre could also be subject to low-frequency changes caused by a trend in the halosteric component on each side of the South Atlantic Current.
Resumo:
The objective of the present study was to evaluate the plasticity of the hunting behavior of the spider Nephilengys cruentata (Araneae: Nephilidae) facing different species of social wasps. Considering that wasps can consume various species of spiders and that their poison can be used as defense against many predators, the effect of the corporal size of the prey was evaluated in the behavior of N. cruentata. Predation experiments were conducted using three species of social wasps of different sizes and the data registered in this research were compiled through annotations and filming of the hunting behavior of each spider, in relation to the offered prey. The results revealed that the size of the wasp and the sequential offer of prey change the hunting behavior of the spider, and prey of large size have high influence on this behavior.
Resumo:
Background: Malaria caused by Plasmodium vivax is an experimentally neglected severe disease with a substantial burden on human health. Because of technical limitations, little is known about the biology of this important human pathogen. Whole genome analysis methods on patient-derived material are thus likely to have a substantial impact on our understanding of P. vivax pathogenesis and epidemiology. For example, it will allow study of the evolution and population biology of the parasite, allow parasite transmission patterns to be characterized, and may facilitate the identification of new drug resistance genes. Because parasitemias are typically low and the parasite cannot be readily cultured, on-site leukocyte depletion of blood samples is typically needed to remove human DNA that may be 1000X more abundant than parasite DNA. These features have precluded the analysis of archived blood samples and require the presence of laboratories in close proximity to the collection of field samples for optimal pre-cryopreservation sample preparation. Results: Here we show that in-solution hybridization capture can be used to extract P. vivax DNA from human contaminating DNA in the laboratory without the need for on-site leukocyte filtration. Using a whole genome capture method, we were able to enrich P. vivax DNA from bulk genomic DNA from less than 0.5% to a median of 55% (range 20%-80%). This level of enrichment allows for efficient analysis of the samples by whole genome sequencing and does not introduce any gross biases into the data. With this method, we obtained greater than 5X coverage across 93% of the P. vivax genome for four P. vivax strains from Iquitos, Peru, which is similar to our results using leukocyte filtration (greater than 5X coverage across 96% of the genome). Conclusion: The whole genome capture technique will enable more efficient whole genome analysis of P. vivax from a larger geographic region and from valuable archived sample collections.
Resumo:
Information on B-10 distribution in normal tissues is crucial to any further development of boron neutron capture therapy (BNCT). The goal of this study was to investigate the in vitro and in vivo boron biodistribution in B16F10 murine melanoma and normal tissues as a model for human melanoma treatment by a simple and rapid colorimetric method, which was validated by HR-ICP-MS. The B16F10 melanoma cell line showed higher melanin content than human melanocytes, demonstrating a greater potential for boronophenylalanine uptake. The melanocytes showed a moderate viability decrease in the first few minutes after BNCT application, stabilizing after 75 min, whereas the B16F10 melanoma showed the greatest intracellular boron concentration at 150 min after application, indicating a different boron uptake of melanoma cells compared to normal melanocytes. Moreover, at this time, the increase in boron uptake in melanoma cells was approximately 1.6 times higher than that in normal melanocytes. The B-10 concentration in the blood of mice bearing B16F10 melanoma increased until 90 min after BNCT application and then decreased after 120 min, and remained low until the 240th minute. On the other hand, the B-10 concentration in tumors was increased from 90 min and maximal at 150 min after application, thus confirming the in vitro results. Therefore, the present in vitro and in vivo study of B-10 uptake in normal and tumor cells revealed important data that could enable BNCT to be possibly used as a treatment for melanoma, a chemoresistant cancer associated with high mortality.
Resumo:
In the era of the Internet of Everything, a user with a handheld or wearable device equipped with sensing capability has become a producer as well as a consumer of information and services. The more powerful these devices get, the more likely it is that they will generate and share content locally, leading to the presence of distributed information sources and the diminishing role of centralized servers. As of current practice, we rely on infrastructure acting as an intermediary, providing access to the data. However, infrastructure-based connectivity might not always be available or the best alternative. Moreover, it is often the case where the data and the processes acting upon them are of local scopus. Answers to a query about a nearby object, an information source, a process, an experience, an ability, etc. could be answered locally without reliance on infrastructure-based platforms. The data might have temporal validity limited to or bounded to a geographical area and/or the social context where the user is immersed in. In this envisioned scenario users could interact locally without the need for a central authority, hence, the claim of an infrastructure-less, provider-less platform. The data is owned by the users and consulted locally as opposed to the current approach of making them available globally and stay on forever. From a technical viewpoint, this network resembles a Delay/Disruption Tolerant Network where consumers and producers might be spatially and temporally decoupled exchanging information with each other in an adhoc fashion. To this end, we propose some novel data gathering and dissemination strategies for use in urban-wide environments which do not rely on strict infrastructure mediation. While preserving the general aspects of our study and without loss of generality, we focus our attention toward practical applicative scenarios which help us capture the characteristics of opportunistic communication networks.
Resumo:
Die Bor-Neuroneneinfang-Therapie (engl.: Boron Neutron Capture Therapy, BNCT) ist eine indirekte Strahlentherapie, welche durch die gezielte Freisetzung von dicht ionisierender Strahlung Tumorzellen zerstört. Die freigesetzten Ionen sind Spaltfragmente einer Kernreaktion, bei welcher das Isotop 10B ein niederenergetisches (thermisches) Neutron einfängt. Das 10B wird durch ein spezielles Borpräparat in den Tumorzellen angereichert, welches selbst nicht radioaktiv ist. rnAn der Johannes Gutenberg-Universität Mainz wurde die Forschung für die Anwendung eines klinischen Behandlungsprotokolls durch zwei Heilversuche bei Patienten mit kolorektalen Lebermetastasen an der Universität Pavia, Italien, angeregt, bei denen die Leber außerhalb des Körpers in einem Forschungsreaktor bestrahlt wurde. Als erster Schritt wurde in Kooperation verschiedener universitärer Institute eine klinische Studie zur Bestimmung klinisch relevanter Parameter wie der Borverteilung in verschiedenen Geweben und dem pharmakokinetischen Aufnahmeverhalten des Borpräparates initiiert.rnDie Borkonzentration in den Gewebeproben wurde hinsichtlich ihrer räumlichen Verteilung in verschiedenen Zellarealen bestimmt, um mehr über das Aufnahmeverhalten der Zellen für das BPA im Hinblick auf ihre biologischen Charakteristika zu erfahren. Die Borbestimung wurde per Quantitative Neutron Capture Radiography, Prompt Gamma Activation Analysis und Inductively Coupled Plasma Mass Spectroscopy parallel zur histologischen Analyse des Gewebes durchgeführt. Es war möglich zu zeigen, dass in Proben aus Tumorgewebe und aus tumorfreiem Gewebe mit unterschiedlichen morphologischen Eigenschaften eine sehr heterogene Borverteilung vorliegt. Die Ergebnisse der Blutproben werden für die Erstellung eines pharmakokinetischen Modells verwendet und sind in Übereinstimmung mit existierenden pharmakokinetische Modellen. Zusätzlich wurden die Methoden zur Borbestimmung über speziell hergestellte Referenzstandards untereinander verglichen. Dabei wurde eine gute Übereinstimmung der Ergebnisse festgestellt, ferner wurde für alle biologischen Proben Standardanalyseprotokolle erstellt.rnDie bisher erhaltenen Ergebnisse der klinischen Studie sind vielversprechend, lassen aber noch keine endgültigen Schlussfolgerungen hinsichtlich der Wirksamkeit von BNCT für maligne Lebererkrankungen zu. rn
Resumo:
The aim of this work is to provide a precise and accurate measurement of the 238U(n,gamma) reaction cross-section. This reaction is of fundamental importance for the design calculations of nuclear reactors, governing the behaviour of the reactor core. In particular, fast neutron reactors, which are experiencing a growing interest for their ability to burn radioactive waste, operate in the high energy region of the neutron spectrum. In this energy region inconsistencies between the existing measurements are present up to 15%, and the most recent evaluations disagree each other. In addition, the assessment of nuclear data uncertainty performed for innovative reactor systems shows that the uncertainty in the radiative capture cross-section of 238U should be further reduced to 1-3% in the energy region from 20 eV to 25 keV. To this purpose, addressed by the Nuclear Energy Agency as a priority nuclear data need, complementary experiments, one at the GELINA and two at the n_TOF facility, were scheduled within the ANDES project within the 7th Framework Project of the European Commission. The results of one of the 238U(n,gamma) measurement performed at the n_TOF CERN facility are presented in this work, carried out with a detection system constituted of two liquid scintillators. The very accurate cross section from this work is compared with the results obtained from the other measurement performed at the n_TOF facility, which exploit a different and complementary detection technique. The excellent agreement between the two data-sets points out that they can contribute to the reduction of the cross section uncertainty down to the required 1-3%.
Resumo:
This paper describes a method for DRR generation as well as for volume gradients projection using hardware accelerated 2D texture mapping and accumulation buffering and demonstrates its application in 2D-3D registration of X-ray fluoroscopy to CT images. The robustness of the present registration scheme are guaranteed by taking advantage of a coarse-to-fine processing of the volume/image pyramids based on cubic B-splines. A human cadaveric spine specimen together with its ground truth was used to compare the present scheme with a purely software-based scheme in three aspects: accuracy, speed, and capture ranges. Our experiments revealed an equivalent accuracy and capture ranges but with much shorter registration time with the present scheme. More specifically, the results showed 0.8 mm average target registration error, 55 second average execution time per registration, and 10 mm and 10° capture ranges for the present scheme when tested on a 3.0 GHz Pentium 4 computer.
Resumo:
Methods for optical motion capture often require timeconsuming manual processing before the data can be used for subsequent tasks such as retargeting or character animation. These processing steps restrict the applicability of motion capturing especially for dynamic VR-environments with real time requirements. To solve these problems, we present two additional, fast and automatic processing stages based on our motion capture pipeline presented in [HSK05]. A normalization step aligns the recorded coordinate systems with the skeleton structure to yield a common and intuitive data basis across different recording sessions. A second step computes a parameterization based on automatically extracted main movement axes to generate a compact motion description. Our method does not restrict the placement of marker bodies nor the recording setup, and only requires a short calibration phase.
Resumo:
In this paper, we investigate how a multilinear model can be used to represent human motion data. Based on technical modes (referring to degrees of freedom and number of frames) and natural modes that typically appear in the context of a motion capture session (referring to actor, style, and repetition), the motion data is encoded in form of a high-order tensor. This tensor is then reduced by using N-mode singular value decomposition. Our experiments show that the reduced model approximates the original motion better then previously introduced PCA-based approaches. Furthermore, we discuss how the tensor representation may be used as a valuable tool for the synthesis of new motions.
Resumo:
The current state of health and biomedicine includes an enormity of heterogeneous data ‘silos’, collected for different purposes and represented differently, that are presently impossible to share or analyze in toto. The greatest challenge for large-scale and meaningful analyses of health-related data is to achieve a uniform data representation for data extracted from heterogeneous source representations. Based upon an analysis and categorization of heterogeneities, a process for achieving comparable data content by using a uniform terminological representation is developed. This process addresses the types of representational heterogeneities that commonly arise in healthcare data integration problems. Specifically, this process uses a reference terminology, and associated "maps" to transform heterogeneous data to a standard representation for comparability and secondary use. The capture of quality and precision of the “maps” between local terms and reference terminology concepts enhances the meaning of the aggregated data, empowering end users with better-informed queries for subsequent analyses. A data integration case study in the domain of pediatric asthma illustrates the development and use of a reference terminology for creating comparable data from heterogeneous source representations. The contribution of this research is a generalized process for the integration of data from heterogeneous source representations, and this process can be applied and extended to other problems where heterogeneous data needs to be merged.
Resumo:
Neutron capture effects in meteorites and lunar surface samples have been successfully used in the past to study exposure histories and shielding conditions. In recent years, however, it turned out that neutron capture effects produce a nuisance for some of the short-lived radionuclide systems. The most prominent example is the 182Hf-182W system in iron meteorites, for which neutron capture effects lower the 182W/184W ratio, thereby producing too old apparent ages. Here, we present a thorough study of neutron capture effects in iron meteorites, ordinary chondrites, and carbonaceous chondrites, whereas the focus is on iron meteorites. We study in detail the effects responsible for neutron production, neutron transport, and neutron slowing down and find that neutron capture in all studied meteorite types is not, as usually expected, exclusively via thermal neutrons. In contrast, most of the neutron capture in iron meteorites is in the epithermal energy range and there is a significant contribution from epithermal neutron capture even in stony meteorites. Using sophisticated particle spectra and evaluated cross section data files for neutron capture reactions we calculate the neutron capture effects for Sm, Gd, Cd, Pd, Pt, and Os isotopes, which all can serve as neutron-dose proxies, either in stony or in iron meteorites. In addition, we model neutron capture effects in W and Ag isotopes. For W isotopes, the GCR-induced shifts perfectly correlate with Os and Pt isotope shifts, which therefore can be used as neutron-dose proxies and permit a reliable correction. We also found that GCR-induced effects for the 107Pd-107Ag system can be significant and need to be corrected, a result that is in contrast to earlier studies.
Resumo:
Whole exome sequencing (WES) is increasingly used in research and diagnostics. WES users expect coverage of the entire coding region of known genes as well as sufficient read depth for the covered regions. It is, however, unknown which recent WES platform is most suitable to meet these expectations. We present insights into the performance of the most recent standard exome enrichment platforms from Agilent, NimbleGen and Illumina applied to six different DNA samples by two sequencing vendors per platform. Our results suggest that both Agilent and NimbleGen overall perform better than Illumina and that the high enrichment performance of Agilent is stable among samples and between vendors, whereas NimbleGen is only able to achieve vendor- and sample-specific best exome coverage. Moreover, the recent Agilent platform overall captures more coding exons with sufficient read depth than NimbleGen and Illumina. Due to considerable gaps in effective exome coverage, however, the three platforms cannot capture all known coding exons alone or in combination, requiring improvement. Our data emphasize the importance of evaluation of updated platform versions and suggest that enrichment-free whole genome sequencing can overcome the limitations of WES in sufficiently covering coding exons, especially GC-rich regions, and in characterizing structural variants.
Resumo:
Current methods for detection of copy number variants (CNV) and aberrations (CNA) from targeted sequencing data are based on the depth of coverage of captured exons. Accurate CNA determination is complicated by uneven genomic distribution and non-uniform capture efficiency of targeted exons. Here we present CopywriteR, which eludes these problems by exploiting 'off-target' sequence reads. CopywriteR allows for extracting uniformly distributed copy number information, can be used without reference, and can be applied to sequencing data obtained from various techniques including chromatin immunoprecipitation and target enrichment on small gene panels. CopywriteR outperforms existing methods and constitutes a widely applicable alternative to available tools.
Resumo:
Aberrant antigens expressed by tumor cells, such as in melanoma, are often associated with humoral immune responses, which may in turn influence tumor progression. Despite recent data showing the central role of adaptive immune responses on cancer spread or control, it remains poorly understood where and how tumor-derived antigen (TDA) induces a humoral immune response in tumor-bearing hosts. Based on our observation of TDA accumulation in B cell areas of lymph nodes (LNs) from melanoma patients, we developed a pre-metastatic B16.F10 melanoma model expressing a fluorescent fusion protein, tandem dimer tomato, as a surrogate TDA. Using intravital two-photon microscopy (2PM) and whole-mount 3D LN imaging of tumor-draining LNs in immunocompetent mice, we report an unexpectedly widespread accumulation of TDA on follicular dendritic cells (FDCs), which were dynamically scanned by circulating B cells. Furthermore, 2PM imaging identified macrophages located in the subcapsular sinus of tumor-draining LNs to capture subcellular TDA-containing particles arriving in afferent lymph. As a consequence, depletion of macrophages or genetic ablation of B cells and FDCs resulted in dramatically reduced TDA capture in tumor-draining LNs. In sum, we identified a major pathway for the induction of humoral responses in a melanoma model, which may be exploitable to manipulate anti-TDA antibody production during cancer immunotherapy.