832 resultados para Deep Belief Network, Deep Learning, Gaze, Head Pose, Surveillance, Unsupervised Learning
Resumo:
BACKGROUND: Epidermal growth factor receptor (EGFR) and its downstream factors KRAS and BRAF are mutated in several types of cancer, affecting the clinical response to EGFR inhibitors. Mutations in the EGFR kinase domain predict sensitivity to the tyrosine kinase inhibitors gefitinib and erlotinib in lung adenocarcinoma, while activating point mutations in KRAS and BRAF confer resistance to the anti-EGFR monoclonal antibody cetuximab in colorectal cancer. The development of new generation methods for systematic mutation screening of these genes will allow more appropriate therapeutic choices. METHODS: We describe a high resolution melting (HRM) assay for mutation detection in EGFR exons 19-21, KRAS codon 12/13 and BRAF V600 using formalin-fixed paraffin-embedded samples. Somatic variation of KRAS exon 2 was also analysed by massively parallel pyrosequencing of amplicons with the GS Junior 454 platform. RESULTS: We tested 120 routine diagnostic specimens from patients with colorectal or lung cancer. Mutations in KRAS, BRAF and EGFR were observed in 41.9%, 13.0% and 11.1% of the overall samples, respectively, being mutually exclusive. For KRAS, six types of substitutions were detected (17 G12D, 9 G13D, 7 G12C, 2 G12A, 2 G12V, 2 G12S), while V600E accounted for all the BRAF activating mutations. Regarding EGFR, two cases showed exon 19 deletions (delE746-A750 and delE746-T751insA) and another two substitutions in exon 21 (one showed L858R with the resistance mutation T590M in exon 20, and the other had P848L mutation). Consistent with earlier reports, our results show that KRAS and BRAF mutation frequencies in colorectal cancer were 44.3% and 13.0%, respectively, while EGFR mutations were detected in 11.1% of the lung cancer specimens. Ultra-deep amplicon pyrosequencing successfully validated the HRM results and allowed detection and quantitation of KRAS somatic mutations. CONCLUSIONS: HRM is a rapid and sensitive method for moderate-throughput cost-effective screening of oncogene mutations in clinical samples. Rather than Sanger sequence validation, next-generation sequencing technology results in more accurate quantitative results in somatic variation and can be achieved at a higher throughput scale.
Resumo:
Deeply incised river networks are generally regarded as robust features that are not easily modified by erosion or tectonics. Although the reorganization of deeply incised drainage systems has been documented, the corresponding importance with regard to the overall landscape evolution of mountain ranges and the factors that permit such reorganizations are poorly understood. To address this problem, we have explored the rapid drainage reorganization that affected the Cahabon River in Guatemala during the Quaternary. Sediment-provenance analysis, field mapping, and electrical resistivity tomography (ERT) imaging are used to reconstruct the geometry of the valley before the river was captured. Dating of the abandoned valley sediments by the Be-10-Al-26 burial method and geomagnetic polarity analysis allow us to determine the age of the capture events and then to quantify several processes, such as the rate of tectonic deformation of the paleovalley, the rate of propagation of post-capture drainage reversal, and the rate at which canyons that formed at the capture sites have propagated along the paleovalley. Transtensional faulting started 1 to 3 million years ago, produced ground tilting and ground faulting along the Cahabon River, and thus generated differential uplift rate of 0.3 +/- 0.1 up to 0.7 +/- 0.4 mm . y(-1) along the river's course. The river responded to faulting by incising the areas of relative uplift and depositing a few tens of meters of sediment above the areas of relative subsidence. Then, the river experienced two captures and one avulsion between 700 ky and 100 ky. The captures breached high-standing ridges that separate the Cahabon River from its captors. Captures occurred at specific points where ridges are made permeable by fault damage zones and/or soluble rocks. Groundwater flow from the Cahabon River down to its captors likely increased the erosive power of the captors thus promoting focused erosion of the ridges. Valley-fill formation and capture occurred in close temporal succession, suggesting a genetic link between the two. We suggest that the aquifers accumulated within the valley-fills, increased the head along the subterraneous system connecting the Cahabon River to its captors, and promoted their development. Upon capture, the breached valley experienced widespread drainage reversal toward the capture sites. We attribute the generalized reversal to combined effects of groundwater sapping in the valley-fill, axial drainage obstruction by lateral fans, and tectonic tilting. Drainage reversal increased the size of the captured areas by a factor of 4 to 6. At the capture sites, 500 m deep canyons have been incised into the bedrock and are propagating upstream at a rate of 3 to 11 mm . y(-1) deepening at a rate of 0.7 to 1 5 mm . y(-1). At this rate, 1 to 2 million years will be necessary for headward erosion to completely erase the topographic expression of the paleovalley. It is concluded that the rapid reorganization of this drainage system was made possible by the way the river adjusted to the new tectonic strain field, which involved transient sedimentation along the river's course. If the river had escaped its early reorganization and had been given the time necessary to reach a new dynamic equilibrium, then the transient conditions that promoted capture would have vanished and its vulnerability to capture would have been strongly reduced.
Resumo:
Major coastal storms, associated with strong winds, high waves and intensified currents, and occasionally with heavy rains and flash floods, are mostly known because of the serious damage they can cause along the shoreline and the threats they pose to navigation. However, there is a profound lack of knowledge on the deep-sea impacts of severe coastal storms. Concurrent measurements of key parameters along the coast and in the deep-sea are extremely rare. Here we present a unique data set showing how one of the most extreme coastal storms of the last decades lashing the Western Mediterranean Sea rapidly impacted the deep-sea ecosystem. The storm peaked the 26th of December 2008 leading to the remobilization of a shallow-water reservoir of marine organic carbon associated with fine particles and resulting in its redistribution across the deep basin. The storm also initiated the movement of large amounts of coarse shelf sediment, which abraded and buried benthic communities. Our findings demonstrate, first, that severe coastal storms are highly efficient in transporting organic carbon from shallow water to deep water, thus contributing to its sequestration and, second, that natural, intermittent atmospheric drivers sensitive to global climate change have the potential to tremendously impact the largest and least known ecosystem on Earth, the deep-sea ecosystem.
Resumo:
Major coastal storms, associated with strong winds, high waves and intensified currents, and occasionally with heavy rains and flash floods, are mostly known because of the serious damage they can cause along the shoreline and the threats they pose to navigation. However, there is a profound lack of knowledge on the deep-sea impacts of severe coastal storms. Concurrent measurements of key parameters along the coast and in the deep-sea are extremely rare. Here we present a unique data set showing how one of the most extreme coastal storms of the last decades lashing the Western Mediterranean Sea rapidly impacted the deep-sea ecosystem. The storm peaked the 26th of December 2008 leading to the remobilization of a shallow-water reservoir of marine organic carbon associated with fine particles and resulting in its redistribution across the deep basin. The storm also initiated the movement of large amounts of coarse shelf sediment, which abraded and buried benthic communities. Our findings demonstrate, first, that severe coastal storms are highly efficient in transporting organic carbon from shallow water to deep water, thus contributing to its sequestration and, second, that natural, intermittent atmospheric drivers sensitive to global climate change have the potential to tremendously impact the largest and least known ecosystem on Earth, the deep-sea ecosystem.
Resumo:
A geophysical and geochemical study has been conducted in a fractured carbonate aquifer located at Combioula in the southwestern Swiss Alps with the objective to detect and characterize hydraulically active fractures along a 260-m-deep borehole. Hydrochemical analyses, borehole diameter, temperature and fluid electrical conductivity logging data were integrated in order to relate electrokinetic self-potential signals to groundwater flow inside the fracture network. The results show a generally good, albeit locally variable correlation of variations of the self-potential signals with variations in temperature, fluid electrical conductivity and borehole diameter. Together with the hydrochemical evidence, which was found to be critical for the interpretation of the self-potential data, these measurements not only made it possible to detect the hydraulically active fractures but also to characterize them as zones of fluid gain or fluid loss. The results complement the available information from the corresponding litholog and illustrate the potential of electrokinetic self-potential signals in conjunction with temperature, fluid electrical conductivity and hydrochemical analyses for the characterization of fractured aquifers, and thus may offer a perspective for an effective quantitative characterization of this increasingly important class of aquifers and geothermal reservoirs.
Resumo:
Electrical deep brain stimulation (DBS) is an efficient method to treat movement disorders. Many models of DBS, based mostly on finite elements, have recently been proposed to better understand the interaction between the electrical stimulation and the brain tissues. In monopolar DBS, clinically widely used, the implanted pulse generator (IPG) is used as reference electrode (RE). In this paper, the influence of the RE model of monopolar DBS is investigated. For that purpose, a finite element model of the full electric loop including the head, the neck and the superior chest is used. Head, neck and superior chest are made of simple structures such as parallelepipeds and cylinders. The tissues surrounding the electrode are accurately modelled from data provided by the diffusion tensor magnetic resonance imaging (DT-MRI). Three different configurations of RE are compared with a commonly used model of reduced size. The electrical impedance seen by the DBS system and the potential distribution are computed for each model. Moreover, axons are modelled to compute the area of tissue activated by stimulation. Results show that these indicators are influenced by the surface and position of the RE. The use of a RE model corresponding to the implanted device rather than the usually simplified model leads to an increase of the system impedance (+48%) and a reduction of the area of activated tissue (-15%).
Resumo:
This thesis presents a perceptual system for a humanoid robot that integrates abilities such as object localization and recognition with the deeper developmental machinery required to forge those competences out of raw physical experiences. It shows that a robotic platform can build up and maintain a system for object localization, segmentation, and recognition, starting from very little. What the robot starts with is a direct solution to achieving figure/ground separation: it simply 'pokes around' in a region of visual ambiguity and watches what happens. If the arm passes through an area, that area is recognized as free space. If the arm collides with an object, causing it to move, the robot can use that motion to segment the object from the background. Once the robot can acquire reliable segmented views of objects, it learns from them, and from then on recognizes and segments those objects without further contact. Both low-level and high-level visual features can also be learned in this way, and examples are presented for both: orientation detection and affordance recognition, respectively. The motivation for this work is simple. Training on large corpora of annotated real-world data has proven crucial for creating robust solutions to perceptual problems such as speech recognition and face detection. But the powerful tools used during training of such systems are typically stripped away at deployment. Ideally they should remain, particularly for unstable tasks such as object detection, where the set of objects needed in a task tomorrow might be different from the set of objects needed today. The key limiting factor is access to training data, but as this thesis shows, that need not be a problem on a robotic platform that can actively probe its environment, and carry out experiments to resolve ambiguity. This work is an instance of a general approach to learning a new perceptual judgment: find special situations in which the perceptual judgment is easy and study these situations to find correlated features that can be observed more generally.
Resumo:
Sigmoid type belief networks, a class of probabilistic neural networks, provide a natural framework for compactly representing probabilistic information in a variety of unsupervised and supervised learning problems. Often the parameters used in these networks need to be learned from examples. Unfortunately, estimating the parameters via exact probabilistic calculations (i.e, the EM-algorithm) is intractable even for networks with fairly small numbers of hidden units. We propose to avoid the infeasibility of the E step by bounding likelihoods instead of computing them exactly. We introduce extended and complementary representations for these networks and show that the estimation of the network parameters can be made fast (reduced to quadratic optimization) by performing the estimation in either of the alternative domains. The complementary networks can be used for continuous density estimation as well.
Resumo:
Automatic summarization of texts is now crucial for several information retrieval tasks owing to the huge amount of information available in digital media, which has increased the demand for simple, language-independent extractive summarization strategies. In this paper, we employ concepts and metrics of complex networks to select sentences for an extractive summary. The graph or network representing one piece of text consists of nodes corresponding to sentences, while edges connect sentences that share common meaningful nouns. Because various metrics could be used, we developed a set of 14 summarizers, generically referred to as CN-Summ, employing network concepts such as node degree, length of shortest paths, d-rings and k-cores. An additional summarizer was created which selects the highest ranked sentences in the 14 systems, as in a voting system. When applied to a corpus of Brazilian Portuguese texts, some CN-Summ versions performed better than summarizers that do not employ deep linguistic knowledge, with results comparable to state-of-the-art summarizers based on expensive linguistic resources. The use of complex networks to represent texts appears therefore as suitable for automatic summarization, consistent with the belief that the metrics of such networks may capture important text features. (c) 2008 Elsevier Inc. All rights reserved.
Resumo:
Abstract Background The molecular phylogenetic relationships and population structure of the species of the Anopheles triannulatus complex: Anopheles triannulatus s.s., Anopheles halophylus and the putative species Anopheles triannulatus C were investigated. Methods The mitochondrial COI gene, the nuclear white gene and rDNA ITS2 of samples that include the known geographic distribution of these taxa were analyzed. Phylogenetic analyses were performed using Bayesian inference, Maximum parsimony and Maximum likelihood approaches. Results Each data set analyzed septely yielded a different topology but none provided evidence for the seption of An. halophylus and An. triannulatus C, consistent with the hypothesis that the two are undergoing incipient speciation. The phylogenetic analyses of the white gene found three main clades, whereas the statistical parsimony network detected only a single metapopulation of Anopheles triannulatus s.l. Seven COI lineages were detected by phylogenetic and network analysis. In contrast, the network, but not the phylogenetic analyses, strongly supported three ITS2 groups. Combined data analyses provided the best resolution of the trees, with two major clades, Amazonian (clade I) and trans-Andean + Amazon Delta (clade II). Clade I consists of multiple subclades: An. halophylus + An. triannulatus C; trans-Andean Venezuela; central Amazonia + central Bolivia; Atlantic coastal lowland; and Amazon delta. Clade II includes three subclades: Panama; cis-Andean Colombia; and cis-Venezuela. The Amazon delta specimens are in both clades, likely indicating local sympatry. Spatial and molecular variance analyses detected nine groups, corroborating some of subclades obtained in the combined data analysis. Conclusion Combination of the three molecular markers provided the best resolution for differentiation within An. triannulatus s.s. and An. halophylus and C. The latest two species seem to be very closely related and the analyses performed were not conclusive regarding species differentiation. Further studies including new molecular markers would be desirable to solve this species status question. Besides, results of the study indicate a trans-Andean origin for An. triannulatus s.l. The potential implications for malaria epidemiology remain to be investigated.
Resumo:
Ground-based Earth troposphere calibration systems play an important role in planetary exploration, especially to carry out radio science experiments aimed at the estimation of planetary gravity fields. In these experiments, the main observable is the spacecraft (S/C) range rate, measured from the Doppler shift of an electromagnetic wave transmitted from ground, received by the spacecraft and coherently retransmitted back to ground. If the solar corona and interplanetary plasma noise is already removed from Doppler data, the Earth troposphere remains one of the main error sources in tracking observables. Current Earth media calibration systems at NASA’s Deep Space Network (DSN) stations are based upon a combination of weather data and multidirectional, dual frequency GPS measurements acquired at each station complex. In order to support Cassini’s cruise radio science experiments, a new generation of media calibration systems were developed, driven by the need to achieve the goal of an end-to-end Allan deviation of the radio link in the order of 3×〖10〗^(-15) at 1000 s integration time. The future ESA’s Bepi Colombo mission to Mercury carries scientific instrumentation for radio science experiments (a Ka-band transponder and a three-axis accelerometer) which, in combination with the S/C telecommunication system (a X/X/Ka transponder) will provide the most advanced tracking system ever flown on an interplanetary probe. Current error budget for MORE (Mercury Orbiter Radioscience Experiment) allows the residual uncalibrated troposphere to contribute with a value of 8×〖10〗^(-15) to the two-way Allan deviation at 1000 s integration time. The current standard ESA/ESTRACK calibration system is based on a combination of surface meteorological measurements and mathematical algorithms, capable to reconstruct the Earth troposphere path delay, leaving an uncalibrated component of about 1-2% of the total delay. In order to satisfy the stringent MORE requirements, the short time-scale variations of the Earth troposphere water vapor content must be calibrated at ESA deep space antennas (DSA) with more precise and stable instruments (microwave radiometers). In parallel to this high performance instruments, ESA ground stations should be upgraded to media calibration systems at least capable to calibrate both troposphere path delay components (dry and wet) at sub-centimetre level, in order to reduce S/C navigation uncertainties. The natural choice is to provide a continuous troposphere calibration by processing GNSS data acquired at each complex by dual frequency receivers already installed for station location purposes. The work presented here outlines the troposphere calibration technique to support both Deep Space probe navigation and radio science experiments. After an introduction to deep space tracking techniques, observables and error sources, in Chapter 2 the troposphere path delay is widely investigated, reporting the estimation techniques and the state of the art of the ESA and NASA troposphere calibrations. Chapter 3 deals with an analysis of the status and the performances of the NASA Advanced Media Calibration (AMC) system referred to the Cassini data analysis. Chapter 4 describes the current release of a developed GNSS software (S/W) to estimate the troposphere calibration to be used for ESA S/C navigation purposes. During the development phase of the S/W a test campaign has been undertaken in order to evaluate the S/W performances. A description of the campaign and the main results are reported in Chapter 5. Chapter 6 presents a preliminary analysis of microwave radiometers to be used to support radio science experiments. The analysis has been carried out considering radiometric measurements of the ESA/ESTEC instruments installed in Cabauw (NL) and compared with the requirements of MORE. Finally, Chapter 7 summarizes the results obtained and defines some key technical aspects to be evaluated and taken into account for the development phase of future instrumentation.
Resumo:
In this study new tomographic models of Colombia were calculated. I used the seismicity recorded by the Colombian seismic network during the period 2006-2009. In this time period, the improvement of the seismic network yields more stable hypocentral results with respect to older data set and allows to compute new 3D Vp and Vp/Vs models. The final dataset consists of 10813 P- and 8614 S-arrival times associated to 1405 earthquakes. Tests with synthetic data and resolution analysis indicate that velocity models are well constrained in central, western and southwestern Colombia to a depth of 160 km; the resolution is poor in the northern Colombia and close to Venezuela due to a lack of seismic stations and seismicity. The tomographic models and the relocated seismicity indicate the existence of E-SE subducting Nazca lithosphere beneath central and southern Colombia. The North-South changes in Wadati-Benioff zone, Vp & Vp/Vs pattern and volcanism, show that the downgoing plate is segmented by slab tears E-W directed, suggesting the presence of three sectors. Earthquakes in the northernmost sector represent most of the Colombian seimicity and concentrated on 100-170 km depth interval, beneath the Eastern Cordillera. Here a massive dehydration is inferred, resulting from a delay in the eclogitization of a thickened oceanic crust in a flat-subduction geometry. In this sector a cluster of intermediate-depth seismicity (Bucaramanga Nest) is present beneath the elbow of the Eastern Cordillera, interpreted as the result of massive and highly localized dehydration phenomenon caused by a hyper-hydrous oceanic crust. The central and southern sectors, although different in Vp pattern show, conversely, a continuous, steep and more homogeneous Wadati-Benioff zone with overlying volcanic areas. Here a "normalthickened" oceanic crust is inferred, allowing for a gradual and continuous metamorphic reactions to take place with depth, enabling the fluid migration towards the mantle wedge.
Resumo:
This thesis concerns artificially intelligent natural language processing systems that are capable of learning the properties of lexical items (properties like verbal valency or inflectional class membership) autonomously while they are fulfilling their tasks for which they have been deployed in the first place. Many of these tasks require a deep analysis of language input, which can be characterized as a mapping of utterances in a given input C to a set S of linguistically motivated structures with the help of linguistic information encoded in a grammar G and a lexicon L: G + L + C → S (1) The idea that underlies intelligent lexical acquisition systems is to modify this schematic formula in such a way that the system is able to exploit the information encoded in S to create a new, improved version of the lexicon: G + L + S → L' (2) Moreover, the thesis claims that a system can only be considered intelligent if it does not just make maximum usage of the learning opportunities in C, but if it is also able to revise falsely acquired lexical knowledge. So, one of the central elements in this work is the formulation of a couple of criteria for intelligent lexical acquisition systems subsumed under one paradigm: the Learn-Alpha design rule. The thesis describes the design and quality of a prototype for such a system, whose acquisition components have been developed from scratch and built on top of one of the state-of-the-art Head-driven Phrase Structure Grammar (HPSG) processing systems. The quality of this prototype is investigated in a series of experiments, in which the system is fed with extracts of a large English corpus. While the idea of using machine-readable language input to automatically acquire lexical knowledge is not new, we are not aware of a system that fulfills Learn-Alpha and is able to deal with large corpora. To instance four major challenges of constructing such a system, it should be mentioned that a) the high number of possible structural descriptions caused by highly underspeci ed lexical entries demands for a parser with a very effective ambiguity management system, b) the automatic construction of concise lexical entries out of a bulk of observed lexical facts requires a special technique of data alignment, c) the reliability of these entries depends on the system's decision on whether it has seen 'enough' input and d) general properties of language might render some lexical features indeterminable if the system tries to acquire them with a too high precision. The cornerstone of this dissertation is the motivation and development of a general theory of automatic lexical acquisition that is applicable to every language and independent of any particular theory of grammar or lexicon. This work is divided into five chapters. The introductory chapter first contrasts three different and mutually incompatible approaches to (artificial) lexical acquisition: cue-based queries, head-lexicalized probabilistic context free grammars and learning by unification. Then the postulation of the Learn-Alpha design rule is presented. The second chapter outlines the theory that underlies Learn-Alpha and exposes all the related notions and concepts required for a proper understanding of artificial lexical acquisition. Chapter 3 develops the prototyped acquisition method, called ANALYZE-LEARN-REDUCE, a framework which implements Learn-Alpha. The fourth chapter presents the design and results of a bootstrapping experiment conducted on this prototype: lexeme detection, learning of verbal valency, categorization into nominal count/mass classes, selection of prepositions and sentential complements, among others. The thesis concludes with a review of the conclusions and motivation for further improvements as well as proposals for future research on the automatic induction of lexical features.
Resumo:
Peatlands deform elastically during precipitation cycles by small (+/- 3 cm) oscillations in surface elevation. In contrast, we used a Global Positioning System network to measure larger oscillations that exceeded 20 cm over periods of 4 - 12 hours during two seasonal droughts at a bog and fen site in northern Minnesota. The second summer drought also triggered 19 depressuring cycles in an overpressured stratum under the bog site. The synchronicity between the largest surface deformations and the depressuring cycles indicates that both phenomena are produced by the episodic release of large volumes of gas from deep semi-elastic compartments confined by dense wood layers. We calculate that the three largest surface deformations were associated with the release of 136 g CH4 m(-2), which exceeds by an order of magnitude the annual average chamber fluxes measured at this site. Ebullition of gas from the deep peat may therefore be a large and previously unrecognized source of radiocarbon depleted methane emissions from northern peatlands.
Resumo:
The study assessed the brain electric mechanisms of light and deep hypnotic conditions in the framework of EEG temporal microstates. Multichannel EEG of healthy volunteers during initial resting, light hypnosis, deep hypnosis, and eventual recovery was analyzed into temporal EEG microstates of four classes. Microstates are defined by the spatial configuration of their potential distribution maps ([Symbol: see text]potential landscapes') on the head surface. Because different potential landscapes must have been generated by different active neural assemblies, it is reasonable to assume that they also incorporate different brain functions. The observed four microstate classes were very similar to the four standard microstate classes A, B, C, D [Koenig, T. et al. Neuroimage, 2002;16: 41-8] and were labeled correspondingly. We expected a progression of microstate characteristics from initial resting to light to deep hypnosis. But, all three microstate parameters (duration, occurrence/second and %time coverage) yielded values for initial resting and final recovery that were between those of the two hypnotic conditions of light and deep hypnosis. Microstates of the classes B and D showed decreased duration, occurrence/second and %time coverage in deep hypnosis compared to light hypnosis; this was contrary to microstates of classes A and C which showed increased values of all three parameters. Reviewing the available information about microstates in other conditions, the changes from resting to light hypnosis in certain respects are reminiscent of changes to meditation states, and changes to deep hypnosis of those in schizophrenic states.