934 resultados para Aprendizagem bottom-up
Resumo:
The role of low-level stimulus-driven control in the guidance of overt visual attention has been difficult to establish because low- and high-level visual content are spatially correlated within natural visual stimuli. Here we show that impairment of parietal cortical areas, either permanently by a lesion or reversibly by repetitive transcranial magnetic stimulation (rTMS), leads to fixation of locations with higher values of low-level features as compared to control subjects or in a no-rTMS condition. Moreover, this unmasking of stimulus-driven control crucially depends on the intrahemispheric balance between top-down and bottom-up cortical areas. This result suggests that although in normal behavior high-level features might exert a strong influence, low-level features do contribute to guide visual selection during the exploration of complex natural stimuli.
Resumo:
This paper proposes a sequential coupling of a Hidden Markov Model (HMM) recognizer for offline handwritten English sentences with a probabilistic bottom-up chart parser using Stochastic Context-Free Grammars (SCFG) extracted from a text corpus. Based on extensive experiments, we conclude that syntax analysis helps to improve recognition rates significantly.
Resumo:
Unraveling intra- and inter-cellular signaling networks managing cell-fate control, coordinating complex differentiation regulatory circuits and shaping tissues and organs in living systems remain major challenges in the post-genomic era. Resting on the laurels of past-century monolayer culture technologies, the cell culture community has only recently begun to appreciate the potential of three-dimensional mammalian cell culture systems to reveal the full scope of mechanisms orchestrating the tissue-like cell quorum in space and time. Capitalizing on gravity-enforced self-assembly of monodispersed primary embryonic mouse cells in hanging drops, we designed and characterized a three-dimensional cell culture model for ganglion-like structures. Within 24h, a mixture of mouse embryonic fibroblasts (MEF) and cells, derived from the dorsal root ganglion (DRG) (sensory neurons and Schwann cells) grown in hanging drops, assembled to coherent spherical microtissues characterized by a MEF feeder core and a peripheral layer of DRG-derived cells. In a time-dependent manner, sensory neurons formed a polar ganglion-like cap structure, which coordinated guided axonal outgrowth and innervation of the distal pole of the MEF feeder spheroid. Schwann cells, present in embryonic DRG isolates, tended to align along axonal structures and myelinate them in an in vivo-like manner. Whenever cultivation exceeded 10 days, DRG:MEF-based microtissues disintegrated due to an as yet unknown mechanism. Using a transgenic MEF feeder spheroid, engineered for gaseous acetaldehyde-inducible interferon-beta (ifn-beta) production by cotransduction of retro-/ lenti-viral particles, a short 6-h ifn-beta induction was sufficient to rescue the integrity of DRG:MEF spheroids and enable long-term cultivation of these microtissues. In hanging drops, such microtissues fused to higher-order macrotissue-like structures, which may pave the way for sophisticated bottom-up tissue engineering strategies. DRG:MEF-based artificial micro- and macrotissue design demonstrated accurate key morphological aspects of ganglions and exemplified the potential of self-assembled scaffold-free multicellular micro-/macrotissues to provide new insight into organogenesis.
Resumo:
As environmental problems became more complex, policy and regulatory decisions become far more difficult to make. The use of science has become an important practice in the decision making process of many federal agencies. Many different types of scientific information are used to make decisions within the EPA, with computer models becoming especially important. Environmental models are used throughout the EPA in a variety of contexts and their predictive capacity has become highly valued in decision making. The main focus of this research is to examine the EPA’s Council for Regulatory Modeling (CREM) as a case study in addressing science issues, particularly models, in government agencies. Specifically, the goal was to answer the following questions: What is the history of the CREM and how can this information shed light on the process of science policy implementation? What were the goals of implementing the CREM? Were these goals reached and how have they changed? What have been the impediments that the CREM has faced and why did these impediments occur? The three main sources of information for this research came from observations during summer employment with the CREM, document review and supplemental interviews with CREM participants and other members of the modeling community. Examining a history of modeling at the EPA, as well as a history of the CREM, provides insight into the many challenges that are faced when implementing science policy and science policy programs. After examining the many impediments that the CREM has faced in implementing modeling policies, it was clear that the impediments fall into two separate categories, classic and paradoxical. The classic impediments include the more standard impediments to science policy implementation that might be found in any regulatory environment, such as lack of resources and changes in administration. Paradoxical impediments are cyclical in nature, with no clear solution, such as balancing top-down versus bottom-up initiatives and coping with differing perceptions. These impediments, when not properly addressed, severely hinder the ability for organizations to successfully implement science policy.
Resumo:
The remarkable advances in nanoscience and nanotechnology over the last two decades allow one to manipulate individuals atoms, molecules and nanostructures, make it possible to build devices with only a few nanometers, and enhance the nano-bio fusion in tackling biological and medical problems. It complies with the ever-increasing need for device miniaturization, from magnetic storage devices, electronic building blocks for computers, to chemical and biological sensors. Despite the continuing efforts based on conventional methods, they are likely to reach the fundamental limit of miniaturization in the next decade, when feature lengths shrink below 100 nm. On the one hand, quantum mechanical efforts of the underlying material structure dominate device characteristics. On the other hand, one faces the technical difficulty in fabricating uniform devices. This has posed a great challenge for both the scientific and the technical communities. The proposal of using a single or a few organic molecules in electronic devices has not only opened an alternative way of miniaturization in electronics, but also brought up brand-new concepts and physical working mechanisms in electronic devices. This thesis work stands as one of the efforts in understanding and building of electronic functional units at the molecular and atomic levels. We have explored the possibility of having molecules working in a wide spectrum of electronic devices, ranging from molecular wires, spin valves/switches, diodes, transistors, and sensors. More specifically, we have observed significant magnetoresistive effect in a spin-valve structure where the non-magnetic spacer sandwiched between two magnetic conducting materials is replaced by a self-assembled monolayer of organic molecules or a single molecule (like a carbon fullerene). The diode behavior in donor(D)-bridge(B)-acceptor(A) type of single molecules is then discussed and a unimolecular transistor is designed. Lastly, we have proposed and primarily tested the idea of using functionalized electrodes for rapid nanopore DNA sequencing. In these studies, the fundamental roles of molecules and molecule-electrode interfaces on quantum electron transport have been investigated based on first-principles calculations of the electronic structure. Both the intrinsic properties of molecules themselves and the detailed interfacial features are found to play critical roles in electron transport at the molecular scale. The flexibility and tailorability of the properties of molecules have opened great opportunity in a purpose-driven design of electronic devices from the bottom up. The results that we gained from this work have helped in understanding the underlying physics, developing the fundamental mechanism and providing guidance for future experimental efforts.
Resumo:
The developmental processes and functions of an organism are controlled by the genes and the proteins that are derived from these genes. The identification of key genes and the reconstruction of gene networks can provide a model to help us understand the regulatory mechanisms for the initiation and progression of biological processes or functional abnormalities (e.g. diseases) in living organisms. In this dissertation, I have developed statistical methods to identify the genes and transcription factors (TFs) involved in biological processes, constructed their regulatory networks, and also evaluated some existing association methods to find robust methods for coexpression analyses. Two kinds of data sets were used for this work: genotype data and gene expression microarray data. On the basis of these data sets, this dissertation has two major parts, together forming six chapters. The first part deals with developing association methods for rare variants using genotype data (chapter 4 and 5). The second part deals with developing and/or evaluating statistical methods to identify genes and TFs involved in biological processes, and construction of their regulatory networks using gene expression data (chapter 2, 3, and 6). For the first part, I have developed two methods to find the groupwise association of rare variants with given diseases or traits. The first method is based on kernel machine learning and can be applied to both quantitative as well as qualitative traits. Simulation results showed that the proposed method has improved power over the existing weighted sum method (WS) in most settings. The second method uses multiple phenotypes to select a few top significant genes. It then finds the association of each gene with each phenotype while controlling the population stratification by adjusting the data for ancestry using principal components. This method was applied to GAW 17 data and was able to find several disease risk genes. For the second part, I have worked on three problems. First problem involved evaluation of eight gene association methods. A very comprehensive comparison of these methods with further analysis clearly demonstrates the distinct and common performance of these eight gene association methods. For the second problem, an algorithm named the bottom-up graphical Gaussian model was developed to identify the TFs that regulate pathway genes and reconstruct their hierarchical regulatory networks. This algorithm has produced very significant results and it is the first report to produce such hierarchical networks for these pathways. The third problem dealt with developing another algorithm called the top-down graphical Gaussian model that identifies the network governed by a specific TF. The network produced by the algorithm is proven to be of very high accuracy.
Resumo:
Attempts to strengthen a chromium-modified titanium trialuminide by a combination of grain size refinement and dispersoid strengthening led to a new means to synthesize such materials. This Reactive Mechanical Alloying/Milling process uses in situ reactions between the metallic powders and elements from a process control agent and/or a gaseous environment to assemble a dispersed small hard particle phase within the matrix by a bottom-up approach. In the current research milled powders of the trialuminide alloy along with titanium carbide were produced. The amount of the carbide can be varied widely with simple processing changes and in this case the milling process created trialuminide grain sizes and carbide particles that are the smallest known from such a process. Characterization of these materials required the development of x-ray diffraction means to determine particle sizes by deconvoluting and synthesizing components of the complex multiphase diffraction patterns and to carry out whole pattern analysis to analyze the diffuse scattering that developed from larger than usual highly defective grain boundary regions. These identified regions provide an important mass transport capability in the processing and not only facilitate the alloy development, but add to the understanding of the mechanical alloying process. Consolidation of the milled powder that consisted of small crystallites of the alloy and dispersed carbide particles two nanometers in size formed a unique, somewhat coarsened, microstructure producing an ultra-high strength solid material composed of the chromium-modified titanium trialuminide alloy matrix with small platelets of the complex carbides Ti2AlC and Ti3AlC2. This synthesis process provides the unique ability to nano-engineer a wide variety of composite materials, or special alloys, and has shown the ability to be extended to a wide variety of metallic materials.
Resumo:
Approximately 90% of fine aerosol in the Midwestern United States has a regional component with a sizable fraction attributed to secondary production of organic aerosol (SOA). The Ozark Forest is an important source of biogenic SOA precursors like isoprene (> 150 mg m-2 d-1), monoterpenes (10-40 mg m-2 d-1), and sesquiterpenes (10-40 mg m-2d-1). Anthropogenic sources include secondary sulfate and nitrate and biomass burning (51-60%), vehicle emissions (17-26%), and industrial emissions (16-18%). Vehicle emissions are an important source of volatile and vapor-phase, semivolatile aliphatic and aromatic hydrocarbons that are important anthropogenic sources of SOA precursors. The short lifetime of SOA precursors and the complex mixture of functionalized oxidation products make rapid sampling, quantitative processing methods, and comprehensive organic molecular analysis essential elements of a comprehensive strategy to advance understanding of SOA formation pathways. Uncertainties in forecasting SOA production on regional scales are large and related to uncertainties in biogenic emission inventories and measurement of SOA yields under ambient conditions. This work presents a bottom-up approach to develop a conifer emission inventory based on foliar and cortical oleoresin composition, development of a model to estimate terpene and terpenoid signatures of foliar and bole emissions from conifers, development of processing and analytic techniques for comprehensive organic molecular characterization of SOA precursors and oxidation products, implementation of the high-volume sampling technique to measure OA and vapor-phase organic matter, and results from a 5 day field experiment conducted to evaluate temporal and diurnal trends in SOA precursors and oxidation products. A total of 98, 115, and 87 terpene and terpenoid species were identified and quantified in commercially available essential oils of Pinus sylvestris, Picea mariana, and Thuja occidentalis, respectively, by comprehensive, two-dimensional gas chromatography with time-of-flight mass spectrometric detection (GC × GC-ToF-MS). Analysis of the literature showed that cortical oleoresin composition was similar to foliar composition of the oldest branches. Our proposed conceptual model for estimation of signatures of terpene and terpenoid emissions from foliar and cortical oleoresin showed that emission potentials of the foliar and bole release pathways are dissimilar and should be considered for conifer species that develop resin blisters or are infested with herbivores or pathogens. Average derivatization efficiencies for Methods 1 and 2 were 87.9 and 114%, respectively. Despite the lower average derivatization efficiency of Method 1, distinct advantages included a greater certainty of derivatization yield for the entire suite of multi- and poly-functional species and fewer processing steps for sequential derivatization. Detection limits for Method 1 using GC × GC- ToF-MS were 0.09-1.89 ng μL-1. A theoretical retention index diagram was developed for a hypothetical GC × 2GC analysis of the complex mixture of SOA precursors and derivatized oxidation products. In general, species eluted (relative to the alkyl diester reference compounds) from the primary column (DB-210) in bands according to n and from the secondary columns (BPX90, SolGel-WAX) according to functionality, essentially making the GC × 2GC retention diagram a Carbon number-functionality grid. The species clustered into 35 groups by functionality and species within each group exhibited good separation by n. Average recoveries of n-alkanes and polyaromatic hydrocarbons (PAHs) by Soxhlet extraction of XAD-2 resin with dichloromethane were 80.1 ± 16.1 and 76.1 ± 17.5%, respectively. Vehicle emissions were the common source for HSVOCs [i.e., resolved alkanes, the unresolved complex mixture (UCM), alkylbenzenes, and 2- and 3-ring PAHs]. An absence of monoterpenes at 0600-1000 and high concentrations of monoterpenoids during the same period was indicative of substantial losses of monoterpenes overnight and the early morning hours. Post-collection, comprehensive organic molecular characterization of SOA precursors and products by GC × GC-ToFMS in ambient air collected with ~2 hr resolution is a promising method for determining biogenic and anthropogenic SOA yields that can be used to evaluate SOA formation models.
Resumo:
The emergence of the state in Europe is a topic that has engaged historians since the establishment of the discipline of history. Yet the primary focus of has nearly always been to take a top-down approach, whereby the formation and consolidation of public institutions is viewed as the outcome of activities by princes and other social elites. Yet, as the essays in this collection show, such an approach does not provide a complete picture. By investigating the importance of local and individual initiatives that contributed to state building from the late middle ages through to the nineteenth century, this volume shows how popular pressure could influence those in power to develop new institutional structures. By not privileging the role of warfare and of elite coercion for state building, it is possible to question the traditional top-down model and explore the degree to which central agencies might have been more important for state representation than for state practice. The studies included in this collection treat many parts of Europe and deal with different phases in the period between the late middle ages and the nineteenth century. Beginning with a critical review of state historiography, the introduction then sets out the concept of 'empowering interactions' which is then explored in the subsequent case studies and a number of historiographical, methodological and theoretical essays. Taken as a whole this collection provides a fascinating platform to reconsider the relationships between top-down and bottom-up processes in the history of the European state.
Resumo:
Few studies have addressed the interaction between instruction content and saccadic eye movement control. To assess the impact of instructions on top-down control, we instructed 20 healthy volunteers to deliberately delay saccade triggering, to make inaccurate saccades or to redirect saccades--i.e. to glimpse towards and then immediately opposite to the target. Regular pro- and antisaccade tasks were used for comparison. Bottom-up visual input remained unchanged and was a gap paradigm for all instructions. In the inaccuracy and delay tasks, both latencies and accuracies were detrimentally impaired by either type of instruction and the variability of latency and accuracy was increased. The intersaccadic interval (ISI) required to correct erroneous antisaccades was shorter than the ISI for instructed direction changes in the redirection task. The word-by-word instruction content interferes with top-down saccade control. Top-down control is a time consuming process, which may override bottom-up processing only during a limited time period. It is questionable whether parallel processing is possible in top-down control, since the long ISI for instructed direction changes suggests sequential planning.
Resumo:
Traditionally, ontologies describe knowledge representation in a denotational, formalized, and deductive way. In addition, in this paper, we propose a semiotic, inductive, and approximate approach to ontology creation. We define a conceptual framework, a semantics extraction algorithm, and a first proof of concept applying the algorithm to a small set of Wikipedia documents. Intended as an extension to the prevailing top-down ontologies, we introduce an inductive fuzzy grassroots ontology, which organizes itself organically from existing natural language Web content. Using inductive and approximate reasoning to reflect the natural way in which knowledge is processed, the ontology’s bottom-up build process creates emergent semantics learned from the Web. By this means, the ontology acts as a hub for computing with words described in natural language. For Web users, the structural semantics are visualized as inductive fuzzy cognitive maps, allowing an initial form of intelligence amplification. Eventually, we present an implementation of our inductive fuzzy grassroots ontology Thus,this paper contributes an algorithm for the extraction of fuzzy grassroots ontologies from Web data by inductive fuzzy classification.
Resumo:
In his in uential article about the evolution of the Web, Berners-Lee [1] envisions a Semantic Web in which humans and computers alike are capable of understanding and processing information. This vision is yet to materialize. The main obstacle for the Semantic Web vision is that in today's Web meaning is rooted most often not in formal semantics, but in natural language and, in the sense of semiology, emerges not before interpretation and processing. Yet, an automated form of interpretation and processing can be tackled by precisiating raw natural language. To do that, Web agents extract fuzzy grassroots ontologies through induction from existing Web content. Inductive fuzzy grassroots ontologies thus constitute organically evolved knowledge bases that resemble automated gradual thesauri, which allow precisiating natural language [2]. The Web agents' underlying dynamic, self-organizing, and best-effort induction, enable a sub-syntactical bottom up learning of semiotic associations. Thus, knowledge is induced from the users' natural use of language in mutual Web interactions, and stored in a gradual, thesauri-like lexical-world knowledge database as a top-level ontology, eventually allowing a form of computing with words [3]. Since when computing with words the objects of computation are words, phrases and propositions drawn from natural languages, it proves to be a practical notion to yield emergent semantics for the Semantic Web. In the end, an improved understanding by computers on the one hand should upgrade human- computer interaction on the Web, and, on the other hand allow an initial version of human- intelligence amplification through the Web.
Resumo:
A bottom-up approach is introduced to fabricate two-dimensional self-assembled layers of molecular spin-systems containing Mn and Fe ions arranged in a chessboard lattice. We demonstrate that the Mn and Fe spin states can be reversibly operated by their selective response to coordination/decoordination of volatile ligands like ammonia (NH3).
Resumo:
Species extinctions are biased towards higher trophic levels, and primary extinctions are often followed by unexpected secondary extinctions. Currently, predictions on the vulnerability of ecological communities to extinction cascades are based on models that focus on bottom-up effects, which cannot capture the effects of extinctions at higher trophic levels. We show, in experimental insect communities, that harvesting of single carnivorous parasitoid species led to a significant increase in extinction rate of other parasitoid species, separated by four trophic links. Harvesting resulted in the release of prey from top-down control, leading to increased interspecific competition at the herbivore trophic level. This resulted in increased extinction rates of non-harvested parasitoid species when their host had become rare relative to other herbivores. The results demonstrate a mechanism for horizontal extinction cascades, and illustrate that altering the relationship between a predator and its prey can cause wide-ranging ripple effects through ecosystems, including unexpected extinctions.
Resumo:
Rapid changes in atmospheric methane (CH4), temperature and precipitation are documented by Greenland ice core data both for glacial times (the so called Dansgaard-Oeschger (D-O) events) as well as for a cooling event in the early Holocene (the 8.2 kyr event). The onsets of D-O warm events are paralleled by abrupt increases in CH4 by up to 250 ppb in a few decades. Vice versa, the 8.2 kyr event is accompanied by an intermittent decrease in CH4 of about 80 ppb over 150 yr. The abrupt CH4 changes are thought to mainly originate from source emission variations in tropical and boreal wet ecosystems, but complex process oriented bottom-up model estimates of the changes in these ecosystems during rapid climate changes are still missing. Here we present simulations of CH4 emissions from northern peatlands with the LPJ-Bern dynamic global vegetation model. The model represents CH4 production and oxidation in soils and transport by ebullition, through plant aerenchyma, and by diffusion. Parameters are tuned to represent site emission data as well as inversion-based estimates of northern wetland emissions. The model is forced with climate input data from freshwater hosing experiments using the NCAR CSM1.4 climate model to simulate an abrupt cooling event. A concentration reduction of ~10 ppb is simulated per degree K change of mean northern hemispheric surface temperature in peatlands. Peatland emissions are equally sensitive to both changes in temperature and in precipitation. If simulated changes are taken as an analogy to the 8.2 kyr event, boreal peatland emissions alone could only explain 23 of the 80 ppb decline in atmospheric methane concentration. This points to a significant contribution to source changes from low latitude and tropical wetlands to this event.