863 resultados para Bottom-up learning
Resumo:
In this work the synthesis of polyarylated cycloparaphenylenes (CPPs) is described in order to form structurally defined carbon nanotube (CNT) segments by the Scholl reaction. Therefore, polyphenylene macrocycles in different sizes and substitution patterns were synthesized. The influence of the ring-strain on the oxidative cyclodehydrogenation of these macrocycles towards CNT segments was investigated. It was demonstrated that a selective solution based bottom-up synthesis of CNT segments could be accomplished, having polyarylated CPPs, sufficient in size and with the right substituents at the critical positions. These findings mark an important step towards the bottom-up synthesis of length- and diameter defined ultrashort CNTsrnIn the second part of this work, novel non-precious metal catalysts (NPMCs) based on phenanthroline-indole macrocycles were synthesized and their electrocatalytic performance in the cathodic oxygen reduction was investigated. It could be demonstrated that all catalysts contributed to the direct 4-electron reduction of oxygen to water in alkaline media and a superior long-term stability was observed. Since these NPMCs are not heat pre-treated, the catalytically active site was structurally well-defined, allowing the investigation of the structure-property relationship. Moreover, it could be shown that these novel NPMCs act as efficient ORR catalysts and could replace the expensive and scarce platinum in fuel cell applications.rn
Resumo:
In der vorliegenden Arbeit wurde eine Top Down (TD) und zwei Bottom Up (BU) MALDI/ESI Massenspektrometrie/HPLC-Methoden entwickelt mit dem Ziel Augenoberfächenkomponenten, d.h. Tränenfilm und Konjunktivalzellen zu analysieren. Dabei wurde ein detaillierter Einblick in die Entwicklungsschritte gegeben und die Ansätze auf Eignung und methodische Grenzen untersucht. Während der TD Ansatz vorwiegend Eignung zur Analyse von rohen, weitgehend unbearbeiteten Zellproben fand, konnten mittels des BU Ansatzes bearbeitete konjunktivale Zellen, aber auch Tränenfilm mit hoher Sensitivität und Genauigkeit proteomisch analysiert werden. Dabei konnten mittels LC MALDI BU-Methode mehr als 200 Tränenproteine und mittels der LC ESI Methode mehr als 1000 Tränen- sowie konjunktivale Zellproteine gelistet werden. Dabei unterschieden sich ESI- and MALDI- Methoden deutlich bezüglich der Quantität und Qualität der Ergebnisse, weshalb differente proteomische Anwendungsgebiete der beiden Methoden vorgeschlagen wurden. Weiterhin konnten mittels der entwickelten LC MALDI/ESI BU Plattform, basierend auf den Vorteilen gegenüber dem TD Ansatz, therapeutische Einflüsse auf die Augenoberfläche mit Fokus auf die topische Anwendung von Taurin sowie Taflotan® sine, untersucht werden. Für Taurin konnten entzündungshemmende Effekte, belegt durch dynamische Veränderungen des Tränenfilms, dokumentiert werden. Außerdem konnten vorteilhafte, konzentrationsabhängige Wirkweisen auch in Studien an konjunktival Zellen gezeigt werden. Für die Anwendung von konservierungsmittelfreien Taflotan® sine, konnte mittels LC ESI BU Analyse eine Regenerierung der Augenoberfläche in Patienten mit Primärem Offenwinkel Glaukom (POWG), welche unter einem “Trockenem Auge“ litten nach einem therapeutischen Wechsel von Xalatan® basierend auf dynamischen Tränenproteomveränderungen gezeigt werden. Die Ergebnisse konnten mittels Microarray (MA) Analysen bestätigt werden. Sowohl in den Taurin Studien, als auch in der Taflotan® sine Studie, konnten charakteristische Proteine der Augenoberfläche dokumentiert werden, welche eine objektive Bewertung des Gesundheitszustandes der Augenoberfläche ermöglichen. Eine Kombination von Taflotan® sine und Taurin wurde als mögliche Strategie zur Therapie des Trockenen Auges bei POWG Patienten vorgeschlagen und diskutiert.
Resumo:
Moderne ESI-LC-MS/MS-Techniken erlauben in Verbindung mit Bottom-up-Ansätzen eine qualitative und quantitative Charakterisierung mehrerer tausend Proteine in einem einzigen Experiment. Für die labelfreie Proteinquantifizierung eignen sich besonders datenunabhängige Akquisitionsmethoden wie MSE und die IMS-Varianten HDMSE und UDMSE. Durch ihre hohe Komplexität stellen die so erfassten Daten besondere Anforderungen an die Analysesoftware. Eine quantitative Analyse der MSE/HDMSE/UDMSE-Daten blieb bislang wenigen kommerziellen Lösungen vorbehalten. rn| In der vorliegenden Arbeit wurden eine Strategie und eine Reihe neuer Methoden zur messungsübergreifenden, quantitativen Analyse labelfreier MSE/HDMSE/UDMSE-Daten entwickelt und als Software ISOQuant implementiert. Für die ersten Schritte der Datenanalyse (Featuredetektion, Peptid- und Proteinidentifikation) wird die kommerzielle Software PLGS verwendet. Anschließend werden die unabhängigen PLGS-Ergebnisse aller Messungen eines Experiments in einer relationalen Datenbank zusammengeführt und mit Hilfe der dedizierten Algorithmen (Retentionszeitalignment, Feature-Clustering, multidimensionale Normalisierung der Intensitäten, mehrstufige Datenfilterung, Proteininferenz, Umverteilung der Intensitäten geteilter Peptide, Proteinquantifizierung) überarbeitet. Durch diese Nachbearbeitung wird die Reproduzierbarkeit der qualitativen und quantitativen Ergebnisse signifikant gesteigert.rn| Um die Performance der quantitativen Datenanalyse zu evaluieren und mit anderen Lösungen zu vergleichen, wurde ein Satz von exakt definierten Hybridproteom-Proben entwickelt. Die Proben wurden mit den Methoden MSE und UDMSE erfasst, mit Progenesis QIP, synapter und ISOQuant analysiert und verglichen. Im Gegensatz zu synapter und Progenesis QIP konnte ISOQuant sowohl eine hohe Reproduzierbarkeit der Proteinidentifikation als auch eine hohe Präzision und Richtigkeit der Proteinquantifizierung erreichen.rn| Schlussfolgernd ermöglichen die vorgestellten Algorithmen und der Analyseworkflow zuverlässige und reproduzierbare quantitative Datenanalysen. Mit der Software ISOQuant wurde ein einfaches und effizientes Werkzeug für routinemäßige Hochdurchsatzanalysen labelfreier MSE/HDMSE/UDMSE-Daten entwickelt. Mit den Hybridproteom-Proben und den Bewertungsmetriken wurde ein umfassendes System zur Evaluierung quantitativer Akquisitions- und Datenanalysesysteme vorgestellt.
Resumo:
Internet of Things (IoT): tre parole che sintetizzano al meglio come la tecnologia abbia pervaso quasi ogni ambito della nostra vita. In questa tesi andrò a esplorare le soluzioni hardware e soprattutto software che si celano dietro allo sviluppo di questa nuova frontiera tecnologica, dalla cui combinazione con il web nasce il Web of Things, ovvero una visione globale, accessibile da qualsiasi utente attraverso i comuni mezzi di navigazione, dei servizi che ogni singolo smart device può offrire. Sarà seguito un percorso bottom-up partendo dalla descrizione fisica dei device e delle tecnologie abilitanti alla comunicazione thing to thing ed i protocolli che instaurano fra i device le connessioni. Proseguendo per l’introduzione di concetti quali middleware e smart gateway, sarà illustrata l’integrazione nel web 2.0 di tali device menzionando durante il percorso quali saranno gli scenari applicativi e le prospettive di sviluppo auspicabili.
Resumo:
Il cambiamento climatico è un fenomeno in atto a livello globale, oggi scientificamente dimostrato, irreversibile nel breve periodo, i cui effetti hanno già provocato nel Mondo ingenti perdite sociali, economiche ed ecosistemiche. Il fattore di incertezza che permane riguarda il modo in cui evolverà il clima nel futuro, che a sua volta dipende dalle quantità di gas climalteranti che continueranno ad essere immesse in atmosfera, e di conseguenza la tipologia e la dimensione degli impatti che potranno verificarsi. Di fronte all’inevitabilità del problema e dei rischi che ne derivano, l’uomo può adattarsi, come per sua natura ha sempre fatto di fronte a condizioni esterne – anche climatiche – avverse. Le strategie di adattamento al cambiamento climatico, secondo un approccio bottom-up, mirano a ridurre la vulnerabilità dei sistemi esposti alle variazioni del clima, rendendoli più preparati ad affrontare il futuro. Oltre ai fattori climatici vi sono altri elementi che incidono in modo determinante sulla vulnerabilità: sono tutte le variabili interne e specifiche di un sistema che ne definiscono il grado di sensibilità verso un potenziale danno. Lo studio ha focalizzato l’attenzione su tre Comuni dell’Appennino Faentino al fine di capire come il cambiamento climatico influisce sulle criticità naturali già esistenti e sulla vita dell’uomo e le sue attività e, conseguentemente, quali azioni potranno essere messe in atto per limitare il pericolo e i potenziali danni.
Resumo:
Low back pain (LBP) is the most prevalent health problem in Switzerland and a leading cause of reduced work performance and disability. This study estimated the total cost of LBP in Switzerland in 2005 from a societal perspective using a bottom-up prevalence-based cost-of-illness approach. The study considers more cost categories than are typically investigated and includes the costs associated with a multitude of LBP sufferers who are not under medical care. The findings are based on a questionnaire completed by a sample of 2,507 German-speaking respondents, of whom 1,253 suffered from LBP in the last 4 weeks; 346 of them were receiving medical treatment for their LBP. Direct costs of LBP were estimated at
Resumo:
We present a multistage strategy to define the scale and geographic distribution of 'local' ceramic production at Lydian Sardis based on geochemical analysis (NAA) of a large diverse ceramic sample (n = 281). Within the sphere of local ceramic production, our results demonstrate an unusual pattern of reliance on a single resource relative to other contemporary Iron Age centers. When our NAA results are combined with legacy NAA provenience data for production centers in Western Anatolia, we can differentiate ceramic emulation from exchange, establish probable proveniences for the non-local component of the dataset, and define new non-local groups with as yet no known provenience. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Changes in resource use over time can provide insight into technological choice and the extent of long-term stability in cultural practices. In this paper we re-evaluate the evidence for a marked demographic shift at the inception of the Early Iron Age at Troy by applying a robust macroscale analysis of changing ceramic resource use over the Late Bronze and Iron Age. We use a combination of new and legacy analytical datasets (NAA and XRF), from excavated ceramics, to evaluate the potential compositional range of local resources (based on comparisons with sediments from within a 10 km site radius). Results show a clear distinction between sediment-defined local and non-local ceramic compositional groups. Two discrete local ceramic resources have been previously identified and we confirm a third local resource for a major class of EIA handmade wares and cooking pots. This third source appears to derive from a residual resource on the Troy peninsula (rather than adjacent alluvial valleys). The presence of a group of large and heavy pithoi among the non-local groups raises questions about their regional or maritime origin. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Active head turns to the left and right have recently been shown to influence numerical cognition by shifting attention along the mental number line. In the present study, we found that passive whole-body motion influences numerical cognition. In a random-number generation task (Experiment 1), leftward and downward displacement of participants facilitated small number generation, whereas rightward and upward displacement facilitated the generation of large numbers. Influences of leftward and rightward motion were also found for the processing of auditorily presented numbers in a magnitude-judgment task (Experiment 2). Additionally, we investigated the reverse effect of the number-space association (Experiment 3). Participants were displaced leftward or rightward and asked to detect motion direction as fast as possible while small or large numbers were auditorily presented. When motion detection was difficult, leftward motion was detected faster when hearing small number and rightward motion when hearing large number. We provide new evidence that bottom-up vestibular activation is sufficient to interact with the higher-order spatial representation underlying numerical cognition. The results show that action planning or motor activity is not necessary to influence spatial attention. Moreover, our results suggest that self-motion perception and numerical cognition can mutually influence each other.
Resumo:
The role of low-level stimulus-driven control in the guidance of overt visual attention has been difficult to establish because low- and high-level visual content are spatially correlated within natural visual stimuli. Here we show that impairment of parietal cortical areas, either permanently by a lesion or reversibly by repetitive transcranial magnetic stimulation (rTMS), leads to fixation of locations with higher values of low-level features as compared to control subjects or in a no-rTMS condition. Moreover, this unmasking of stimulus-driven control crucially depends on the intrahemispheric balance between top-down and bottom-up cortical areas. This result suggests that although in normal behavior high-level features might exert a strong influence, low-level features do contribute to guide visual selection during the exploration of complex natural stimuli.
Resumo:
This paper proposes a sequential coupling of a Hidden Markov Model (HMM) recognizer for offline handwritten English sentences with a probabilistic bottom-up chart parser using Stochastic Context-Free Grammars (SCFG) extracted from a text corpus. Based on extensive experiments, we conclude that syntax analysis helps to improve recognition rates significantly.
Resumo:
Unraveling intra- and inter-cellular signaling networks managing cell-fate control, coordinating complex differentiation regulatory circuits and shaping tissues and organs in living systems remain major challenges in the post-genomic era. Resting on the laurels of past-century monolayer culture technologies, the cell culture community has only recently begun to appreciate the potential of three-dimensional mammalian cell culture systems to reveal the full scope of mechanisms orchestrating the tissue-like cell quorum in space and time. Capitalizing on gravity-enforced self-assembly of monodispersed primary embryonic mouse cells in hanging drops, we designed and characterized a three-dimensional cell culture model for ganglion-like structures. Within 24h, a mixture of mouse embryonic fibroblasts (MEF) and cells, derived from the dorsal root ganglion (DRG) (sensory neurons and Schwann cells) grown in hanging drops, assembled to coherent spherical microtissues characterized by a MEF feeder core and a peripheral layer of DRG-derived cells. In a time-dependent manner, sensory neurons formed a polar ganglion-like cap structure, which coordinated guided axonal outgrowth and innervation of the distal pole of the MEF feeder spheroid. Schwann cells, present in embryonic DRG isolates, tended to align along axonal structures and myelinate them in an in vivo-like manner. Whenever cultivation exceeded 10 days, DRG:MEF-based microtissues disintegrated due to an as yet unknown mechanism. Using a transgenic MEF feeder spheroid, engineered for gaseous acetaldehyde-inducible interferon-beta (ifn-beta) production by cotransduction of retro-/ lenti-viral particles, a short 6-h ifn-beta induction was sufficient to rescue the integrity of DRG:MEF spheroids and enable long-term cultivation of these microtissues. In hanging drops, such microtissues fused to higher-order macrotissue-like structures, which may pave the way for sophisticated bottom-up tissue engineering strategies. DRG:MEF-based artificial micro- and macrotissue design demonstrated accurate key morphological aspects of ganglions and exemplified the potential of self-assembled scaffold-free multicellular micro-/macrotissues to provide new insight into organogenesis.
Resumo:
As environmental problems became more complex, policy and regulatory decisions become far more difficult to make. The use of science has become an important practice in the decision making process of many federal agencies. Many different types of scientific information are used to make decisions within the EPA, with computer models becoming especially important. Environmental models are used throughout the EPA in a variety of contexts and their predictive capacity has become highly valued in decision making. The main focus of this research is to examine the EPA’s Council for Regulatory Modeling (CREM) as a case study in addressing science issues, particularly models, in government agencies. Specifically, the goal was to answer the following questions: What is the history of the CREM and how can this information shed light on the process of science policy implementation? What were the goals of implementing the CREM? Were these goals reached and how have they changed? What have been the impediments that the CREM has faced and why did these impediments occur? The three main sources of information for this research came from observations during summer employment with the CREM, document review and supplemental interviews with CREM participants and other members of the modeling community. Examining a history of modeling at the EPA, as well as a history of the CREM, provides insight into the many challenges that are faced when implementing science policy and science policy programs. After examining the many impediments that the CREM has faced in implementing modeling policies, it was clear that the impediments fall into two separate categories, classic and paradoxical. The classic impediments include the more standard impediments to science policy implementation that might be found in any regulatory environment, such as lack of resources and changes in administration. Paradoxical impediments are cyclical in nature, with no clear solution, such as balancing top-down versus bottom-up initiatives and coping with differing perceptions. These impediments, when not properly addressed, severely hinder the ability for organizations to successfully implement science policy.
Resumo:
The remarkable advances in nanoscience and nanotechnology over the last two decades allow one to manipulate individuals atoms, molecules and nanostructures, make it possible to build devices with only a few nanometers, and enhance the nano-bio fusion in tackling biological and medical problems. It complies with the ever-increasing need for device miniaturization, from magnetic storage devices, electronic building blocks for computers, to chemical and biological sensors. Despite the continuing efforts based on conventional methods, they are likely to reach the fundamental limit of miniaturization in the next decade, when feature lengths shrink below 100 nm. On the one hand, quantum mechanical efforts of the underlying material structure dominate device characteristics. On the other hand, one faces the technical difficulty in fabricating uniform devices. This has posed a great challenge for both the scientific and the technical communities. The proposal of using a single or a few organic molecules in electronic devices has not only opened an alternative way of miniaturization in electronics, but also brought up brand-new concepts and physical working mechanisms in electronic devices. This thesis work stands as one of the efforts in understanding and building of electronic functional units at the molecular and atomic levels. We have explored the possibility of having molecules working in a wide spectrum of electronic devices, ranging from molecular wires, spin valves/switches, diodes, transistors, and sensors. More specifically, we have observed significant magnetoresistive effect in a spin-valve structure where the non-magnetic spacer sandwiched between two magnetic conducting materials is replaced by a self-assembled monolayer of organic molecules or a single molecule (like a carbon fullerene). The diode behavior in donor(D)-bridge(B)-acceptor(A) type of single molecules is then discussed and a unimolecular transistor is designed. Lastly, we have proposed and primarily tested the idea of using functionalized electrodes for rapid nanopore DNA sequencing. In these studies, the fundamental roles of molecules and molecule-electrode interfaces on quantum electron transport have been investigated based on first-principles calculations of the electronic structure. Both the intrinsic properties of molecules themselves and the detailed interfacial features are found to play critical roles in electron transport at the molecular scale. The flexibility and tailorability of the properties of molecules have opened great opportunity in a purpose-driven design of electronic devices from the bottom up. The results that we gained from this work have helped in understanding the underlying physics, developing the fundamental mechanism and providing guidance for future experimental efforts.
Resumo:
Attempts to strengthen a chromium-modified titanium trialuminide by a combination of grain size refinement and dispersoid strengthening led to a new means to synthesize such materials. This Reactive Mechanical Alloying/Milling process uses in situ reactions between the metallic powders and elements from a process control agent and/or a gaseous environment to assemble a dispersed small hard particle phase within the matrix by a bottom-up approach. In the current research milled powders of the trialuminide alloy along with titanium carbide were produced. The amount of the carbide can be varied widely with simple processing changes and in this case the milling process created trialuminide grain sizes and carbide particles that are the smallest known from such a process. Characterization of these materials required the development of x-ray diffraction means to determine particle sizes by deconvoluting and synthesizing components of the complex multiphase diffraction patterns and to carry out whole pattern analysis to analyze the diffuse scattering that developed from larger than usual highly defective grain boundary regions. These identified regions provide an important mass transport capability in the processing and not only facilitate the alloy development, but add to the understanding of the mechanical alloying process. Consolidation of the milled powder that consisted of small crystallites of the alloy and dispersed carbide particles two nanometers in size formed a unique, somewhat coarsened, microstructure producing an ultra-high strength solid material composed of the chromium-modified titanium trialuminide alloy matrix with small platelets of the complex carbides Ti2AlC and Ti3AlC2. This synthesis process provides the unique ability to nano-engineer a wide variety of composite materials, or special alloys, and has shown the ability to be extended to a wide variety of metallic materials.