109 resultados para Taps de suro
Photoproduktion neutraler Pionen am Proton mit linear polarisierten Photonen im Bereich der Schwelle
Resumo:
Diese Arbeit beschreibt ein Experiment zur Photoproduktionneutraler Pionen am Proton im Schwellenbereich. DurchVerwendung linear polarisierter Photonen konnte neben dentotalen und differentiellen Wirkungsquerschnitten zum erstenMal die Photonasymmetrie nahe der Schwelle gemessen werden.Besonderes Interesse galt dem aus diesen physikalischenObservablen bestimmbaren s-Wellen-Multipol E0+ sowie der erstmaligen Bestimmung aller drei p-Wellen-KombinationenP1, P2 und P3 im Bereich der Schwelle.Das Experiment wurde 1995/1996 am ElektronenbeschleunigerMAMI (Mainzer Mikrotron) der Universität Mainz durchgeführt.Durch Verwendung eines Diamanten als Bremsstrahltarget fürdie Elektronen wurden über den Prozeß der kohärentenBremsstrahlung linear polarisierte Photonen erzeugt. DieEnergie der Photonen wurde über die Messung der Energie der gestreuten Elektronen in der MainzerPhotonenmarkierungsanlage bestimmt. Der Detektor TAPS, eineAnordnung aus 504 BaF2-Modulen, war um einFlüssigwasserstofftarget aufgebaut. In den Modulen wurdendie im Target produzierten neutralen Pionen über ihrenZerfall in zwei Photonen nachgewiesen.Die totalen und differentiellen Wirkungsquerschnitte wurdenim Energiebereich zwischen der Schwelle von 144.7 MeV und168 MeV gemessen. Die erstmals gemessene Photonasymmetriefür 159.5 MeV ist positiv und hat einen Wert von+0.217+/-0.046 für einen Polarwinkel von 90 Grad. Der Multipol E0+ und die drei p-Wellen-Kombinationen wurden andie physikalischen Observablen über zwei unterschiedlicheMethoden angepaßt, die übereinstimmende Ergebnisselieferten. Die Vorhersagen der Niederenergietheoreme derchiralen Störungstheorie für P1 und P2 stimmen beiEinbeziehung der statistischen und systematischen Fehler mitden experimentellen Werten überein.
Resumo:
Im Jahre 1997 wurden von Tatischeff et al. bei der Reaktion p p -> X p pi+ resonanzartige Zustände im Spektrum der invarianten Masse des fehlenden Nukleons X bei M = 1004, 1044 und 1094 MeV gefunden. In einem zweiten Experiment von Filkov et al. beobachtete man bei der Reaktion p d -> p p X Resonanzstrukturen bei M = 966, 986 und 1003 MeV. Solche exotischen Resonanzen widersprechen etablierten Nukleonenmodellen, die die Delta(1232)-Resonanz als ersten Anregungszustand beschreiben. Zur Deutung der beobachteten Strukturen wurden Quarkcluster-Modelle mit und ohne Farb-Magnet-Wechselwirkungen entwickelt. Lvov et al. zweifelten die experimentellen Ergebnisse an, da keine Strukturen in den Daten zur reellen Comptonstreuung gefunden wurden. Als Gegenargument wurde von Kobushkin vorgeschlagen, dass diese Resonanzen eine total-antisymmetrische Spin-Flavour-Wellenfunktion haben und nur der N-2Gamma-Zerfall erlaubt wäre. In dieser Arbeit wurde die Reaktion g p -> X pi+ -> n g g pi+ zur Suche nach diesen exotischen Resonanzen verwendet. Die Daten wurden parallel zur Messung der Pion-Polarisierbarkeiten am Mainzer Beschleuniger MAMI genommen. Durch Bremsstrahlung der Elektronen an einer Radiatorfolie wurden reelle Photonen erzeugt, deren Energie von der A2-Photonenmarkierungsanlage (Glasgow-Tagger) bestimmt wurde. Als Protontarget wurde ein 10 cm langes Flüssigwasserstoff-Target verwendet. Geladene Reaktionsprodukte wurden unter Vorwärtswinkeln Theta < 20 Grad bezüglich der Strahlachse in einer Vieldraht-Proportionalkammer nachgewiesen, während Photonen im Spektrometer TAPS mit 526 BaF2-Kristallen unter Polarwinkeln Theta > 60 Grad detektiert wurden. Zum Nachweis von Neutronen stand ein Flugzeitdetektor mit insgesamt 111 Einzelmodulen zur Verfügung. Zum Test der Analysesoftware und des experimentellen Aufbaus wurden zusätzlich die Reaktionskanäle g p -> p pi0 und g p -> n pi0 pi+ ausgewertet. Für die Ein-Pion-Produktion wurden differentielle Wirkungsquerschnitte unter Rückwärtswinkeln bestimmt und mit theoretischen Modellen und experimentellen Werten verglichen. Für den Kanal g p -> n pi0 pi+ wurden Spektren invarianter Massen für verschiedene Teilchenkombinationen ermittelt und mit einer Simulation verglichen. Die Daten legen nahe, dass die Reaktion hauptsächlich über eine Anregung der Delta0(1232)-Resonanz verläuft. Bei der Suche nach exotischen Resonanzen wurden keine statistisch signifikanten Strukturen gefunden. Es wurden Obergrenzen für den differentiellen Wirkungsquerschnitt ermittelt.
Resumo:
Im Juli 2009 wurde am Mainzer Mikrotron (MAMI) erstmal ein Experiment durchgeführt, bei dem ein polarisiertes 3He Target mit Photonen im Energiebereich von 200 bis 800 MeV untersucht wurde. Das Ziel dieses Experiments war die Überprüfung der Gerasimov-Drell-Hearn Summenregel am Neutron. Die Verwendung der Messdaten welche mit dem polarisierten 3He Target gewonnen wurden, geben - im Vergleich mit den bereits existieren Daten vom Deuteron - aufgrund der Spin-Struktur des 3He einen komplementären und direkteren Zugang zum Neutron. Die Messung des totalen helizitätsabhängigen Photoabsorptions-Wirkungsquerschnitts wurde mittels eines energiemarkierten Strahls von zirkular polarisierten Photonen, welcher auf das longitudinal polarisierte 3He Target trifft, durchgeführt. Als Produktdetektoren kamen der Crystal Ball (4π Raumabdeckung), TAPS (als ”Vorwärtswand”) sowie ein Schwellen-Cherenkov-Detektor (online Veto zur Reduktion von elektromagnetischen Ereignissen) zum Einsatz. Planung und Aufbau der verschiedenen komponenten Teile des 3He Experimentaufbaus war ein entscheidender Teil dieser Dissertation und wird detailliert in der vorliegenden Arbeit beschrieben. Das Detektorsystem als auch die Analyse-Methoden wurden durch die Messung des unpolarisierten, totalen und inklusiven Photoabsoprtions-Wirkungsquerschnitts an flüssigem Wasserstoff getestet. Hierbei zeigten die Ergebnisse eine gute Übereinstimmung mit bereits zuvor publizierten Daten. Vorläufige Ergebnisse des unpolarisierten totalen Photoabsorptions-Wirkungsquerschnitts sowie der helizitätsabhängige Unterschied zwischen Photoabsorptions-Wirkungsquerschnitten an 3He im Vergleich zu verschiedenen theoretischen Modellen werden vorgestellt.
Resumo:
Con il presente studio si è inteso analizzare l’impatto dell’utilizzo di una memoria di traduzione (TM) e del post-editing (PE) di un output grezzo sul livello di difficoltà percepita e sul tempo necessario per ottenere un testo finale di alta qualità. L’esperimento ha coinvolto sei studenti, di madrelingua italiana, del corso di Laurea Magistrale in Traduzione Specializzata dell’Università di Bologna (Vicepresidenza di Forlì). I partecipanti sono stati divisi in tre coppie, a ognuna delle quali è stato assegnato un estratto di comunicato stampa in inglese. Per ogni coppia, ad un partecipante è stato chiesto di tradurre il testo in italiano usando la TM all’interno di SDL Trados Studio 2011. All’altro partecipante è stato chiesto di fare il PE completo in italiano dell’output grezzo ottenuto da Google Translate. Nei casi in cui la TM o l’output non contenevano traduzioni (corrette), i partecipanti avrebbero potuto consultare Internet. Ricorrendo ai Think-aloud Protocols (TAPs), è stato chiesto loro di riflettere a voce alta durante lo svolgimento dei compiti. È stato quindi possibile individuare i problemi traduttivi incontrati e i casi in cui la TM e l’output grezzo hanno fornito soluzioni corrette; inoltre, è stato possibile osservare le strategie traduttive impiegate, per poi chiedere ai partecipanti di indicarne la difficoltà attraverso interviste a posteriori. È stato anche misurato il tempo impiegato da ogni partecipante. I dati sulla difficoltà percepita e quelli sul tempo impiegato sono stati messi in relazione con il numero di soluzioni corrette rispettivamente fornito da TM e output grezzo. È stato osservato che usare la TM ha comportato un maggior risparmio di tempo e che, al contrario del PE, ha portato a una riduzione della difficoltà percepita. Il presente studio si propone di aiutare i futuri traduttori professionisti a scegliere strumenti tecnologici che gli permettano di risparmiare tempo e risorse.
Resumo:
Diese Arbeit befasst sich mit der photoinduzierten Erzeugung neutraler Pionen, sehr nahe an der Schwellenenergie. Dabei werden zwei Ziele verfolgt: Zum einen die Überprüfung von Vorhersagen dieser effektiven Theorien und Modelle, zum anderen werden hier erstmals alle relevanten Partialwellenamplituden modellunabhängig aus gemessenen Observablen bestimmt. Diese Methode soll in Zukunft auch bei höheren Energien im Bereich der Nukleonresonanzen Anwendung finden. rnrnKonkret wird die Durchführung und Analyse eines Experiments vorgestellt, welches am Mainzer Mikrotron (MAMI) in den Jahren 2010 bis 2013 mit zirkular polarisiertem Photonenstrahl stattfand. Der Photonenstrahl wurde an einer Anlage zur Erzeugung energiemarkierter Bremsstrahlung aus dem Elektronenstrahl von MAMI gewonnen. Zum Nachweis der Reaktionsprodukte diente das hermetische 4pi CB/TAPS-Detektorsystem. Erstmalig standen bei derartigen Messungen auch transversal polarisierte Protonen zur Verfügung. Dazu wird Butanol in einer speziellen Apparatur dynamisch polarisiert. Molekularer Wasserstoff lässt sich aufgrund der para-Konfiguration nicht polarisieren. Wegen der Verwendung von Butanol als Targetmaterial, bei dem weniger als 5% aller erzeugten Pionen an polarisierten Protonen produziert wurden, ist die Behandlung des Untergrunds eine zentrale Aufgabe. rnrnEs werden zwei Methoden der Untergrundseparation vorgestellt, wovon die bessere in der Analyse angewendet wurde. Abschließend findet eine ausführliche Bewertung systematischer Fehler statt.rnrnDie erstmalige Verwendung transversal polarisierter Protonen ermöglicht den Zugang zu bisher nicht gemessenen Spin"=Freiheitsgraden. In Kombination mit einem komplementären Vorläufer-Experiment aus dem Jahr 2008 mit linear polarisiertem Photonenstrahl konnten aus den gewonnenen Daten erstmals alle komplexen s- und p-Partialwellenamplituden modellunabhängig bestimmt werden. rnrnDarüber hinaus wurden im Rahmen dieser Arbeit wesentliche Verbesserungen am apparativen Aufbau erzielt. Beispiele sind ein Elektronenstrahl-Polarimeter, ein zellularer CB-Multiplizitätstrigger, sowie signifikante Verbesserungen der Datennahmeelektronik und des Triggersystems, die teilweise in dieser Arbeit vorgestellt werden.
Resumo:
The excitation spectrum is one of the fundamental properties of every spatially extended system. The excitations of the building blocks of normal matter, i.e., protons and neutrons (nucleons), play an important role in our understanding of the low energy regime of the strong interaction. Due to the large coupling, perturbative solutions of quantum chromodynamics (QCD) are not appropriate to calculate long-range phenomena of hadrons. For many years, constituent quark models were used to understand the excitation spectra. Recently, calculations in lattice QCD make first connections between excited nucleons and the fundamental field quanta (quarks and gluons). Due to their short lifetime and large decay width, excited nucleons appear as resonances in scattering processes like pion nucleon scattering or meson photoproduction. In order to disentangle individual resonances with definite spin and parity in experimental data, partial wave analyses are necessary. Unique solutions in these analyses can only be expected if sufficient empirical information about spin degrees of freedom is available. The measurement of spin observables in pion photoproduction is the focus of this thesis. The polarized electron beam of the Mainz Microtron (MAMI) was used to produce high-intensity, polarized photon beams with tagged energies up to 1.47 GeV. A "frozen-spin" Butanol target in combination with an almost 4π detector setup consisting of the Crystal Ball and the TAPS calorimeters allowed the precise determination of the helicity dependence of the γp → π0p reaction. In this thesis, as an improvement of the target setup, an internal polarizing solenoid has been constructed and tested. A magnetic field of 2.32 T and homogeneity of 1.22×10−3 in the target volume have been achieved. The helicity asymmetry E, i.e., the difference of events with total helicity 1/2 and 3/2 divided by the sum, was determined from data taken in the years 2013-14. The subtraction of background events arising from nucleons bound in Carbon and Oxygen was an important part of the analysis. The results for the asymmetry E are compared to existing data and predictions from various models. The results show a reasonable agreement to the models in the energy region of the ∆(1232)-resonance but large discrepancies are observed for energy above 600 MeV. The expansion of the present data in terms of Legendre polynomials, shows the sensitivity of the data to partial wave amplitudes up to F-waves. Additionally, a first, preliminary multipole analysis of the present data together with other results from the Crystal Ball experiment has been as been performed.
Resumo:
Neglect is defined as the failure to attend and to orient to the contralesional side of space. A horizontal bias towards the right visual field is a classical finding in patients who suffered from a right-hemispheric stroke. The vertical dimension of spatial attention orienting has only sparsely been investigated so far. The aim of this study was to investigate the specificity of this vertical bias by means of a search task, which taps a more pronounced top-down attentional component. Eye movements and behavioural search performance were measured in thirteen patients with left-sided neglect after right hemispheric stroke and in thirteen age-matched controls. Concerning behavioural performance, patients found significantly less targets than healthy controls in both the upper and lower left quadrant. However, when targets were located in the lower left quadrant, patients needed more visual fixations (and therefore longer search time) to find them, suggesting a time-dependent vertical bias.
Resumo:
The object of this trip and report was to familiarize the students of the Montana State School of Mines with methods of taking and mapping surface and undergound geology. All surface geology was mapped by means of plane table and alidade, and undergound work by means of Brunton compass and taps. The senior class of the Montana State School of MInes under the supervision of Dr. E.S. Perry performed the work, which covered an area in Madison County including South Boulder Creek, near Jefferson Island, the Silver Star Mining District, and the Alameda Mine, near Virginia City.
Resumo:
More than eighteen percent of the world’s population lives without reliable access to clean water, forced to walk long distances to get small amounts of contaminated surface water. Carrying heavy loads of water long distances and ingesting contaminated water can lead to long-term health problems and even death. These problems affect the most vulnerable populations, women, children, and the elderly, more than anyone else. Water access is one of the most pressing issues in development today. Boajibu, a small village in Sierra Leone, where the author served in Peace Corps for two years, lacks access to clean water. Construction of a water distribution system was halted when a civil war broke out in 1992 and has not been continued since. The community currently relies on hand-dug and borehole wells that can become dirty during the dry season, which forces people to drink contaminated water or to travel a far distance to collect clean water. This report is intended to provide a design the system as it was meant to be built. The water system design was completed based on the taps present, interviews with local community leaders, local surveying, and points taken with a GPS. The design is a gravity-fed branched water system, supplied by a natural spring on a hill adjacent to Boajibu. The system’s source is a natural spring on a hill above Boajibu, but the flow rate of the spring is unknown. There has to be enough flow from the spring over a 24-hour period to meet the demands of the users on a daily basis, or what is called providing continuous flow. If the spring has less than this amount of flow, the system must provide intermittent flow, flow that is restricted to a few hours a day. A minimum flow rate of 2.1 liters per second was found to be necessary to provide continuous flow to the users of Boajibu. If this flow is not met, intermittent flow can be provided to the users. In order to aid the construction of a distribution system in the absence of someone with formal engineering training, a table was created detailing water storage tank sizing based on possible source flow rates. A builder can interpolate using the source flow rate found to get the tank size from the table. However, any flow rate below 2.1 liters per second cannot be used in the table. In this case, the builder should size the tank such that it can take in the water that will be supplied overnight, as all the water will be drained during the day because the users will demand more than the spring can supply through the night. In the developing world, there is often a problem collecting enough money to fund large infrastructure projects, such as a water distribution system. Often there is only enough money to add only one or two loops to a water distribution system. It is helpful to know where these one or two loops can be most effectively placed in the system. Various possible loops were designated for the Boajibu water distribution system and the Adaptive Greedy Heuristic Loop Addition Selection Algorithm (AGHLASA) was used to rank the effectiveness of the possible loops to construct. Loop 1 which was furthest upstream was selected because it benefitted the most people for the least cost. While loops which were further downstream were found to be less effective because they would benefit fewer people. Further studies should be conducted on the water use habits of the people of Boajibu to more accurately predict the demands that will be placed on the system. Further population surveying should also be conducted to predict population change over time so that the appropriate capacity can be built into the system to accommodate future growth. The flow at the spring should be measured using a V-notch weir and the system adjusted accordingly. Future studies can be completed adjusting the loop ranking method so that two users who may be using the water system for different lengths of time are not counted the same and vulnerable users are weighted more heavily than more robust users.
Resumo:
Enriching knowledge bases with multimedia information makes it possible to complement textual descriptions with visual and audio information. Such complementary information can help users to understand the meaning of assertions, and in general improve the user experience with the knowledge base. In this paper we address the problem of how to enrich ontology instances with candidate images retrieved from existing Web search engines. DBpedia has evolved into a major hub in the Linked Data cloud, interconnecting millions of entities organized under a consistent ontology. Our approach taps into the Wikipedia corpus to gather context information for DBpedia instances and takes advantage of image tagging information when this is available to calculate semantic relatedness between instances and candidate images. We performed experiments with focus on the particularly challenging problem of highly ambiguous names. Both methods presented in this work outperformed the baseline. Our best method leveraged context words from Wikipedia, tags from Flickr and type information from DBpedia to achieve an average precision of 80%.
Resumo:
We propose and experimentally demonstrate a potentially integrable optical scheme to generate high order UWB pulses. The technique is based on exploiting the cross phase modulation generated in an InGaAsP Mach-Zehnder interferometer containing integrated semiconductor optical amplifiers, and is also adaptable to different pulse modulation formats through an optical processing unit which allows to control of the amplitude, polarity and time delay of the generated taps.
Resumo:
The effect of an upstream building on the suction forces on the flat roof of a low-rise building placed in the wake of the former is analyzed. The analysis has been performed by wind tunnel testing of a flat roof, low-rise building model equipped with pressure taps on the roof and different block-type buildings (only configurations where the upstream building is as high or higher than the downstream one are considered in this paper). The influence of the distance between both buildings on the wind loads on the downstream building roof is analyzed, as well as the height of the upstream one and the wind angle of incidence. Experimental results reveal that the wind load increases as the relative height of the upstream building increases, the wind load being highest for intermediate distances between buildings, when a passage between them is formed.
Resumo:
O objetivo do presente trabalho é a investigação e o desenvolvimento de estratégias de otimização contínua e discreta para problemas de Fluxo de Potência Ótimo (FPO), onde existe a necessidade de se considerar as variáveis de controle associadas aos taps de transformadores em-fase e chaveamentos de bancos de capacitores e reatores shunt como variáveis discretas e existe a necessidade da limitação, e/ou até mesmo a minimização do número de ações de controle. Neste trabalho, o problema de FPO será abordado por meio de três estratégias. Na primeira proposta, o problema de FPO é modelado como um problema de Programação Não Linear com Variáveis Contínuas e Discretas (PNLCD) para a minimização de perdas ativas na transmissão; são propostas três abordagens utilizando funções de discretização para o tratamento das variáveis discretas. Na segunda proposta, considera-se que o problema de FPO, com os taps de transformadores discretos e bancos de capacitores e reatores shunts fixos, possui uma limitação no número de ações de controles; variáveis binárias associadas ao número de ações de controles são tratadas por uma função quadrática. Na terceira proposta, o problema de FPO é modelado como um problema de Otimização Multiobjetivo. O método da soma ponderada e o método ε-restrito são utilizados para modificar os problemas multiobjetivos propostos em problemas mono-objetivos. As variáveis binárias associadas às ações de controles são tratadas por duas funções, uma sigmoidal e uma polinomial. Para verificar a eficácia e a robustez dos modelos e algoritmos desenvolvidos serão realizados testes com os sistemas elétricos IEEE de 14, 30, 57, 118 e 300 barras. Todos os algoritmos e modelos foram implementados em General Algebraic Modeling System (GAMS) e os solvers CONOPT, IPOPT, KNITRO e DICOPT foram utilizados na resolução dos problemas. Os resultados obtidos confirmam que as estratégias de discretização são eficientes e as propostas de modelagem para variáveis binárias permitem encontrar soluções factíveis para os problemas envolvendo as ações de controles enquanto os solvers DICOPT e KNITRO utilizados para modelar variáveis binárias não encontram soluções.
Resumo:
The Alaska Natural Gas Pipeline (ANGP) is proposed for construction on the North Slope in 2016. It will be aligned through Arctic caribou habitat and evidence shows that caribou are negatively affected by human development. This Capstone identifies potential adverse affects of ANGP on Arctic caribou using interviews from expert caribou biologists and the 1977 Trans-Alaska Pipeline System (TAPS) as a model. Based on a synthesis of the interviews and TAPS analysis, this capstone proposes and examines a set of seventeen conservation measures to be implemented during construction and operation of ANGP to minimize adverse impacts on caribou herds. These conservation measures can be used as a baseline for future developments on the North Slope to promote caribou herd management.
Resumo:
Las Memorias Generales de Repoblación surgen con las disposiciones que desarrollan la Ley de 11 de julio de 1877 sobre repoblación, fomento y mejora de los montes públicos. Elaboradas por el personal de los distritos forestales, se envían al Ministerio de Fomento entre 1878 y 1884. Además de contener los objetivos y las actuaciones repobladoras propuestas por cada distrito, también reflejan, desde el punto de vista de los ingenieros del ramo, la situación de los montes españoles e incluyen en ocasiones, dentro de esta lógica descriptiva, una cartografía básica de contenido forestal, agronómico y geológico. Ésta es conocida en unos pocos casos a través de los originales publicados a mayor escala; pero en otros no se tenía constancia de su existencia más que por referencias documentales. Unos y otros ejemplos se enmarcan dentro del esfuerzo cartográfico español que caracteriza la segunda mitad del siglo XIX.