926 resultados para Dipole Array


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ocean bottom pressure records from eight stations of the Cascadia array are used to investigate the properties of short surface gravity waves with frequencies ranging from 0.2 to 5 Hz. It is found that the pressure spectrum at all sites is a well-defined function of the wind speed U10 and frequency f, with only a minor shift of a few dB from one site to another that can be attributed to variations in bottom properties. This observation can be combined with the theoretical prediction that the ocean bottom pressure spectrum is proportional to the surface gravity wave spectrum E(f) squared, times the overlap integral I(f) which is given by the directional wave spectrum at each frequency. This combination, using E(f) estimated from modeled spectra or parametric spectra, yields an overlap integral I(f) that is a function of the local wave age inline image. This function is maximum for f∕fPM = 8 and decreases by 10 dB for f∕fPM = 2 and f∕fPM = 30. This shape of I(f) can be interpreted as a maximum width of the directional wave spectrum at f∕fPM = 8, possibly equivalent to an isotropic directional spectrum, and a narrower directional distribution toward both the dominant low frequencies and the higher capillary-gravity wave frequencies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Crossing the Franco-Swiss border, the Large Hadron Collider (LHC), designed to collide 7 TeV proton beams, is the world's largest and most powerful particle accelerator the operation of which was originally intended to commence in 2008. Unfortunately, due to an interconnect discontinuity in one of the main dipole circuit's 13 kA superconducting busbars, a catastrophic quench event occurred during initial magnet training, causing significant physical system damage. Furthermore, investigation into the cause found that such discontinuities were not only present in the circuit in question, but throughout the entire LHC. This prevented further magnet training and ultimately resulted in the maximum sustainable beam energy being limited to approximately half that of the design nominal, 3.5-4 TeV, for the first three years of operation (Run 1, 2009-2012) and a major consolidation campaign being scheduled for the first long shutdown (LS 1, 2012-2014). Throughout Run 1, a series of studies attempted to predict the amount of post-installation training quenches still required to qualify each circuit to nominal-energy current levels. With predictions in excess of 80 quenches (each having a recovery time of 8-12+ hours) just to achieve 6.5 TeV and close to 1000 quenches for 7 TeV, it was decided that for Run 2, all systems be at least qualified for 6.5 TeV operation. However, even with all interconnect discontinuities scheduled to be repaired during LS 1, numerous other concerns regarding circuit stability arose. In particular, observations of an erratic behaviour of magnet bypass diodes and the degradation of other potentially weak busbar sections, as well as observations of seemingly random millisecond spikes in beam losses, known as unidentified falling object (UFO) events, which, if persist at 6.5 TeV, may eventually deposit sufficient energy to quench adjacent magnets. In light of the above, the thesis hypothesis states that, even with the observed issues, the LHC main dipole circuits can safely support and sustain near-nominal proton beam energies of at least 6.5 TeV. Research into minimising the risk of magnet training led to the development and implementation of a new qualification method, capable of providing conclusive evidence that all aspects of all circuits, other than the magnets and their internal joints, can safely withstand a quench event at near-nominal current levels, allowing for magnet training to be carried out both systematically and without risk. This method has become known as the Copper Stabiliser Continuity Measurement (CSCM). Results were a success, with all circuits eventually being subject to a full current decay from 6.5 TeV equivalent current levels, with no measurable damage occurring. Research into UFO events led to the development of a numerical model capable of simulating typical UFO events, reproducing entire Run 1 measured event data sets and extrapolating to 6.5 TeV, predicting the likelihood of UFO-induced magnet quenches. Results provided interesting insights into the involved phenomena as well as confirming the possibility of UFO-induced magnet quenches. The model was also capable of predicting that such events, if left unaccounted for, are likely to be commonplace or not, resulting in significant long-term issues for 6.5+ TeV operation. Addressing the thesis hypothesis, the following written works detail the development and results of all CSCM qualification tests and subsequent magnet training as well as the development and simulation results of both 4 TeV and 6.5 TeV UFO event modelling. The thesis concludes, post-LS 1, with the LHC successfully sustaining 6.5 TeV proton beams, but with UFO events, as predicted, resulting in otherwise uninitiated magnet quenches and being at the forefront of system availability issues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Title of dissertation: MAGNETIC AND ACOUSTIC INVESTIGATIONS OF TURBULENT SPHERICAL COUETTE FLOW Matthew M. Adams, Doctor of Philosophy, 2016 Dissertation directed by: Professor Daniel Lathrop Department of Physics This dissertation describes experiments in spherical Couette devices, using both gas and liquid sodium. The experimental geometry is motivated by the Earth's outer core, the seat of the geodynamo, and consists of an outer spherical shell and an inner sphere, both of which can be rotated independently to drive a shear flow in the fluid lying between them. In the case of experiments with liquid sodium, we apply DC axial magnetic fields, with a dominant dipole or quadrupole component, to the system. We measure the magnetic field induced by the flow of liquid sodium using an external array of Hall effect magnetic field probes, as well as two probes inserted into the fluid volume. This gives information about possible velocity patterns present, and we extend previous work categorizing flow states, noting further information that can be extracted from the induced field measurements. The limitations due to a lack of direct velocity measurements prompted us to work on developing the technique of using acoustic modes to measure zonal flows. Using gas as the working fluid in our 60~cm diameter spherical Couette experiment, we identified acoustic modes of the container, and obtained excellent agreement with theoretical predictions. For the case of uniform rotation of the system, we compared the acoustic mode frequency splittings with theoretical predictions for solid body flow, and obtained excellent agreement. This gave us confidence in extending this work to the case of differential rotation, with a turbulent flow state. Using the measured splittings for this case, our colleagues performed an inversion to infer the pattern of zonal velocities within the flow, the first such inversion in a rotating laboratory experiment. This technique holds promise for use in liquid sodium experiments, for which zonal flow measurements have historically been challenging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biogeochemical-Argo is the extension of the Argo array of profiling floats to include floats that are equipped with biogeochemical sensors for pH, oxygen, nitrate, chlorophyll, suspended particles, and downwelling irradiance. Argo is a highly regarded, international program that measures the changing ocean temperature (heat content) and salinity with profiling floats distributed throughout the ocean. Newly developed sensors now allow profiling floats to also observe biogeochemical properties with sufficient accuracy for climate studies. This extension of Argo will enable an observing system that can determine the seasonal to decadal-scale variability in biological productivity, the supply of essential plant nutrients from deep-waters to the sunlit surface layer, ocean acidification, hypoxia, and ocean uptake of CO2. Biogeochemical-Argo will drive a transformative shift in our ability to observe and predict the effects of climate change on ocean metabolism, carbon uptake, and living marine resource management. Presently, vast areas of the open ocean are sampled only once per decade or less, with sampling occurring mainly in summer. Our ability to detect changes in biogeochemical processes that may occur due to the warming and acidification driven by increasing atmospheric CO2, as well as by natural climate variability, is greatly hindered by this undersampling. In close synergy with satellite systems (which are effective at detecting global patterns for a few biogeochemical parameters, but only very close to the sea surface and in the absence of clouds), a global array of biogeochemical sensors would revolutionize our understanding of ocean carbon uptake, productivity, and deoxygenation. The array would reveal the biological, chemical, and physical events that control these processes. Such a system would enable a new generation of global ocean prediction systems in support of carbon cycling, acidification, hypoxia and harmful algal blooms studies, as well as the management of living marine resources. In order to prepare for a global Biogeochemical-Argo array, several prototype profiling float arrays have been developed at the regional scale by various countries and are now operating. Examples include regional arrays in the Southern Ocean (SOCCOM ), the North Atlantic Sub-polar Gyre (remOcean ), the Mediterranean Sea (NAOS ), the Kuroshio region of the North Pacific (INBOX ), and the Indian Ocean (IOBioArgo ). For example, the SOCCOM program is deploying 200 profiling floats with biogeochemical sensors throughout the Southern Ocean, including areas covered seasonally with ice. The resulting data, which are publically available in real time, are being linked with computer models to better understand the role of the Southern Ocean in influencing CO2 uptake, biological productivity, and nutrient supply to distant regions of the world ocean. The success of these regional projects has motivated a planning meeting to discuss the requirements for and applications of a global-scale Biogeochemical-Argo program. The meeting was held 11-13 January 2016 in Villefranche-sur-Mer, France with attendees from eight nations now deploying Argo floats with biogeochemical sensors present to discuss this topic. In preparation, computer simulations and a variety of analyses were conducted to assess the resources required for the transition to a global-scale array. Based on these analyses and simulations, it was concluded that an array of about 1000 biogeochemical profiling floats would provide the needed resolution to greatly improve our understanding of biogeochemical processes and to enable significant improvement in ecosystem models. With an endurance of four years for a Biogeochemical-Argo float, this system would require the procurement and deployment of 250 new floats per year to maintain a 1000 float array. The lifetime cost for a Biogeochemical-Argo float, including capital expense, calibration, data management, and data transmission, is about $100,000. A global Biogeochemical-Argo system would thus cost about $25,000,000 annually. In the present Argo paradigm, the US provides half of the profiling floats in the array, while the EU, Austral/Asia, and Canada share most the remaining half. If this approach is adopted, the US cost for the Biogeochemical-Argo system would be ~$12,500,000 annually and ~$6,250,000 each for the EU, and Austral/Asia and Canada. This includes no direct costs for ship time and presumes that float deployments can be carried out from future research cruises of opportunity, including, for example, the international GO-SHIP program (http://www.go-ship.org). The full-scale implementation of a global Biogeochemical-Argo system with 1000 floats is feasible within a decade. The successful, ongoing pilot projects have provided the foundation and start for such a system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This document introduces the planned new search for the neutron Electric Dipole Moment at the Spallation Neutron Source at the Oak Ridge National Laboratory. A spin precession measurement is to be carried out using Ultracold neutrons diluted in a superfluid Helium bath at T = 0.5 K, where spin polarized 3He atoms act as detector of the neutron spin polarization. This manuscript describes some of the key aspects of the planned experiment with the contributions from Caltech to the development of the project.

Techniques used in the design of magnet coils for Nuclear Magnetic Resonance were adapted to the geometry of the experiment. Described is an initial design approach using a pair of coils tuned to shield outer conductive elements from resistive heat loads, while inducing an oscillating field in the measurement volume. A small prototype was constructed to test the model of the field at room temperature.

A large scale test of the high voltage system was carried out in a collaborative effort at the Los Alamos National Laboratory. The application and amplification of high voltage to polished steel electrodes immersed in a superfluid Helium bath was studied, as well as the electrical breakdown properties of the electrodes at low temperatures. A suite of Monte Carlo simulation software tools to model the interaction of neutrons, 3He atoms, and their spins with the experimental magnetic and electric fields was developed and implemented to further the study of expected systematic effects of the measurement, with particular focus on the false Electric Dipole Moment induced by a Geometric Phase akin to Berry’s phase.

An analysis framework was developed and implemented using unbinned likelihood to fit the time modulated signal expected from the measurement data. A collaborative Monte Carlo data set was used to test the analysis methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, the development of the photovoltaic (PV) technology is consolidated as a source of renewable energy. The research in the topic of maximum improvement on the energy efficiency of the PV plants is today a major challenge. The main requirement for this purpose is to know the performance of each of the PV modules that integrate the PV field in real time. In this respect, a PLC communications based Smart Monitoring and Communications Module, which is able to monitor at PV level their operating parameters, has been developed at the University of Malaga. With this device you can check if any of the panels is suffering any type of overriding performance, due to a malfunction or partial shadowing of its surface. Since these fluctuations in electricity production from a single panel affect the overall sum of all panels that conform a string, it is necessary to isolate the problem and modify the routes of energy through alternative paths in case of PV panels array configuration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Colloid self-assembly under external control is a new route to fabrication of advanced materials with novel microstructures and appealing functionalities. The kinetic processes of colloidal self-assembly have attracted great interests also because they are similar to many atomic level kinetic processes of materials. In the past decades, rapid technological progresses have been achieved on producing shape-anisotropic, patchy, core-shell structured particles and particles with electric/magnetic charges/dipoles, which greatly enriched the self-assembled structures. Multi-phase carrier liquids offer new route to controlling colloidal self-assembly. Therefore, heterogeneity is the essential characteristics of colloid system, while so far there still lacks a model that is able to efficiently incorporate these possible heterogeneities. This thesis is mainly devoted to development of a model and computational study on the complex colloid system through a diffuse-interface field approach (DIFA), recently developed by Wang et al. This meso-scale model is able to describe arbitrary particle shape and arbitrary charge/dipole distribution on the surface or body of particles. Within the framework of DIFA, a Gibbs-Duhem-type formula is introduced to treat Laplace pressure in multi-liquid-phase colloidal system and it obeys Young-Laplace equation. The model is thus capable to quantitatively study important capillarity related phenomena. Extensive computer simulations are performed to study the fundamental behavior of heterogeneous colloidal system. The role of Laplace pressure is revealed in determining the mechanical equilibrium of shape-anisotropic particles at fluid interfaces. In particular, it is found that the Laplace pressure plays a critical role in maintaining the stability of capillary bridges between close particles, which sheds light on a novel route to in situ firming compact but fragile colloidal microstructures via capillary bridges. Simulation results also show that competition between like-charge repulsion, dipole-dipole interaction and Brownian motion dictates the degree of aggregation of heterogeneously charged particles. Assembly and alignment of particles with magnetic dipoles under external field is studied. Finally, extended studies on the role of dipole-dipole interaction are performed for ferromagnetic and ferroelectric domain phenomena. The results reveal that the internal field generated by dipoles competes with external field to determine the dipole-domain evolution in ferroic materials.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Networks have come to occupy a key position in the strategic armoury of the government, business and community sectors and now have impact on a broad array of policy and management arenas. An emphasis on relationships, trust and mutuality mean that networks function on a different operating logic to the conventional processes of government and business. It is therefore important that organizational members of networks are able to adopt the skills and culture necessary to operate successfully under these distinctive kinds of arrangements. Because networks function from a different operational logic to traditional bureaucracies, public sector organizations may experience difficulties in adapting to networked arrangements. Networks are formed to address a variety of social problems or meet capability gaps within organizations. As such they are often under pressure to quickly produce measurable outcomes and need to form rapidly and come to full operation quickly. This paper presents a theoretical exploration of how diverse types of networks are required for different management and policy situations and draws on a set of public sector case studies to understand/demonstrate how these various types of networked arrangements may be ‘turbo-charged’ so that they more quickly adopt the characteristics necessary to deliver required outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This is the first volume to capture the essence of the burgeoning field of cultural studies in a concise and accessible manner. Other books have explored the British and North American traditions, but this is the first guide to the ideas, purposes and controversies that have shaped the subject. The author sheds new light on neglected pioneers and a clear route map through the terrain. He provides lively critical narratives on a dazzling array of key figures including, Arnold, Barrell, Bennett, Carey, Fiske, Foucault, Grossberg, Hall, Hawkes, hooks, Hoggart, Leadbeater, Lissistzky, Malevich, Marx, McLuhan, McRobbie, D Miller, T Miller, Morris, Quiller-Couch, Ross, Shaw, Urry, Williams, Wilson, Wolfe and Woolf. Hartley also examines a host of central themes in the subject including literary and political writing, publishing, civic humanism, political economy and Marxism, sociology, feminism, anthropology and the pedagogy of cultural studies.