917 resultados para Charged System Search
Resumo:
A battery powered air-conditioning device was developed to provide an improved thermal comfort level for individuals in inadequately cooled environments. This device is a battery powered air-conditioning system with the phase change material (PCM) for heat storage. The condenser heat is stored in the PCM during the cooling operation and is discharged while the battery is charged by using the vapor compression cycle as a thermosiphon loop. The main focus of the current research was on the development of the cooling system. The cooling capacity of the vapor compression cycle measured was 165.6 W with system COP at 2.85. It was able to provide 2 hours cooling without discharging heat to the ambient. The PCM was recharged in nearly 8 hours under thermosiphon mode.
Resumo:
Odour impacts and concerns are an impediment to the growth of the Australian chicken meat industry. To manage these, the industry has to be able to demonstrate the efficacy of its odour reduction strategies scientifically and defensibly; however, it currently lacks reliable, cost effective and objective tools to do so. This report describes the development of an artificial olfaction system (AOS) to measure meat chicken farm odour. This report describes the market research undertaken to determine the demand for such a tool, the development and evaluation of three AOS prototypes, data analysis and odour prediction modelling, and the development of two complementary odour measurement tools, namely, a volatile organic compound (VOC) pre-concentrator and a field olfactometer. This report is aimed at investors in poultry odour research and those charged with, or interested in, assessment of odour on chicken farms, including farm managers, integrators, their consultants, regulators and researchers. The findings will influence the focus of future environmental odour measurement research.
Resumo:
We present new radial velocity measurements of eight stars that were secured with the spectrograph SOPHIE at the 193 cm telescope of the Haute-Provence Observatory. The measurements allow detecting and characterizing new giant extrasolar planets. The host stars are dwarfs of spectral types between F5 and K0 and magnitudes of between 6.7 and 9.6; the planets have minimum masses Mp sin i of between 0.4 to 3.8 MJup and orbitalperiods of several days to several months. The data allow only single planets to be discovered around the first six stars (HD 143105, HIP 109600, HD 35759, HIP 109384, HD 220842, and HD 12484), but one of them shows the signature of an additional substellar companion in the system. The seventh star, HIP 65407, allows the discovery of two giant planets that orbit just outside the 12:5 resonance in weak mutual interaction. The last star, HD 141399, was already known to host a four-planet system; our additional data and analyses allow new constraints to be set on it. We present Keplerian orbits of all systems, together with dynamical analyses of the two multi-planet systems. HD 143105 is one of the brightest stars known to host a hot Jupiter, which could allow numerous follow-up studies to be conducted even though this is not a transiting system. The giant planets HIP 109600b, HIP 109384b, and HD 141399c are located in the habitable zone of their host star.
Resumo:
Context. With about 2000 extrasolar planets confirmed, the results show that planetary systems have a whole range of unexpected properties. This wide diversity provides fundamental clues to the processes of planet formation and evolution. Aims: We present a full investigation of the HD 219828 system, a bright metal-rich star for which a hot Neptune has previously been detected. Methods: We used a set of HARPS, SOPHIE, and ELODIE radial velocities to search for the existence of orbiting companions to HD 219828. The spectra were used to characterise the star and its chemical abundances, as well as to check for spurious, activity induced signals. A dynamical analysis is also performed to study the stability of the system and to constrain the orbital parameters and planet masses. Results: We announce the discovery of a long period (P = 13.1 yr) massive (m sini = 15.1 MJup) companion (HD 219828 c) in a very eccentric orbit (e = 0.81). The same data confirms the existence of a hot Neptune, HD 219828 b, with a minimum mass of 21 M⊕ and a period of 3.83 days. The dynamical analysis shows that the system is stable, and that the equilibrium eccentricity of planet b is close to zero. Conclusions: The HD 219828 system is extreme and unique in several aspects. First, ammong all known exoplanet systems it presents an unusually high mass ratio. We also show that systems like HD 219828, with a hot Neptune and a long-period massive companion are more frequent than similar systems with a hot Jupiter instead. This suggests that the formation of hot Neptunes follows a different path than the formation of their hot jovian counterparts. The high mass, long period, and eccentricity of HD 219828 c also make it a good target for Gaia astrometry as well as a potential target for atmospheric characterisation, using direct imaging or high-resolution spectroscopy. Astrometric observations will allow us to derive its real mass and orbital configuration. If a transit of HD 219828 b is detected, we will be able to fully characterise the system, including the relative orbital inclinations. With a clearly known mass, HD 219828 c may become a benchmark object for the range in between giant planets and brown dwarfs.
Resumo:
The origin of observed ultra-high energy cosmic rays (UHECRs, energies in excess of $10^{18.5}$ eV) remains unknown, as extragalactic magnetic fields deflect these charged particles from their true origin. Interactions of these UHECRs at their source would invariably produce high energy neutrinos. As these neutrinos are chargeless and nearly massless, their propagation through the universe is unimpeded and their detection can be correlated with the origin of UHECRs. Gamma-ray bursts (GRBs) are one of the few possible origins for UHECRs, observed as short, immensely bright outbursts of gamma-rays at cosmological distances. The energy density of GRBs in the universe is capable of explaining the measured UHECR flux, making them promising UHECR sources. Interactions between UHECRs and the prompt gamma-ray emission of a GRB would produce neutrinos that would be detected in coincidence with the GRB’s gamma-ray emission. The IceCube Neutrino Observatory can be used to search for these neutrinos in coincidence with GRBs, detecting neutrinos through the Cherenkov radiation emitted by secondary charged particles produced in neutrino interactions in the South Pole glacial ice. Restricting these searches to be in coincidence with GRB gamma-ray emis- sion, analyses can be performed with very little atmospheric background. Previous searches have focused on detecting muon tracks from muon neutrino interactions fromthe Northern Hemisphere, where the Earth shields IceCube’s primary background of atmospheric muons, or spherical cascade events from neutrinos of all flavors from the entire sky, with no compelling neutrino signal found. Neutrino searches from GRBs with IceCube have been extended to a search for muon tracks in the Southern Hemisphere in coincidence with 664 GRBs over five years of IceCube data in this dissertation. Though this region of the sky contains IceCube’s primary background of atmospheric muons, it is also where IceCube is most sensitive to neutrinos at the very highest energies as Earth absorption in the Northern Hemisphere becomes relevant. As previous neutrino searches have strongly constrained neutrino production in GRBs, a new per-GRB analysis is introduced for the first time to discover neutrinos in coincidence with possibly rare neutrino-bright GRBs. A stacked analysis is also performed to discover a weak neutrino signal distributed over many GRBs. Results of this search are found to be consistent with atmospheric muon backgrounds. Combining this result with previously published searches for muon neutrino tracks in the Northern Hemisphere, cascade event searches over the entire sky, and an extension of the Northern Hemisphere track search in three additional years of IceCube data that is consistent with atmospheric backgrounds, the most stringent limits yet can be placed on prompt neutrino production in GRBs, which increasingly disfavor GRBs as primary sources of UHECRs in current GRB models.
Resumo:
Temporally-growing frontal meandering and occasional eddy-shedding is observed in the Brazil Current (BC) as it flows adjacent to the Brazilian Coast. No study of the dynamics of this phenomenon has been conducted to date in the region between 22 degrees S and 25 degrees S. Within this latitude range, the flow over the intermediate continental slope is marked by a current inversion at a depth that is associated with the Intermediate Western Boundary Current (IWBC). A time series analysis of 10-current-meter mooring data was used to describe a mean vertical profile for the BC-IWBC jet and a typical meander vertical structure. The latter was obtained by an empirical orthogonal function (EOF) analysis that showed a single mode explaining 82% of the total variance. This mode structure decayed sharply with depth, revealing that the meandering is much more vigorous within the BC domain than it is in the IWBC region. As the spectral analysis of the mode amplitude time series revealed no significant periods, we searched for dominant wavelengths. This search was done via a spatial EOF analysis on 51 thermal front patterns derived from digitized AVHRR images. Four modes were statistically significant at the 95% confidence level. Modes 3 and 4, which together explained 18% of the total variance, are associated with 266 and 338-km vorticity waves, respectively. With this new information derived from the data, the [Johns, W.E., 1988. One-dimensional baroclinically unstable waves on the Gulf Stream potential vorticity gradient near Cape Hatteras. Dyn. Atmos. Oceans 11, 323-350] one-dimensional quasi-geostrophic model was applied to the interpolated mean BC-IWBC jet. The results indicated that the BC system is indeed baroclinically unstable and that the wavelengths depicted in the thermal front analysis are associated with the most unstable waves produced by the model. Growth rates were about 0.06 (0.05) days(-1) for the 266-km (338-km) wave. Moreover, phase speeds for these waves were low compared to the surface BC velocity and may account for remarks in the literature about growing standing or stationary meanders off southeast Brazil. The theoretical vertical structure modes associated with these waves resembled very closely to the one obtained for the current-meter mooring EOF analysis. We interpret this agreement as a confirmation that baroclinic instability is an important mechanism in meander growth in the BC system. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
This document introduces the planned new search for the neutron Electric Dipole Moment at the Spallation Neutron Source at the Oak Ridge National Laboratory. A spin precession measurement is to be carried out using Ultracold neutrons diluted in a superfluid Helium bath at T = 0.5 K, where spin polarized 3He atoms act as detector of the neutron spin polarization. This manuscript describes some of the key aspects of the planned experiment with the contributions from Caltech to the development of the project.
Techniques used in the design of magnet coils for Nuclear Magnetic Resonance were adapted to the geometry of the experiment. Described is an initial design approach using a pair of coils tuned to shield outer conductive elements from resistive heat loads, while inducing an oscillating field in the measurement volume. A small prototype was constructed to test the model of the field at room temperature.
A large scale test of the high voltage system was carried out in a collaborative effort at the Los Alamos National Laboratory. The application and amplification of high voltage to polished steel electrodes immersed in a superfluid Helium bath was studied, as well as the electrical breakdown properties of the electrodes at low temperatures. A suite of Monte Carlo simulation software tools to model the interaction of neutrons, 3He atoms, and their spins with the experimental magnetic and electric fields was developed and implemented to further the study of expected systematic effects of the measurement, with particular focus on the false Electric Dipole Moment induced by a Geometric Phase akin to Berry’s phase.
An analysis framework was developed and implemented using unbinned likelihood to fit the time modulated signal expected from the measurement data. A collaborative Monte Carlo data set was used to test the analysis methods.
Resumo:
In the first part of this thesis we search for beyond the Standard Model physics through the search for anomalous production of the Higgs boson using the razor kinematic variables. We search for anomalous Higgs boson production using proton-proton collisions at center of mass energy √s=8 TeV collected by the Compact Muon Solenoid experiment at the Large Hadron Collider corresponding to an integrated luminosity of 19.8 fb-1.
In the second part we present a novel method for using a quantum annealer to train a classifier to recognize events containing a Higgs boson decaying to two photons. We train that classifier using simulated proton-proton collisions at √s=8 TeV producing either a Standard Model Higgs boson decaying to two photons or a non-resonant Standard Model process that produces a two photon final state.
The production mechanisms of the Higgs boson are precisely predicted by the Standard Model based on its association with the mechanism of electroweak symmetry breaking. We measure the yield of Higgs bosons decaying to two photons in kinematic regions predicted to have very little contribution from a Standard Model Higgs boson and search for an excess of events, which would be evidence of either non-standard production or non-standard properties of the Higgs boson. We divide the events into disjoint categories based on kinematic properties and the presence of additional b-quarks produced in the collisions. In each of these disjoint categories, we use the razor kinematic variables to characterize events with topological configurations incompatible with typical configurations found from standard model production of the Higgs boson.
We observe an excess of events with di-photon invariant mass compatible with the Higgs boson mass and localized in a small region of the razor plane. We observe 5 events with a predicted background of 0.54 ± 0.28, which observation has a p-value of 10-3 and a local significance of 3.35σ. This background prediction comes from 0.48 predicted non-resonant background events and 0.07 predicted SM higgs boson events. We proceed to investigate the properties of this excess, finding that it provides a very compelling peak in the di-photon invariant mass distribution and is physically separated in the razor plane from predicted background. Using another method of measuring the background and significance of the excess, we find a 2.5σ deviation from the Standard Model hypothesis over a broader range of the razor plane.
In the second part of the thesis we transform the problem of training a classifier to distinguish events with a Higgs boson decaying to two photons from events with other sources of photon pairs into the Hamiltonian of a spin system, the ground state of which is the best classifier. We then use a quantum annealer to find the ground state of this Hamiltonian and train the classifier. We find that we are able to do this successfully in less than 400 annealing runs for a problem of median difficulty at the largest problem size considered. The networks trained in this manner exhibit good classification performance, competitive with the more complicated machine learning techniques, and are highly resistant to overtraining. We also find that the nature of the training gives access to additional solutions that can be used to improve the classification performance by up to 1.2% in some regions.
Resumo:
Les métaheuristiques sont très utilisées dans le domaine de l'optimisation discrète. Elles permettent d’obtenir une solution de bonne qualité en un temps raisonnable, pour des problèmes qui sont de grande taille, complexes, et difficiles à résoudre. Souvent, les métaheuristiques ont beaucoup de paramètres que l’utilisateur doit ajuster manuellement pour un problème donné. L'objectif d'une métaheuristique adaptative est de permettre l'ajustement automatique de certains paramètres par la méthode, en se basant sur l’instance à résoudre. La métaheuristique adaptative, en utilisant les connaissances préalables dans la compréhension du problème, des notions de l'apprentissage machine et des domaines associés, crée une méthode plus générale et automatique pour résoudre des problèmes. L’optimisation globale des complexes miniers vise à établir les mouvements des matériaux dans les mines et les flux de traitement afin de maximiser la valeur économique du système. Souvent, en raison du grand nombre de variables entières dans le modèle, de la présence de contraintes complexes et de contraintes non-linéaires, il devient prohibitif de résoudre ces modèles en utilisant les optimiseurs disponibles dans l’industrie. Par conséquent, les métaheuristiques sont souvent utilisées pour l’optimisation de complexes miniers. Ce mémoire améliore un procédé de recuit simulé développé par Goodfellow & Dimitrakopoulos (2016) pour l’optimisation stochastique des complexes miniers stochastiques. La méthode développée par les auteurs nécessite beaucoup de paramètres pour fonctionner. Un de ceux-ci est de savoir comment la méthode de recuit simulé cherche dans le voisinage local de solutions. Ce mémoire implémente une méthode adaptative de recherche dans le voisinage pour améliorer la qualité d'une solution. Les résultats numériques montrent une augmentation jusqu'à 10% de la valeur de la fonction économique.
Resumo:
Les métaheuristiques sont très utilisées dans le domaine de l'optimisation discrète. Elles permettent d’obtenir une solution de bonne qualité en un temps raisonnable, pour des problèmes qui sont de grande taille, complexes, et difficiles à résoudre. Souvent, les métaheuristiques ont beaucoup de paramètres que l’utilisateur doit ajuster manuellement pour un problème donné. L'objectif d'une métaheuristique adaptative est de permettre l'ajustement automatique de certains paramètres par la méthode, en se basant sur l’instance à résoudre. La métaheuristique adaptative, en utilisant les connaissances préalables dans la compréhension du problème, des notions de l'apprentissage machine et des domaines associés, crée une méthode plus générale et automatique pour résoudre des problèmes. L’optimisation globale des complexes miniers vise à établir les mouvements des matériaux dans les mines et les flux de traitement afin de maximiser la valeur économique du système. Souvent, en raison du grand nombre de variables entières dans le modèle, de la présence de contraintes complexes et de contraintes non-linéaires, il devient prohibitif de résoudre ces modèles en utilisant les optimiseurs disponibles dans l’industrie. Par conséquent, les métaheuristiques sont souvent utilisées pour l’optimisation de complexes miniers. Ce mémoire améliore un procédé de recuit simulé développé par Goodfellow & Dimitrakopoulos (2016) pour l’optimisation stochastique des complexes miniers stochastiques. La méthode développée par les auteurs nécessite beaucoup de paramètres pour fonctionner. Un de ceux-ci est de savoir comment la méthode de recuit simulé cherche dans le voisinage local de solutions. Ce mémoire implémente une méthode adaptative de recherche dans le voisinage pour améliorer la qualité d'une solution. Les résultats numériques montrent une augmentation jusqu'à 10% de la valeur de la fonction économique.
Resumo:
The MARS (Media Asset Retrieval System) Project is the collaborative effort of public broadcasters,libraries and schools in the Puget Sound region to create a digital online resource that provides access to content produced by public broadcasters via the public libraries. Convergence ConsortiumThe Convergence Consortium is a model for community collaboration, including organizations such as public broadcasters, libraries, museums, and schools in the Puget Sound region to assess the needs of their constituents and pool resources to develop solutions to meet those needs. Specifically, the archives of public broadcasters have been identified as significant resources for the local communities and nationally. These resources can be accessed on the broadcasters websites, and through libraries and used by schools, and integrated with text and photographic archives from other partners.MARS’ goalCreate an online resource that provides effective access to the content produced locally by KCTS (Seattle PBS affiliate) and KUOW (Seattle NPR affiliate). The broadcasts will be made searchable using the CPB Metadata Element Set (under development) and controlled vocabularies (to be developed). This will ensure a user friendly search and navigation mechanism and user satisfaction.Furthermore, the resource can search the local public library’s catalog concurrently and provide the user with relevant TV material, radio material, and books on a given subject.The ultimate goal is to produce a model that can be used in cities around the country.The current phase of the project assesses the community’s need, analyzes the current operational systems, and makes recommendations for the design of the resource.Deliverables• Literature review of the issues surrounding the organization, description and representation of media assets• Needs assessment report of internal and external stakeholders• Profile of the systems in the area of managing and organizing media assetsfor public broadcasting nationwideActivities• Analysis of information seeking behavior• Analysis of collaboration within the respective organizations• Analysis of the scope and context of the proposed system• Examining the availability of information resources and exchangeof resources among users
Resumo:
In this article, we describe the development of an exten- sion to the Simple Knowledge Organization System (SKOS) to accommodate the needs of vocabulary devel- opment applications (VDA) managing metadata schemes and requiring close tracking of change to both those schemes and their member concepts. We take a neo- pragmatic epistemic stance in asserting the need for an entity in SKOS modeling to mediate between the abstract concept and the concrete scheme. While the SKOS model sufficiently describes entities for modeling the current state of a scheme in support of indexing and search on the Semantic Web, it lacks the expressive power to serve the needs of VDA needing to maintain scheme historical continuity. We demonstrate prelimi- narily that conceptualizations drawn from empirical work in modeling entities in the bibliographic universe, such as works, texts, and exemplars, can provide the basis for SKOS extension in ways that support more rig- orous demands of capturing concept evolution in VDA.
Resumo:
An integrated mathematical model for the simulation of an offshore wind system performance is presented in this paper. The mathematical model considers an offshore variable-speed turbine in deep water equipped with a permanent magnet synchronous generator using multiple point full-power clamped three-level converter, converting the energy of a variable frequency source in injected energy into the electric network with constant frequency, through a HVDC transmission submarine cable. The mathematical model for the drive train is a concentrate two mass model which incorporates the dynamic for the blades of the wind turbine, tower and generator due to the need to emulate the effects of the wind and the floating motion. Controller strategy considered is a proportional integral one. Also, pulse width modulation using space vector modulation supplemented with sliding mode is used for trigger the transistors of the converter. Finally, a case study is presented to access the system performance.