948 resultados para Performance of High Energy Physics detectors


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The first part of this thesis has focused on the construction of a twelve-phase asynchronous machine for More Electric Aircraft (MEA) applications. In fact, the aerospace world has found in electrification the way to improve the efficiency, reliability and maintainability of an aircraft. This idea leads to the aircraft a new management and distribution of electrical services. In this way is possible to remove or to reduce the hydraulic, mechanical and pneumatic systems inside the aircraft. The second part of this dissertation is dedicated on the enhancement of the control range of matrix converters (MCs) operating with non-unity input power factor and, at the same time, on the reduction of the switching power losses. The analysis leads to the determination in closed form of a modulation strategy that features a control range, in terms of output voltage and input power factor, that is greater than that of the traditional strategies under the same operating conditions, and a reduction in the switching power losses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the year 2013, the detection of a diffuse astrophysical neutrino flux with the IceCube neutrino telescope – constructed at the geographic South Pole – was announced by the IceCube collaboration. However, the origin of these neutrinos is still unknown as no sources have been identified to this day. Promising neutrino source candidates are blazars, which are a subclass of active galactic nuclei with radio jets pointing towards the Earth. In this thesis, the neutrino flux from blazars is tested with a maximum likelihood stacking approach, analyzing the combined emission from uniform groups of objects. The stacking enhances the sensitivity w.r.t. the still unsuccessful single source searches. The analysis utilizes four years of IceCube data including one year from the completed detector. As all results presented in this work are compatible with background, upper limits on the neutrino flux are given. It is shown that, under certain conditions, some hadronic blazar models can be challenged or even rejected. Moreover, the sensitivity of this analysis – and any other future IceCube point source search – was enhanced by the development of a new angular reconstruction method. It is based on a detailed simulation of the photon propagation in the Antarctic ice. The median resolution for muon tracks, induced by high-energy neutrinos, is improved for all neutrino energies above IceCube’s lower threshold at 0.1TeV. By reprocessing the detector data and simulation from the year 2010, it is shown that the new method improves IceCube’s discovery potential by 20% to 30% depending on the declination.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

I Nuclei Galattici Attivi (AGN) sono sorgenti luminose e compatte alimentate dall'accrescimento di materia sul buco nero supermassiccio al centro di una galassia. Una frazione di AGN, detta "radio-loud", emette fortemente nel radio grazie a getti relativistici accelerati dal buco nero. I Misaligned AGN (MAGN) sono sorgenti radio-loud il cui getto non è allineato con la nostra linea di vista (radiogalassie e SSRQ). La grande maggioranza delle sorgenti extragalattiche osservate in banda gamma sono blazar, mentre, in particolare in banda TeV, abbiamo solo 4 MAGN osservati. Lo scopo di questa tesi è valutare l'impatto del Cherenkov Telescope Array (CTA), il nuovo strumento TeV, sugli studi di MAGN. Dopo aver studiato le proprietà dei 4 MAGN TeV usando dati MeV-GeV dal telescopio Fermi e dati TeV dalla letteratura, abbiamo assunto come candidati TeV i MAGN osservati da Fermi. Abbiamo quindi simulato 50 ore di osservazioni CTA per ogni sorgente e calcolato la loro significatività. Assumendo una estrapolazione diretta dello spettro Fermi, prevediamo la scoperta di 9 nuovi MAGN TeV con il CTA, tutte sorgenti locali di tipo FR I. Applicando un cutoff esponenziale a 100 GeV, come forma spettrale più realistica secondo i dati osservativi, prevediamo la scoperta di 2-3 nuovi MAGN TeV. Per quanto riguarda l'analisi spettrale con il CTA, secondo i nostri studi sarà possibile ottenere uno spettro per 5 nuove sorgenti con tempi osservativi dell'ordine di 250 ore. In entrambi i casi, i candidati migliori risultano essere sempre sorgenti locali (z<0.1) e con spettro Fermi piatto (Gamma<2.2). La migliore strategia osservativa per ottenere questi risultati non corrisponde con i piani attuali per il CTA che prevedono una survey non puntata, in quanto queste sorgenti sono deboli, e necessitano di lunghe osservazioni puntate per essere rilevate (almeno 50 ore per studi di flusso integrato e 250 per studi spettrali).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of our study was to compare the effect of dual-energy subtraction and bone suppression software alone and in combination with computer-aided detection (CAD) on the performance of human observers in lung nodule detection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Anaerobic digestion of food scraps has the potential to accomplish waste minimization, energy production, and compost or humus production. At Bucknell University, removal of food scraps from the waste stream could reduce municipal solid waste transportation costs and landfill tipping fees, and provide methane and humus for use on campus. To determine the suitability of food waste produced at Bucknell for high-solids anaerobic digestion (HSAD), a year-long characterization study was conducted. Physical and chemical properties, waste biodegradability, and annual production of biodegradable waste were assessed. Bucknell University food and landscape waste was digested at pilot-scale for over a year to test performance at low and high loading rates, ease of operation at 20% solids, benefits of codigestion of food and landscape waste, and toprovide digestate for studies to assess the curing needs of HSAD digestate. A laboratory-scale curing study was conducted to assess the curing duration required to reduce microbial activity, phytotoxicity, and odors to acceptable levels for subsequent use ofhumus. The characteristics of Bucknell University food and landscape waste were tested approximately weekly for one year, to determine chemical oxygen demand (COD), total solids (TS), volatile solids (VS), and biodegradability (from batch digestion studies). Fats, oil, and grease and total Kjeldahl nitrogen were also tested for some food waste samples. Based on the characterization and biodegradability studies, Bucknell University dining hall food waste is a good candidate for HSAD. During batch digestion studies Bucknell University food waste produced a mean of 288 mL CH4/g COD with a 95%confidence interval of 0.06 mL CH4/g COD. The addition of landscape waste for digestion increased methane production from both food and landscape waste; however, because the landscape waste biodegradability was extremely low the increase was small.Based on an informal waste audit, Bucknell could collect up to 100 tons of food waste from dining facilities each year. The pilot-scale high-solids anaerobic digestion study confirmed that digestion ofBucknell University food waste combined with landscape waste at a low organic loading rate (OLR) of 2 g COD/L reactor volume-day is feasible. During low OLR operation, stable reactor performance was demonstrated through monitoring of biogas production and composition, reactor total and volatile solids, total and soluble chemical oxygendemand, volatile fatty acid content, pH, and bicarbonate alkalinity. Low OLR HSAD of Bucknell University food waste and landscape waste combined produced 232 L CH4/kg COD and 229 L CH4/kg VS. When OLR was increased to high loading (15 g COD/L reactor volume-day) to assess maximum loading conditions, reactor performance became unstable due to ammonia accumulation and subsequent inhibition. The methaneproduction per unit COD also decreased (to 211 L CH4/kg COD fed), although methane production per unit VS increased (to 272 L CH4/kg VS fed). The degree of ammonia inhibition was investigated through respirometry in which reactor digestate was diluted and exposed to varying concentrations of ammonia. Treatments with low ammoniaconcentrations recovered quickly from ammonia inhibition within the reactor. The post-digestion curing process was studied at laboratory-scale, to provide a preliminary assessment of curing duration. Digestate was mixed with woodchips and incubated in an insulated container at 35 °C to simulate full-scale curing self-heatingconditions. Degree of digestate stabilization was determined through oxygen uptake rates, percent O2, temperature, volatile solids, and Solvita Maturity Index. Phytotoxicity was determined through observation of volatile fatty acid and ammonia concentrations.Stabilization of organics and elimination of phytotoxic compounds (after 10–15 days of curing) preceded significant reductions of volatile sulfur compounds (hydrogen sulfide, methanethiol, and dimethyl sulfide) after 15–20 days of curing. Bucknell University food waste has high biodegradability and is suitable for high-solids anaerobic digestion; however, it has a low C:N ratio which can result in ammonia accumulation under some operating conditions. The low biodegradability of Bucknell University landscape waste limits the amount of bioavailable carbon that it can contribute, making it unsuitable for use as a cosubstrate to increase the C:N ratio of food waste. Additional research is indicated to determine other cosubstrates with higher biodegradabilities that may allow successful HSAD of Bucknell University food waste at high OLRs. Some cosubstrates to investigate are office paper, field residues, or grease trap waste. A brief curing period of less than 3 weeks was sufficient to produce viable humus from digestate produced by low OLR HSAD of food and landscape waste.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Project-based education and portfolio assessments are at the forefront of educational research. This research follows the implementation of a project-based unit in a high school physics class. Students played the role of an engineering firm who designed, built and tested file folder bridges. The purpose was to determine if projectbased learning could improve student attitude toward science and related careers like engineering. Teams of students presented their work in a portfolio for a final assessment of the process of designing, building and testing their bridges.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Building energy meter network, based on per-appliance monitoring system, willbe an important part of the Advanced Metering Infrastructure. Two key issues exist for designing such networks. One is the network structure to be used. The other is the implementation of the network structure on a large amount of small low power devices, and the maintenance of high quality communication when the devices have electric connection with high voltage AC line. The recent advancement of low-power wireless communication makes itself the right candidate for house and building energy network. Among all kinds of wireless solutions, the low speed but highly reliable 802.15.4 radio has been chosen in this design. While many network-layer solutions have been provided on top of 802.15.4, an IPv6 based method is used in this design. 6LOWPAN is the particular protocol which adapts IP on low power personal network radio. In order to extend the network into building area without, a specific network layer routing mechanism-RPL, is included in this design. The fundamental unit of the building energy monitoring system is a smart wall plug. It is consisted of an electricity energy meter, a RF communication module and a low power CPU. The real challenge for designing such a device is its network firmware. In this design, IPv6 is implemented through Contiki operation system. Customize hardware driver and meter application program have been developed on top of the Contiki OS. Some experiments have been done, in order to prove the network ability of this system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Steers were sorted into four groups based on hip height and fat cover at the start of the finishing period. Each group of sorted steers was fed a diet containing 0.59 or 0.64 Mcal NEg per pound of diet. Steers with less initial fat cover (.08 in.) gained slightly faster, consumed less feed, and therefore tended to be more efficient than steers with greater finish (.16 in.). Steers fed the lower-energy diet consumed more feed, gained similarly, and were less efficient than steers fed the higher-energy diet. The NRC computer model to evaluate beef cattle diets underpredicted performance of cattle in this experiment, but accurately predicted the differences in gain and feed efficiency observed between the leaner and fatter steers and between the two diets. In this study, the shorter steers (49.4 vs 52.2 in. initial height at the hip) gained faster with slightly greater feed intake and the same feed conversion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rolled high-oil corn in comparison with rolled isogenetic control corn was fed to finishing steers as 33%, 66% and 100% of the corn grain in their diet in a 134-day feeding trial. During the first 75 days of the trial, steers fed highoil corn had numerically lower rates of gain and tended to have poorer feed conversions compared with the control corn. At the end of the trial, there were not statistically significant differences in performance or carcass measurements of the steers fed the different amounts of high-oil or control corns. The results of this study indicated that the steers did not respond to the higher energy content of high-oil corn.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: We sought to evaluate the performance of the human papillomavirus high-risk DNA test in patients 30 years and older. MATERIALS AND METHODS: Screening (n=835) and diagnosis (n=518) groups were defined based on prior Papanicolaou smear results as part of a clinical trial for cervical cancer detection. We compared the Hybrid Capture II (HCII) test result with the worst histologic report. We used cervical intraepithelial neoplasia (CIN) 2/3 or worse as the reference of disease. We calculated sensitivities, specificities, positive and negative likelihood ratios (LR+ and LR-), receiver operating characteristic (ROC) curves, and areas under the ROC curves for the HCII test. We also considered alternative strategies, including Papanicolaou smear, a combination of Papanicolaou smear and the HCII test, a sequence of Papanicolaou smear followed by the HCII test, and a sequence of the HCII test followed by Papanicolaou smear. RESULTS: For the screening group, the sensitivity was 0.69 and the specificity was 0.93; the area under the ROC curve was 0.81. The LR+ and LR- were 10.24 and 0.34, respectively. For the diagnosis group, the sensitivity was 0.88 and the specificity was 0.78; the area under the ROC curve was 0.83. The LR+ and LR- were 4.06 and 0.14, respectively. Sequential testing showed little or no improvement over the combination testing. CONCLUSIONS: The HCII test in the screening group had a greater LR+ for the detection of CIN 2/3 or worse. HCII testing may be an additional screening tool for cervical cancer in women 30 years and older.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main goal of the AEgIS experiment at CERN is to test the weak equivalence principle for antimatter. AEgIS will measure the free-fall of an antihydrogen beam traversing a moir'e deflectometer. The goal is to determine the gravitational acceleration with an initial relative accuracy of 1% by using an emulsion detector combined with a silicon μ-strip detector to measure the time of flight. Nuclear emulsions can measure the annihilation vertex of antihydrogen atoms with a precision of ~ 1–2 μm r.m.s. We present here results for emulsion detectors operated in vacuum using low energy antiprotons from the CERN antiproton decelerator. We compare with Monte Carlo simulations, and discuss the impact on the AEgIS project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Impairment of cognitive performance during and after high-altitude climbing has been described in numerous studies and has mostly been attributed to cerebral hypoxia and resulting functional and structural cerebral alterations. To investigate the hypothesis that high-altitude climbing leads to cognitive impairment, we used of neuropsychological tests and measurements of eye movement (EM) performance during different stimulus conditions. The study was conducted in 32 mountaineers participating in an expedition to Muztagh Ata (7,546 m). Neuropsychological tests comprised figural fluency, line bisection, letter and number cancellation, and a modified pegboard task. Saccadic performance was evaluated under three stimulus conditions with varying degrees of cortical involvement: visually guided pro- and anti-saccades, and visuo-visual interaction. Typical saccade parameters (latency, mean sequence, post-saccadic stability, and error rate) were computed off-line. Measurements were taken at a baseline level of 440 m and at altitudes of 4,497, 5,533, 6,265, and again at 440 m. All subjects reached 5,533 m, and 28 reached 6,265 m. The neuropsychological test results did not reveal any cognitive impairment. Complete eye movement recordings for all stimulus conditions were obtained in 24 subjects at baseline and at least two altitudes and in 10 subjects at baseline and all altitudes. Measurements of saccade performances showed no dependence on any altitude-related parameter and were well within normal limits. Our data indicates that acclimatized climbers do not seem to suffer from significant cognitive deficits during or after climbs to altitudes above 7,500 m. We demonstrated that investigation of EMs is feasible during high-altitude expeditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A search has been performed for the experimental signature of an isolated photon with high transverse momentum, at least one jet identified as originating from a bottom quark, and high missing transverse momentum. Such a final state may originate from supersymmetric models with gauge-mediated supersymmetry breaking in events in which one of a pair of higgsino-like neutralinos decays into a photon and a gravitino while the other decays into a Higgs boson and a gravitino. The search is performed using the full dataset of 7 TeV proton-proton collisions recorded with the ATLAS detector at the LHC in 2011, corresponding to an integrated luminosity of 4.7 fb(-1). A total of 7 candidate events are observed while 7.5 +/- 2.2 events are expected from the Standard Model background. The results of the search are interpreted in the context of general gauge mediation to exclude certain regions of a benchmark plane for higgsino-like neutralino production.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a measurement of the cross-section for high transverse momentum W and Z bosons produced in pp collisions and decaying to allhadronic final states. The data used in the analysis were recorded by the ATLAS detector at the CERN Large Hadron Collider at a centre-of-mass energy of s = 7 TeV and correspond to an integrated luminosity of 4.6 fb−1. The measurement is performed by reconstructing the boosted W or Z bosons in single jets. The reconstructed jet mass is used to identify the W and Z bosons, and a jet substructure method based on energy cluster information in the jet centre-of mass frame is used to suppress the large multi-jet background. The cross-section for events with a hadronically decaying W or Z boson, with transverse momentum pT > 320 GeV and pseudorapidity |η| < 1.9, is measured to be σ + = ± W Z 8.5 1.7 pb and is compared to next-to-leading-order calculations. The selected events are further used to study jet grooming techniques.