58 resultados para CMS detectors


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Photoacoustic/thermoacoustic tomography is an emerging hybrid imaging modality combining optical/microwave imaging with ultrasound imaging. Here, a k-wave MATLAB toolbox was used to simulate various configurations of excitation pulse shape, width, transducer types, and target object sizes to see their effect on the photoacoustic/thermoacoustic signals. A numerical blood vessel phantom was also used to demonstrate the effect of various excitation pulse waveforms and pulse widths on the reconstructed images. Reconstructed images were blurred due to the broadening of the pressure waves by the excitation pulse width as well as by the limited transducer bandwidth. The blurring increases with increase in pulse width. A deconvolution approach is presented here with Tikhonov regularization to correct the photoacoustic/thermoacoustic signals, which resulted in improved reconstructed images by reducing the blurring effect. It is observed that the reconstructed images remain unaffected by change in pulse widths or pulse shapes, as well as by the limited bandwidth of the ultrasound detectors after the use of the deconvolution technique. (C) 2013 Optical Society of America

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Light neutralino dark matter can be achieved in the Minimal Supersymmetric Standard Model if staus are rather light, with mass around 100 GeV. We perform a detailed analysis of the relevant supersymmetric parameter space, including also the possibility of light selectons and smuons, and of light higgsino- or wino-like charginos. In addition to the latest limits from direct and indirect detection of dark matter, ATLAS and CMS constraints on electroweak-inos and on sleptons are taken into account using a ``simplified models'' framework. Measurements of the properties of the Higgs boson at 125 GeV, which constrain amongst others the invisible decay of the Higgs boson into a pair of neutralinos, are also implemented in the analysis. We show that viable neutralino dark matter can be achieved for masses as low as 15 GeV. In this case, light charginos close to the LEP bound are required in addition to light right-chiral staus. Significant deviations are observed in the couplings of the 125 GeV Higgs boson. These constitute a promising way to probe the light neutralino dark matter scenario in the next run of the LHC. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spatial resolution in photoacoustic and thermoacoustic tomography is ultrasound transducer (detector) bandwidth limited. For a circular scanning geometry the axial (radial) resolution is not affected by the detector aperture, but the tangential (lateral) resolution is highly dependent on the aperture size, and it is also spatially varying (depending on the location relative to the scanning center). Several approaches have been reported to counter this problem by physically attaching a negative acoustic lens in front of the nonfocused transducer or by using virtual point detectors. Here, we have implemented a modified delay-and-sum reconstruction method, which takes into account the large aperture of the detector, leading to more than fivefold improvement in the tangential resolution in photoacoustic (and thermoacoustic) tomography. Three different types of numerical phantoms were used to validate our reconstruction method. It is also shown that we were able to preserve the shape of the reconstructed objects with the modified algorithm. (C) 2014 Optical Society of America

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Large Hadron Collider (LHC) has completed its run at 8 TeV with the experiments ATLAS and CMS having collected about 25 fb(-1) of data each. Discovery of a light Higgs boson coupled with lack of evidence for supersymmetry at the LHC so far, has motivated studies of supersymmetry in the context of naturalness with the principal focus being the third generation squarks. In this work, we analyze the prospects of the flavor violating decay mode (t) over tilde (1) -> c chi(0)(1) at 8 and 13 TeV center-of-mass energy at the LHC. This channel is also relevant in the dark matter context for the stop-coannihilation scenario, where the relic density depends on the mass difference between the lighter stop quark ((t) over tilde (1)) and the lightest neutralino (chi(0)(1)) states. This channel is extremely challenging to probe, especially for situations when the mass difference between the lighter stop quark and the lightest neutralino is small. Using certain kinematical properties of signal events we find that the level of backgrounds can be reduced substantially. We find that the prospect for this channel is limited due to the low production cross section for top squarks and limited luminosity at 8 TeV, but at the 13 TeV LHC with 100 fb(-1) luminosity, it is possible to probe top squarks with masses up to similar to 450 GeV. We also discuss how the sensitivity could be significantly improved by tagging charm jets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We develop iterative diffraction tomography algorithms, which are similar to the distorted Born algorithms, for inverting scattered intensity data. Within the Born approximation, the unknown scattered field is expressed as a multiplicative perturbation to the incident field. With this, the forward equation becomes stable, which helps us compute nearly oscillation-free solutions that have immediate bearing on the accuracy of the Jacobian computed for use in a deterministic Gauss-Newton (GN) reconstruction. However, since the data are inherently noisy and the sensitivity of measurement to refractive index away from the detectors is poor, we report a derivative-free evolutionary stochastic scheme, providing strictly additive updates in order to bridge the measurement-prediction misfit, to arrive at the refractive index distribution from intensity transport data. The superiority of the stochastic algorithm over the GN scheme for similar settings is demonstrated by the reconstruction of the refractive index profile from simulated and experimentally acquired intensity data. (C) 2014 Optical Society of America

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ever-increasing number of diseases worldwide requires comprehensive, efficient, and cost-effective modes of treatments. Among various strategies, nanomaterials fulfill most of these criteria. The unique physicochemical properties of nanoparticles have made them a premier choice as a drug or a drug delivery system for the purpose of treatment, and as bio-detectors for disease prognosis. However, the main challenge is the proper consideration of the physical properties of these nanomaterials, while developing them as potential tools for therapeutics and/or diagnostics. In this review, we focus mainly on the characteristics of nanoparticles to develop an effective and sensitive system for clinical purposes. This review will present an overview of the important properties of nanoparticles, through their journey from its route of administration until disposal from the human body after accomplishing targeted functionality. We have chosen cancer as our model disease to explain the potentiality of nano-systems for therapeutics and diagnostics in relation to several organs (intestine, lung, brain, etc.). Furthermore, we have discussed their biodegradability and accumulation probability which can cause unfavorable side effects in healthy human subjects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The detection efficiency of a gaseous photomultiplier depends on the photocathode quantum efficiency and the extraction efficiency of photoelectrons into the gas. In this paper we have studied the performance of an UV photon detector with P10 gas in which the extraction efficiency can reach values near to those in vacuum operated devices. Simulations have been done to compare the percentage of photoelectrons backscattered in P10 gas as well as in the widely used neon-based gas mixture. The performance study has been carried out using a single stage thick gas electron multiplier (THGEM). The electron pulses and electron spectrum are recorded under various operating conditions. Secondary effects prevailing in UV photon detectors like photon feedback are discussed and its effect on the electron spectrum under different operating conditions is analyzed. (C) 2014 Chinese Laser Press

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Morphological changes in cells associated with disease states are often assessed using clinical microscopy. However, the changes in chemical composition of cells can also be used to detect disease conditions. Optical absorption measurements carried out on single cells using inexpensive sources, detectors can help assess the chemical composition of cells; thereby enable detection of diseases. In this article, we present a novel technique capable of simultaneously detecting changes in morphology and chemical composition of cells. The presented technique enables characterization of optical absorbance-based methods against microscopy for detection of disease states. Using the technique, we have been able to achieve a throughput of about 1000 cells per second. We demonstrate the proof-of-principle by detecting malaria in a given blood sample. The presented technique is capable of detecting very lower levels of parasitemia within time scales comparable to antigen-based rapid diagnostic tests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Subtle concurrency errors in multithreaded libraries that arise because of incorrect or inadequate synchronization are often difficult to pinpoint precisely using only static techniques. On the other hand, the effectiveness of dynamic race detectors is critically dependent on multithreaded test suites whose execution can be used to identify and trigger races. Usually, such multithreaded tests need to invoke a specific combination of methods with objects involved in the invocations being shared appropriately to expose a race. Without a priori knowledge of the race, construction of such tests can be challenging. In this paper, we present a lightweight and scalable technique for synthesizing precisely these kinds of tests. Given a multithreaded library and a sequential test suite, we describe a fully automated analysis that examines sequential execution traces, and produces as its output a concurrent client program that drives shared objects via library method calls to states conducive for triggering a race. Experimental results on a variety of well-tested Java libraries yield 101 synthesized multithreaded tests in less than four minutes. Analyzing the execution of these tests using an off-the-shelf race detector reveals 187 harmful races, including several previously unreported ones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ATLAS and CMS collaborations at the LHC have performed analyses on the existing data sets, studying the case of one vector-like fermion or multiplet coupling to the standard model Yukawa sector. In the near future, with more data available, these experimental collaborations will start to investigate more realistic cases. The presence of more than one extra vector-like multiplet is indeed a common situation in many extensions of the standard model. The interplay of these vector-like multiplet between precision electroweak bounds, flavour and collider phenomenology is a important question in view of establishing bounds or for the discovery of physics beyond the standard model. In this work we study the phenomenological consequences of the presence of two vector-like multiplets. We analyse the constraints on such scenarios from tree-level data and oblique corrections for the case of mixing to each of the SM generations. In the present work, we limit to scenarios with two top-like partners and no mixing in the down-sector.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The irradiation of selective regions in a polymer gel dosimeter results in an increase in optical density and refractive index (RI) at those regions. An optical tomography-based dosimeter depends on rayline path through the dosimeter to estimate and reconstruct the dose distribution. The refraction of light passing through a dose region results in artefacts in the reconstructed images. These refraction errors are dependant on the scanning geometry and collection optics. We developed a fully 3D image reconstruction algorithm, algebraic reconstruction technique-refraction correction (ART-rc) that corrects for the refractive index mismatches present in a gel dosimeter scanner not only at the boundary, but also for any rayline refraction due to multiple dose regions inside the dosimeter. In this study, simulation and experimental studies have been carried out to reconstruct a 3D dose volume using 2D CCD measurements taken for various views. The study also focuses on the effectiveness of using different refractive-index matching media surrounding the gel dosimeter. Since the optical density is assumed to be low for a dosimeter, the filtered backprojection is routinely used for reconstruction. We carry out the reconstructions using conventional algebraic reconstruction (ART) and refractive index corrected ART (ART-rc) algorithms. The reconstructions based on FDK algorithm for cone-beam tomography has also been carried out for comparison. Line scanners and point detectors, are used to obtain reconstructions plane by plane. The rays passing through dose region with a RI mismatch does not reach the detector in the same plane depending on the angle of incidence and RI. In the fully 3D scanning setup using 2D array detectors, light rays that undergo refraction are still collected and hence can still be accounted for in the reconstruction algorithm. It is found that, for the central region of the dosimeter, the usable radius using ART-rc algorithm with water as RI matched medium is 71.8%, an increase of 6.4% compared to that achieved using conventional ART algorithm. Smaller diameter dosimeters are scanned with dry air scanning by using a wide-angle lens that collects refracted light. The images reconstructed using cone beam geometry is seen to deteriorate in some planes as those regions are not scanned. Refraction correction is important and needs to be taken in to consideration to achieve quantitatively accurate dose reconstructions. Refraction modeling is crucial in array based scanners as it is not possible to identify refracted rays in the sinogram space.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two-dimensional magnetic recording 2-D (TDMR) is a promising technology for next generation magnetic storage systems based on a systems-level framework involving sophisticated signal processing at the core. The TDMR channel suffers from severe jitter noise along with electronic noise that needs to be mitigated during signal detection and recovery. Recently, we developed noise prediction-based techniques coupled with advanced signal detectors to work with these systems. However, it is important to understand the role of harmful patterns that can be avoided during the encoding process. In this paper, we investigate the Voronoi-based media model to study the harmful patterns over multitrack shingled recording systems. Through realistic quasi-micromagnetic simulation studies, we identify 2-D data patterns that contribute to high media noise. We look into the generic Voronoi model and present our analysis on multitrack detection with constrained coded data. We show that the 2-D constraints imposed on input patterns result in an order of magnitude improvement in the bit-error rate for the TDMR systems. The use of constrained codes can reduce the complexity of 2-D intersymbol interference (ISI) signal detection, since the lesser 2-D ISI span can be accommodated at the cost of a nominal code rate loss. However, a system must be designed carefully so that the rate loss incurred by a 2-D constraint does not offset the detector performance gain due to more distinguishable readback signals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

If the recent indications of a possible state I broken vertical bar with mass similar to 750 GeV decaying into two photons reported by ATLAS and CMS in LHC collisions at 13 TeV were to become confirmed, the prospects for future collider physics at the LHC and beyond would be affected radically, as we explore in this paper. Even minimal scenarios for the I broken vertical bar resonance and its gamma gamma decays require additional particles with masses . We consider here two benchmark scenarios that exemplify the range of possibilities: one in which I broken vertical bar is a singlet scalar or pseudoscalar boson whose production and gamma gamma decays are due to loops of coloured and charged fermions, and another benchmark scenario in which I broken vertical bar is a superposition of (nearly) degenerate CP-even and CP-odd Higgs bosons in a (possibly supersymmetric) two-Higgs doublet model also with additional fermions to account for the gamma gamma decay rate. We explore the implications of these benchmark scenarios for the production of I broken vertical bar and its new partners at colliders in future runs of the LHC and beyond, at higher-energy pp colliders and at e (+) e (-) and gamma gamma colliders, with emphasis on the bosonic partners expected in the doublet scenario and the fermionic partners expected in both scenarios.