369 resultados para PELLETRON ACCELERATORS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

After a decade evolving in the High Performance Computing arena, GPU-equipped supercomputers have con- quered the top500 and green500 lists, providing us unprecedented levels of computational power and memory bandwidth. This year, major vendors have introduced new accelerators based on 3D memory, like Xeon Phi Knights Landing by Intel and Pascal architecture by Nvidia. This paper reviews hardware features of those new HPC accelerators and unveils potential performance for scientific applications, with an emphasis on Hybrid Memory Cube (HMC) and High Bandwidth Memory (HBM) used by commercial products according to roadmaps already announced.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The measurement of broadband ultrasonic attenuation (BUA) in cancellous bone at the calcaneus was first described in 1984. The assessment of osteoporosis by BUA has recently been recognized by Universities UK, within its EurekaUK book, as being one of the “100 discoveries and developments in UK Universities that have changed the world” over the past 50 years, covering the whole academic spectrum from the arts and humanities to science and technology. Indeed, BUA technique has been clinically validated and is utilized worldwide, with at least seven commercial systems providing calcaneal BUA measurement. However, a fundamental understanding of the dependence of BUA upon the material and structural properties of cancellous bone is still lacking. This review aims to provide a science- and technology-orientated perspective on the application of BUA to the medical disease of osteoporosis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Preterm infants have an increased risk of low bone mass and subsequent fracture due to limited bone mass accretion in utero and a greater need for bone nutrients. The diagnosis of ostepeonia of prematurity remains difficult as there is no sctreening test which is both sensitive and specific.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vertebrplasty involved injecting cement into a fractured vertebra to provide stabilisation. There is clinical evidence to suggest however that vertebroplasty may be assocated with a higher risk of adjacent vertebral fracture; which may be due to the change in material properties of the post-procedure vertebra modifying the transmission of mechanical stresses to adjacent vertebrae.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bone mineral density (BMD) is currently the preferred surrogate for bone strength in clinical practice. Finite element analysis (FEA) is a computer simulation technique that can predict the deformation of a structure when a load is applied, providing a measure of stiffness (N mm− 1). Finite element analysis of X-ray images (3D-FEXI) is a FEA technique whose analysis is derived from a single 2D radiographic image. This ex-vivo study demonstrates that 3D-FEXI derived from a conventional 2D radiographic image has the potential to significantly increase the accuracy of failure load assessment of the proximal femur compared with that currently achieved with BMD.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The rapid growth of online social media networks like Facebook and Twitter is strongly influencing news media to engage with such networks for generating newsworthy content, accessing mass audiences for news consumption and using the platforms for news distribution. While both media’s complement each other as sources of news and information, they also compete against each other as news repositories and are observed vying for the same audiences. We call this phenomenon the competing-complementarity (C-C) engagement. To investigate the C-C relationship we use Fidler’s “mediamorphosis” concept to explain the metamorphosis of news media in the online domain. We make two contributions to Fidler’s concept by offering an additional principle “mass user migration” to address the characteristics of metamorphosis and an additional driver “transcended social engagement” to show the force that propels it. Besides, we also propose four accelerators that influence metamorphosis. Theoretical analysis of news media’s metamorphosis indicates its affinity to online social media. We apply niche and gratification theories to explain complementarity, and displacement effects on media consumption habits to trace competition between both media’s.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability to estimate the asset reliability and the probability of failure is critical to reducing maintenance costs, operation downtime, and safety hazards. Predicting the survival time and the probability of failure in future time is an indispensable requirement in prognostics and asset health management. In traditional reliability models, the lifetime of an asset is estimated using failure event data, alone; however, statistically sufficient failure event data are often difficult to attain in real-life situations due to poor data management, effective preventive maintenance, and the small population of identical assets in use. Condition indicators and operating environment indicators are two types of covariate data that are normally obtained in addition to failure event and suspended data. These data contain significant information about the state and health of an asset. Condition indicators reflect the level of degradation of assets while operating environment indicators accelerate or decelerate the lifetime of assets. When these data are available, an alternative approach to the traditional reliability analysis is the modelling of condition indicators and operating environment indicators and their failure-generating mechanisms using a covariate-based hazard model. The literature review indicates that a number of covariate-based hazard models have been developed. All of these existing covariate-based hazard models were developed based on the principle theory of the Proportional Hazard Model (PHM). However, most of these models have not attracted much attention in the field of machinery prognostics. Moreover, due to the prominence of PHM, attempts at developing alternative models, to some extent, have been stifled, although a number of alternative models to PHM have been suggested. The existing covariate-based hazard models neglect to fully utilise three types of asset health information (including failure event data (i.e. observed and/or suspended), condition data, and operating environment data) into a model to have more effective hazard and reliability predictions. In addition, current research shows that condition indicators and operating environment indicators have different characteristics and they are non-homogeneous covariate data. Condition indicators act as response variables (or dependent variables) whereas operating environment indicators act as explanatory variables (or independent variables). However, these non-homogenous covariate data were modelled in the same way for hazard prediction in the existing covariate-based hazard models. The related and yet more imperative question is how both of these indicators should be effectively modelled and integrated into the covariate-based hazard model. This work presents a new approach for addressing the aforementioned challenges. The new covariate-based hazard model, which termed as Explicit Hazard Model (EHM), explicitly and effectively incorporates all three available asset health information into the modelling of hazard and reliability predictions and also drives the relationship between actual asset health and condition measurements as well as operating environment measurements. The theoretical development of the model and its parameter estimation method are demonstrated in this work. EHM assumes that the baseline hazard is a function of the both time and condition indicators. Condition indicators provide information about the health condition of an asset; therefore they update and reform the baseline hazard of EHM according to the health state of asset at given time t. Some examples of condition indicators are the vibration of rotating machinery, the level of metal particles in engine oil analysis, and wear in a component, to name but a few. Operating environment indicators in this model are failure accelerators and/or decelerators that are included in the covariate function of EHM and may increase or decrease the value of the hazard from the baseline hazard. These indicators caused by the environment in which an asset operates, and that have not been explicitly identified by the condition indicators (e.g. Loads, environmental stresses, and other dynamically changing environment factors). While the effects of operating environment indicators could be nought in EHM; condition indicators could emerge because these indicators are observed and measured as long as an asset is operational and survived. EHM has several advantages over the existing covariate-based hazard models. One is this model utilises three different sources of asset health data (i.e. population characteristics, condition indicators, and operating environment indicators) to effectively predict hazard and reliability. Another is that EHM explicitly investigates the relationship between condition and operating environment indicators associated with the hazard of an asset. Furthermore, the proportionality assumption, which most of the covariate-based hazard models suffer from it, does not exist in EHM. According to the sample size of failure/suspension times, EHM is extended into two forms: semi-parametric and non-parametric. The semi-parametric EHM assumes a specified lifetime distribution (i.e. Weibull distribution) in the form of the baseline hazard. However, for more industry applications, due to sparse failure event data of assets, the analysis of such data often involves complex distributional shapes about which little is known. Therefore, to avoid the restrictive assumption of the semi-parametric EHM about assuming a specified lifetime distribution for failure event histories, the non-parametric EHM, which is a distribution free model, has been developed. The development of EHM into two forms is another merit of the model. A case study was conducted using laboratory experiment data to validate the practicality of the both semi-parametric and non-parametric EHMs. The performance of the newly-developed models is appraised using the comparison amongst the estimated results of these models and the other existing covariate-based hazard models. The comparison results demonstrated that both the semi-parametric and non-parametric EHMs outperform the existing covariate-based hazard models. Future research directions regarding to the new parameter estimation method in the case of time-dependent effects of covariates and missing data, application of EHM in both repairable and non-repairable systems using field data, and a decision support model in which linked to the estimated reliability results, are also identified.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The "standard" procedure for calibrating the Vesuvio eV neutron spectrometer at the ISIS neutron source, forming the basis for data analysis over at least the last decade, was recently documented in considerable detail by the instrument’s scientists. Additionally, we recently derived analytic expressions of the sensitivity of recoil peak positions with respect to fight-path parameters and presented neutron–proton scattering results that together called in to question the validity of the "standard" calibration. These investigations should contribute significantly to the assessment of the experimental results obtained with Vesuvio. Here we present new results of neutron–deuteron scattering from D2 in the backscattering angular range (theata > 90 degrees) which are accompanied by a striking energy increase that violates the Impulse Approximation, thus leading unequivocally the following dilemma: (A) either the "standard" calibration is correct and then the experimental results represent a novel quantum dynamical effect of D which stands in blatant contradiction of conventional theoretical expectations; (B) or the present "standard" calibration procedure is seriously deficient and leads to artificial outcomes. For Case(A), we allude to the topic of attosecond quantumdynamical phenomena and our recent neutron scattering experiments from H2 molecules. For Case(B),some suggestions as to how the "standard" calibration could be considerably improved are made.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The effects of small changes in flight-path parameters (primary and secondary flight paths, detector angles), and of displacement of the sample along the beam axis away from its ideal position, are examined for an inelastic time-of-flight (TOF) neutron spectrometer, emphasising the deep-inelastic regime. The aim was to develop a rational basis for deciding what measured shifts in the positions of spectral peaks could be regarded as reliable in the light of the uncertainties in the calibrated flight-path parameters. Uncertainty in the length of the primary or secondary flight path has the least effect on the positions of the peaks of H, D and He, which are dominated by the accuracy of the calibration of the detector angles. This aspect of the calibration of a TOF spectrometer therefore demands close attention to achieve reliable outcomes where the position of the peaks is of significant scientific interest and is discussed in detail. The corresponding sensitivities of the position of peak of the Compton profile, J(y), to flight-path parameters and sample position are also examined, focusing on the comparability across experiments of results for H, D and He. We show that positioning the sample to within a few mm of the ideal position is required to ensure good comparability between experiments if data from detectors at high forward angles are to be reliably interpreted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

X-ray microtomography (micro-CT) with micron resolution enables new ways of characterizing microstructures and opens pathways for forward calculations of multiscale rock properties. A quantitative characterization of the microstructure is the first step in this challenge. We developed a new approach to extract scale-dependent characteristics of porosity, percolation, and anisotropic permeability from 3-D microstructural models of rocks. The Hoshen-Kopelman algorithm of percolation theory is employed for a standard percolation analysis. The anisotropy of permeability is calculated by means of the star volume distribution approach. The local porosity distribution and local percolation probability are obtained by using the local porosity theory. Additionally, the local anisotropy distribution is defined and analyzed through two empirical probability density functions, the isotropy index and the elongation index. For such a high-resolution data set, the typical data sizes of the CT images are on the order of gigabytes to tens of gigabytes; thus an extremely large number of calculations are required. To resolve this large memory problem parallelization in OpenMP was used to optimally harness the shared memory infrastructure on cache coherent Non-Uniform Memory Access architecture machines such as the iVEC SGI Altix 3700Bx2 Supercomputer. We see adequate visualization of the results as an important element in this first pioneering study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A procedure for the evaluation of multiple scattering contributions is described, for deep inelastic neutron scattering (DINS) studies using an inverse geometry time-of-flight spectrometer. The accuracy of a Monte Carlo code DINSMS, used to calculate the multiple scattering, is tested by comparison with analytic expressions and with experimental data collected from polythene, polycrystalline graphite and tin samples. It is shown that the Monte Carlo code gives an accurate representation of the measured data and can therefore be used to reliably correct DINS data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The electron Volt Spectrometer (eVS) is an inverse geometry filter difference spectrometer that has been optimised to measure the single atom properties of condensed matter systems using a technique known as Neutron Compton Scattering (NCS) or Deep Inelastic Neutron Scattering (DINS). The spectrometer utilises the high flux of epithermal neutrons that are produced by the ISIS neutron spallation source enabling the direct measurement of atomic momentum distributions and ground state kinetic energies. In this paper the procedure that is used to calibrate the spectrometer is described. This includes details of the method used to determine detector positions and neutron flight path lengths as well as the determination of the instrument resolution. Examples of measurements on 3 different samples are shown, ZrH2, 4He and Sn which show the self-consistency of the calibration procedure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Electronic Portal Imaging Devices (EPIDs) are available with most linear accelerators (Amonuk, 2002), the current technology being amorphous silicon flat panel imagers. EPIDs are currently used routinely in patient positioning before radiotherapy treatments. There has been an increasing interest in using EPID technology tor dosimetric verification of radiotherapy treatments (van Elmpt, 2008). A straightforward technique involves the EPID panel being used to measure the fluence exiting the patient during a treatment which is then compared to a prediction of the fluence based on the treatment plan. However, there are a number of significant limitations which exist in this Method: Resulting in a limited proliferation ot this technique in a clinical environment. In this paper, we aim to present a technique of simulating IMRT fields using Monte Carlo to predict the dose in an EPID which can then be compared to the measured dose in the EPID. Materials: Measurements were made using an iView GT flat panel a-SI EPfD mounted on an Elekta Synergy linear accelerator. The images from the EPID were acquired using the XIS software (Heimann Imaging Systems). Monte Carlo simulations were performed using the BEAMnrc and DOSXVZnrc user codes. The IMRT fieids to be delivered were taken from the treatment planning system in DICOMRT format and converted into BEAMnrc and DOSXYZnrc input files using an in-house application (Crowe, 2009). Additionally. all image processing and analysis was performed using another in-house application written using the Interactive Data Language (IDL) (In Visual Information Systems). Comparison between the measured and Monte Carlo EPID images was performed using a gamma analysis (Low, 1998) incorporating dose and distance to agreement criteria. Results: The fluence maps recorded by the EPID were found to provide good agreement between measured and simulated data. Figure 1 shows an example of measured and simulated IMRT dose images and profiles in the x and y directions. "A technique for the quantitative evaluation of dose distributions", Med Phys, 25(5) May 1998 S. Crowe, 1. Kairn, A. Fielding, "The Development of a Monte Carlo system to verify Radiotherapy treatment dose calculations", Radiotherapy & Oncology, Volume 92, Supplement 1, August 2009, Pages S71-S71.