952 resultados para SCINTILLATION COUNTING


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to reconstruct regional vegetation changes and local conditions during the fen-bog transition in the Borsteler Moor (northwestern Germany), a sediment core covering the period between 7.1 and 4.5 cal kyrs BP was palynologically in vestigated. The pollen diagram demonstrates the dominance of oak forests and a gradual replacement of trees by raised bog vegetation with the wetter conditions in the Late Atlantic. At ~ 6 cal kyrs BP, the non-pollen palynomorphs (NPP) demonstrate the succession from mesotrophic conditions, clearly indicated by a number of fungal spore types, to oligotrophic conditions, indicated by Sphagnum spores, Bryophytomyces sphagni, and testate amoebae Amphitrema, Assulina and Arcella, etc. Four relatively dry phases during the transition from fen to bog are clearly indicated by the dominance of Calluna and associated fungi as well as by the increase of microcharcoal. Several new NPP types are described and known NPP types are identified. All NPP are discussed in the context of their palaeoecological indicator values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spectral CT using a photon counting x-ray detector (PCXD) shows great potential for measuring material composition based on energy dependent x-ray attenuation. Spectral CT is especially suited for imaging with K-edge contrast agents to address the otherwise limited contrast in soft tissues. We have developed a micro-CT system based on a PCXD. This system enables full spectrum CT in which the energy thresholds of the PCXD are swept to sample the full energy spectrum for each detector element and projection angle. Measurements provided by the PCXD, however, are distorted due to undesirable physical eects in the detector and are very noisy due to photon starvation. In this work, we proposed two methods based on machine learning to address the spectral distortion issue and to improve the material decomposition. This rst approach is to model distortions using an articial neural network (ANN) and compensate for the distortion in a statistical reconstruction. The second approach is to directly correct for the distortion in the projections. Both technique can be done as a calibration process where the neural network can be trained using 3D printed phantoms data to learn the distortion model or the correction model of the spectral distortion. This replaces the need for synchrotron measurements required in conventional technique to derive the distortion model parametrically which could be costly and time consuming. The results demonstrate experimental feasibility and potential advantages of ANN-based distortion modeling and correction for more accurate K-edge imaging with a PCXD. Given the computational eciency with which the ANN can be applied to projection data, the proposed scheme can be readily integrated into existing CT reconstruction pipelines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ce projet de recherche s’inscrit dans le domaine de la dosimétrie à scintillation en radiothérapie, plus précisément en curiethérapie à haut débit de dose (HDR). Lors de ce type de traitement, la dose est délivrée localement, ce qui implique de hauts gradients de dose autour de la source. Le but de ce travail est d’obtenir un détecteur mesurant la dose en 2 points distincts et optimisé pour la mesure de dose en curiethérapie HDR. Pour ce faire, le projet de recherche est séparé en deux études : la caractérisation spectrale du détecteur à 2-points et la caractérisation du système de photodétecteur menant à la mesure de la dose. D’abord, la chaine optique d’un détecteur à scintillation à 2-points est caractérisée à l’aide d’un spectromètre afin de déterminer les composantes scintillantes optimales. Cette étude permet de construire quelques détecteurs à partir des composantes choisies pour ensuite les tester avec le système de photodétecteur multi-point. Le système de photodétecteur est aussi caractérisé de façon à évaluer les limites de sensibilité pour le détecteur 2-points choisi précédemment. L’objectif final est de pouvoir mesurer le débit de dose avec précision et justesse aux deux points de mesure du détecteur multi-point lors d’un traitement de curiethérapie HDR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current crime decrease is defying traditional criminological theories such as those espoused by Bonger (1916) who researched the relationship between crime and economic conditions and stated that when unemployment rises so does crime. In both the USA and the UK crime has dropped dramatically while unemployment has risen. Both the USA and the UK have been in a deep recession since 2008 but the crime rate has decreased dramatically in both countries. Over the past 20 years it has halved in England and Wales. So how do we explain this phenomenon? Crime is down across the West but more so in Britain (see Figure 1). In England and Wales crime has decreased by 8% in a single year (2013). Vandalism is down by 14% and burglaries and vehicle crime by 11%. The murder rate in the UK is at its lowest since 1978; in 2013, 540 people were killed. Some less serious offences are vanishing too; antisocial behaviour has fallen from just under 4million incidents in 2007-08 to 2.4million. (The Economist 20/4/13). According to the most recent annual results from the Crime Survey for England and Wales (CSEW), crime is at its lowest level since the survey began in 1981; the most recent annual figures from the survey, Latest figures from the CSEW show there were an estimated 7.3 million incidents of crime against households and resident adults (aged 16 and over) in England and Wales for the year ending March 2014. This represents a 14% decrease compared with the previous year’s survey, and is the lowest estimate since the survey began in 1981.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Presentation at the CRIS2016 conference in St Andrews, June 10, 2016

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Starting with an evaluator for a language, an abstract machine for the same language can be mechanically derived using successive program transformations. This has relevance to studying both the space and time properties of programs because these can be estimated by counting transitions of the abstract machine and measuring the size of the additional data structures needed, such as environments and stacks. In this article we use this process to derive a function that accurately counts the number of steps required to evaluate expressions in a simple language.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rainflow counting methods convert a complex load time history into a set of load reversals for use in fatigue damage modeling. Rainflow counting methods were originally developed to assess fatigue damage associated with mechanical cycling where creep of the material under load was not considered to be a significant contributor to failure. However, creep is a significant factor in some cyclic loading cases such as solder interconnects under temperature cycling. In this case, fatigue life models require the dwell time to account for stress relaxation and creep. This study develops a new version of the multi-parameter rainflow counting algorithm that provides a range-based dwell time estimation for use with time-dependent fatigue damage models. To show the applicability, the method is used to calculate the life of solder joints under a complex thermal cycling regime and is verified by experimental testing. An additional algorithm is developed in this study to provide data reduction in the results of the rainflow counting. This algorithm uses a damage model and a statistical test to determine which of the resultant cycles are statistically insignificant to a given confidence level. This makes the resulting data file to be smaller, and for a simplified load history to be reconstructed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Proliferation of microglial cells has been considered a sign of glial activation and a hallmark of ongoing neurodegenerative diseases. Microglia activation is analyzed in animal models of different eye diseases. Numerous retinal samples are required for each of these studies to obtain relevant data of statistical significance. Because manual quantification of microglial cells is time consuming, the aim of this study was develop an algorithm for automatic identification of retinal microglia. Two groups of adult male Swiss mice were used: age-matched controls (naïve, n = 6) and mice subjected to unilateral laser-induced ocular hypertension (lasered; n = 9). In the latter group, both hypertensive eyes and contralateral untreated retinas were analyzed. Retinal whole mounts were immunostained with anti Iba-1 for detecting microglial cell populations. A new algorithm was developed in MATLAB for microglial quantification; it enabled the quantification of microglial cells in the inner and outer plexiform layers and evaluates the area of the retina occupied by Iba-1+ microglia in the nerve fiber-ganglion cell layer. The automatic method was applied to a set of 6,000 images. To validate the algorithm, mouse retinas were evaluated both manually and computationally; the program correctly assessed the number of cells (Pearson correlation R = 0.94 and R = 0.98 for the inner and outer plexiform layers respectively). Statistically significant differences in glial cell number were found between naïve, lasered eyes and contralateral eyes (P<0.05, naïve versus contralateral eyes; P<0.001, naïve versus lasered eyes and contralateral versus lasered eyes). The algorithm developed is a reliable and fast tool that can evaluate the number of microglial cells in naïve mouse retinas and in retinas exhibiting proliferation. The implementation of this new automatic method can enable faster quantification of microglial cells in retinal pathologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé : Les performances de détecteurs à scintillation, composés d’un cristal scintillateur couplé à un photodétecteur, dépendent de façon critique de l’efficacité de la collecte et de l’extraction des photons de scintillation du cristal vers le capteur. Dans les systèmes d’imagerie hautement pixellisés (e.g. TEP, TDM), les scintillateurs doivent être arrangés en matrices compactes avec des facteurs de forme défavorables pour le transport des photons, au détriment des performances du détecteur. Le but du projet est d’optimiser les performances de ces détecteurs pixels par l'identification des sources de pertes de lumière liées aux caractéristiques spectrales, spatiales et angulaires des photons de scintillation incidents sur les faces des scintillateurs. De telles informations acquises par simulation Monte Carlo permettent une pondération adéquate pour l'évaluation de gains atteignables par des méthodes de structuration du scintillateur visant à une extraction de lumière améliorée vers le photodétecteur. Un plan factoriel a permis d'évaluer la magnitude de paramètres affectant la collecte de lumière, notamment l'absorption des matériaux adhésifs assurant l'intégrité matricielle des cristaux ainsi que la performance optique de réflecteurs, tous deux ayant un impact considérable sur le rendement lumineux. D'ailleurs, un réflecteur abondamment utilisé en raison de ses performances optiques exceptionnelles a été caractérisé dans des conditions davantage réalistes par rapport à une immersion dans l'air, où sa réflectivité est toujours rapportée. Une importante perte de réflectivité lorsqu'il est inséré au sein de matrices de scintillateurs a été mise en évidence par simulations puis confirmée expérimentalement. Ceci explique donc les hauts taux de diaphonie observés en plus d'ouvrir la voie à des méthodes d'assemblage en matrices limitant ou tirant profit, selon les applications, de cette transparence insoupçonnée.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Size distributions of expiratory droplets expelled during coughing and speaking and the velocities of the expiration air jets of healthy volunteers were measured. Droplet size was measured using the Interferometric Mie imaging (IMI) technique while the Particle Image Velocimetry (PIV) technique was used for measuring air velocity. These techniques allowed measurements in close proximity to the mouth and avoided air sampling losses. The average expiration air velocity was 11.7 m/s for coughing and 3.9 m/s for speaking. Under the experimental setting, evaporation and condensation effects had negligible impact on the measured droplet size. The geometric mean diameter of droplets from coughing was 13.5m and it was 16.0m for speaking (counting 1 to 100). The estimated total number of droplets expelled ranged from 947 – 2085 per cough and 112 – 6720 for speaking. The estimated droplet concentrations for coughing ranged from 2.4 - 5.2cm-3 per cough and 0.004 – 0.223 cm-3 for speaking.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This document provides a review of international and national practices in investment decision support tools in road asset management. Efforts were concentrated on identifying analytic frameworks, evaluation methodologies and criteria adopted by current tools. Emphasis was also given to how current approaches support Triple Bottom Line decision-making. Benefit Cost Analysis and Multiple Criteria Analysis are principle methodologies in supporting decision-making in Road Asset Management. The complexity of the applications shows significant differences in international practices. There is continuing discussion amongst practitioners and researchers regarding to which one is more appropriate in supporting decision-making. It is suggested that the two approaches should be regarded as complementary instead of competitive means. Multiple Criteria Analysis may be particularly helpful in early stages of project development, say strategic planning. Benefit Cost Analysis is used most widely for project prioritisation and selecting the final project from amongst a set of alternatives. Benefit Cost Analysis approach is useful tool for investment decision-making from an economic perspective. An extension of the approach, which includes social and environmental externalities, is currently used in supporting Triple Bottom Line decision-making in the road sector. However, efforts should be given to several issues in the applications. First of all, there is a need to reach a degree of commonality on considering social and environmental externalities, which may be achieved by aggregating the best practices. At different decision-making level, the detail of consideration of the externalities should be different. It is intended to develop a generic framework to coordinate the range of existing practices. The standard framework will also be helpful in reducing double counting, which appears in some current practices. Cautions should also be given to the methods of determining the value of social and environmental externalities. A number of methods, such as market price, resource costs and Willingness to Pay, are found in the review. The use of unreasonable monetisation methods in some cases has discredited Benefit Cost Analysis in the eyes of decision makers and the public. Some social externalities, such as employment and regional economic impacts, are generally omitted in current practices. This is due to the lack of information and credible models. It may be appropriate to consider these externalities in qualitative forms in a Multiple Criteria Analysis. Consensus has been reached in considering noise and air pollution in international practices. However, Australia practices generally omitted these externalities. Equity is an important consideration in Road Asset Management. The considerations are either between regions, or social groups, such as income, age, gender, disable, etc. In current practice, there is not a well developed quantitative measure for equity issues. More research is needed to target this issue. Although Multiple Criteria Analysis has been used for decades, there is not a generally accepted framework in the choice of modelling methods and various externalities. The result is that different analysts are unlikely to reach consistent conclusions about a policy measure. In current practices, some favour using methods which are able to prioritise alternatives, such as Goal Programming, Goal Achievement Matrix, Analytic Hierarchy Process. The others just present various impacts to decision-makers to characterise the projects. Weighting and scoring system are critical in most Multiple Criteria Analysis. However, the processes of assessing weights and scores were criticised as highly arbitrary and subjective. It is essential that the process should be as transparent as possible. Obtaining weights and scores by consulting local communities is a common practice, but is likely to result in bias towards local interests. Interactive approach has the advantage in helping decision-makers elaborating their preferences. However, computation burden may result in lose of interests of decision-makers during the solution process of a large-scale problem, say a large state road network. Current practices tend to use cardinal or ordinal scales in measure in non-monetised externalities. Distorted valuations can occur where variables measured in physical units, are converted to scales. For example, decibels of noise converts to a scale of -4 to +4 with a linear transformation, the difference between 3 and 4 represents a far greater increase in discomfort to people than the increase from 0 to 1. It is suggested to assign different weights to individual score. Due to overlapped goals, the problem of double counting also appears in some of Multiple Criteria Analysis. The situation can be improved by carefully selecting and defining investment goals and criteria. Other issues, such as the treatment of time effect, incorporating risk and uncertainty, have been given scant attention in current practices. This report suggested establishing a common analytic framework to deal with these issues.