15 resultados para Carlo Felice, King of Sardinia.

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Angular distribution of microscopic ion fluxes around nanotubes arranged into a dense ordered pattern on the surface of the substrate is studied by means of multiscale numerical simulation. The Monte Carlo technique was used to show that the ion current density is distributed nonuniformly around the carbon nanotubes arranged into a dense rectangular array. The nonuniformity factor of the ion current flux reaches 7 in dense (5× 1018 m-3) plasmas for a nanotube radius of 25 nm, and tends to 1 at plasma densities below 1× 1017 m-3. The results obtained suggest that the local density of carbon adatoms on the nanotube side surface, at areas facing the adjacent nanotubes of the pattern, can be high enough to lead to the additional wall formation and thus cause the single- to multiwall structural transition, and other as yet unexplained nanoscience phenomena.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We used Monte Carlo simulations of Brownian dynamics of water to study anisotropic water diffusion in an idealised model of articular cartilage. The main aim was to use the simulations as a tool for translation of the fractional anisotropy of the water diffusion tensor in cartilage into quantitative characteristics of its collagen fibre network. The key finding was a linear empirical relationship between the collagen volume fraction and the fractional anisotropy of the diffusion tensor. Fractional anisotropy of the diffusion tensor is potentially a robust indicator of the microstructure of the tissue because, in the first approximation, it is invariant to the inclusion of proteoglycans or chemical exchange between free and collagen-bound water in the model. We discuss potential applications of Monte Carlo diffusion-tensor simulations for quantitative biophysical interpretation of MRI diffusion-tensor images of cartilage. Extension of the model to include collagen fibre disorder is also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Geant4 based simulation tool has been developed to perform Monte Carlo modelling of a 6 MV VarianTM iX clinac. The computer aided design interface of Geant4 was used to accurately model the LINAC components, including the Millenium multi-leaf collimators (MLCs). The simulation tool was verified via simulation of standard commissioning dosimetry data acquired with an ionisation chamber in a water phantom. Verification of the MLC model was achieved by simulation of leaf leakage measurements performed using GafchromicTM film in a solid water phantom. An absolute dose calibration capability was added by including a virtual monitor chamber into the simulation. Furthermore, a DICOM-RT interface was integrated with the application to allow the simulation of treatment plans in radiotherapy. The ability of the simulation tool to accurately model leaf movements and doses at each control point was verified by simulation of a widely used intensity-modulated radiation therapy (IMRT) quality assurance (QA) technique, the chair test.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Discrete stochastic simulations, via techniques such as the Stochastic Simulation Algorithm (SSA) are a powerful tool for understanding the dynamics of chemical kinetics when there are low numbers of certain molecular species. However, an important constraint is the assumption of well-mixedness and homogeneity. In this paper, we show how to use Monte Carlo simulations to estimate an anomalous diffusion parameter that encapsulates the crowdedness of the spatial environment. We then use this parameter to replace the rate constants of bimolecular reactions by a time-dependent power law to produce an SSA valid in cases where anomalous diffusion occurs or the system is not well-mixed (ASSA). Simulations then show that ASSA can successfully predict the temporal dynamics of chemical kinetics in a spatially constrained environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel in-cylinder pressure method for determining ignition delay has been proposed and demonstrated. This method proposes a new Bayesian statistical model to resolve the start of combustion, defined as being the point at which the band-pass in-cylinder pressure deviates from background noise and the combustion resonance begins. Further, it is demonstrated that this method is still accurate in situations where there is noise present. The start of combustion can be resolved for each cycle without the need for ad hoc methods such as cycle averaging. Therefore, this method allows for analysis of consecutive cycles and inter-cycle variability studies. Ignition delay obtained by this method and by the net rate of heat release have been shown to give good agreement. However, the use of combustion resonance to determine the start of combustion is preferable over the net rate of heat release method because it does not rely on knowledge of heat losses and will still function accurately in the presence of noise. Results for a six-cylinder turbo-charged common-rail diesel engine run with neat diesel fuel at full, three quarters and half load have been presented. Under these conditions the ignition delay was shown to increase as the load was decreased with a significant increase in ignition delay at half load, when compared with three quarter and full loads.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Throughout a long and occasionally distinguished career first as a television sports correspondent, then chat show host (dramatically ended by the accidental homicide of a guest live on air), then rebirth as a radio presenter at North Norfolk Digital, Alan Partridge has navigated the stormy waters of the British media landscape, now achieving mainstream success on the big screen with a starring role in Steve Coogan’s Alpha Papa (Declan Lowney, 2013). A man who in his desperation for a television series of his own once sank so low as to pitch a show called Monkey Tennis to the BBC finally finds his inner hero in a film which, while presenting mainly as comedy, also contains a biting critique of trends in the British media with which all journalists and media practitioners in general will be familiar. Alpha Papa is a nostalgic, elegiac riff on the pleasures and values of local radio the way it used to be, exemplified by North Norfolk Digital’s stable of flawed, but endearing jocks – Wally Banter, Bruno Brooks, Dave Clifton (who in one scene recounts the depths to which he sank as an alcoholic, drug addicted wreck—“I woke up in a skip with someone else’s underpants in my mouth. I can laugh about it now …”), and Pat Farrell. 50- something Pat is sacked by the new owners of North Norfolk Digital, who in their efforts to transform the station into a “multiplatform content provider” going by the more Gen Yfriendly name of Shape (“the way you want it to be”), wish to replace him with a younger, brattish model lacking in taste and manners. Out go records by the likes of Glen Campbell and Neil Diamond (“You can keep Jesus Christ”, observes Partridge after playing Diamond’s Sweet Caroline in a demonstration of the crackling radio repartee for which he is by now renowned, “that was the king of the Jews”), in comes Roachford. Pat, grieving his dead wife Molly, finally snaps and turns the glitzy media launch of Shape into a hostage siege. Only Alan Partridge, it seems, can step in and talk Pat out of a looming catastrophe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We use Bayesian model selection techniques to test extensions of the standard flat LambdaCDM paradigm. Dark-energy and curvature scenarios, and primordial perturbation models are considered. To that end, we calculate the Bayesian evidence in favour of each model using Population Monte Carlo (PMC), a new adaptive sampling technique which was recently applied in a cosmological context. The Bayesian evidence is immediately available from the PMC sample used for parameter estimation without further computational effort, and it comes with an associated error evaluation. Besides, it provides an unbiased estimator of the evidence after any fixed number of iterations and it is naturally parallelizable, in contrast with MCMC and nested sampling methods. By comparison with analytical predictions for simulated data, we show that our results obtained with PMC are reliable and robust. The variability in the evidence evaluation and the stability for various cases are estimated both from simulations and from data. For the cases we consider, the log-evidence is calculated with a precision of better than 0.08. Using a combined set of recent CMB, SNIa and BAO data, we find inconclusive evidence between flat LambdaCDM and simple dark-energy models. A curved Universe is moderately to strongly disfavoured with respect to a flat cosmology. Using physically well-motivated priors within the slow-roll approximation of inflation, we find a weak preference for a running spectral index. A Harrison-Zel'dovich spectrum is weakly disfavoured. With the current data, tensor modes are not detected; the large prior volume on the tensor-to-scalar ratio r results in moderate evidence in favour of r=0.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses challenges part of the shift of paradigm taking place in the way we produce, transmit and use power related to what is known as smart grids. The aim of this paper is to explore present initiatives to establish smart grids as a sustainable and reliable power supply system. We argue that smart grids are not isolated to abstract conceptual models alone. We suggest that establishing sustainable and reliable smart grids depend on series of contributions including modeling and simulation projects, technological infrastructure pilots, systemic methods and training, and not least how these and other elements must interact to add reality to the conceptual models. We present and discuss three initiatives that illuminate smart grids from three very different positions. First, the new power grid simulator project in the electrical engineering PhD program at Queensland University of Technology (QUT). Second, the new smart grids infrastructure pilot run by the Norwegian Centers of Expertise Smart Energy Markets (NCE SMART). And third, the new systemic Master program on next generation energy technology at østfold University College (Hiø). These initiatives represent future threads in a mesh embedding smart grids in models, technology, infrastructure, education, skills and people.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction Due to their high spatial resolution diodes are often used for small field relative output factor measurements. However, a field size specific correction factor [1] is required and corrects for diode detector over-response at small field sizes. A recent Monte Carlo based study has shown that it is possible to design a diode detector that produces measured relative output factors that are equivalent to those in water. This is accomplished by introducing an air gap at the upstream end of the diode [2]. The aim of this study was to physically construct this diode by placing an ‘air cap’ on the end of a commercially available diode (the PTW 60016 electron diode). The output factors subsequently measured with the new diode design were compared to current benchmark small field output factor measurements. Methods A water-tight ‘cap’ was constructed so that it could be placed over the upstream end of the diode. The cap was able to be offset from the end of the diode, thus creating an air gap. The air gap width was the same as the diode width (7 mm) and the thickness of the air gap could be varied. Output factor measurements were made using square field sizes of side length from 5 to 50 mm, using a 6 MV photon beam. The set of output factor measurements were repeated with the air gap thickness set to 0, 0.5, 1.0 and 1.5 mm. The optimal air gap thickness was found in a similar manner to that proposed by Charles et al. [2]. An IBA stereotactic field diode, corrected using Monte Carlo calculated kq,clin,kq,msr values [3] was used as the gold standard. Results The optimal air thickness required for the PTW 60016 electron diode was 1.0 mm. This was close to the Monte Carlo predicted value of 1.15 mm2. The sensitivity of the new diode design was independent of field size (kq,clin,kq,msr = 1.000 at all field sizes) to within 1 %. Discussion and conclusions The work of Charles et al. [2] has been proven experimentally. An existing commercial diode has been converted into a correction-less small field diode by the simple addition of an ‘air cap’. The method of applying a cap to create the new diode leads to the diode being dual purpose, as without the cap it is still an unmodified electron diode.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the problem of determining optimal designs for biological process models with intractable likelihoods, with the goal of parameter inference. The Bayesian approach is to choose a design that maximises the mean of a utility, and the utility is a function of the posterior distribution. Therefore, its estimation requires likelihood evaluations. However, many problems in experimental design involve models with intractable likelihoods, that is, likelihoods that are neither analytic nor can be computed in a reasonable amount of time. We propose a novel solution using indirect inference (II), a well established method in the literature, and the Markov chain Monte Carlo (MCMC) algorithm of Müller et al. (2004). Indirect inference employs an auxiliary model with a tractable likelihood in conjunction with the generative model, the assumed true model of interest, which has an intractable likelihood. Our approach is to estimate a map between the parameters of the generative and auxiliary models, using simulations from the generative model. An II posterior distribution is formed to expedite utility estimation. We also present a modification to the utility that allows the Müller algorithm to sample from a substantially sharpened utility surface, with little computational effort. Unlike competing methods, the II approach can handle complex design problems for models with intractable likelihoods on a continuous design space, with possible extension to many observations. The methodology is demonstrated using two stochastic models; a simple tractable death process used to validate the approach, and a motivating stochastic model for the population evolution of macroparasites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim A recent Monte Carlo based study has shown that it is possible to design a diode that measures small field output factors equivalent to that in water. This is accomplished by placing an appropriate sized air gap above the silicon chip (1) with experimental results subsequently confirming that a particular Monte Carlo design was accurate (2). The aim of this work was to test if a new correction-less diode could be designed using an entirely experimental methodology. Method: All measurements were performed on a Varian iX at a depth of 5 cm, SSD of 95 cm and field sizes of 5, 6, 8, 10, 20 and 30 mm. Firstly, the experimental transfer of kq,clin,kq,msr from a commonly used diode detector (IBA, stereotactic field diode (SFD)) to another diode detector (Sun Nuclear, unshielded diode, (EDGEe)) was tested. These results were compared to Monte Carlo calculated values of the EDGEe. Secondly, the air gap above the EDGEe silicon chip was optimised empirically. Nine different air gap “tops” were placed above the EDGEe (air depth = 0.3, 0.6, 0.9 mm; air width = 3.06, 4.59, 6.13 mm). The sensitivity of the EDGEe was plotted as a function of air gap thickness for the field sizes measured. Results: The transfer of kq,clin,kq,msr from the SFD to the EDGEe was correct to within the simulation and measurement uncertainties. The EDGEe detector can be made “correction-less” for field sizes of 5 and 6 mm, but was ∼2% from being “correction-less” at field sizes of 8 and 10 mm. Conclusion Different materials will perturb small fields in different ways. A detector is only “correction-less” if all these perturbations happen to cancel out. Designing a “correction-less” diode is a complicated process, thus it is reasonable to expect that Monte Carlo simulations should play an important role.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an efficient noniterative method for distribution state estimation using conditional multivariate complex Gaussian distribution (CMCGD). In the proposed method, the mean and standard deviation (SD) of the state variables is obtained in one step considering load uncertainties, measurement errors, and load correlations. In this method, first the bus voltages, branch currents, and injection currents are represented by MCGD using direct load flow and a linear transformation. Then, the mean and SD of bus voltages, or other states, are calculated using CMCGD and estimation of variance method. The mean and SD of pseudo measurements, as well as spatial correlations between pseudo measurements, are modeled based on the historical data for different levels of load duration curve. The proposed method can handle load uncertainties without using time-consuming approaches such as Monte Carlo. Simulation results of two case studies, six-bus, and a realistic 747-bus distribution network show the effectiveness of the proposed method in terms of speed, accuracy, and quality against the conventional approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A right of resale, or droit de suite (a right to follow), is a legislative instrument under intellectual property law, which enables artists to receive a percentage of the sale price whenever artistic works are resold. A French legal scholar, Albert Vaunois, first articulated the need for a 'droit de suite' in connection with fine art back in 1893. The French Government introduced a scheme to protect the right of resale in 1920, after controversy over artists living in poverty, while public auction houses were profiting from the resale of their artistic creations. In the United States, there has been less support for a right of resale amongst legislatures. After lobbying from artists such as the king of pop art, Robert Rauschenberg, the state of California passed the Resale Royalties Act in 1977. At a Federal level, the United States Congress has shown some reluctance in providing national recognition for a right of resale in the United States. A number of other European countries have established a right of resale. In 2001, the European Council adopted the Artists' Resale directive and recognised that the 'artist's resale right forms an integral part of copyright and is an essential prerogative for authors.' In 2006, the United Kingdom promulgated regulations, giving effect to a right of resale in that jurisdiction. However, a number of Latin American and African countries have established a right of resale. The New Zealand Parliament has debated a bill on a right of resale.