923 resultados para Acquired Mrsa Bacteremia


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Eccentric exercise has become the treatment of choice for Achilles tendinopathy. However, little is known about the acute response of tendons to eccentric exercise or the mechanisms underlying its clinical benefit. This research evaluated the sonographic characteristics and acute anteroposterior (AP) strain response of control (healthy), asymptomatic, and symptomatic Achilles tendons to eccentric exercise. Methods: Eleven male adults with unilateral midportion Achilles tendinopathy and nine control male adults without tendinopathy participated in the research. Sagittal sonograms of the Achilles tendon were acquired immediately before and after completion of a common eccentric rehabilitation exercise protocol and again 24 h later. Tendon thickness, echogenicity, and AP strain were determined 40 mm proximal to the calcaneal insertion. Results: Compared with the control tendon, both the asymptomatic and symptomatic tendons were thicker (P < 0.05) and hypoechoic (P < 0.05) at baseline. All tendons decreased in thickness immediately after eccentric exercise (P < 0.05). The symptomatic tendon was characterized by a significantly lower AP strain response to eccentric exercise compared with both the asymptomatic and control tendons (P < 0.05). AP strains did not differ in the control and asymptomatic tendons. For all tendons, preexercise thickness was restored 24 h after exercise completion. Conclusions: These observations support the concept that Achilles tendinopathy is a bilateral or systemic process and structural changes associated with symptomatic tendinopathy alter fluid movement within the tendon matrix. Altered fluid movement may disrupt remodeling and homeostatic processes and represents a plausible mechanism underlying the progression of tendinopathy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Proteoglycans (PGs) are crucial extracellular matrix (ECM) components that are present in all tissues and organs. Pathological remodeling of these macromolecules can lead to severe diseases such as osteoarthritis or rheumatoid arthritis. To date, PG-associated ECM alterations are routinely diagnosed by invasive analytical methods. Here, we employed Raman microspectroscopy, a laser-based, marker-free and non-destructive technique that allows the generation of spectra with peaks originating from molecular vibrations within a sample, to identify specific Raman bands that can be assigned to PGs within human and porcine cartilage samples and chondrocytes. Based on the non-invasively acquired Raman spectra, we further revealed that a prolonged in vitro culture leads to phenotypic alterations of chondrocytes, resulting in a decreased PG synthesis rate and loss of lipid contents. Our results are the first to demonstrate the applicability of Raman microspectroscopy as an analytical and potential diagnostic tool for non-invasive cell and tissue state monitoring of cartilage in biomedical research. ((c) 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Inherent and acquired cisplatin resistance reduces the effectiveness of this agent in the management of non-small cell lung cancer (NSCLC). Understanding the molecular mechanisms underlying this process may result in the development of novel agents to enhance the sensitivity of cisplatin. Methods: An isogenic model of cisplatin resistance was generated in a panel of NSCLC cell lines (A549, SKMES-1, MOR, H460). Over a period of twelve months, cisplatin resistant (CisR) cell lines were derived from original, age-matched parent cells (PT) and subsequently characterized. Proliferation (MTT) and clonogenic survival assays (crystal violet) were carried out between PT and CisR cells. Cellular response to cisplatin-induced apoptosis and cell cycle distribution were examined by FACS analysis. A panel of cancer stem cell and pluripotent markers was examined in addition to the EMT proteins, c-Met and β-catenin. Cisplatin-DNA adduct formation, DNA damage (γH2AX) and cellular platinum uptake (ICP-MS) was also assessed. Results: Characterisation studies demonstrated a decreased proliferative capacity of lung tumour cells in response to cisplatin, increased resistance to cisplatin-induced cell death, accumulation of resistant cells in the G0/G1 phase of the cell cycle and enhanced clonogenic survival ability. Moreover, resistant cells displayed a putative stem-like signature with increased expression of CD133+/CD44+cells and increased ALDH activity relative to their corresponding parental cells. The stem cell markers, Nanog, Oct-4 and SOX-2, were significantly upregulated as were the EMT markers, c-Met and β-catenin. While resistant sublines demonstrated decreased uptake of cisplatin in response to treatment, reduced cisplatin-GpG DNA adduct formation and significantly decreased γH2AX foci were observed compared to parental cell lines. Conclusion: Our results identified cisplatin resistant subpopulations of NSCLC cells with a putative stem-like signature, providing a further understanding of the cellular events associated with the cisplatin resistance phenotype in lung cancer. © 2013 Barr et al.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The launch of the Centre of Research Excellence in Reducing Healthcare Associated Infection (CRE-RHAI) took place in Sydney on Friday 12 October 2012. The mission of the CRE-RHAI is to generate new knowledge about strategies to reduce healthcare associated infections and to provide data on the cost-effectiveness of infection control programs. As well as launching the CRE-RHAI, an important part of this event was a stakeholder Consultation Workshop, which brought together several experts in the Australian infection control community. The aims of this workshop were to establish the research and clinical priorities in Australian infection control, assess the importance of various multi-resistant organisms, and to gather information about decision making in infection control. We present here a summary and discussion of the responses we received.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The preparedness theory of classical conditioning proposed by Seligman (1970, 1971) has been applied extensively over the past 40 years to explain the nature and "source" of human fear and phobias. In this review we examine the formative studies that tested the four defining characteristics of prepared learning with animal fear-relevant stimuli (typically snakes and spiders) and consider claims that fear of social stimuli, such as angry faces, or faces of racial out-group members, may also be acquired utilising the same preferential learning mechanism. Exposition of critical differences between fear learning to animal and social stimuli suggests that a single account cannot adequately explain fear learning with animal and social stimuli. We demonstrate that fear conditioned to social stimuli is less robust than fear conditioned to animal stimuli as it is susceptible to cognitive influence and propose that it may instead reflect on negative stereotypes and social norms. Thus, a theoretical model that can accommodate the influence of both biological and cultural factors is likely to have broader utility in the explanation of fear and avoidance responses than accounts based on a single mechanism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Located within the Creative Industries Faculty, the Animation team at the Queensland University of Technology (QUT) recently acquired a full-body inertial motion capture system. Our research to date has been predominantly concerned with interdisciplinary practice and the benefits this could bring to undergraduate teaching. From early experimental tests it was identified that there was a need to develop a framework for best practice and an efficient production workflow to ensure the system was being used to its full potential. Through our ongoing investigation we have identified at least three areas that stand to have long-term benefits from universities engaging in motion capture related research activity. This includes interdisciplinary collaborative research, undergraduate teaching and improved production processes. The following paper reports the early stages of our research, which explores the use of a full-body inertial motion capture (MoCap) solution in collaboration with performing artists.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The current gold standard for the design of orthopaedic implants is 3D models of long bones obtained using computed tomography (CT). However, high-resolution CT imaging involves high radiation exposure, which limits its use in healthy human volunteers. Magnetic resonance imaging (MRI) is an attractive alternative for the scanning of healthy human volunteers for research purposes. Current limitations of MRI include difficulties of tissue segmentation within joints and long scanning times. In this work, we explore the possibility of overcoming these limitations through the use of MRI scanners operating at a higher field strength. We quantitatively compare the quality of anatomical MR images of long bones obtained at 1.5 T and 3 T and optimise the scanning protocol of 3 T MRI. FLASH images of the right leg of five human volunteers acquired at 1.5 T and 3 T were compared in terms of signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR). The comparison showed a relatively high CNR and SNR at 3 T for most regions of the femur and tibia, with the exception of the distal diaphyseal region of the femur and the mid diaphyseal region of the tibia. This was accompanied by an ~65% increase in the longitudinal spin relaxation time (T1) of the muscle at 3 T compared to 1.5 T. The results suggest that MRI at 3 T may be able to enhance the segmentability and potentially improve the accuracy of 3D anatomical models of long bones, compared to 1.5 T. We discuss how the total imaging times at 3 T can be kept short while maximising the CNR and SNR of the images obtained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: There are strong logical reasons why energy expended in metabolism should influence the energy acquired in food-intake behavior. However, the relation has never been established, and it is not known why certain people experience hunger in the presence of large amounts of body energy. Objective: We investigated the effect of the resting metabolic rate (RMR) on objective measures of whole-day food intake and hunger. Design: We carried out a 12-wk intervention that involved 41 overweight and obese men and women [mean ± SD age: 43.1 ± 7.5 y; BMI (in kg/m2): 30.7 ± 3.9] who were tested under conditions of physical activity (sedentary or active) and dietary energy density (17 or 10 kJ/g). RMR, daily energy intake, meal size, and hunger were assessed within the same day and across each condition. Results: We obtained evidence that RMR is correlated with meal size and daily energy intake in overweight and obese individuals. Participants with high RMRs showed increased levels of hunger across the day (P < 0.0001) and greater food intake (P < 0.00001) than did individuals with lower RMRs. These effects were independent of sex and food energy density. The change in RMR was also related to energy intake (P < 0.0001). Conclusions: We propose that RMR (largely determined by fat-free mass) may be a marker of energy intake and could represent a physiologic signal for hunger. These results may have implications for additional research possibilities in appetite, energy homeostasis, and obesity. This trial was registered under international standard identification for controlled trials as ISRCTN47291569.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Studies have shown that nurse staffing levels, among many other factors in the hospital setting, contribute to adverse patient outcomes. Concerns about patient safety and quality of care have resulted in numerous studies being conducted to examine the relationship between nurse staffing levels and the incidence of adverse patient events in both general wards and intensive care units. AIM: The aim of this paper is to review literature published in the previous 10 years which examines the relationship between nurse staffing levels and the incidence of mortality and morbidity in adult intensive care unit patients. METHODS: A literature search from 2002 to 2011 using the MEDLINE, Cumulative Index to Nursing and Allied Health Literature (CINAHL), PsycINFO, and Australian digital thesis databases was undertaken. The keywords used were: intensive care; critical care; staffing; nurse staffing; understaffing; nurse-patient ratios; adverse outcomes; mortality; ventilator-associated pneumonia; ventilator-acquired pneumonia; infection; length of stay; pressure ulcer/injury; unplanned extubation; medication error; readmission; myocardial infarction; and renal failure. A total of 19 articles were included in the review. Outcomes of interest are patient mortality and morbidity, particularly infection and pressure ulcers. RESULTS: Most of the studies were observational in nature with variables obtained retrospectively from large hospital databases. Nurse staffing measures and patient outcomes varied widely across the studies. While an overall statistical association between increased nurse staffing levels and decreased adverse patient outcomes was not found in this review, most studies concluded that a trend exists between increased nurse staffing levels and decreased adverse events. CONCLUSION: While an overall statistical association between increased nurse staffing levels and decreased adverse patient outcomes was not found in this review, most studies demonstrated a trend between increased nurse staffing levels and decreased adverse patient outcomes in the intensive care unit which is consistent with previous literature. While further more robust research methodologies need to be tested in order to more confidently demonstrate this association and decrease the influence of the many other confounders to patient outcomes; this would be difficult to achieve in this field of research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Determining the properties and integrity of subchondral bone in the developmental stages of osteoarthritis, especially in a form that can facilitate real-time characterization for diagnostic and decision-making purposes, is still a matter for research and development. This paper presents relationships between near infrared absorption spectra and properties of subchondral bone obtained from 3 models of osteoarthritic degeneration induced in laboratory rats via: (i) menisectomy (MSX); (ii) anterior cruciate ligament transaction (ACL); and (iii) intra-articular injection of mono-ido-acetate (1 mg) (MIA), in the right knee joint, with 12 rats per model group (N = 36). After 8 weeks, the animals were sacrificed and knee joints were collected. A custom-made diffuse reflectance NIR probe of diameter 5 mm was placed on the tibial surface and spectral data were acquired from each specimen in the wavenumber range 4000–12 500 cm− 1. After spectral acquisition, micro computed tomography (micro-CT) was performed on the samples and subchondral bone parameters namely: bone volume (BV) and bone mineral density (BMD) were extracted from the micro-CT data. Statistical correlation was then conducted between these parameters and regions of the near infrared spectra using multivariate techniques including principal component analysis (PCA), discriminant analysis (DA), and partial least squares (PLS) regression. Statistically significant linear correlations were found between the near infrared absorption spectra and subchondral bone BMD (R2 = 98.84%) and BV (R2 = 97.87%). In conclusion, near infrared spectroscopic probing can be used to detect, qualify and quantify changes in the composition of the subchondral bone, and could potentially assist in distinguishing healthy from OA bone as demonstrated with our laboratory rat models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This 
paper 
is
 based
 on 
a
 PhD 
thesis
 that investigated how Hollywood’s
dominance 
of
 the
 movie
 industry 
arose
 and
 how
 it
 has
 been
 maintained
over
time.
 Major 
studio
 dominance 
and
 the 
global
 popularity 
of
Hollywood
 movies has
 been
 the 
subject
 of
 numerous 
studies. 
An interdisciplinary
 literature 
review
 of 
the
 economics,
 management,
marketing,
 film,
 media
 and
 culture 
literatures
 identified
 twenty
 different 
single 
or
multiple 
factor
 explanations
 that 
try
 to
 account
 for
Major
 studio
 dominance 
at
 different
 time
 periods
 but
 cannot
comprehensively 
explain 
how 
Hollywood
 acquired
 and 
maintained
 global
 dominance 
for
 nine
 decades. 
Existing 
strategic
 management 
and
marketing
 theories
 were 
integrated 
into
 a 
‘theoretical
 lens’
 that
 enabled
a
 historical
 analysis
 of 
Hollywood’s
 longstanding 
dominance 
of
 the
movie
 business
 to
 be 
undertaken 
from 
a
 strategic
 business
 perspective.
 This
 paper
 concludes
 that 
the
 major 
studios 
rise 
to
 market 
leadership
 and 
enduring
 dominance
 can
 primarily
 be
 explained 
because 
they
 developed
 and 
maintained 
a 
set
 of
 strategic
 marketing
 management
 capabilities 
that
 were
 superior 
to rival
 firms
 and 
rival 
film 
industries. 
It
 is
 argued that 
a 
marketing
 orientation 
and 
effective
 strategic
 marketing
management
 capabilities 
also
 provide
a 
unifying
 theory
 for
 Hollywood’s
enduring 
dominance 
because 
they
 can 
account
 for
 each
 of
 the 
twenty
 previously
 identified
 explanations 
for 
that
 dominance.



Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the disintermediation of the financial markets, credit rating agencies filled the informational need of investors on the creditworthiness of borrowers. They acquired their privileged position in the financial market through their intellectual technology and reputational capital. To a large extent, they have gradually dissipated the authority of state regulators and supervisory authorities with their increasing reliance on credit ratings for regulatory purposes. But the recent credit crisis revives the question on whether states should retake their authorities and how far rating agencies should be subjected to competition, transparency and accountability constraints imposed by the public and the market on state regulators and supervisory authorities. Against this backdrop, this article critically explores the key concerns with credit rating agencies' functions to regulate financial market for further assessment

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Emerging sciences, such as conceptual cost estimating, seem to have to go through two phases. The first phase involves reducing the field of study down to its basic ingredients - from systems development to technological development (techniques) to theoretical development. The second phase operates in the direction in building up techniques from theories, and systems from techniques. Cost estimating is clearly and distinctly still in the first phase. A great deal of effort has been put into the development of both manual and computer based cost estimating systems during this first phase and, to a lesser extent, the development of a range of techniques that can be used (see, for instance, Ashworth & Skitmore, 1986). Theoretical developments have not, as yet, been forthcoming. All theories need the support of some observational data and cost estimating is not likely to be an exception. These data do not need to be complete in order to build theories. As it is possible to construct an image of a prehistoric animal such as the brontosaurus from only a few key bones and relics, so a theory of cost estimating may possibly be found on a few factual details. The eternal argument of empiricists and deductionists is that, as theories need factual support, so do we need theories in order to know what facts to collect. In cost estimating, the basic facts of interest concern accuracy, the cost of achieving this accuracy, and the trade off between the two. When cost estimating theories do begin to emerge, it is highly likely that these relationships will be central features. This paper presents some of the facts we have been able to acquire regarding one part of this relationship - accuracy, and its influencing factors. Although some of these factors, such as the amount of information used in preparing the estimate, will have cost consequences, we have not yet reached the stage of quantifying these costs. Indeed, as will be seen, many of the factors do not involve any substantial cost considerations. The absence of any theory is reflected in the arbitrary manner in which the factors are presented. Rather, the emphasis here is on the consideration of purely empirical data concerning estimating accuracy. The essence of good empirical research is to .minimize the role of the researcher in interpreting the results of the study. Whilst space does not allow a full treatment of the material in this manner, the principle has been adopted as closely as possible to present results in an uncleaned and unbiased way. In most cases the evidence speaks for itself. The first part of the paper reviews most of the empirical evidence that we have located to date. Knowledge of any work done, but omitted here would be most welcome. The second part of the paper presents an analysis of some recently acquired data pertaining to this growing subject.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The health impacts of exposure to ambient temperature have been drawing increasing attention from the environmental health research community, government, society, industries, and the public. Case-crossover and time series models are most commonly used to examine the effects of ambient temperature on mortality. However, some key methodological issues remain to be addressed. For example, few studies have used spatiotemporal models to assess the effects of spatial temperatures on mortality. Few studies have used a case-crossover design to examine the delayed (distributed lag) and non-linear relationship between temperature and mortality. Also, little evidence is available on the effects of temperature changes on mortality, and on differences in heat-related mortality over time. This thesis aimed to address the following research questions: 1. How to combine case-crossover design and distributed lag non-linear models? 2. Is there any significant difference in effect estimates between time series and spatiotemporal models? 3. How to assess the effects of temperature changes between neighbouring days on mortality? 4. Is there any change in temperature effects on mortality over time? To combine the case-crossover design and distributed lag non-linear model, datasets including deaths, and weather conditions (minimum temperature, mean temperature, maximum temperature, and relative humidity), and air pollution were acquired from Tianjin China, for the years 2005 to 2007. I demonstrated how to combine the case-crossover design with a distributed lag non-linear model. This allows the case-crossover design to estimate the non-linear and delayed effects of temperature whilst controlling for seasonality. There was consistent U-shaped relationship between temperature and mortality. Cold effects were delayed by 3 days, and persisted for 10 days. Hot effects were acute and lasted for three days, and were followed by mortality displacement for non-accidental, cardiopulmonary, and cardiovascular deaths. Mean temperature was a better predictor of mortality (based on model fit) than maximum or minimum temperature. It is still unclear whether spatiotemporal models using spatial temperature exposure produce better estimates of mortality risk compared with time series models that use a single site’s temperature or averaged temperature from a network of sites. Daily mortality data were obtained from 163 locations across Brisbane city, Australia from 2000 to 2004. Ordinary kriging was used to interpolate spatial temperatures across the city based on 19 monitoring sites. A spatiotemporal model was used to examine the impact of spatial temperature on mortality. A time series model was used to assess the effects of single site’s temperature, and averaged temperature from 3 monitoring sites on mortality. Squared Pearson scaled residuals were used to check the model fit. The results of this study show that even though spatiotemporal models gave a better model fit than time series models, spatiotemporal and time series models gave similar effect estimates. Time series analyses using temperature recorded from a single monitoring site or average temperature of multiple sites were equally good at estimating the association between temperature and mortality as compared with a spatiotemporal model. A time series Poisson regression model was used to estimate the association between temperature change and mortality in summer in Brisbane, Australia during 1996–2004 and Los Angeles, United States during 1987–2000. Temperature change was calculated by the current day's mean temperature minus the previous day's mean. In Brisbane, a drop of more than 3 �C in temperature between days was associated with relative risks (RRs) of 1.16 (95% confidence interval (CI): 1.02, 1.31) for non-external mortality (NEM), 1.19 (95% CI: 1.00, 1.41) for NEM in females, and 1.44 (95% CI: 1.10, 1.89) for NEM aged 65.74 years. An increase of more than 3 �C was associated with RRs of 1.35 (95% CI: 1.03, 1.77) for cardiovascular mortality and 1.67 (95% CI: 1.15, 2.43) for people aged < 65 years. In Los Angeles, only a drop of more than 3 �C was significantly associated with RRs of 1.13 (95% CI: 1.05, 1.22) for total NEM, 1.25 (95% CI: 1.13, 1.39) for cardiovascular mortality, and 1.25 (95% CI: 1.14, 1.39) for people aged . 75 years. In both cities, there were joint effects of temperature change and mean temperature on NEM. A change in temperature of more than 3 �C, whether positive or negative, has an adverse impact on mortality even after controlling for mean temperature. I examined the variation in the effects of high temperatures on elderly mortality (age . 75 years) by year, city and region for 83 large US cities between 1987 and 2000. High temperature days were defined as two or more consecutive days with temperatures above the 90th percentile for each city during each warm season (May 1 to September 30). The mortality risk for high temperatures was decomposed into: a "main effect" due to high temperatures using a distributed lag non-linear function, and an "added effect" due to consecutive high temperature days. I pooled yearly effects across regions and overall effects at both regional and national levels. The effects of high temperature (both main and added effects) on elderly mortality varied greatly by year, city and region. The years with higher heat-related mortality were often followed by those with relatively lower mortality. Understanding this variability in the effects of high temperatures is important for the development of heat-warning systems. In conclusion, this thesis makes contribution in several aspects. Case-crossover design was combined with distribute lag non-linear model to assess the effects of temperature on mortality in Tianjin. This makes the case-crossover design flexibly estimate the non-linear and delayed effects of temperature. Both extreme cold and high temperatures increased the risk of mortality in Tianjin. Time series model using single site’s temperature or averaged temperature from some sites can be used to examine the effects of temperature on mortality. Temperature change (no matter significant temperature drop or great temperature increase) increases the risk of mortality. The high temperature effect on mortality is highly variable from year to year.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using Monte Carlo simulation for radiotherapy dose calculation can provide more accurate results when compared to the analytical methods usually found in modern treatment planning systems, especially in regions with a high degree of inhomogeneity. These more accurate results acquired using Monte Carlo simulation however, often require orders of magnitude more calculation time so as to attain high precision, thereby reducing its utility within the clinical environment. This work aims to improve the utility of Monte Carlo simulation within the clinical environment by developing techniques which enable faster Monte Carlo simulation of radiotherapy geometries. This is achieved principally through the use new high performance computing environments and simpler alternative, yet equivalent representations of complex geometries. Firstly the use of cloud computing technology and it application to radiotherapy dose calculation is demonstrated. As with other super-computer like environments, the time to complete a simulation decreases as 1=n with increasing n cloud based computers performing the calculation in parallel. Unlike traditional super computer infrastructure however, there is no initial outlay of cost, only modest ongoing usage fees; the simulations described in the following are performed using this cloud computing technology. The definition of geometry within the chosen Monte Carlo simulation environment - Geometry & Tracking 4 (GEANT4) in this case - is also addressed in this work. At the simulation implementation level, a new computer aided design interface is presented for use with GEANT4 enabling direct coupling between manufactured parts and their equivalent in the simulation environment, which is of particular importance when defining linear accelerator treatment head geometry. Further, a new technique for navigating tessellated or meshed geometries is described, allowing for up to 3 orders of magnitude performance improvement with the use of tetrahedral meshes in place of complex triangular surface meshes. The technique has application in the definition of both mechanical parts in a geometry as well as patient geometry. Static patient CT datasets like those found in typical radiotherapy treatment plans are often very large and present a significant performance penalty on a Monte Carlo simulation. By extracting the regions of interest in a radiotherapy treatment plan, and representing them in a mesh based form similar to those used in computer aided design, the above mentioned optimisation techniques can be used so as to reduce the time required to navigation the patient geometry in the simulation environment. Results presented in this work show that these equivalent yet much simplified patient geometry representations enable significant performance improvements over simulations that consider raw CT datasets alone. Furthermore, this mesh based representation allows for direct manipulation of the geometry enabling motion augmentation for time dependant dose calculation for example. Finally, an experimental dosimetry technique is described which allows the validation of time dependant Monte Carlo simulation, like the ones made possible by the afore mentioned patient geometry definition. A bespoke organic plastic scintillator dose rate meter is embedded in a gel dosimeter thereby enabling simultaneous 3D dose distribution and dose rate measurement. This work demonstrates the effectiveness of applying alternative and equivalent geometry definitions to complex geometries for the purposes of Monte Carlo simulation performance improvement. Additionally, these alternative geometry definitions allow for manipulations to be performed on otherwise static and rigid geometry.