486 resultados para likelihood to publication
Resumo:
A Monte Carlo model of an Elekta iViewGT amorphous silicon electronic portal imaging device (a-Si EPID) has been validated for pre-treatment verification of clinical IMRT treatment plans. The simulations involved the use of the BEAMnrc and DOSXYZnrc Monte Carlo codes to predict the response of the iViewGT a-Si EPID model. The predicted EPID images were compared to the measured images obtained from the experiment. The measured EPID images were obtained by delivering a photon beam from an Elekta Synergy linac to the Elekta iViewGT a-Si EPID. The a-Si EPID was used with no additional build-up material. Frame averaged EPID images were acquired and processed using in-house software. The agreement between the predicted and measured images was analyzed using the gamma analysis technique with acceptance criteria of 3% / 3 mm. The results show that the predicted EPID images for four clinical IMRT treatment plans have a good agreement with the measured EPID signal. Three prostate IMRT plans were found to have an average gamma pass rate of more than 95.0 % and a spinal IMRT plan has the average gamma pass rate of 94.3 %. During the period of performing this work a routine MLC calibration was performed and one of the IMRT treatments re-measured with the EPID. A change in the gamma pass rate for one field was observed. This was the motivation for a series of experiments to investigate the sensitivity of the method by introducing delivery errors, MLC position and dosimetric overshoot, into the simulated EPID images. The method was found to be sensitive to 1 mm leaf position errors and 10% overshoot errors.
Resumo:
This article assesses the extent to which the recently formulated Chinese concept of “Responsible Protection” (RP) offers a valuable contribution to the normative debate over R2P’s third pillar following the controversy over military intervention in Libya. While RP draws heavily on previous proposals such as the original 2001 ICISS report and Brazil’s “Responsibility while Protecting” (RwP), by amalgamating and re-packaging these earlier ideas in a more restrictive form the initiative represents a new and distinctive interpretation of R2P. However, some aspects of RP are framed too narrowly to provide workable guidelines for determining the permissibility of military intervention for civilian protection purposes, and should therefore be clarified and refined. Nevertheless, the Chinese proposal remains significant because it offers important insights into Beijing’s current stance on R2P. More broadly, China’s RP and Brazil’s RwP initiatives illustrate the growing willingness of rising, non-Western powers to assert their own normative preferences on sovereignty, intervention and global governance.
Resumo:
Background Chronic respiratory illnesses are the most common group of childhood chronic health conditions and are overrepresented in socially isolated groups. Objective To conduct a randomized controlled pilot trial to evaluate the efficacy of Breathe Easier Online (BEO), an Internet-based problem-solving program with minimal facilitator involvement to improve psychosocial well-being in children and adolescents with a chronic respiratory condition. Methods We randomly assigned 42 socially isolated children and adolescents (18 males), aged between 10 and 17 years to either a BEO (final n = 19) or a wait-list control (final n = 20) condition. In total, 3 participants (2 from BEO and 1 from control) did not complete the intervention. Psychosocial well-being was operationalized through self-reported scores on depression symptoms and social problem solving. Secondary outcome measures included self-reported attitudes toward their illness and spirometry results. Paper-and-pencil questionnaires were completed at the hospital when participants attended a briefing session at baseline (time 1) and in their homes after the intervention for the BEO group or a matched 9-week time period for the wait-list group (time 2). Results The two groups were comparable at baseline across all demographic measures (all F < 1). For the primary outcome measures, there were no significant group differences on depression (P = .17) or social problem solving (P = .61). However, following the online intervention, those in the BEO group reported significantly lower depression (P = .04), less impulsive/careless problem solving (P = .01), and an improvement in positive attitude toward their illness (P = .04) compared with baseline. The wait-list group did not show these differences. Children in the BEO group and their parents rated the online modules very favorably. Conclusions Although there were no significant group differences on primary outcome measures, our pilot data provide tentative support for the feasibility (acceptability and user satisfaction) and initial efficacy of an Internet-based intervention for improving well-being in children and adolescents with a chronic respiratory condition. Trial registration Australian New Zealand Clinical Trials Registry number: ACTRN12610000214033;
Resumo:
Driver sleepiness is a major contributor to road crashes. The current study sought to examine the association between perceptions of effectiveness of six sleepiness countermeasures and their relationship with self-reports of continuing to drive while sleepy among 309 drivers after controlling for the influence of age, sex, motivation for driving sleepy, and risk perception of sleepy driving. The results demonstrate that the variables of age, sex, motivation, and risk perception were significantly associated with self-reports of continuing to drive while sleepy and only one countermeasure was associated with self-reports of continuing to drive while sleepy. Further, it was found that age differences in self-reports of continuing to drive while sleepy was mediated by participants’ motivation and risk perception. These findings highlight modifiable factors that could be focused on with interventions that seek to modify drivers’ attitudes and behaviours of driving while sleepy.
Resumo:
This paper presents a numerical model for understanding particle transport and deposition in metal foam heat exchangers. Two-dimensional steady and unsteady numerical simulations of a standard single row metal foam-wrapped tube bundle are performed for different particle size distributions, i.e. uniform and normal distributions. Effects of different particle sizes and fluid inlet velocities on the overall particle transport inside and outside the foam layer are also investigated. It was noted that the simplification made in the previously-published numerical works in the literature, e.g. uniform particle deposition in the foam, is not necessarily accurate at least for the cases considered here. The results highlight the preferential particle deposition areas both along the tube walls and inside the foam using a developed particle deposition likelihood matrix. This likelihood matrix is developed based on three criteria being particle local velocity, time spent in the foam, and volume fraction. It was noted that the particles tend to deposit near both front and rear stagnation points. The former is explained by the higher momentum and direct exposure of the particles to the foam while the latter only accommodate small particles which can be entrained in the recirculation region formed behind the foam-wrapped tubes.
Resumo:
Cold-formed steel sections are commonly used in low-rise commercial and residential buildings. During fire events, cold-formed steel structural elements in these buildings are exposed to elevated temperatures. Hence after such events there is a need to determine the residual strength of these structural elements. However, only limited information is available in relation to the residual strength of fire exposed cold-formed steel members. This research is aimed at investigating the residual distortional buckling capacities of fire exposed cold-formed steel lipped channel sections. A series of compression tests of fire exposed, short lipped channel columns made of varying steel grades and thicknesses was undertaken in this research. Test columns were exposed to different elevated temperatures up to 800 oC. They were then allowed to cool down at ambient temperature before they were tested to failure. Suitable finite element models of tested columns were also developed and validated using test results. The residual compression capacities of tested columns were predicted using the ambient temperature cold-formed steel design rules (AS/NZS 4600, AISI S100 and Direct Strength Method). Post-fire mechanical properties obtained from a previous study were used in this study. Comparison of results showed that ambient temperature design rules for compression members can be used to predict the residual compression capacities of fire exposed short or laterally restrained cold-formed steel columns provided the maximum temperature experienced by the columns can be estimated after a fire event. Such residual capacity assessments will allow structural and fire engineers to make an accurate prediction of the safety of buildings after fire events. This paper presents the details of these experimental and numerical studies and the results.
Resumo:
Plant food materials have a very high demand in the consumer market and therefore, improved food products and efficient processing techniques are concurrently being researched in food engineering. In this context, numerical modelling and simulation techniques have a very high potential to reveal fundamentals of the underlying mechanisms involved. However, numerical modelling of plant food materials during drying becomes quite challenging, mainly due to the complexity of the multiphase microstructure of the material, which undergoes excessive deformations during drying. In this regard, conventional grid-based modelling techniques have limited applicability due to their inflexible grid-based fundamental limitations. As a result, meshfree methods have recently been developed which offer a more adaptable approach to problem domains of this nature, due to their fundamental grid-free advantages. In this work, a recently developed meshfree based two-dimensional plant tissue model is used for a comparative study of microscale morphological changes of several food materials during drying. The model involves Smoothed Particle Hydrodynamics (SPH) and Discrete Element Method (DEM) to represent fluid and solid phases of the cellular structure. Simulation are conducted on apple, potato, carrot and grape tissues and the results are qualitatively and quantitatively compared and related with experimental findings obtained from the literature. The study revealed that cellular deformations are highly sensitive to cell dimensions, cell wall physical and mechanical properties, middle lamella properties and turgor pressure. In particular, the meshfree model is well capable of simulating critically dried tissues at lower moisture content and turgor pressure, which lead to cell wall wrinkling. The findings further highlighted the potential applicability of the meshfree approach to model large deformations of the plant tissue microstructure during drying, providing a distinct advantage over the state of the art grid-based approaches.
Resumo:
The effects of estrogen deficiency on bone characteristics are site-dependent, with the most commonly studied sites being appendicular long bones (proximal femur and tibia) and axial bones (vertebra). The effect on the maxillary and mandibular bones is still inconsistent and requires further investigation. This study was designed to evaluate bone quality in the posterior maxilla of ovariectomized rats in order to validate this site as an appropriate model to study the effect of osteoporotic changes. Methods: Forty-eight 3-month-old female Sprague-Dawley rats were randomly divided into two groups: an ovariectomized group (OVX, n=24) and Sham-operated group (SHAM, n=24). Six rats were randomly sacrificed from both groups at time points 8, 12, 16 and 20 weeks. The samples from tibia and maxilla were collected for Micro CT and histological analysis. For the maxilla, the volume of interest (VOI) area focused on the furcation areas of the first and second molar. Trabecular bone volume fraction (BV/TV, %), trabecular thickness (Tb.Th.), trabecular number (Tb.N.), trabecular separation (Tb.Sp.), and connectivity density (Conn.Dens) were analysed after Micro CT scanning. Results: At 8 weeks the indices BV/TV, Tb.Sp, Tb.N and Conn.Dens showed significant differences (P<0.05) between the OVX and SHAM groups in the tibia. Compared with the tibia, the maxilla developed osteoporosis at a later stage, with significant changes in maxillary bone density only occurring after 12 weeks. Compared with the SHAM group, both the first and second molars of the OVX group showed significantly decreased BV/TV values from 12 weeks, and these changes were sustained through 16 and 20 weeks. For Tb.Sp, there were significant increases in bone values for the OVX group compared with the SHAM group at 12, 16 and 20 weeks. Histological changes were highly consistent with Micro CT results. Conclusion: This study established a method to quantify the changes of intra-radicular alveolar bone in the posterior maxilla in an accepted rat osteoporosis model. The degree of the osteoporotic changes to trabecular bone architecture is site-dependent and at least 3 months are required for the osteoporotic effects to be apparent in the posterior maxilla following rat OVX.
Resumo:
Firstly, we would like to thank Ms. Alison Brough and her colleagues for their positive commentary on our published work [1] and their appraisal of our utility of the “off-set plane” protocol for anthropometric analysis. The standardized protocols described in our manuscript have wide applications, ranging from forensic anthropology and paleodemographic research to clinical settings such as paediatric practice and orthopaedic surgical design. We affirm that the use of geometrically based reference tools commonly found in computer aided design (CAD) programs such as Geomagic Design X® are imperative for more automated and precise measurement protocols for quantitative skeletal analysis. Therefore we stand by our recommendation of the use of software such as Amira and Geomagic Design X® in the contexts described in our manuscript...
Resumo:
Analytically or computationally intractable likelihood functions can arise in complex statistical inferential problems making them inaccessible to standard Bayesian inferential methods. Approximate Bayesian computation (ABC) methods address such inferential problems by replacing direct likelihood evaluations with repeated sampling from the model. ABC methods have been predominantly applied to parameter estimation problems and less to model choice problems due to the added difficulty of handling multiple model spaces. The ABC algorithm proposed here addresses model choice problems by extending Fearnhead and Prangle (2012, Journal of the Royal Statistical Society, Series B 74, 1–28) where the posterior mean of the model parameters estimated through regression formed the summary statistics used in the discrepancy measure. An additional stepwise multinomial logistic regression is performed on the model indicator variable in the regression step and the estimated model probabilities are incorporated into the set of summary statistics for model choice purposes. A reversible jump Markov chain Monte Carlo step is also included in the algorithm to increase model diversity for thorough exploration of the model space. This algorithm was applied to a validating example to demonstrate the robustness of the algorithm across a wide range of true model probabilities. Its subsequent use in three pathogen transmission examples of varying complexity illustrates the utility of the algorithm in inferring preference of particular transmission models for the pathogens.
Resumo:
Studies examining the ability of motivational enhancement therapy (MET) to augment education provision among ecstasy users have produced mixed results and none have examined whether treatment fidelity was related to ecstasy use outcomes. The primary objectives of this multi-site, parallel, two-group randomized controlled trial were to determine if a single-session of MET could instill greater commitment to change and reduce ecstasy use and related problems more so than an education-only intervention and whether MET sessions delivered with higher treatment fidelity are associated with better outcomes. The secondary objective was to assess participants’ satisfaction with their assigned interventions. Participants (N = 174; Mage = 23.62) at two Australian universities were allocated randomly to receive a 15-minute educational session on ecstasy use (n = 85) or a 50-minute session of MET that included an educational component (n = 89). Primary outcomes were assessed at baseline, and then at 4-, 16-, and 24-weeks post-baseline, while the secondary outcome measure was assessed 4-weeks post-baseline by researchers blind to treatment allocation. Overall, the treatment fidelity was acceptable to good in the MET condition. There were no statistical differences at follow-up between the groups on the primary outcomes of ecstasy use, ecstasy-related problems, and commitment to change. Both interventions groups reported a 50% reduction in their ecstasy use and a 20% reduction in the severity of their ecstasy-related problems at the 24-week follow up. Commitment to change slightly improved for both groups (9% - 17%). Despite the lack of between-group statistical differences on primary outcomes, participants who received a single session of MET were slightly more satisfied with their intervention than those who received education only. MI fidelity was not associated with ecstasy use outcomes. Given these findings, future research should focus on examining mechanisms of change. Such work may suggest new methods for enhancing outcomes.
Resumo:
Media architecture’s combination of the digital and the physical can trigger, enhance, and amplify urban experiences. In this paper, we examine how to bring about and foster more open and participatory approaches to engage communities through media architecture by identifying novel ways to put some of the creative process into the hands of laypeople. We review technical, spatial, and social aspects of DIY phenomena with a view to better understand maker cultures, communities, and practices. We synthesise our findings and ask if and how media architects as a community of practice can encourage the ‘open-sourcing’ of information and tools allowing laypeople to not only participate but become active instigators of change in their own right. We argue that enabling true DIY practices in media architecture may increase citizen control. Seeking design strategies that foster DIY approaches, we propose five areas for further work and investigation. The paper begs many questions indicating ample room for further research into DIY Media Architecture.
Resumo:
The conventional approach to setting a milling unit is essentially based on the desire to achieve a particular bagasse moisture content or fibre fill in each nip of the mill. This approach relies on the selection of the speed at which the mill will operate for the selected fibre rate. There is rarely any checking that the selected speed or the selected fibre fill is achieved and the same set of assumptions is generally carried over to use again in the next year. The conventional approach largely ignores the fact that the selection of mill settings actually determines the speed at which the mill will operate. Making an adjustment with the intent of changing the performance of the mill often also changes the speed of the mill as an unintended consequence. This paper presents an alternative approach to mill setting. The approach discussed makes use of mill feeding theory to define the relationship between fibre rate, mill speed and mill settings and uses that theory to provide an alternative means of determining the settings in some nips of the mill. Mill feeding theory shows that, as the feed work opening reduces, roll speed increases. The theory also shows that there is an optimal underfeed opening and Donnelly chute exit opening that will minimise roll speed and that the current South African guidelines appear to be well away from those optimal values.
Resumo:
Product Ecosystem theory is an emerging theory that shows that disruptive “game changing” innovation is only possible when the entire ecosystem is considered. When environmental variables change faster than products or services can adapt, disruptive innovation is required to keep pace. This has many parallels with natural ecosystems where species that cannot keep up with changes to the environment will struggle or become extinct. In this case the environment is the city, the environmental pressures are pollution and congestion, the product is the car and the product ecosystem is comprised of roads, bridges, traffic lights, legislation, refuelling facilities etc. Each one of these components is the responsibility of a different organisation and so any change that affects the whole ecosystem requires a transdisciplinary approach. As a simple example, cars that communicate wirelessly with traffic lights are only of value if wireless-enabled traffic lights exist and vice versa. Cars that drive themselves are technically possible but legislation in most places doesn’t allow their use. According to innovation theory, incremental innovation tends to chase ever diminishing returns and becomes increasingly unable to tackle the “big issues.” Eventually “game changing” disruptive innovation comes along and solves the “big issues” and/or provides new opportunities. Seen through this lens, the environmental pressures of urban traffic congestion and pollution are the “big issues.” It can be argued that the design of cars and the other components of the product ecosystem follow an incremental innovation approach. That is why the “big issues” remain unresolved. This paper explores the problems of pollution and congestion in urban environments from a Product Ecosystem perspective. From this a strategy will be proposed for a transdisciplinary approach to develop and implement solutions.
Resumo:
Migraine is a common neurological disorder classified by the World Health Organisation (WHO) as one of the top twenty most debilitating diseases in the developed world. Current therapies are only effective for a proportion of sufferers and new therapeutic targets are desperately needed to alleviate this burden. Recently the role of epigenetics in the development of many complex diseases including migraine has become an emerging topic. By understanding the importance of acetylation, methylation and other epigenetic modifications, it then follows that this modification process is a potential target to manipulate epigenetic status with the goal of treating disease. Bisulphite sequencing and methylated DNA immunoprecipitation have been used to demonstrate the presence of methylated cytosines in the human D-loop of mitochondrial DNA (mtDNA), proving that the mitochondrial genome is methylated. For the first time, it has been shown that there is a difference in mtDNA epigenetic status between healthy controls and those with disease, especially for neurodegenerative and age related conditions. Given co-morbidities with migraine and the suggestive link between mitochondrial dysfunction and the lowered threshold for triggering a migraine attack, mitochondrial methylation may be a new avenue to pursue. Creative thinking and new approaches are needed to solve complex problems and a systems biology approach, where multiple layers of information are integrated is becoming more important in complex disease modelling.