235 resultados para analogy calculation
Resumo:
We present a formalism for the analysis of sensitivity of nuclear magnetic resonance pulse sequences to variations of pulse sequence parameters, such as radiofrequency pulses, gradient pulses or evolution delays. The formalism enables the calculation of compact, analytic expressions for the derivatives of the density matrix and the observed signal with respect to the parameters varied. The analysis is based on two constructs computed in the course of modified density-matrix simulations: the error interrogation operators and error commutators. The approach presented is consequently named the Error Commutator Formalism (ECF). It is used to evaluate the sensitivity of the density matrix to parameter variation based on the simulations carried out for the ideal parameters, obviating the need for finite-difference calculations of signal errors. The ECF analysis therefore carries a computational cost comparable to a single density-matrix or product-operator simulation. Its application is illustrated using a number of examples from basic NMR spectroscopy. We show that the strength of the ECF is its ability to provide analytic insights into the propagation of errors through pulse sequences and the behaviour of signal errors under phase cycling. Furthermore, the approach is algorithmic and easily amenable to implementation in the form of a programming code. It is envisaged that it could be incorporated into standard NMR product-operator simulation packages.
Resumo:
The Monte Carlo DICOM Tool-Kit (MCDTK) is a software suite designed for treatment plan dose verification, using the BEAMnrc and DOSXYZnrc Monte Carlo codes. MCDTK converts DICOM-format treatment plan information into Monte Carlo input files and compares the results of Monte Carlo treatment simulations with conventional treatment planning dose calculations. In this study, a treatment is planned using a commercial treatment planning system, delivered to a pelvis phantom containing ten thermoluminescent dosimeters and simulated using BEAMnrc and DOSXYZnrc using inputs derived from MCDTK. The dosimetric accuracy of the Monte Carlo data is then evaluated via comparisons with the dose distribution obtained from the treatment planning system as well as the in-phantom point dose measurements. The simulated beam arrangement produced by MCDTK is found to be in geometric agreement with the planned treatment. An isodose display generated from the Monte Carlo data by MCDTK shows general agreement with the isodose display obtained from the treatment planning system, except for small regions around density heterogeneities in the phantom, where the pencil-beam dose calculation performed by the treatment planning systemis likely to be less accurate. All point dose measurements agree with the Monte Carlo data obtained using MCDTK, within confidence limits, and all except one of these point dose measurements show closer agreement with theMonte Carlo data than with the doses calculated by the treatment planning system. This study provides a simple demonstration of the geometric and dosimetric accuracy ofMonte Carlo simulations based on information from MCDTK.
Resumo:
Objective: We investigated to what extent changes in metabolic rate and composition of weight loss explained the less-than-expected weight loss in obese men and women during a diet-plus-exercise intervention. Design: 16 obese men and women (41 ± 9 years; BMI 39 ± 6 kg/m2) were investigated in energy balance before, after and twice during a 12-week VLED (565–650 kcal/day) plus exercise (aerobic plus resistance training) intervention. The relative energy deficit (EDef) from baseline requirements was severe (74-87%). Body composition was measured by deuterium dilution and DXA and resting metabolic rate (RMR) by indirect calorimetry. Fat mass (FM) and fat-free mass (FFM) were converted into energy equivalents using constants: 9.45 kcal/gFM and 1.13 kcal/gFFM. Predicted weight loss was calculated from the energy deficit using the '7700 kcal/kg rule'. Results: Changes in weight (-18.6 ± 5.0 kg), FM (-15.5 ± 4.3 kg), and FFM (-3.1 ± 1.9 kg) did not differ between genders. Measured weight loss was on average 67% of the predicted value, but ranged from 39 to 94%. Relative EDef was correlated with the decrease in RMR (R=0.70, P<0.01) and the decrease in RMR correlated with the difference between actual and expected weight loss (R=0.51, P<0.01). Changes in metabolic rate explained on average 67% of the less-than-expected weight loss, and variability in the proportion of weight lost as FM accounted for a further 5%. On average, after adjustment for changes in metabolic rate and body composition of weight lost, actual weight loss reached 90% of predicted values. Conclusion: Although weight loss was 33% lower than predicted at baseline from standard energy equivalents, the majority of this differential was explained by physiological variables. While lower-than-expected weight loss is often attributed to incomplete adherence to prescribed interventions, the influence of baseline calculation errors and metabolic down-regulation should not be discounted.
Resumo:
This paper presents two novel concepts to enhance the accuracy of damage detection using the Modal Strain Energy based Damage Index (MSEDI) with the presence of noise in the mode shape data. Firstly, the paper presents a sequential curve fitting technique that reduces the effect of noise on the calculation process of the MSEDI, more effectively than the two commonly used curve fitting techniques; namely, polynomial and Fourier’s series. Secondly, a probability based Generalized Damage Localization Index (GDLI) is proposed as a viable improvement to the damage detection process. The study uses a validated ABAQUS finite-element model of a reinforced concrete beam to obtain mode shape data in the undamaged and damaged states. Noise is simulated by adding three levels of random noise (1%, 3%, and 5%) to the mode shape data. Results show that damage detection is enhanced with increased number of modes and samples used with the GDLI.
Resumo:
With the emergence of Unmanned Aircraft Systems (UAS) there is a growing need for safety standards and regulatory frameworks to manage the risks associated with their operations. The primary driver for airworthiness regulations (i.e., those governing the design, manufacture, maintenance and operation of UAS) are the risks presented to people in the regions overflown by the aircraft. Models characterising the nature of these risks are needed to inform the development of airworthiness regulations. The output from these models should include measures of the collective, individual and societal risk. A brief review of these measures is provided. Based on the review, it was determined that the model of the operation of an UAS over inhabited areas must be capable of describing the distribution of possible impact locations, given a failure at a particular point in the flight plan. Existing models either do not take the impact distribution into consideration, or propose complex and computationally expensive methods for its calculation. A computationally efficient approach for estimating the boundary (and in turn area) of the impact distribution for fixed wing unmanned aircraft is proposed. A series of geometric templates that approximate the impact distributions are derived using an empirical analysis of the results obtained from a 6-Degree of Freedom (6DoF) simulation. The impact distributions can be aggregated to provide impact footprint distributions for a range of generic phases of flight and missions. The maximum impact footprint areas obtained from the geometric template are shown to have a relative error of typically less than 1% compared to the areas calculated using the computationally more expensive 6DoF simulation. Computation times for the geometric models are on the order of one second or less, using a standard desktop computer. Future work includes characterising the distribution of impact locations within the footprint boundaries.
Resumo:
Most unsignalised intersection capacity calculation procedures are based on gap acceptance models. Accuracy of critical gap estimation affects accuracy of capacity and delay estimation. Several methods have been published to estimate drivers’ sample mean critical gap, the Maximum Likelihood Estimation (MLE) technique regarded as the most accurate. This study assesses three novel methods; Average Central Gap (ACG) method, Strength Weighted Central Gap method (SWCG), and Mode Central Gap method (MCG), against MLE for their fidelity in rendering true sample mean critical gaps. A Monte Carlo event based simulation model was used to draw the maximum rejected gap and accepted gap for each of a sample of 300 drivers across 32 simulation runs. Simulation mean critical gap is varied between 3s and 8s, while offered gap rate is varied between 0.05veh/s and 0.55veh/s. This study affirms that MLE provides a close to perfect fit to simulation mean critical gaps across a broad range of conditions. The MCG method also provides an almost perfect fit and has superior computational simplicity and efficiency to the MLE. The SWCG method performs robustly under high flows; however, poorly under low to moderate flows. Further research is recommended using field traffic data, under a variety of minor stream and major stream flow conditions for a variety of minor stream movement types, to compare critical gap estimates using MLE against MCG. Should the MCG method prove as robust as MLE, serious consideration should be given to its adoption to estimate critical gap parameters in guidelines.
Resumo:
A software tool (DRONE) has been developed to evaluate road traffic noise in a large area with the consideration of network dynamic traffic flow and the buildings. For more precise estimation of noise in urban network where vehicles are mainly in stop and go running conditions, vehicle sound power level (for acceleration/deceleration cruising and ideal vehicle) is incorporated in DRONE. The calculation performance of DRONE is increased by evaluating the noise in two steps of first estimating the unit noise database and then integrating it with traffic simulation. Details of the process from traffic simulation to contour maps are discussed in the paper and the implementation of DRONE on Tsukuba city is presented.
Resumo:
A number of groups around the world are working in the field of three dimensional(3D) ultrasound (US) in order to obtain higher quality diagnostic information. 3D US, in general, involves collecting a sequence of conventional 2D US images along with information on the position and orientation of each image plane. A transformation matrix is calculated relating image space to real world space. This allows image pixels and region of interest (ROI) points drawn on the image to be displayed in 3D. The 3D data can be used for the production of volume or surface rendered images, or for the direct calculation of ROI volumes.
Resumo:
If the current discourses of progress are to be believed, the new or social media promise a kaleidoscope of opportunity for connecting and informing citizens. This is by allegedly revitalizing the fading legitimacy and practice of institutions and providing an agent for social interaction. However, as social media adoption has increased, it has revealed a wealth of contradictions both of its own making and reproduction of past action. This has created a crisis for traditional media as well as for public relations. For example, social media such as WikiLeaks have bypassed official channels about government information. In other cases, social media such as Facebook and Twitter informed BBC coverage of the Rio Olympics. Although old media are unlikely to go away, social media have had an impact with several large familybased media companies collapsing or being reintegrated into the new paradigm. To use Walter Lippman’s analogy of the phantom public, the social media contradictorily serve to both disparate the phantom in part and reinforce it...
Resumo:
Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1=n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal. Funding source Cancer Australia (Department of Health and Ageing) Research Grant 614217
Resumo:
This spreadsheet calculates carbonate speciation using carbonate equilibrium equations at standard conditions (T=25°C) with ionic strength corrections. The user will typically be able to calculate the different carbonate species by entering total alkalinity and pH. This spreadsheet contains additional tools to calculate the Langelier Index for calcium and the SAR of the water. Note that in this last calculation the potential for calcium precipitation is not taken into account. The last tool presented here is a carbonate speciation tool in open systems (e.g. open to the atmosphere) which takes into account atmospheric pressure.
Resumo:
The civil liability provisions relating to the assessment of damages for past and future economic loss have abrogated the common law principle of full compensation by imposing restrictions on the damages award, most commonly by a “three times average weekly earnings” cap. This consideration of the impact of those provisions is informed by a case study of the Supreme Court of Victoria Court of Appeal decision, Tuohey v Freemasons Hospital (Tuohey) , which addressed the construction and arithmetic operation of the Victorian cap for high income earners. While conclusions as to operation of the cap outside of Victoria can be drawn from Tuohey, a number of issues await judicial determination. These issues, which include the impact of the damages caps on the calculation of damages for economic loss in the circumstances of fluctuating income; vicissitudes; contributory negligence; claims per quod servitum amisit; and claims by dependants, are identified and potential resolutions discussed.
Resumo:
Over the past 30 years the nature of airport precincts has changed significantly from purely aviation services to a full range of retail, commercial, industrial and other non aviation uses. Most major airports in Australia are owned and operated by the private sector but are subject to long term head leases to the Federal Government, with subsequent sub leases in place to users of the land. The lease term available for both aviation and non aviation tenants is subject to the head lease term and in a number of Australian airport locations, these head leases are now two-thirds through their initial 50 year lease term and this is raising a number of issues from a valuation and ongoing development perspective. . For our airport precincts to continue to offer levels of infrastructure and services that are comparable or better than many commercial centres in the same location, policy makers need to understand the impact the uncertainty that exists when the current lease term is nearing expiration, especially in relation to the renewed lease term and rental payments. This paper reviews the changes in airport precinct ownership, management and development in Australia and highlights the valuation and rental assessment issues that are currently facing this property sector.
Resumo:
It is widely recognized that the quality of design is crucial to the success of the construction or production process and fairly minor changes in design can often result in giving major effects on the cost and efficiency of production and construction as well as on the usefulness, constructability and marketability of the product especially in developing high rise residential property development. The purpose of this study is to suggest a framework model for property manager, considering the sustainable and building quality of property development in high rise residential complex. This paper evaluates and ranks the importance, and frequency of the building quality factors that affect the sustainability and comfort of living for the resident in the selected high rise residential complex in Malaysia. A total of 500 respondents consisting of 20 property managers participated in this study. The respondents were asked to indicate how important each of building equipments in giving them the comfort of living in the selected high rise residential complex. The data were then subjected to the calculation of important indices which enabled the factors to be ranked. After that, a framework model will be developed to make sure all property managers will be guided to prepare their property for the resident to stay in the complex. Accordingly, the living satisfaction by the framework model plays a meaningful role in preparing and developing sustainable and good building quality in Malaysia high rise residential complex.