953 resultados para DFT calculation
Resumo:
Most unsignalised intersection capacity calculation procedures are based on gap acceptance models. Accuracy of critical gap estimation affects accuracy of capacity and delay estimation. Several methods have been published to estimate drivers’ sample mean critical gap, the Maximum Likelihood Estimation (MLE) technique regarded as the most accurate. This study assesses three novel methods; Average Central Gap (ACG) method, Strength Weighted Central Gap method (SWCG), and Mode Central Gap method (MCG), against MLE for their fidelity in rendering true sample mean critical gaps. A Monte Carlo event based simulation model was used to draw the maximum rejected gap and accepted gap for each of a sample of 300 drivers across 32 simulation runs. Simulation mean critical gap is varied between 3s and 8s, while offered gap rate is varied between 0.05veh/s and 0.55veh/s. This study affirms that MLE provides a close to perfect fit to simulation mean critical gaps across a broad range of conditions. The MCG method also provides an almost perfect fit and has superior computational simplicity and efficiency to the MLE. The SWCG method performs robustly under high flows; however, poorly under low to moderate flows. Further research is recommended using field traffic data, under a variety of minor stream and major stream flow conditions for a variety of minor stream movement types, to compare critical gap estimates using MLE against MCG. Should the MCG method prove as robust as MLE, serious consideration should be given to its adoption to estimate critical gap parameters in guidelines.
Resumo:
A software tool (DRONE) has been developed to evaluate road traffic noise in a large area with the consideration of network dynamic traffic flow and the buildings. For more precise estimation of noise in urban network where vehicles are mainly in stop and go running conditions, vehicle sound power level (for acceleration/deceleration cruising and ideal vehicle) is incorporated in DRONE. The calculation performance of DRONE is increased by evaluating the noise in two steps of first estimating the unit noise database and then integrating it with traffic simulation. Details of the process from traffic simulation to contour maps are discussed in the paper and the implementation of DRONE on Tsukuba city is presented.
Resumo:
A number of groups around the world are working in the field of three dimensional(3D) ultrasound (US) in order to obtain higher quality diagnostic information. 3D US, in general, involves collecting a sequence of conventional 2D US images along with information on the position and orientation of each image plane. A transformation matrix is calculated relating image space to real world space. This allows image pixels and region of interest (ROI) points drawn on the image to be displayed in 3D. The 3D data can be used for the production of volume or surface rendered images, or for the direct calculation of ROI volumes.
Resumo:
Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1=n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal. Funding source Cancer Australia (Department of Health and Ageing) Research Grant 614217
Resumo:
This spreadsheet calculates carbonate speciation using carbonate equilibrium equations at standard conditions (T=25°C) with ionic strength corrections. The user will typically be able to calculate the different carbonate species by entering total alkalinity and pH. This spreadsheet contains additional tools to calculate the Langelier Index for calcium and the SAR of the water. Note that in this last calculation the potential for calcium precipitation is not taken into account. The last tool presented here is a carbonate speciation tool in open systems (e.g. open to the atmosphere) which takes into account atmospheric pressure.
Resumo:
The civil liability provisions relating to the assessment of damages for past and future economic loss have abrogated the common law principle of full compensation by imposing restrictions on the damages award, most commonly by a “three times average weekly earnings” cap. This consideration of the impact of those provisions is informed by a case study of the Supreme Court of Victoria Court of Appeal decision, Tuohey v Freemasons Hospital (Tuohey) , which addressed the construction and arithmetic operation of the Victorian cap for high income earners. While conclusions as to operation of the cap outside of Victoria can be drawn from Tuohey, a number of issues await judicial determination. These issues, which include the impact of the damages caps on the calculation of damages for economic loss in the circumstances of fluctuating income; vicissitudes; contributory negligence; claims per quod servitum amisit; and claims by dependants, are identified and potential resolutions discussed.
Resumo:
Over the past 30 years the nature of airport precincts has changed significantly from purely aviation services to a full range of retail, commercial, industrial and other non aviation uses. Most major airports in Australia are owned and operated by the private sector but are subject to long term head leases to the Federal Government, with subsequent sub leases in place to users of the land. The lease term available for both aviation and non aviation tenants is subject to the head lease term and in a number of Australian airport locations, these head leases are now two-thirds through their initial 50 year lease term and this is raising a number of issues from a valuation and ongoing development perspective. . For our airport precincts to continue to offer levels of infrastructure and services that are comparable or better than many commercial centres in the same location, policy makers need to understand the impact the uncertainty that exists when the current lease term is nearing expiration, especially in relation to the renewed lease term and rental payments. This paper reviews the changes in airport precinct ownership, management and development in Australia and highlights the valuation and rental assessment issues that are currently facing this property sector.
Resumo:
It is widely recognized that the quality of design is crucial to the success of the construction or production process and fairly minor changes in design can often result in giving major effects on the cost and efficiency of production and construction as well as on the usefulness, constructability and marketability of the product especially in developing high rise residential property development. The purpose of this study is to suggest a framework model for property manager, considering the sustainable and building quality of property development in high rise residential complex. This paper evaluates and ranks the importance, and frequency of the building quality factors that affect the sustainability and comfort of living for the resident in the selected high rise residential complex in Malaysia. A total of 500 respondents consisting of 20 property managers participated in this study. The respondents were asked to indicate how important each of building equipments in giving them the comfort of living in the selected high rise residential complex. The data were then subjected to the calculation of important indices which enabled the factors to be ranked. After that, a framework model will be developed to make sure all property managers will be guided to prepare their property for the resident to stay in the complex. Accordingly, the living satisfaction by the framework model plays a meaningful role in preparing and developing sustainable and good building quality in Malaysia high rise residential complex.
Resumo:
Most commonly, residents are always arguing about the satisfaction of sustainability and quality of their high rise residential property. This paper aim is to maintain the best quality satisfaction of the door hardware by introducing the whole life cycle costing approach to the property manager of the public housing in Johor. This paper looks into the current situation of ironmongeries (door hardware) of 2 public housings in Johor, Malaysia and testing the whole life cycle costing approach towards them. The calculation and the literature review are conducted. The questionnaire surveys of 2 public housings were conducted to make clear the occupants’ evaluation about the actual quality conditions of the ironmongeries in their house. As a result, the quality of door hardware based on the whole life cycle costing approach is one of the best among their previous decision making tool that have been applied. Practitioners can benefit from this paper as it provides information on calculating the whole life costing and making the decisions about ironmongeries selection of their properties.
Resumo:
Most commonly, residents are always arguing about the satisfaction of sustainability and quality of their high rise residential property. This paper aim is to maintain the best quality satisfaction of the floor materials by introducing the whole life cycle costing approach to the property manager of the public housing in Johor. This paper looks into the current situation of floor material of two public housings in Johor, Malaysia and testing the whole life cycle costing approach towards them. The cost figures may be implemented to justify higher investments, for examples, in the quality or flexibility of building solutions through a long-term cost reduction. The calculation and the literature review are conducted. The questionnaire surveys of two public housings were conducted to make clear the occupants’ evaluation about the actual quality conditions of the floor material in their house. As a result, the quality of floor material based on the whole life cycle costing approach is one of the best among their previous decision making tool that was applied. Practitioners can benefit from this paper as it provides information on calculating the whole life costing and making the decisions for floor material selection for their properties.
Resumo:
miRDeep and its varieties are widely used to quantify known and novel micro RNA (miRNA) from small RNA sequencing (RNAseq). This article describes miRDeep*, our integrated miRNA identification tool, which is modeled off miRDeep, but the precision of detecting novel miRNAs is improved by introducing new strategies to identify precursor miRNAs. miRDeep* has a user-friendly graphic interface and accepts raw data in FastQ and Sequence Alignment Map (SAM) or the binary equivalent (BAM) format. Known and novel miRNA expression levels, as measured by the number of reads, are displayed in an interface, which shows each RNAseq read relative to the pre-miRNA hairpin. The secondary pre-miRNA structure and read locations for each predicted miRNA are shown and kept in a separate figure file. Moreover, the target genes of known and novel miRNAs are predicted using the TargetScan algorithm, and the targets are ranked according to the confidence score. miRDeep* is an integrated standalone application where sequence alignment, pre-miRNA secondary structure calculation and graphical display are purely Java coded. This application tool can be executed using a normal personal computer with 1.5 GB of memory. Further, we show that miRDeep* outperformed existing miRNA prediction tools using our LNCaP and other small RNAseq datasets. miRDeep* is freely available online at http://www.australianprostatecentre.org/research/software/mirdeep-star
Resumo:
High resolution TEM images of boron carbide (B13C2) have been recorded and compared with images calculated using the multislice method as implemented by M. A. O'Keefe in the SHRLI programs. Images calculated for the [010] zone, using machine parameters for the JEOL 2000FX AEM operating at 200 keV, indicate that for the structure model of Will et al., the optimum defocus image can be interpreted such that white spots correspond to B12 icosahedra for thin specimens and to low density channels through the structure adjacent to the direct inter-icosahedral bonds for specimens of intermediate thickness (-40 > t > -100 nm). With this information, and from the symmetry observed in the TEM images, it is likely that the (101) twin plane passes through the center of icosahedron located at the origin. This model was tested using the method of periodic continuation. Resulting images compare favorably with experimental images, thus supporting the structural model. The introduction of a (101) twin plane through the origin creates distortions to the icosahedral linkages as well as to the intra-icosahedral bonding. This increases the inequivalence of adjacent icosahedral sites along the twin plane, and thereby increases the likelihood of bipolaron hopping.
Resumo:
Process mining encompasses the research area which is concerned with knowledge discovery from information system event logs. Within the process mining research area, two prominent tasks can be discerned. First of all, process discovery deals with the automatic construction of a process model out of an event log. Secondly, conformance checking focuses on the assessment of the quality of a discovered or designed process model in respect to the actual behavior as captured in event logs. Hereto, multiple techniques and metrics have been developed and described in the literature. However, the process mining domain still lacks a comprehensive framework for assessing the goodness of a process model from a quantitative perspective. In this study, we describe the architecture of an extensible framework within ProM, allowing for the consistent, comparative and repeatable calculation of conformance metrics. For the development and assessment of both process discovery as well as conformance techniques, such a framework is considered greatly valuable.
Resumo:
Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the “gold standard” for predicting dose deposition in the patient. In this study, software has been developed that enables the transfer of treatment plan information from the treatment planning system to a Monte Carlo dose calculation engine. A database of commissioned linear accelerator models (Elekta Precise and Varian 2100CD at various energies) has been developed using the EGSnrc/BEAMnrc Monte Carlo suite. Planned beam descriptions and CT images can be exported from the treatment planning system using the DICOM framework. The information in these files is combined with an appropriate linear accelerator model to allow the accurate calculation of the radiation field incident on a modelled patient geometry. The Monte Carlo dose calculation results are combined according to the monitor units specified in the exported plan. The result is a 3D dose distribution that could be used to verify treatment planning system calculations. The software, MCDTK (Monte Carlo Dicom ToolKit), has been developed in the Java programming language and produces BEAMnrc and DOSXYZnrc input files, ready for submission on a high-performance computing cluster. The code has been tested with the Eclipse (Varian Medical Systems), Oncentra MasterPlan (Nucletron B.V.) and Pinnacle3 (Philips Medical Systems) planning systems. In this study the software was validated against measurements in homogenous and heterogeneous phantoms. Monte Carlo models are commissioned through comparison with quality assurance measurements made using a large square field incident on a homogenous volume of water. This study aims to provide a valuable confirmation that Monte Carlo calculations match experimental measurements for complex fields and heterogeneous media.
Resumo:
Background Non-fatal health outcomes from diseases and injuries are a crucial consideration in the promotion and monitoring of individual and population health. The Global Burden of Disease (GBD) studies done in 1990 and 2000 have been the only studies to quantify non-fatal health outcomes across an exhaustive set of disorders at the global and regional level. Neither effort quantified uncertainty in prevalence or years lived with disability (YLDs). Methods Of the 291 diseases and injuries in the GBD cause list, 289 cause disability. For 1160 sequelae of the 289 diseases and injuries, we undertook a systematic analysis of prevalence, incidence, remission, duration, and excess mortality. Sources included published studies, case notification, population-based cancer registries, other disease registries, antenatal clinic serosurveillance, hospital discharge data, ambulatory care data, household surveys, other surveys, and cohort studies. For most sequelae, we used a Bayesian meta-regression method, DisMod-MR, designed to address key limitations in descriptive epidemiological data, including missing data, inconsistency, and large methodological variation between data sources. For some disorders, we used natural history models, geospatial models, back-calculation models (models calculating incidence from population mortality rates and case fatality), or registration completeness models (models adjusting for incomplete registration with health-system access and other covariates). Disability weights for 220 unique health states were used to capture the severity of health loss. YLDs by cause at age, sex, country, and year levels were adjusted for comorbidity with simulation methods. We included uncertainty estimates at all stages of the analysis. Findings Global prevalence for all ages combined in 2010 across the 1160 sequelae ranged from fewer than one case per 1 million people to 350 000 cases per 1 million people. Prevalence and severity of health loss were weakly correlated (correlation coefficient −0·37). In 2010, there were 777 million YLDs from all causes, up from 583 million in 1990. The main contributors to global YLDs were mental and behavioural disorders, musculoskeletal disorders, and diabetes or endocrine diseases. The leading specific causes of YLDs were much the same in 2010 as they were in 1990: low back pain, major depressive disorder, iron-deficiency anaemia, neck pain, chronic obstructive pulmonary disease, anxiety disorders, migraine, diabetes, and falls. Age-specific prevalence of YLDs increased with age in all regions and has decreased slightly from 1990 to 2010. Regional patterns of the leading causes of YLDs were more similar compared with years of life lost due to premature mortality. Neglected tropical diseases, HIV/AIDS, tuberculosis, malaria, and anaemia were important causes of YLDs in sub-Saharan Africa. Interpretation Rates of YLDs per 100 000 people have remained largely constant over time but rise steadily with age. Population growth and ageing have increased YLD numbers and crude rates over the past two decades. Prevalences of the most common causes of YLDs, such as mental and behavioural disorders and musculoskeletal disorders, have not decreased. Health systems will need to address the needs of the rising numbers of individuals with a range of disorders that largely cause disability but not mortality. Quantification of the burden of non-fatal health outcomes will be crucial to understand how well health systems are responding to these challenges. Effective and affordable strategies to deal with this rising burden are an urgent priority for health systems in most parts of the world. Funding Bill & Melinda Gates Foundation.