130 resultados para Calculation methodology


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exposure control or case-control methodologies are common techniques for estimating crash risks, however they require either observational data on control cases or exogenous exposure data, such as vehicle-kilometres travelled. This study proposes an alternative methodology for estimating crash risk of road user groups, whilst controlling for exposure under a variety of roadway, traffic and environmental factors by using readily available police-reported crash data. In particular, the proposed method employs a combination of a log-linear model and quasi-induced exposure technique to identify significant interactions among a range of roadway, environmental and traffic conditions to estimate associated crash risks. The proposed methodology is illustrated using a set of police-reported crash data from January 2004 to June 2009 on roadways in Queensland, Australia. Exposure-controlled crash risks of motorcyclists—involved in multi-vehicle crashes at intersections—were estimated under various combinations of variables like posted speed limit, intersection control type, intersection configuration, and lighting condition. Results show that the crash risk of motorcycles at three-legged intersections is high if the posted speed limits along the approaches are greater than 60 km/h. The crash risk at three-legged intersections is also high when they are unsignalized. Dark lighting conditions appear to increase the crash risk of motorcycles at signalized intersections, but the problem of night time conspicuity of motorcyclists at intersections is lessened on approaches with lower speed limits. This study demonstrates that this combined methodology is a promising tool for gaining new insights into the crash risks of road user groups, and is transferrable to other road users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The methoxyamine group represents an ideal protecting group for the nitroxide moiety. It can be easily and selectively introduced in high yield (typically >90%) to a range of functionalised nitroxides using FeSO4.7H2O and H2O2 in DMSO. Its removal is readily achieved under mild conditions in high yield (70-90%) using mCPBA in a Cope-type elimination process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A multi-resource multi-stage scheduling methodology is developed to solve short-term open-pit mine production scheduling problems as a generic multi-resource multi-stage scheduling problem. It is modelled using essential characteristics of short-term mining production operations such as drilling, sampling, blasting and excavating under the capacity constraints of mining equipment at each processing stage. Based on an extended disjunctive graph model, a shifting-bottleneck-procedure algorithm is enhanced and applied to obtain feasible short-term open-pit mine production schedules and near-optimal solutions. The proposed methodology and its solution quality are verified and validated using a real mining case study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hot spot identification (HSID) aims to identify potential sites—roadway segments, intersections, crosswalks, interchanges, ramps, etc.—with disproportionately high crash risk relative to similar sites. An inefficient HSID methodology might result in either identifying a safe site as high risk (false positive) or a high risk site as safe (false negative), and consequently lead to the misuse the available public funds, to poor investment decisions, and to inefficient risk management practice. Current HSID methods suffer from issues like underreporting of minor injury and property damage only (PDO) crashes, challenges of accounting for crash severity into the methodology, and selection of a proper safety performance function to model crash data that is often heavily skewed by a preponderance of zeros. Addressing these challenges, this paper proposes a combination of a PDO equivalency calculation and quantile regression technique to identify hot spots in a transportation network. In particular, issues related to underreporting and crash severity are tackled by incorporating equivalent PDO crashes, whilst the concerns related to the non-count nature of equivalent PDO crashes and the skewness of crash data are addressed by the non-parametric quantile regression technique. The proposed method identifies covariate effects on various quantiles of a population, rather than the population mean like most methods in practice, which more closely corresponds with how black spots are identified in practice. The proposed methodology is illustrated using rural road segment data from Korea and compared against the traditional EB method with negative binomial regression. Application of a quantile regression model on equivalent PDO crashes enables identification of a set of high-risk sites that reflect the true safety costs to the society, simultaneously reduces the influence of under-reported PDO and minor injury crashes, and overcomes the limitation of traditional NB model in dealing with preponderance of zeros problem or right skewed dataset.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The analysis of content and meta–data has long been the subject of most Twitter studies, however such research only tells part of the story of the development of Twitter as a platform. In this work, we introduce a methodology to determine the growth patterns of individual users of the platform, a technique we refer to as follower accession, and through a number of case studies consider the factors which lead to follower growth, and the identification of non–authentic followers. Finally, we consider what such an approach tells us about the history of the platform itself, and the way in which changes to the new user signup process have impacted upon users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Timely diagnosis and reporting of patient symptoms in hospital emergency departments (ED) is a critical component of health services delivery. However, due to dispersed information resources and a vast amount of manual processing of unstructured information, accurate point-of-care diagnosis is often difficult. Aims The aim of this research is to report initial experimental evaluation of a clinician-informed automated method for the issue of initial misdiagnoses associated with delayed receipt of unstructured radiology reports. Method A method was developed that resembles clinical reasoning for identifying limb abnormalities. The method consists of a gazetteer of keywords related to radiological findings; the method classifies an X-ray report as abnormal if it contains evidence contained in the gazetteer. A set of 99 narrative reports of radiological findings was sourced from a tertiary hospital. Reports were manually assessed by two clinicians and discrepancies were validated by a third expert ED clinician; the final manual classification generated by the expert ED clinician was used as ground truth to empirically evaluate the approach. Results The automated method that attempts to individuate limb abnormalities by searching for keywords expressed by clinicians achieved an F-measure of 0.80 and an accuracy of 0.80. Conclusion While the automated clinician-driven method achieved promising performances, a number of avenues for improvement were identified using advanced natural language processing (NLP) and machine learning techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction Given the known challenges of obtaining accurate measurements of small radiation fields, and the increasing use of small field segments in IMRT beams, this study examined the possible effects of referencing inaccurate field output factors in the planning of IMRT treatments. Methods This study used the Brainlab iPlan treatment planning system to devise IMRT treatment plans for delivery using the Brainlab m3 microMLC (Brainlab, Feldkirchen, Germany). Four pairs of sample IMRT treatments were planned using volumes, beams and prescriptions that were based on a set of test plans described in AAPM TG 119’s recommendations for the commissioning of IMRT treatment planning systems [1]: • C1, a set of three 4 cm volumes with different prescription doses, was modified to reduce the size of the PTV to 2 cm across and to include an OAR dose constraint for one of the other volumes. • C2, a prostate treatment, was planned as described by the TG 119 report [1]. • C3, a head-and-neck treatment with a PTV larger than 10 cm across, was excluded from the study. • C4, an 8 cm long C-shaped PTV surrounding a cylindrical OAR, was planned as described in the TG 119 report [1] and then replanned with the length of the PTV reduced to 4 cm. Both plans in each pair used the same beam angles, collimator angles, dose reference points, prescriptions and constraints. However, one of each pair of plans had its beam modulation optimisation and dose calculation completed with reference to existing iPlan beam data and the other had its beam modulation optimisation and dose calculation completed with reference to revised beam data. The beam data revisions consisted of increasing the field output factor for a 0.6 9 0.6 cm2 field by 17 % and increasing the field output factor for a 1.2 9 1.2 cm2 field by 3 %. Results The use of different beam data resulted in different optimisation results with different microMLC apertures and segment weightings between the two plans for each treatment, which led to large differences (up to 30 % with an average of 5 %) between reference point doses in each pair of plans. These point dose differences are more indicative of the modulation of the plans than of any clinically relevant changes to the overall PTV or OAR doses. By contrast, the maximum, minimum and mean doses to the PTVs and OARs were smaller (less than 1 %, for all beams in three out of four pairs of treatment plans) but are more clinically important. Of the four test cases, only the shortened (4 cm) version of TG 119’s C4 plan showed substantial differences between the overall doses calculated in the volumes of interest using the different sets of beam data and thereby suggested that treatment doses could be affected by changes to small field output factors. An analysis of the complexity of this pair of plans, using Crowe et al.’s TADA code [2], indicated that iPlan’s optimiser had produced IMRT segments comprised of larger numbers of small microMLC leaf separations than in the other three test cases. Conclusion: The use of altered small field output factors can result in substantially altered doses when large numbers of small leaf apertures are used to modulate the beams, even when treating relatively large volumes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most of existing motorway traffic safety studies using disaggregate traffic flow data aim at developing models for identifying real-time traffic risks by comparing pre-crash and non-crash conditions. One of serious shortcomings in those studies is that non-crash conditions are arbitrarily selected and hence, not representative, i.e. selected non-crash data might not be the right data comparable with pre-crash data; the non-crash/pre-crash ratio is arbitrarily decided and neglects the abundance of non-crash over pre-crash conditions; etc. Here, we present a methodology for developing a real-time MotorwaY Traffic Risk Identification Model (MyTRIM) using individual vehicle data, meteorological data, and crash data. Non-crash data are clustered into groups called traffic regimes. Thereafter, pre-crash data are classified into regimes to match with relevant non-crash data. Among totally eight traffic regimes obtained, four highly risky regimes were identified; three regime-based Risk Identification Models (RIM) with sufficient pre-crash data were developed. MyTRIM memorizes the latest risk evolution identified by RIM to predict near future risks. Traffic practitioners can decide MyTRIM’s memory size based on the trade-off between detection and false alarm rates. Decreasing the memory size from 5 to 1 precipitates the increase of detection rate from 65.0% to 100.0% and of false alarm rate from 0.21% to 3.68%. Moreover, critical factors in differentiating pre-crash and non-crash conditions are recognized and usable for developing preventive measures. MyTRIM can be used by practitioners in real-time as an independent tool to make online decision or integrated with existing traffic management systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Heatwaves could cause the population excess death numbers to be ranged from tens to thousands within a couple of weeks in a local area. An excess mortality due to a special event (e.g., a heatwave or an epidemic outbreak) is estimated by subtracting the mortality figure under ‘normal’ conditions from the historical daily mortality records. The calculation of the excess mortality is a scientific challenge because of the stochastic temporal pattern of the daily mortality data which is characterised by (a) the long-term changing mean levels (i.e., non-stationarity); (b) the non-linear temperature-mortality association. The Hilbert-Huang Transform (HHT) algorithm is a novel method originally developed for analysing the non-linear and non-stationary time series data in the field of signal processing, however, it has not been applied in public health research. This paper aimed to demonstrate the applicability and strength of the HHT algorithm in analysing health data. Methods Special R functions were developed to implement the HHT algorithm to decompose the daily mortality time series into trend and non-trend components in terms of the underlying physical mechanism. The excess mortality is calculated directly from the resulting non-trend component series. Results The Brisbane (Queensland, Australia) and the Chicago (United States) daily mortality time series data were utilized for calculating the excess mortality associated with heatwaves. The HHT algorithm estimated 62 excess deaths related to the February 2004 Brisbane heatwave. To calculate the excess mortality associated with the July 1995 Chicago heatwave, the HHT algorithm needed to handle the mode mixing issue. The HHT algorithm estimated 510 excess deaths for the 1995 Chicago heatwave event. To exemplify potential applications, the HHT decomposition results were used as the input data for a subsequent regression analysis, using the Brisbane data, to investigate the association between excess mortality and different risk factors. Conclusions The HHT algorithm is a novel and powerful analytical tool in time series data analysis. It has a real potential to have a wide range of applications in public health research because of its ability to decompose a nonlinear and non-stationary time series into trend and non-trend components consistently and efficiently.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 3′ UTRs of eukaryotic genes participate in a variety of post-transcriptional (and some transcriptional) regulatory interactions. Some of these interactions are well characterised, but an undetermined number remain to be discovered. While some regulatory sequences in 3′ UTRs may be conserved over long evolutionary time scales, others may have only ephemeral functional significance as regulatory profiles respond to changing selective pressures. Here we propose a sensitive segmentation methodology for investigating patterns of composition and conservation in 3′ UTRs based on comparison of closely related species. We describe encodings of pairwise and three-way alignments integrating information about conservation, GC content and transition/transversion ratios and apply the method to three closely related Drosophila species: D. melanogaster, D. simulans and D. yakuba. Incorporating multiple data types greatly increased the number of segment classes identified compared to similar methods based on conservation or GC content alone. We propose that the number of segments and number of types of segment identified by the method can be used as proxies for functional complexity. Our main finding is that the number of segments and segment classes identified in 3′ UTRs is greater than in the same length of protein-coding sequence, suggesting greater functional complexity in 3′ UTRs. There is thus a need for sustained and extensive efforts by bioinformaticians to delineate functional elements in this important genomic fraction. C code, data and results are available upon request.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monitoring gases for environmental, industrial and agricultural fields is a demanding task that requires long periods of observation, large quantity of sensors, data management, high temporal and spatial resolution, long term stability, recalibration procedures, computational resources, and energy availability. Wireless Sensor Networks (WSNs) and Unmanned Aerial Vehicles (UAVs) are currently representing the best alternative to monitor large, remote, and difficult access areas, as these technologies have the possibility of carrying specialised gas sensing systems, and offer the possibility of geo-located and time stamp samples. However, these technologies are not fully functional for scientific and commercial applications as their development and availability is limited by a number of factors: the cost of sensors required to cover large areas, their stability over long periods, their power consumption, and the weight of the system to be used on small UAVs. Energy availability is a serious challenge when WSN are deployed in remote areas with difficult access to the grid, while small UAVs are limited by the energy in their reservoir tank or batteries. Another important challenge is the management of data produced by the sensor nodes, requiring large amount of resources to be stored, analysed and displayed after long periods of operation. In response to these challenges, this research proposes the following solutions aiming to improve the availability and development of these technologies for gas sensing monitoring: first, the integration of WSNs and UAVs for environmental gas sensing in order to monitor large volumes at ground and aerial levels with a minimum of sensor nodes for an effective 3D monitoring; second, the use of solar energy as a main power source to allow continuous monitoring; and lastly, the creation of a data management platform to store, analyse and share the information with operators and external users. The principal outcomes of this research are the creation of a gas sensing system suitable for monitoring any kind of gas, which has been installed and tested on CH4 and CO2 in a sensor network (WSN) and on a UAV. The use of the same gas sensing system in a WSN and a UAV reduces significantly the complexity and cost of the application as it allows: a) the standardisation of the signal acquisition and data processing, thereby reducing the required computational resources; b) the standardisation of calibration and operational procedures, reducing systematic errors and complexity; c) the reduction of the weight and energy consumption, leading to an improved power management and weight balance in the case of UAVs; d) the simplification of the sensor node architecture, which is easily replicated in all the nodes. I evaluated two different sensor modules by laboratory, bench, and field tests: a non-dispersive infrared module (NDIR) and a metal-oxide resistive nano-sensor module (MOX nano-sensor). The tests revealed advantages and disadvantages of the two modules when used for static nodes at the ground level and mobile nodes on-board a UAV. Commercial NDIR modules for CO2 have been successfully tested and evaluated in the WSN and on board of the UAV. Their advantage is the precision and stability, but their application is limited to a few gases. The advantages of the MOX nano-sensors are the small size, low weight, low power consumption and their sensitivity to a broad range of gases. However, selectivity is still a concern that needs to be addressed with further studies. An electronic board to interface sensors in a large range of resistivity was successfully designed, created and adapted to operate on ground nodes and on-board UAV. The WSN and UAV created were powered with solar energy in order to facilitate outdoor deployment, data collection and continuous monitoring over large and remote volumes. The gas sensing, solar power, transmission and data management systems of the WSN and UAV were fully evaluated by laboratory, bench and field testing. The methodology created to design, developed, integrate and test these systems was extensively described and experimentally validated. The sampling and transmission capabilities of the WSN and UAV were successfully tested in an emulated mission involving the detection and measurement of CO2 concentrations in a field coming from a contaminant source; the data collected during the mission was transmitted in real time to a central node for data analysis and 3D mapping of the target gas. The major outcome of this research is the accomplishment of the first flight mission, never reported before in the literature, of a solar powered UAV equipped with a CO2 sensing system in conjunction with a network of ground sensor nodes for an effective 3D monitoring of the target gas. A data management platform was created using an external internet server, which manages, stores, and shares the data collected in two web pages, showing statistics and static graph images for internal and external users as requested. The system was bench tested with real data produced by the sensor nodes and the architecture of the platform was widely described and illustrated in order to provide guidance and support on how to replicate the system. In conclusion, the overall results of the project provide guidance on how to create a gas sensing system integrating WSNs and UAVs, how to power the system with solar energy and manage the data produced by the sensor nodes. This system can be used in a wide range of outdoor applications, especially in agriculture, bushfires, mining studies, zoology, and botanical studies opening the way to an ubiquitous low cost environmental monitoring, which may help to decrease our carbon footprint and to improve the health of the planet.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article outlines the research approach used in the international 1000 Voices Project. The 1000 Voices project is an interdisciplinary research and public awareness project that uses a customised online multimodal storytelling platform to explore the lives of people with disability internationally. Through the project, researchers and partners have encouraged diverse participants to select the modes of storytelling (e.g. images, text, videos and combinations thereof) that suit them best and to self-define what both ‘disability’ and ‘life story’ mean to them. The online reflective component of the approach encourages participants to organically and reflectively develop story events and revisions over time in ways that suit them and their emerging lives. This article provides a detailed summary of the project's theoretical and methodological development alongside suggestions for future development in social work and qualitative research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The generation of a correlation matrix for set of genomic sequences is a common requirement in many bioinformatics problems such as phylogenetic analysis. Each sequence may be millions of bases long and there may be thousands of such sequences which we wish to compare, so not all sequences may fit into main memory at the same time. Each sequence needs to be compared with every other sequence, so we will generally need to page some sequences in and out more than once. In order to minimize execution time we need to minimize this I/O. This paper develops an approach for faster and scalable computing of large-size correlation matrices through the maximal exploitation of available memory and reducing the number of I/O operations. The approach is scalable in the sense that the same algorithms can be executed on different computing platforms with different amounts of memory and can be applied to different bioinformatics problems with different correlation matrix sizes. The significant performance improvement of the approach over previous work is demonstrated through benchmark examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measurement of discrimination against 18O during dark respiration in plants is currently accepted as the only reliable method of estimating the partitioning of electrons between the cytochrome and alternative pathways. In this paper, we review the theory of the technique and its application to a gas-phase system. We extend it to include sampling effects and show that the isotope discrimination factor, D, is calculated as –dln(1 + δ)/dlnO*, where δ is isotopic composition of the substrate oxygen and O*=[O2]/[N2] in a closed chamber containing tissue respiring in the dark. It is not necessary to integrate the expression but, if the integrated form is used, the resultant regression should not be constrained through the origin. This is important since any error in D will have significant effects on the estimation of the flux of electrons through the two pathways.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, the beauty leaf plant (Calophyllum Inophyllum) is being considered as a potential 2nd generation biodiesel source due to high seed oil content, high fruit production rate, simple cultivation and ability to grow in a wide range of climate conditions. However, however, due to the high free fatty acid (FFA) content in this oil, the potential of this biodiesel feedstock is still unrealized, and little research has been undertaken on it. In this study, transesterification of beauty leaf oil to produce biodiesel has been investigated. A two-step biodiesel conversion method consisting of acid catalysed pre-esterification and alkali catalysed transesterification has been utilized. The three main factors that drive the biodiesel (fatty acid methyl ester (FAME)) conversion from vegetable oil (triglycerides) were studied using response surface methodology (RSM) based on a Box-Behnken experimental design. The factors considered in this study were catalyst concentration, methanol to oil molar ratio and reaction temperature. Linear and full quadratic regression models were developed to predict FFA and FAME concentration and to optimize the reaction conditions. The significance of these factors and their interaction in both stages was determined using analysis of variance (ANOVA). The reaction conditions for the largest reduction in FFA concentration for acid catalysed pre-esterification was 30:1 methanol to oil molar ratio, 10% (w/w) sulfuric acid catalyst loading and 75 °C reaction temperature. In the alkali catalysed transesterification process 7.5:1 methanol to oil molar ratio, 1% (w/w) sodium methoxide catalyst loading and 55 °C reaction temperature were found to result in the highest FAME conversion. The good agreement between model outputs and experimental results demonstrated that this methodology may be useful for industrial process optimization for biodiesel production from beauty leaf oil and possibly other industrial processes as well.