904 resultados para calibration of rainfall-runoff models
Resumo:
With the increasing number of XML documents in varied domains, it has become essential to identify ways of finding interesting information from these documents. Data mining techniques were used to derive this interesting information. Mining on XML documents is impacted by its model due to the semi-structured nature of these documents. Hence, in this chapter we present an overview of the various models of XML documents, how these models were used for mining and some of the issues and challenges in these models. In addition, this chapter also provides some insights into the future models of XML documents for effectively capturing the two important features namely structure and content of XML documents for mining.
Resumo:
Finite Element Modeling (FEM) has become a vital tool in the automotive design and development processes. FEM of the human body is a technique capable of estimating parameters that are difficult to measure in experimental studies with the human body segments being modeled as complex and dynamic entities. Several studies have been dedicated to attain close-to-real FEMs of the human body (Pankoke and Siefert 2007; Amann, Huschenbeth et al. 2009; ESI 2010). The aim of this paper is to identify and appraise the state of-the art models of the human body which incorporate detailed pelvis and/or lower extremity models. Six databases and search engines were used to obtain literature, and the search was limited to studies published in English since 2000. The initial search results identified 636 pelvis-related papers, 834 buttocks-related papers, 505 thigh-related papers, 927 femur-related papers, 2039 knee-related papers, 655 shank-related papers, 292 tibia-related papers, 110 fibula-related papers, 644 ankle related papers, and 5660 foot-related papers. A refined search returned 100 pelvis-related papers, 45 buttocks related papers, 65 thigh-related papers, 162 femur-related papers, 195 kneerelated papers, 37 shank-related papers, 80 tibia-related papers, 30 fibula-related papers and 102 ankle-related papers and 246 foot-related papers. The refined literature list was further restricted by appraisal against a modified LOW appraisal criteria. Studies with unclear methodologies, with a focus on populations with pathology or with sport related dynamic motion modeling were excluded. The final literature list included fifteen models and each was assessed against the percentile the model represents, the gender the model was based on, the human body segment/segments included in the model, the sample size used to develop the model, the source of geometric/anthropometric values used to develop the model, the posture the model represents and the finite element solver used for the model. The results of this literature review provide indication of bias in the available models towards 50th percentile male modeling with a notable concentration on the pelvis, femur and buttocks segments.
Resumo:
Driver response (reaction) time (tr) of the second queuing vehicle is generally longer than other vehicles at signalized intersections. Though this phenomenon was revealed in 1972, the above factor is still ignored in conventional departure models. This paper highlights the need for quantitative measurements and analysis of queuing vehicle performance in spontaneous discharge pattern because it can improve microsimulation. Video recording from major cities in Australia plus twenty two sets of vehicle trajectories extracted from the Next Generation Simulation (NGSIM) Peachtree Street Dataset have been analyzed to better understand queuing vehicle performance in the discharge process. Findings from this research will alleviate driver response time and also can be used for the calibration of the microscopic traffic simulation model.
Resumo:
In this paper, a class of fractional advection–dispersion models (FADMs) is considered. These models include five fractional advection–dispersion models, i.e., the time FADM, the mobile/immobile time FADM with a time Caputo fractional derivative 0 < γ < 1, the space FADM with two sides Riemann–Liouville derivatives, the time–space FADM and the time fractional advection–diffusion-wave model with damping with index 1 < γ < 2. These equations can be used to simulate the regional-scale anomalous dispersion with heavy tails. We propose computationally effective implicit numerical methods for these FADMs. The stability and convergence of the implicit numerical methods are analysed and compared systematically. Finally, some results are given to demonstrate the effectiveness of theoretical analysis.
Resumo:
Even though titanium dioxide photocatalysis has been promoted as a leading green technology for water purification, many issues have hindered its application on a large commercial scale. For the materials scientist the main issues have centred the synthesis of more efficient materials and the investigation of degradation mechanisms; whereas for the engineers the main issues have been the development of appropriate models and the evaluation of intrinsic kinetics parameters that allow the scale up or re-design of efficient large-scale photocatalytic reactors. In order to obtain intrinsic kinetics parameters the reaction must be analysed and modelled considering the influence of the radiation field, pollutant concentrations and fluid dynamics. In this way, the obtained kinetic parameters are independent of the reactor size and configuration and can be subsequently used for scale-up purposes or for the development of entirely new reactor designs. This work investigates the intrinsic kinetics of phenol degradation over titania film due to the practicality of a fixed film configuration over a slurry. A flat plate reactor was designed in order to be able to control reaction parameters that include the UV irradiance, flow rates, pollutant concentration and temperature. Particular attention was paid to the investigation of the radiation field over the reactive surface and to the issue of mass transfer limited reactions. The ability of different emission models to describe the radiation field was investigated and compared to actinometric measurements. The RAD-LSI model was found to give the best predictions over the conditions tested. Mass transfer issues often limit fixed film reactors. The influence of this phenomenon was investigated with specifically planned sets of benzoic acid experiments and with the adoption of the stagnant film model. The phenol mass transfer coefficient in the system was calculated to be km,phenol=8.5815x10-7Re0.65(ms-1). The data obtained from a wide range of experimental conditions, together with an appropriate model of the system, has enabled determination of intrinsic kinetic parameters. The experiments were performed in four different irradiation levels (70.7, 57.9, 37.1 and 20.4 W m-2) and combined with three different initial phenol concentrations (20, 40 and 80 ppm) to give a wide range of final pollutant conversions (from 22% to 85%). The simple model adopted was able to fit the wide range of conditions with only four kinetic parameters; two reaction rate constants (one for phenol and one for the family of intermediates) and their corresponding adsorption constants. The intrinsic kinetic parameters values were defined as kph = 0.5226 mmol m-1 s-1 W-1, kI = 0.120 mmol m-1 s-1 W-1, Kph = 8.5 x 10-4 m3 mmol-1 and KI = 2.2 x 10-3 m3 mmol-1. The flat plate reactor allowed the investigation of the reaction under two different light configurations; liquid and substrate side illumination. The latter of particular interest for real world applications where light absorption due to turbidity and pollutants contained in the water stream to be treated could represent a significant issue. The two light configurations allowed the investigation of the effects of film thickness and the determination of the catalyst optimal thickness. The experimental investigation confirmed the predictions of a porous medium model developed to investigate the influence of diffusion, advection and photocatalytic phenomena inside the porous titania film, with the optimal thickness value individuated at 5 ìm. The model used the intrinsic kinetic parameters obtained from the flat plate reactor to predict the influence of thickness and transport phenomena on the final observed phenol conversion without using any correction factor; the excellent match between predictions and experimental results provided further proof of the quality of the parameters obtained with the proposed method.
Resumo:
Polysulphone (PS) dosimetry has been a widely used technique for more than 30 years to quantify the erythemally effective UV dose received by anatomic sites (personal exposure). The calibration of PS dosimeters is an important issue as their spectral response is different from the erythemal action spectrum. It is performed exposing a set of PS dosimeters on a horizontal plane and measuring the UV doses received by dosimeters using calibrated spectroradiometers or radiometers. In this study, data collected during PS field campaigns (from 2004 to 2006), using horizontal and differently inclined dosimeters, were analyzed to provide some considerations on the transfer of the horizontal calibration to differently inclined dosimeters, as anatomic sites usually are. The role of sky conditions, of the angle of incidence between the sun and the normal to the slope, and of the type of surrounding surface on the calibration were investigated. It was concluded that PS horizontal calibrations apply to differently inclined dosimeters for incidence angles up to approximately 70 degrees and for surfaces excluding ones with high albedo. Caution should be used in the application of horizontal calibrations for cases of high-incidence angle and/or high albedo surfaces.
Resumo:
A pressing cost issue facing construction is the procurement of off-site pre-manufactured assemblies. In order to encourage Australian adoption of off-site manufacture (OSM), a new approach to underlying processes is required. The advent of object oriented digital models for construction design assumes intelligent use of data. However, the construction production system relies on traditional methods and data sources and is expected to benefit from the application of well-established business process management techniques. The integration of the old and new data sources allows for the development of business process models which, by capturing typical construction processes involving OSM, provides insights into such processes. This integrative approach is the foundation of research into the use of OSM to increase construction productivity in Australia. The purpose of this study is to develop business process models capturing the procurement, resources and information flow of construction projects. For each stage of the construction value chain, a number of sub-processes are identified. Business Process Modelling Notation (BPMN), a mainstream business process modelling standard, is used to create base-line generic construction process models. These models identify OSM decision-making points that could provide cost reductions in procurement workflow and management systems. This paper reports on phase one of an on-going research aiming to develop a proto-type workflow application that can provide semi-automated support to construction processes involving OSM and assist in decision-making in the adoption of OSM thus contributing to a sustainable built environment.
Resumo:
The building sector is the dominant consumer of energy and therefore a major contributor to anthropomorphic climate change. The rapid generation of photorealistic, 3D environment models with incorporated surface temperature data has the potential to improve thermographic monitoring of building energy efficiency. In pursuit of this goal, we propose a system which combines a range sensor with a thermal-infrared camera. Our proposed system can generate dense 3D models of environments with both appearance and temperature information, and is the first such system to be developed using a low-cost RGB-D camera. The proposed pipeline processes depth maps successively, forming an ongoing pose estimate of the depth camera and optimizing a voxel occupancy map. Voxels are assigned 4 channels representing estimates of their true RGB and thermal-infrared intensity values. Poses corresponding to each RGB and thermal-infrared image are estimated through a combination of timestamp-based interpolation and a pre-determined knowledge of the extrinsic calibration of the system. Raycasting is then used to color the voxels to represent both visual appearance using RGB, and an estimate of the surface temperature. The output of the system is a dense 3D model which can simultaneously represent both RGB and thermal-infrared data using one of two alternative representation schemes. Experimental results demonstrate that the system is capable of accurately mapping difficult environments, even in complete darkness.
Resumo:
The determinants and key mechanisms of cancer cell osteotropism have not been identified, mainly due to the lack of reproducible animal models representing the biological, genetic and clinical features seen in humans. An ideal model should be capable of recapitulating as many steps of the metastatic cascade as possible, thus facilitating the development of prognostic markers and novel therapeutic strategies. Most animal models of bone metastasis still have to be derived experimentally as most syngeneic and transgeneic approaches do not provide a robust skeletal phenotype and do not recapitulate the biological processes seen in humans. The xenotransplantation of human cancer cells or tumour tissue into immunocompromised murine hosts provides the possibility to simulate early and late stages of the human disease. Human bone or tissue-engineered human bone constructs can be implanted into the animal to recapitulate more subtle, species-specific aspects of the mutual interaction between human cancer cells and the human bone microenvironment. Moreover, the replication of the entire "organ" bone makes it possible to analyse the interaction between cancer cells and the haematopoietic niche and to confer at least a partial human immunity to the murine host. This process of humanisation is facilitated by novel immunocompromised mouse strains that allow a high engraftment rate of human cells or tissue. These humanised xenograft models provide an important research tool to study human biological processes of bone metastasis.
Resumo:
Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.
Resumo:
The traditional hospital-based model of cardiac rehabilitation faces substantial challenges, such as cost and accessibility. These challenges have led to the development of alternative models of cardiac rehabilitation in recent years. The aim of this study was to identify and critique evidence for the effectiveness of these alternative models. A total of 22 databases were searched to identify quantitative studies or systematic reviews of quantitative studies regarding the effectiveness of alternative models of cardiac rehabilitation. Included studies were appraised using a Critical Appraisal Skills Programme tool and the National Health and Medical Research Council's designations for Level of Evidence. The 83 included articles described interventions in the following broad categories of alternative models of care: multifactorial individualized telehealth, internet based, telehealth focused on exercise, telehealth focused on recovery, community- or home-based, and complementary therapies. Multifactorial individualized telehealth and community- or home-based cardiac rehabilitation are effective alternative models of cardiac rehabilitation, as they have produced similar reductions in cardiovascular disease risk factors compared with hospital-based programmes. While further research is required to address the paucity of data available regarding the effectiveness of alternative models of cardiac rehabilitation in rural, remote, and culturally and linguistically diverse populations, our review indicates there is no need to rely on hospital-based strategies alone to deliver effective cardiac rehabilitation. Local healthcare systems should strive to integrate alternative models of cardiac rehabilitation, such as brief telehealth interventions tailored to individual's risk factor profiles as well as community- or home-based programmes, in order to ensure there are choices available for patients that best fit their needs, risk factor profile, and preferences.
Resumo:
Background: Developing sampling strategies to target biological pests such as insects in stored grain is inherently difficult owing to species biology and behavioural characteristics. The design of robust sampling programmes should be based on an underlying statistical distribution that is sufficiently flexible to capture variations in the spatial distribution of the target species. Results: Comparisons are made of the accuracy of four probability-of-detection sampling models - the negative binomial model,1 the Poisson model,1 the double logarithmic model2 and the compound model3 - for detection of insects over a broad range of insect densities. Although the double log and negative binomial models performed well under specific conditions, it is shown that, of the four models examined, the compound model performed the best over a broad range of insect spatial distributions and densities. In particular, this model predicted well the number of samples required when insect density was high and clumped within experimental storages. Conclusions: This paper reinforces the need for effective sampling programs designed to detect insects over a broad range of spatial distributions. The compound model is robust over a broad range of insect densities and leads to substantial improvement in detection probabilities within highly variable systems such as grain storage.
Resumo:
Mathematical models of mosquito-borne pathogen transmission originated in the early twentieth century to provide insights into how to most effectively combat malaria. The foundations of the Ross–Macdonald theory were established by 1970. Since then, there has been a growing interest in reducing the public health burden of mosquito-borne pathogens and an expanding use of models to guide their control. To assess how theory has changed to confront evolving public health challenges, we compiled a bibliography of 325 publications from 1970 through 2010 that included at least one mathematical model of mosquito-borne pathogen transmission and then used a 79-part questionnaire to classify each of 388 associated models according to its biological assumptions. As a composite measure to interpret the multidimensional results of our survey, we assigned a numerical value to each model that measured its similarity to 15 core assumptions of the Ross–Macdonald model. Although the analysis illustrated a growing acknowledgement of geographical, ecological and epidemiological complexities in modelling transmission, most models during the past 40 years closely resemble the Ross–Macdonald model. Modern theory would benefit from an expansion around the concepts of heterogeneous mosquito biting, poorly mixed mosquito-host encounters, spatial heterogeneity and temporal variation in the transmission process.
Resumo:
The current approach for protecting the receiving water environment from urban stormwater pollution is the adoption of structural measures commonly referred to as Water Sensitive Urban Design (WSUD). The treatment efficiency of WSUD measures closely depends on the design of the specific treatment units. As stormwater quality can be influenced by rainfall characteristics, the selection of appropriate rainfall events for treatment design is essential to ensure the effectiveness of WSUD systems. Based on extensive field investigation of four urban residential catchments and computer modelling, this paper details a technically robust approach for the selection of rainfall events for stormwater treatment design using a three-component model. The modelling outcomes indicate that selecting smaller average recurrence interval (ARI) events with high intensity-short duration as the threshold for the treatment system design is the most feasible since these events cumulatively generate a major portion of the annual pollutant load compared to the other types of rainfall events, despite producing a relatively smaller runoff volume. This implies that designs based on small and more frequent rainfall events rather than larger rainfall events would be appropriate in the context of efficiency in treatment performance, cost-effectiveness and possible savings in land area needed.
Resumo:
Carrying capacity assessments model a population’s potential self-sufficiency. A crucial first step in the development of such modelling is to examine the basic resource-based parameters defining the population’s production and consumption habits. These parameters include basic human needs such as food, water, shelter and energy together with climatic, environmental and behavioural characteristics. Each of these parameters imparts land-usage requirements in different ways and varied degrees so their incorporation into carrying capacity modelling also differs. Given that the availability and values of production parameters may differ between locations, no two carrying capacity models are likely to be exactly alike. However, the essential parameters themselves can remain consistent so one example, the Carrying Capacity Dashboard, is offered as a case study to highlight one way in which these parameters are utilised. While examples exist of findings made from carrying capacity assessment modelling, to date, guidelines for replication of such studies in other regions and scales have largely been overlooked. This paper addresses such shortcomings by describing a process for the inclusion and calibration of the most important resource-based parameters in a way that could be repeated elsewhere.