907 resultados para rate equation model
Resumo:
It is well known that, although a uniform magnetic field inhibits the onset of small amplitude thermal convection in a layer of fluid heated from below, isolated convection cells may persist if the fluid motion within them is sufficiently vigorous to expel magnetic flux. Such fully nonlinear(‘‘convecton’’) solutions for magnetoconvection have been investigated by several authors. Here we explore a model amplitude equation describing this separation of a fluid layer into a vigorously convecting part and a magnetically-dominated part at rest. Our analysis elucidates the origin of the scaling laws observed numerically to form the boundaries in parameter space of the region of existence of these localised states, and importantly, for the lowest thermal forcing required to sustain them.
Resumo:
A Monte Carlo model of an Elekta iViewGT amorphous silicon electronic portal imaging device (a-Si EPID) has been validated for pre-treatment verification of clinical IMRT treatment plans. The simulations involved the use of the BEAMnrc and DOSXYZnrc Monte Carlo codes to predict the response of the iViewGT a-Si EPID model. The predicted EPID images were compared to the measured images obtained from the experiment. The measured EPID images were obtained by delivering a photon beam from an Elekta Synergy linac to the Elekta iViewGT a-Si EPID. The a-Si EPID was used with no additional build-up material. Frame averaged EPID images were acquired and processed using in-house software. The agreement between the predicted and measured images was analyzed using the gamma analysis technique with acceptance criteria of 3% / 3 mm. The results show that the predicted EPID images for four clinical IMRT treatment plans have a good agreement with the measured EPID signal. Three prostate IMRT plans were found to have an average gamma pass rate of more than 95.0 % and a spinal IMRT plan has the average gamma pass rate of 94.3 %. During the period of performing this work a routine MLC calibration was performed and one of the IMRT treatments re-measured with the EPID. A change in the gamma pass rate for one field was observed. This was the motivation for a series of experiments to investigate the sensitivity of the method by introducing delivery errors, MLC position and dosimetric overshoot, into the simulated EPID images. The method was found to be sensitive to 1 mm leaf position errors and 10% overshoot errors.
Resumo:
Trade flows of commodities are generally affected by the principles of comparative advantage in a free trade. However, trade flows might be enhanced or distorted not only by various government interventions, but also by exchange rate fluctuations among others. This study applies a commodity-specific gravity model to selected vegetable trade flows among Organization for Economic Co-operation and Development (OECD) countries to determine the effects of exchange rate uncertainty on the trade flows. Using the data from 1996 to 2002, the results show that, while the exchange rate uncertainty significantly reduces trade in the majority of commodity flows, there is evidence that both short- and long-term volatility have positive effect on trade flows of specific commodities. This study also tests the regional preferential trade agreements such as the North American Free Trade Agreement (NAFTA), the Asia-Pacific Economic Cooperation (APEC) and the EU, and their different effects on commodities.
Resumo:
This paper provides a first look at the acceptance of Accountable-eHealth systems, a new genre of eHealth systems, designed to manage information privacy concerns that hinder the proliferation of eHealth. The underlying concept of AeH systems is appropriate use of information through after-the-fact accountability for intentional misuse of information by healthcare professionals. An online questionnaire survey was utilised for data collection from three educational institutions in Queensland, Australia. A total of 23 hypothesis relating to 9 constructs were tested using a structural equation modelling technique. A total of 334 valid responses were received. The cohort consisted of medical, nursing and other health related students studying at various levels in both undergraduate and postgraduate courses. The hypothesis testing disproved 7 hypotheses. The empirical research model developed was capable of predicting 47.3% of healthcare professionals’ perceived intention to use AeH systems. A validation of the model with a wider survey cohort would be useful to confirm the current findings.
Resumo:
Rating systems are used by many websites, which allow customers to rate available items according to their own experience. Subsequently, reputation models are used to aggregate available ratings in order to generate reputation scores for items. A problem with current reputation models is that they provide solutions to enhance accuracy of sparse datasets not thinking of their models performance over dense datasets. In this paper, we propose a novel reputation model to generate more accurate reputation scores for items using any dataset; whether it is dense or sparse. Our proposed model is described as a weighted average method, where the weights are generated using the normal distribution. Experiments show promising results for the proposed model over state-of-the-art ones on sparse and dense datasets.
Resumo:
Land-use regression (LUR) is a technique that can improve the accuracy of air pollution exposure assessment in epidemiological studies. Most LUR models are developed for single cities, which places limitations on their applicability to other locations. We sought to develop a model to predict nitrogen dioxide (NO2) concentrations with national coverage of Australia by using satellite observations of tropospheric NO2 columns combined with other predictor variables. We used a generalised estimating equation (GEE) model to predict annual and monthly average ambient NO2 concentrations measured by a national monitoring network from 2006 through 2011. The best annual model explained 81% of spatial variation in NO2 (absolute RMS error=1.4 ppb), while the best monthly model explained 76% (absolute RMS error=1.9 ppb). We applied our models to predict NO2 concentrations at the ~350,000 census mesh blocks across the country (a mesh block is the smallest spatial unit in the Australian census). National population-weighted average concentrations ranged from 7.3 ppb (2006) to 6.3 ppb (2011). We found that a simple approach using tropospheric NO2 column data yielded models with slightly better predictive ability than those produced using a more involved approach that required simulation of surface-to-column ratios. The models were capable of capturing within-urban variability in NO2, and offer the ability to estimate ambient NO2 concentrations at monthly and annual time scales across Australia from 2006–2011. We are making our model predictions freely available for research.
Resumo:
Approximate Bayesian Computation’ (ABC) represents a powerful methodology for the analysis of complex stochastic systems for which the likelihood of the observed data under an arbitrary set of input parameters may be entirely intractable – the latter condition rendering useless the standard machinery of tractable likelihood-based, Bayesian statistical inference [e.g. conventional Markov chain Monte Carlo (MCMC) simulation]. In this paper, we demonstrate the potential of ABC for astronomical model analysis by application to a case study in the morphological transformation of high-redshift galaxies. To this end, we develop, first, a stochastic model for the competing processes of merging and secular evolution in the early Universe, and secondly, through an ABC-based comparison against the observed demographics of massive (Mgal > 1011 M⊙) galaxies (at 1.5 < z < 3) in the Cosmic Assembly Near-IR Deep Extragalatic Legacy Survey (CANDELS)/Extended Groth Strip (EGS) data set we derive posterior probability densities for the key parameters of this model. The ‘Sequential Monte Carlo’ implementation of ABC exhibited herein, featuring both a self-generating target sequence and self-refining MCMC kernel, is amongst the most efficient of contemporary approaches to this important statistical algorithm. We highlight as well through our chosen case study the value of careful summary statistic selection, and demonstrate two modern strategies for assessment and optimization in this regard. Ultimately, our ABC analysis of the high-redshift morphological mix returns tight constraints on the evolving merger rate in the early Universe and favours major merging (with disc survival or rapid reformation) over secular evolution as the mechanism most responsible for building up the first generation of bulges in early-type discs.
Resumo:
Objective. Leconotide (CVID, AM336, CNSB004) is an omega conopeptide similar to ziconotide, which blocks voltage sensitive calcium channels. However, unlike ziconotide, which must be administered intrathecally, leconotide can be given intravenously because it is less toxic. This study investigated the antihyperalgesic potency of leconotide given intravenously alone and in combinations with morphine-administered intraperitoneally, in a rat model of bone cancer pain. Design. Syngeneic rat prostate cancer cells AT3B-1 were injected into one tibia of male Wistar rats. The tumor expanded within the bone causing hyperalgesia to heat applied to the ipsilateral hind paw. Measurements were made of the maximum dose (MD) of morphine and leconotide given alone and in combinations that caused no effect in an open-field activity monitor, rotarod, and blood pressure and heart rate measurements. Paw withdrawal thresholds from noxious heat were measured. Dose response curves for morphine (0.312–5.0 mg/kg intraperitoneal) and leconotide (0.002–200 µg/kg intravenous) given alone were plotted and responses compared with those caused by morphine and leconotide in combinations. Results. Leconotide caused minimal antihyperalgesic effects when administered alone. Morphine given alone intraperitoneally caused dose-related antihyperalgesic effects (ED50 = 2.40 ± 1.24 mg/kg), which were increased by coadministration of leconotide 20 µg/kg (morphine ED50 = 0.16 ± 1.30 mg/kg); 0.2 µg/kg (morphine ED50 = 0.39 ± 1.27 mg/kg); and 0.02 µg/kg (morphine ED50 = 1.24 ± 1.30 mg/kg). Conclusions. Leconotide caused a significant increase in reversal by morphine of the bone cancer-induced hyperalgesia without increasing the side effect profile of either drug. Clinical Implication. Translation into clinical practice of the method of analgesia described here will improve the quantity and quality of analgesia in patients with bone metastases. The use of an ordinary parenteral route for administration of the calcium channel blocker (leconotide) at low dose opens up the technique to large numbers of patients who could not have an intrathecal catheter for drug administration. Furthermore, the potentiating synergistic effect with morphine on hyperalgesia without increased side effects will lead to greater analgesia with improved quality of life.
Resumo:
Solid-extracellular fluid interaction is believed to play an important role in the strain-rate dependent mechanical behaviors of shoulder articular cartilages. It is believed that the kangaroo shoulder joint is anatomically and biomechanically similar to human shoulder joint and it is easy to get in Australia. Therefore, the kangaroo humeral head cartilage was used as the suitable tissue for the study in this paper. Indentation tests from quasi-static (10-4/sec) to moderately high strain-rate (10-2/sec) on kangaroo humeral head cartilage tissues were conduced to investigate the strain-rate dependent behaviors. A finite element (FE) model was then developed, in which cartilage was conceptualized as a porous solid matrix filled with incompressible fluids. In this model, the solid matrix was modeled as an isotropic hyperelastic material and the percolating fluid follows Darcy’s law. Using inverse FE procedure, the constitutive parameters related to stiffness, compressibility of the solid matrix and permeability were obtained from the experimental results. The effect of solid-extracellular fluid interaction and drag force (the resistance to fluid movement) on strain-rate dependent behavior was investigated by comparing the influence of constant, strain dependent and strain-rate dependent permeability on FE model prediction. The newly developed porohyperelastic cartilage model with the inclusion of strain-rate dependent permeability was found to be able to predict the strain-rate dependent behaviors of cartilages.
Resumo:
Structural equation modeling (SEM) is a versatile multivariate statistical technique, and applications have been increasing since its introduction in the 1980s. This paper provides a critical review of 84 articles involving the use of SEM to address construction related problems over the period 1998–2012 including, but not limited to, seven top construction research journals. After conducting a yearly publication trend analysis, it is found that SEM applications have been accelerating over time. However, there are inconsistencies in the various recorded applications and several recurring problems exist. The important issues that need to be considered are examined in research design, model development and model evaluation and are discussed in detail with reference to current applications. A particularly important issue concerns the construct validity. Relevant topics for efficient research design also include longitudinal or cross-sectional studies, mediation and moderation effects, sample size issues and software selection. A guideline framework is provided to help future researchers in construction SEM applications.
Resumo:
The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.
Resumo:
A fractional FitzHugh–Nagumo monodomain model with zero Dirichlet boundary conditions is presented, generalising the standard monodomain model that describes the propagation of the electrical potential in heterogeneous cardiac tissue. The model consists of a coupled fractional Riesz space nonlinear reaction-diffusion model and a system of ordinary differential equations, describing the ionic fluxes as a function of the membrane potential. We solve this model by decoupling the space-fractional partial differential equation and the system of ordinary differential equations at each time step. Thus, this means treating the fractional Riesz space nonlinear reaction-diffusion model as if the nonlinear source term is only locally Lipschitz. The fractional Riesz space nonlinear reaction-diffusion model is solved using an implicit numerical method with the shifted Grunwald–Letnikov approximation, and the stability and convergence are discussed in detail in the context of the local Lipschitz property. Some numerical examples are given to show the consistency of our computational approach.
Resumo:
One of the Department of Defense's most pressing environmental problems is the efficient detection and identification of unexploded ordnance (UXO). In regions of highly magnetic soils, magnetic and electromagnetic sensors often detect anomalies that are of geologic origin, adding significantly to remediation costs. In order to develop predictive models for magnetic susceptibility, it is crucial to understand modes of formation and the spatial distribution of different iron oxides. Most rock types contain iron and their magnetic susceptibility is determined by the amount and form of iron oxides present. When rocks weather, the amount and form of the oxides change, producing concomitant changes in magnetic susceptibility. The type of iron oxide found in the weathered rock or regolith is a function of the duration and intensity of weathering, as well as the original content of iron in the parent material. The rate of weathering is controlled by rainfall and temperature; thus knowing the climate zone, the amount of iron in the lithology and the age of the surface will help predict the amount and forms of iron oxide. We have compiled analyses of the types, amounts, and magnetic properties of iron oxides from soils over a wide climate range, from semi arid grasslands, to temperate regions, and tropical forests. We find there is a predictable range of iron oxide type and magnetic susceptibility according to the climate zone, the age of the soil and the amount of iron in the unweathered regolith.
An external field prior for the hidden Potts model with application to cone-beam computed tomography
Resumo:
In images with low contrast-to-noise ratio (CNR), the information gain from the observed pixel values can be insufficient to distinguish foreground objects. A Bayesian approach to this problem is to incorporate prior information about the objects into a statistical model. A method for representing spatial prior information as an external field in a hidden Potts model is introduced. This prior distribution over the latent pixel labels is a mixture of Gaussian fields, centred on the positions of the objects at a previous point in time. It is particularly applicable in longitudinal imaging studies, where the manual segmentation of one image can be used as a prior for automatic segmentation of subsequent images. The method is demonstrated by application to cone-beam computed tomography (CT), an imaging modality that exhibits distortions in pixel values due to X-ray scatter. The external field prior results in a substantial improvement in segmentation accuracy, reducing the mean pixel misclassification rate for an electron density phantom from 87% to 6%. The method is also applied to radiotherapy patient data, demonstrating how to derive the external field prior in a clinical context.
Resumo:
We describe the development and parameterization of a grid-based model of African savanna vegetation processes. The model was developed with the objective of exploring elephant effects on the diversity of savanna species and structure, and in this formulation concentrates on the relative cover of grass and woody plants, the vertical structure of the woody plant community, and the distribution of these over space. Grid cells are linked by seed dispersal and fire, and environmental variability is included in the form of stochastic rainfall and fire events. The model was parameterized from an extensive review of the African savanna literature; when available, parameter values varied widely. The most plausible set of parameters produced long-term coexistence between woody plants and grass, with the tree-grass balance being more sensitive to changes in parameters influencing demographic processes and drought incidence and response, while less sensitive to fire regime. There was considerable diversity in the woody structure of savanna systems within the range of uncertainty in tree growth rate parameters. Thus, given the paucity of height growth data regarding woody plant species in southern African savannas, managers of natural areas should be cognizant of different tree species growth and damage response attributes when considering whether to act on perceived elephant threats to vegetation. © 2007 Springer Science+Business Media B.V.