929 resultados para Marginal structural model
Resumo:
Steady and pulsed flow stationary impinging jets have been employed to simulate the wind field produced by a thunderstorm microburst. The effect on the low level wind field due to jet inclination with respect to the impingement surface has been studied. A single point velocity time history has been compared to the full-scale Andrews AFB microburst for model validation. It was found that for steady flow, jet inclination increased the radial extent of high winds but did not increase the magnitude of these winds when compared to the perpendicular impingement case. It was found that for inclined pulsed flow the design wind conditions could increase compared to perpendicular impingement. It was found that the location of peak winds was affected by varying the outlet conditions.
Resumo:
Cells are the fundamental building block of plant based food materials and many of the food processing born structural changes can fundamentally be derived as a function of the deformations of the cellular structure. In food dehydration the bulk level changes in porosity, density and shrinkage can be better explained using cellular level deformations initiated by the moisture removal from the cellular fluid. A novel approach is used in this research to model the cell fluid with Smoothed Particle Hydrodynamics (SPH) and cell walls with Discrete Element Methods (DEM), that are fundamentally known to be robust in treating complex fluid and solid mechanics. High Performance Computing (HPC) is used for the computations due to its computing advantages. Comparing with the deficiencies of the state of the art drying models, the current model is found to be robust in replicating drying mechanics of plant based food materials in microscale.
Resumo:
We have developed a Hierarchical Look-Ahead Trajectory Model (HiLAM) that incorporates the firing pattern of medial entorhinal grid cells in a planning circuit that includes interactions with hippocampus and prefrontal cortex. We show the model’s flexibility in representing large real world environments using odometry information obtained from challenging video sequences. We acquire the visual data from a camera mounted on a small tele-operated vehicle. The camera has a panoramic field of view with its focal point approximately 5 cm above the ground level, similar to what would be expected from a rat’s point of view. Using established algorithms for calculating perceptual speed from the apparent rate of visual change over time, we generate raw dead reckoning information which loses spatial fidelity over time due to error accumulation. We rectify the loss of fidelity by exploiting the loop-closure detection ability of a biologically inspired, robot navigation model termed RatSLAM. The rectified motion information serves as a velocity input to the HiLAM to encode the environment in the form of grid cell and place cell maps. Finally, we show goal directed path planning results of HiLAM in two different environments, an indoor square maze used in rodent experiments and an outdoor arena more than two orders of magnitude larger than the indoor maze. Together these results bridge for the first time the gap between higher fidelity bio-inspired navigation models (HiLAM) and more abstracted but highly functional bio-inspired robotic mapping systems (RatSLAM), and move from simulated environments into real-world studies in rodent-sized arenas and beyond.
Resumo:
Poor compliance with speed limits is a serious safety concern in work zones. Most studies of work zone speeds have focused on descriptive analyses and statistical testing without systematically capturing the effects of vehicle and traffic characteristics. Consequently, little is known about how the characteristics of surrounding traffic and platoons influence speeds. This paper develops a Tobit regression technique for innovatively modeling the probability and the magnitude of non-compliance with speed limits at various locations in work zones. Speed data is transformed into two groups—continuous for non-compliant and left-censored for compliant drivers—to model in a Tobit model framework. The modeling technique is illustrated using speed data from three long-term highway work zones in Queensland, Australia. Consistent and plausible model estimates across the three work zones support the appropriateness and validity of the technique. The results show that the probability and magnitude of speeding was higher for leaders of platoons with larger front gaps, during late afternoon and early morning, when traffic volumes were higher, and when higher proportions of surrounding vehicles were non-compliant. Light vehicles and their followers were also more likely to speed than others. Speeding was more common and greater in magnitude upstream than in the activity area, with higher compliance rates close to the end of the activity area and close to stop/slow traffic controllers. The modeling technique and results have great potential to assist in deployment of appropriate countermeasures by better identifying the traffic characteristics associated with speeding and the locations of lower compliance.
Resumo:
This paper provides a first look at the acceptance of Accountable-eHealth systems, a new genre of eHealth systems, designed to manage information privacy concerns that hinder the proliferation of eHealth. The underlying concept of AeH systems is appropriate use of information through after-the-fact accountability for intentional misuse of information by healthcare professionals. An online questionnaire survey was utilised for data collection from three educational institutions in Queensland, Australia. A total of 23 hypothesis relating to 9 constructs were tested using a structural equation modelling technique. A total of 334 valid responses were received. The cohort consisted of medical, nursing and other health related students studying at various levels in both undergraduate and postgraduate courses. The hypothesis testing disproved 7 hypotheses. The empirical research model developed was capable of predicting 47.3% of healthcare professionals’ perceived intention to use AeH systems. A validation of the model with a wider survey cohort would be useful to confirm the current findings.
Resumo:
Braking is a crucial driving task with a direct relationship with crash risk, as both excess and inadequate braking can lead to collisions. The objective of this study was to compare the braking profile of young drivers distracted by mobile phone conversations to non-distracted braking. In particular, the braking behaviour of drivers in response to a pedestrian entering a zebra crossing was examined using the CARRS-Q Advanced Driving Simulator. Thirty-two licensed drivers drove the simulator in three phone conditions: baseline (no phone conversation), hands-free, and handheld. In addition to driving the simulator, each participant completed questionnaires related to driver demographics, driving history, usage of mobile phones while driving, and general mobile phone usage history. The drivers were 18–26 years old and split evenly by gender. A linear mixed model analysis of braking profiles along the roadway before the pedestrian crossing revealed comparatively increased decelerations among distracted drivers, particularly during the initial 20 kph of deceleration. Drivers’ initial 20 kph deceleration time was modelled using a parametric accelerated failure time (AFT) hazard-based duration model with a Weibull distribution with clustered heterogeneity to account for the repeated measures experiment design. Factors found to significantly influence the braking task included vehicle dynamics variables like initial speed and maximum deceleration, phone condition, and driver-specific variables such as licence type, crash involvement history, and self-reported experience of using a mobile phone whilst driving. Distracted drivers on average appear to reduce the speed of their vehicle faster and more abruptly than non-distracted drivers, exhibiting excess braking comparatively and revealing perhaps risk compensation. The braking appears to be more aggressive for distracted drivers with provisional licenses compared to drivers with open licenses. Abrupt or excessive braking by distracted drivers might pose significant safety concerns to following vehicles in a traffic stream.
Resumo:
Stormwater pollution is linked to stream ecosystem degradation. In predicting stormwater pollution, various types of modelling techniques are adopted. The accuracy of predictions provided by these models depends on the data quality, appropriate estimation of model parameters, and the validation undertaken. It is well understood that available water quality datasets in urban areas span only relatively short time scales unlike water quantity data, which limits the applicability of the developed models in engineering and ecological assessment of urban waterways. This paper presents the application of leave-one-out (LOO) and Monte Carlo cross validation (MCCV) procedures in a Monte Carlo framework for the validation and estimation of uncertainty associated with pollutant wash-off when models are developed using a limited dataset. It was found that the application of MCCV is likely to result in a more realistic measure of model coefficients than LOO. Most importantly, MCCV and LOO were found to be effective in model validation when dealing with a small sample size which hinders detailed model validation and can undermine the effectiveness of stormwater quality management strategies.
Resumo:
Bone morphogen proteins (BMPs) are distributed along a dorsal-ventral (DV) gradient in many developing embryos. The spatial distribution of this signaling ligand is critical for correct DV axis specification. In various species, BMP expression is spatially localized, and BMP gradient formation relies on BMP transport, which in turn requires interactions with the extracellular proteins Short gastrulation/Chordin (Chd) and Twisted gastrulation (Tsg). These binding interactions promote BMP movement and concomitantly inhibit BMP signaling. The protease Tolloid (Tld) cleaves Chd, which releases BMP from the complex and permits it to bind the BMP receptor and signal. In sea urchin embryos, BMP is produced in the ventral ectoderm, but signals in the dorsal ectoderm. The transport of BMP from the ventral ectoderm to the dorsal ectoderm in sea urchin embryos is not understood. Therefore, using information from a series of experiments, we adapt the mathematical model of Mizutani et al. (2005) and embed it as the reaction part of a one-dimensional reaction–diffusion model. We use it to study aspects of this transport process in sea urchin embryos. We demonstrate that the receptor-bound BMP concentration exhibits dorsally centered peaks of the same type as those observed experimentally when the ternary transport complex (Chd-Tsg-BMP) forms relatively quickly and BMP receptor binding is relatively slow. Similarly, dorsally centered peaks are created when the diffusivities of BMP, Chd, and Chd-Tsg are relatively low and that of Chd-Tsg-BMP is relatively high, and the model dynamics also suggest that Tld is a principal regulator of the system. At the end of this paper, we briefly compare the observed dynamics in the sea urchin model to a version that applies to the fly embryo, and we find that the same conditions can account for BMP transport in the two types of embryos only if Tld levels are reduced in sea urchin compared to fly.
Resumo:
A nonlinear interface element modelling method is formulated for the prediction of deformation and failure of high adhesive thin layer polymer mortared masonry exhibiting failure of units and mortar. Plastic flow vectors are explicitly integrated within the implicit finite element framework instead of relying on predictor–corrector like approaches. The method is calibrated using experimental data from uniaxial compression, shear triplet and flexural beam tests. The model is validated using a thin layer mortared masonry shear wall, whose experimental datasets are reported in the literature and is used to examine the behaviour of thin layer mortared masonry under biaxial loading.
Resumo:
Land-use regression (LUR) is a technique that can improve the accuracy of air pollution exposure assessment in epidemiological studies. Most LUR models are developed for single cities, which places limitations on their applicability to other locations. We sought to develop a model to predict nitrogen dioxide (NO2) concentrations with national coverage of Australia by using satellite observations of tropospheric NO2 columns combined with other predictor variables. We used a generalised estimating equation (GEE) model to predict annual and monthly average ambient NO2 concentrations measured by a national monitoring network from 2006 through 2011. The best annual model explained 81% of spatial variation in NO2 (absolute RMS error=1.4 ppb), while the best monthly model explained 76% (absolute RMS error=1.9 ppb). We applied our models to predict NO2 concentrations at the ~350,000 census mesh blocks across the country (a mesh block is the smallest spatial unit in the Australian census). National population-weighted average concentrations ranged from 7.3 ppb (2006) to 6.3 ppb (2011). We found that a simple approach using tropospheric NO2 column data yielded models with slightly better predictive ability than those produced using a more involved approach that required simulation of surface-to-column ratios. The models were capable of capturing within-urban variability in NO2, and offer the ability to estimate ambient NO2 concentrations at monthly and annual time scales across Australia from 2006–2011. We are making our model predictions freely available for research.
Resumo:
We present a machine learning model that predicts a structural disruption score from a protein s primary structure. SCHEMA was introduced by Frances Arnold and colleagues as a method for determining putative recombination sites of a protein on the basis of the full (PDB) description of its structure. The present method provides an alternative to SCHEMA that is able to determine the same score from sequence data only. Circumventing the need for resolving the full structure enables the exploration of yet unresolved and even hypothetical sequences for protein design efforts. Deriving the SCHEMA score from a primary structure is achieved using a two step approach: first predicting a secondary structure from the sequence and then predicting the SCHEMA score from the predicted secondary structure. The correlation coefficient for the prediction is 0.88 and indicates the feasibility of replacing SCHEMA with little loss of precision.
Resumo:
A computationally efficient sequential Monte Carlo algorithm is proposed for the sequential design of experiments for the collection of block data described by mixed effects models. The difficulty in applying a sequential Monte Carlo algorithm in such settings is the need to evaluate the observed data likelihood, which is typically intractable for all but linear Gaussian models. To overcome this difficulty, we propose to unbiasedly estimate the likelihood, and perform inference and make decisions based on an exact-approximate algorithm. Two estimates are proposed: using Quasi Monte Carlo methods and using the Laplace approximation with importance sampling. Both of these approaches can be computationally expensive, so we propose exploiting parallel computational architectures to ensure designs can be derived in a timely manner. We also extend our approach to allow for model uncertainty. This research is motivated by important pharmacological studies related to the treatment of critically ill patients.
Resumo:
The chubby baby who eats well is desirable in our culture. Perceived low weight gains and feeding concerns are common reasons mothers seek advice in the early years. In contrast, childhood obesity is a global public health concern. Use of coercive feeding practices, prompted by maternal concern about weight, may disrupt a child’s innate self regulation of energy intake, promoting overeating and overweight. This study describes predictors of maternal concern about her child undereating/becoming underweight and feeding practices. Mothers in the control group of the NOURISH and South Australian Infants Dietary Intake studies (n = 332) completed a self-administered questionnaire when the child was aged 12–16 months. Weight-for-age z-score (WAZ)was derived from weight measured by study staff. Mean age (SD) was 13.8 (1.3) months, mean WAZ (SD), 0.58 (0.86) and 49% were male. WAZ and two questions describing food refusal were combined in a structural equation model with four items from the Infant feeding Questionnaire (IFQ) to form the factor ‘Concern about undereating/weight’. Structural relationships were drawn between concern and IFQ factors ‘awareness of infant’s hunger and satiety cues’, ‘use of food to calm infant’s fussiness’ and ‘feeding infant on a schedule’, resulting in a model of acceptable fit. Lower WAZ and higher frequency of food refusal predicted higher maternal concern. Higher maternal concern was associated with lower awareness of infant cues (r = −.17, p = .01) and greater use of food to calm (r = .13, p = .03). In a cohort of healthy children, maternal concern about undereating and underweight was associated with practices that have the potential to disrupt self-regulation.
Resumo:
Empirical evidence shows that repositories of business process models used in industrial practice contain significant amounts of duplication. This duplication arises for example when the repository covers multiple variants of the same processes or due to copy-pasting. Previous work has addressed the problem of efficiently retrieving exact clones that can be refactored into shared subprocess models. This article studies the broader problem of approximate clone detection in process models. The article proposes techniques for detecting clusters of approximate clones based on two well-known clustering algorithms: DBSCAN and Hi- erarchical Agglomerative Clustering (HAC). The article also defines a measure of standardizability of an approximate clone cluster, meaning the potential benefit of replacing the approximate clones with a single standardized subprocess. Experiments show that both techniques, in conjunction with the proposed standardizability measure, accurately retrieve clusters of approximate clones that originate from copy-pasting followed by independent modifications to the copied fragments. Additional experiments show that both techniques produce clusters that match those produced by human subjects and that are perceived to be standardizable.