115 resultados para Random effects
Resumo:
Objectives. Intrusive memories of extreme trauma can disrupt a stepwise approach to imaginal exposure. Concurrent tasks that load the visuospatial sketchpad (VSSP) of working memory reduce the vividness of recalled images. This study tested whether relief of distress from competing VSSP tasks during imaginal exposure is at the cost of impaired desensitization . Design. This study examined repeated exposure to emotive memories using 18 unselected undergraduates and a within-subjects design with three exposure conditions (Eye Movement, Visual Noise, Exposure Alone) in random, counterbalanced order. Method. At baseline, participants recalled positive and negative experiences, and rated the vividness and emotiveness of each image. A different positive and negative recollection was then used for each condition. Vividness and emotiveness were rated after each of eight exposure trials. At a post-exposure session 1 week later, participants rated each image without any concurrent task. Results. Consistent with previous research, vividness and distress during imaging were lower during Eye Movements than in Exposure Alone, with passive visual interference giving intermediate results. A reduction in emotional responses from Baseline to Post was of similar size for the three conditions. Conclusion. Visuospatial tasks may offer a temporary response aid for imaginal exposure without affecting desensitization.
Resumo:
The advance of rapid prototyping techniques has significantly improved control over the pore network architecture of tissue engineering scaffolds. In this work we assessed the influence of scaffold pore architecture on cell seeding and static culturing, by comparing a computer‐designed gyroid architecture fabricated by stereolithography to a random‐pore architecture resulting from salt‐leaching. The scaffold types showed comparable porosity and pore size values, but the gyroid type showed a more than tenfold higher permeability due to the absence of size‐limiting pore interconnections. The higher permeability significantly improved the wetting properties of the hydrophobic scaffolds, and increased the settling speed of cells upon static seeding of immortalised mesenchymal stem cells. After dynamic seeding followed by 5 days of static culture, gyroid scaffolds showed large cell populations in the centre of the scaffold, while salt‐leached scaffolds were covered with a cell‐sheet on the outside and no cells were found in the scaffold centre. It was shown that interconnectivity of the pores and permeability of the scaffold prolongs the time of static culture before overgrowth of cells at the scaffold periphery occurs. Furthermore, novel scaffold designs are proposed to further improve the transport of oxygen and nutrients throughout the scaffolds, and to create tissue engineering grafts with designed, pre‐fabricated vasculature.
Resumo:
Velocity jump processes are discrete random walk models that have many applications including the study of biological and ecological collective motion. In particular, velocity jump models are often used to represent a type of persistent motion, known as a “run and tumble”, which is exhibited by some isolated bacteria cells. All previous velocity jump processes are non-interacting, which means that crowding effects and agent-to-agent interactions are neglected. By neglecting these agent-to-agent interactions, traditional velocity jump models are only applicable to very dilute systems. Our work is motivated by the fact that many applications in cell biology, such as wound healing, cancer invasion and development, often involve tissues that are densely packed with cells where cell-to-cell contact and crowding effects can be important. To describe these kinds of high cell density problems using a velocity jump process we introduce three different classes of crowding interactions into a one-dimensional model. Simulation data and averaging arguments lead to a suite of continuum descriptions of the interacting velocity jump processes. We show that the resulting systems of hyperbolic partial differential equations predict the mean behavior of the stochastic simulations very well.
Resumo:
Individual-based models describing the migration and proliferation of a population of cells frequently restrict the cells to a predefined lattice. An implicit assumption of this type of lattice based model is that a proliferative population will always eventually fill the lattice. Here we develop a new lattice-free individual-based model that incorporates cell-to-cell crowding effects. We also derive approximate mean-field descriptions for the lattice-free model in two special cases motivated by commonly used experimental setups. Lattice-free simulation results are compared to these mean-field descriptions and to a corresponding lattice-based model. Data from a proliferation experiment is used to estimate the parameters for the new model, including the cell proliferation rate, showing that the model fits the data well. An important aspect of the lattice-free model is that the confluent cell density is not predefined, as with lattice-based models, but an emergent model property. As a consequence of the more realistic, irregular configuration of cells in the lattice-free model, the population growth rate is much slower at high cell densities and the population cannot reach the same confluent density as an equivalent lattice-based model.
Resumo:
The health effects of environmental hazards are often examined using time series of the association between a daily response variable (e.g., death) and a daily level of exposure (e.g., temperature). Exposures are usually the average from a network of stations. This gives each station equal importance, and negates the opportunity for some stations to be better measures of exposure. We used a Bayesian hierarchical model that weighted stations using random variables between zero and one. We compared the weighted estimates to the standard model using data on health outcomes (deaths and hospital admissions) and exposures (air pollution and temperature) in Brisbane, Australia. The improvements in model fit were relatively small, and the estimated health effects of pollution were similar using either the standard or weighted estimates. Spatial weighted exposures would be probably more worthwhile when there is either greater spatial detail in the health outcome, or a greater spatial variation in exposure.
Resumo:
Molecular dynamics simulations were carried out on single chain models of linear low-density polyethylene in vacuum to study the effects of branch length, branch content, and branch distribution on the polymer’s crystalline structure at 300 K. The trans/gauche (t/g) ratios of the backbones of the modeled molecules were calculated and utilized to characterize their degree of crystallinity. The results show that the t/g ratio decreases with increasing branch content regardless of branch length and branch distribution, indicating that branch content is the key molecular parameter that controls the degree of crystallinity. Although t/g ratios of the models with the same branch content vary, they are of secondary importance. However, our data suggests that branch distribution (regular or random) has a significant effect on the degree of crystallinity for models containing 10 hexyl branches/1,000 backbone carbons. The fractions of branches that resided in the equilibrium crystalline structures of the models were also calculated. On average, 9.8% and 2.5% of the branches were found in the crystallites of the molecules with ethyl and hexyl branches while C13 NMR experiments showed that the respective probabilities of branch inclusion for ethyl and hexyl branches are 10% and 6% [Hosoda et al., Polymer 1990, 31, 1999–2005]. However, the degree of branch inclusion seems to be insensitive to the branch content and branch distribution.
Resumo:
This paper was designed to study metabonomic characters of the hepatotoxicity induced by alcohol and the intervention effects of Yin Chen Hao Tang (YCHT), a classic traditional Chinese medicine formula for treatment of jaundice and liver disorders in China. Urinary samples from control, alcohol- and YCHT-treated rats were analyzed by ultra-performance liquid chromatography/electrospray ionization quadruple time-of-flight mass spectrometry (UPLC/ESI-QTOF-MS) in positive ionization mode. The total ion chromatograms obtained from the control, alcohol- and YCHT-treated rats were easily distinguishable using a multivariate statistical analysis method such as the principal components analysis (PCA). The greatest difference in metabolic profiling was observed from alcohol-treated rats compared with the control and YCHT-treated rats. The positive ions m/z 664.3126 (9.00 min) was elevated in urine of alcohol-treated rats, whereas, ions m/z 155.3547 (10.96 min) and 708.2932 (9.01 min) were at a lower concentration compared with that in urine of control rats, however, these ions did not indicate a statistical difference between control rats and YCHT-treated rats. The ion m/z 664.3126 was found to correspond to ceramide (d18:1/25:0), providing further support for an involvement of the sphingomyelin signaling pathway in alcohol hepatotoxicity and the intervention effects of YCHT.
Resumo:
Robust facial expression recognition (FER) under occluded face conditions is challenging. It requires robust algorithms of feature extraction and investigations into the effects of different types of occlusion on the recognition performance to gain insight. Previous FER studies in this area have been limited. They have spanned recovery strategies for loss of local texture information and testing limited to only a few types of occlusion and predominantly a matched train-test strategy. This paper proposes a robust approach that employs a Monte Carlo algorithm to extract a set of Gabor based part-face templates from gallery images and converts these templates into template match distance features. The resulting feature vectors are robust to occlusion because occluded parts are covered by some but not all of the random templates. The method is evaluated using facial images with occluded regions around the eyes and the mouth, randomly placed occlusion patches of different sizes, and near-realistic occlusion of eyes with clear and solid glasses. Both matched and mis-matched train and test strategies are adopted to analyze the effects of such occlusion. Overall recognition performance and the performance for each facial expression are investigated. Experimental results on the Cohn-Kanade and JAFFE databases demonstrate the high robustness and fast processing speed of our approach, and provide useful insight into the effects of occlusion on FER. The results on the parameter sensitivity demonstrate a certain level of robustness of the approach to changes in the orientation and scale of Gabor filters, the size of templates, and occlusions ratios. Performance comparisons with previous approaches show that the proposed method is more robust to occlusion with lower reductions in accuracy from occlusion of eyes or mouth.
Resumo:
Business planning is at the core of entrepreneurship as it has implications for opportunity discovery and exploitation. This thesis' objectives are to disentangle the relationships between business planning and venture emergence to reconcile previous inconsistent findings. It reveals that the formalization of planning, the effort invested in the venture and the revision of the plan influence success for entrepreneurs in the process of launching their firm. This thesis provides generalizable results about the phenomenon of business planning by using a longitudinal random sample of emerging firms.
Resumo:
Objects in an environment are often encountered sequentially during spatial learning, forming a path along which object locations are experienced. The present study investigated the effect of spatial information conveyed through the path in visual and proprioceptive learning of a room-sized spatial layout, exploring whether different modalities differentially depend on the integrity of the path. Learning object locations along a coherent path was compared with learning them in a spatially random manner. Path integrity had little effect on visual learning, whereas learning with the coherent path produced better memory performance than random order learning for proprioceptive learning. These results suggest that path information has differential effects in visual and proprioceptive spatial learning, perhaps due to a difference in the way one establishes a reference frame for representing relative locations of objects.
Resumo:
Background Random Breath Testing (RBT) has proven to be a cornerstone of enforcement attempts to deter (as well as apprehend) motorists from drink driving in Queensland (Australia) for decades. However, scant published research has examined the relationship between the frequency of implementing RBT activities and subsequent drink driving apprehension rates across time. Aim This study aimed to examine the prevalence of apprehending drink drivers in Queensland over a 12 year period. It was hypothesised that an increase in breath testing rates would result in a corresponding decrease in the frequency of drink driving apprehension rates over time, which would reflect general deterrent effects. Method The Queensland Police Service provided RBT data that was analysed. Results Between the 1st of January 2000 and 31st of December 2011, 35,082,386 random breath tests (both mobile and stationary) were conducted in Queensland, resulting in 248,173 individuals being apprehended for drink driving offences. A total of 342,801 offences were recorded during this period, representing an intercept rate of .96. Of these offences, 276,711 (80.72%) were recorded against males and 66,024 (19.28%) offences committed by females. The most common drink driving offence was between 0.05 and 0.08 BAC limit. The largest proportion of offences was detected on the weekends, with Saturdays (27.60%) proving to be the most common drink driving night followed by Sundays (21.41%). The prevalence of drink driving detection rates rose steadily across time, peaking in 2008 and 2009, before slightly declining. This decline was observed across all Queensland regions and any increase in annual figures was due to new offence types being developed. Discussion This paper will further outline the major findings of the study in regards to tailoring RBT operations to increase detection rates as well as improve the general deterrent effect of the initiative.
Resumo:
Modal flexibility is a widely accepted technique to detect structural damage using vibration characteristics. Its application to detect damage in long span large diameter cables such as those used in suspension bridge main cables has not received much attention. This paper uses the modal flexibility method incorporating two damage indices (DIs) based on lateral and vertical modes to localize damage in such cables. The competency of those DIs in damage detection is tested by the numerically obtained vibration characteristics of a suspended cable in both intact and damaged states. Three single damage cases and one multiple damage case are considered. The impact of random measurement noise in the modal data on the damage localization capability of these two DIs is next examined. Long span large diameter cables are characterized by the two critical cable parameters named bending stiffness and sag-extensibility. The influence of these parameters in the damage localization capability of the two DIs is evaluated by a parametric study with two single damage cases. Results confirm that the damage index based on lateral vibration modes has the ability to successfully detect and locate damage in suspended cables with 5% noise in modal data for a range of cable parameters. This simple approach therefore can be extended for timely damage detection in cables of suspension bridges and thereby enhance their service during their life spans.
Resumo:
Submarine groundwater discharge (SGD) is an integral part of the hydrological cycle and represents an important aspect of land-ocean interactions. We used a numerical model to simulate flow and salt transport in a nearshore groundwater aquifer under varying wave conditions based on yearlong random wave data sets, including storm surge events. The results showed significant flow asymmetry with rapid response of influxes and retarded response of effluxes across the seabed to the irregular wave conditions. While a storm surge immediately intensified seawater influx to the aquifer, the subsequent return of intruded seawater to the sea, as part of an increased SGD, was gradual. Using functional data analysis, we revealed and quantified retarded, cumulative effects of past wave conditions on SGD including the fresh groundwater and recirculating seawater discharge components. The retardation was characterized well by a gamma distribution function regardless of wave conditions. The relationships between discharge rates and wave parameters were quantifiable by a regression model in a functional form independent of the actual irregular wave conditions. This statistical model provides a useful method for analyzing and predicting SGD from nearshore unconfined aquifers affected by random waves
Resumo:
The effects of fish density distribution and effort distribution on the overall catchability coefficient are examined. Emphasis is also on how aggregation and effort distribution interact to affect overall catch rate [catch per unit effort (cpue)]. In particular, it is proposed to evaluate three indices, the catchability index, the knowledge parameter, and the aggregation index, to describe the effectiveness of targeting and the effects on overall catchability in the stock area. Analytical expressions are provided so that these indices can easily be calculated. The average of the cpue calculated from small units where fishing is random is a better index for measuring the stock abundance. The overall cpue, the ratio of lumped catch and effort, together with the average cpue, can be used to assess the effectiveness of targeting. The proposed methods are applied to the commercial catch and effort data from the Australian northern prawn fishery. The indices are obtained assuming a power law for the effort distribution as an approximation of targeting during the fishing operation. Targeting increased catchability in some areas by 10%, which may have important implications on management advice.