142 resultados para Random process
Resumo:
In this paper, a phenomenologically motivated magneto-mechanically coupled finite strain elastic framework for simulating the curing process of polymers in the presence of a magnetic load is proposed. This approach is in line with previous works by Hossain and co-workers on finite strain curing modelling framework for the purely mechanical polymer curing (Hossain et al., 2009b). The proposed thermodynamically consistent approach is independent of any particular free energy function that may be used for the fully-cured magneto-sensitive polymer modelling, i.e. any phenomenological or micromechanical-inspired free energy can be inserted into the main modelling framework. For the fabrication of magneto-sensitive polymers, micron-size ferromagnetic particles are mixed with the liquid matrix material in the uncured stage. The particles align in a preferred direction with the application of a magnetic field during the curing process. The polymer curing process is a complex (visco) elastic process that transforms a fluid to a solid with time. Such transformation process is modelled by an appropriate constitutive relation which takes into account the temporal evolution of the material parameters appearing in a particular energy function. For demonstration in this work, a frequently used energy function is chosen, i.e. the classical Mooney-Rivlin free energy enhanced by coupling terms. Several representative numerical examples are demonstrated that prove the capability of our approach to correctly capture common features in polymers undergoing curing processes in the presence of a magneto-mechanical coupled load.
Resumo:
Linking the structural connectivity of brain circuits to their cooperative dynamics and emergent functions is a central aim of neuroscience research. Graph theory has recently been applied to study the structure-function relationship of networks, where dynamical similarity of different nodes has been turned into a "static" functional connection. However, the capability of the brain to adapt, learn and process external stimuli requires a constant dynamical functional rewiring between circuitries and cell assemblies. Hence, we must capture the changes of network functional connectivity over time. Multi-electrode array data present a unique challenge within this framework. We study the dynamics of gamma oscillations in acute slices of the somatosensory cortex from juvenile mice recorded by planar multi-electrode arrays. Bursts of gamma oscillatory activity lasting a few hundred milliseconds could be initiated only by brief trains of electrical stimulations applied at the deepest cortical layers and simultaneously delivered at multiple locations. Local field potentials were used to study the spatio-temporal properties and the instantaneous synchronization profile of the gamma oscillatory activity, combined with current source density (CSD) analysis. Pair-wise differences in the oscillation phase were used to determine the presence of instantaneous synchronization between the different sites of the circuitry during the oscillatory period. Despite variation in the duration of the oscillatory response over successive trials, they showed a constant average power, suggesting that the rate of expenditure of energy during the gamma bursts is consistent across repeated stimulations. Within each gamma burst, the functional connectivity map reflected the columnar organization of the neocortex. Over successive trials, an apparently random rearrangement of the functional connectivity was observed, with a more stable columnar than horizontal organization. This work reveals new features of evoked gamma oscillations in developing cortex.
Resumo:
Natural selection is typically exerted at some specific life stages. If natural selection takes place before a trait can be measured, using conventional models can cause wrong inference about population parameters. When the missing data process relates to the trait of interest, a valid inference requires explicit modeling of the missing process. We propose a joint modeling approach, a shared parameter model, to account for nonrandom missing data. It consists of an animal model for the phenotypic data and a logistic model for the missing process, linked by the additive genetic effects. A Bayesian approach is taken and inference is made using integrated nested Laplace approximations. From a simulation study we find that wrongly assuming that missing data are missing at random can result in severely biased estimates of additive genetic variance. Using real data from a wild population of Swiss barn owls Tyto alba, our model indicates that the missing individuals would display large black spots; and we conclude that genes affecting this trait are already under selection before it is expressed. Our model is a tool to correctly estimate the magnitude of both natural selection and additive genetic variance.
Resumo:
Introduction: In the middle of the 90's, the discovery of endogenous ligands for cannabinoid receptors opened a new era in this research field. Amides and esters of arachidonic acid have been identified as these endogenous ligands. Arachidonoylethanolamide (anandamide or AEA) and 2-Arachidonoylglycerol (2-AG) seem to be the most important of these lipid messengers. In addition, virodhamine (VA), noladin ether (2-AGE), and N-arachidonoyl dopamine (NADA) have been shown to bind to CB receptors with varying affinities. During recent years, it has become more evident that the EC system is part of fundamental regulatory mechanisms in many physiological processes such as stress and anxiety responses, depression, anorexia and bulimia, schizophrenia disorders, neuroprotection, Parkinson disease, anti-proliferative effects on cancer cells, drug addiction, and atherosclerosis. Aims: This work presents the problematic of EC analysis and the input of Information Dependant Acquisition based on hybrid triple quadrupole linear ion trap (QqQLIT) system for the profiling of these lipid mediators. Methods: The method was developed on a LC Ultimate 3000 series (Dionex, Sunnyvale, CA, USA) coupled to a QTrap 4000 system (Applied biosystems, Concord, ON, Canada). The ECs were separated on an XTerra C18 MS column (50 × 3.0 mm i.d., 3.5 μm) with a 5 min gradient elution. For confirmatory analysis, an information-dependant acquisition experiment was performed with selected reaction monitoring (SRM) as survey scan and enhanced produced ion (EPI) as dependant scan. Results: The assay was found to be linear in the concentration range of 0.1-5 ng/mL for AEA, 0.3-5 ng/mL for VA, 2-AGE, and NADA and 1-20 ng/mL for 2-AG using 0.5 mL of plasma. Repeatability and intermediate precision were found less than 15% over the tested concentration ranges. Under non-pathophysiological conditions, only AEA and 2-AG were actually detected in plasma with concentration ranges going from 104 to 537 pg/mL and from 2160 to 3990 pg/mL respectively. We have particularly focused our scopes on the evaluation of EC level changes in biological matrices through drug addiction and atherosclerosis processes. We will present preliminary data obtained during pilot study after administration of cannabis on human patients. Conclusion: ECs have been shown to play a key role in regulation of many pathophysiological processes. Medical research in these different fields continues to growth in order to understand and to highlight the predominant role of EC in the CNS and peripheral tissues signalisation. The profiling of these lipids needs to develop rapid, highly sensitive and selective analytical methods.
Resumo:
Cryo-electron microscopy of vitreous sections (CEMOVIS) has recently been shown to provide images of biological specimens with unprecedented quality and resolution. Cutting the sections remains however the major difficulty. Here, we examine the parameters influencing the quality of the sections and analyse the resulting artefacts. They are in particular: knife marks, compression, crevasses, and chatter. We propose a model taking into account the interplay between viscous flow and fracture. We confirm that crevasses are formed on only one side of the section, and define conditions by which they can be avoided. Chatter is an effect of irregular compression due to friction of the section of the knife edge and conditions to prevent this are also explored. In absence of crevasses and chatter, the bulk of the section is compressed approximately homogeneously. Within this approximation, it is possible to correct for compression by a simple linear transformation for the bulk of the section. A research program is proposed to test and refine our understanding of the sectioning process.
Resumo:
To assess the effectiveness of a multidisciplinary evaluation and referral process in a prospective cohort of general hospital patients with alcohol dependence. Alcohol-dependent patients were identified in the wards of the general hospital and its primary care center. They were evaluated and then referred to treatment by a multidisciplinary team; those patients who accepted to participate in this cohort study were consecutively included and followed for 6 months. Not included patients were lost for follow-up, whereas all included patients were assessed at time of inclusion, 2 and 6 months later by a research psychologist in order to collect standardized baseline patients' characteristics, process salient features and patients outcomes (defined as treatment adherence and abstinence). Multidisciplinary evaluation and therapeutic referral was feasible and effective, with a success rate of 43%for treatment adherence and 28%for abstinence at 6 months. Among patients' characteristics, predictors of success were an age over 45, not living alone, being employed and being motivated to treatment (RAATE-A score < 18), whereas successful process characteristics included detoxification of the patient at time of referral and a full multidisciplinary referral meeting. This multidisciplinary model of evaluation and referral of alcohol dependent patients of a general hospital had a satisfactory level of effectiveness. Predictors of success and failure allow to identify subsets of patients for whom new strategies of motivation and treatment referral should be designed.
Resumo:
Over the last decade, the development of statistical models in support of forensic fingerprint identification has been the subject of increasing research attention, spurned on recently by commentators who claim that the scientific basis for fingerprint identification has not been adequately demonstrated. Such models are increasingly seen as useful tools in support of the fingerprint identification process within or in addition to the ACE-V framework. This paper provides a critical review of recent statistical models from both a practical and theoretical perspective. This includes analysis of models of two different methodologies: Probability of Random Correspondence (PRC) models that focus on calculating probabilities of the occurrence of fingerprint configurations for a given population, and Likelihood Ratio (LR) models which use analysis of corresponding features of fingerprints to derive a likelihood value representing the evidential weighting for a potential source.
Resumo:
The stable insertion of a copy of their genome into the host cell genome is an essential step of the life cycle of retroviruses. The site of viral DNA integration, mediated by the viral-encoded integrase enzyme, has important consequences for both the virus and the host cell. The analysis of retroviral integration site distribution was facilitated by the availability of the human genome sequence, revealing the non-random feature of integration site selection and identifying different favored and disfavored genomic locations for individual retroviruses. This review will summarize the current knowledge about retroviral differences in their integration site preferences as well as the mechanisms involved in this process.
Resumo:
We advocate the use of a novel compressed sensing technique for accelerating the magnetic resonance image acquisition process, coined spread spectrum MR imaging or simply s2MRI. The method resides in pre-modulating the signal of interest by a linear chirp, resulting from the application of quadratic phase profiles, before random k-space under-sampling with uniform average density. The effectiveness of the procedure is theoretically underpinned by the optimization of the coherence between the sparsity and sensing bases. The application of the technique for single coil acquisitions is thoroughly studied by means of numerical simulations as well as phantom and in vivo experiments on a 7T scanner. The corresponding results suggest a favorable comparison with state-of-the-art variable density k-space under-sampling approaches.
Resumo:
In this paper, we study the average inter-crossing number between two random walks and two random polygons in the three-dimensional space. The random walks and polygons in this paper are the so-called equilateral random walks and polygons in which each segment of the walk or polygon is of unit length. We show that the mean average inter-crossing number ICN between two equilateral random walks of the same length n is approximately linear in terms of n and we were able to determine the prefactor of the linear term, which is a = (3 In 2)/(8) approximate to 0.2599. In the case of two random polygons of length n, the mean average inter-crossing number ICN is also linear, but the prefactor of the linear term is different from that of the random walks. These approximations apply when the starting points of the random walks and polygons are of a distance p apart and p is small compared to n. We propose a fitting model that would capture the theoretical asymptotic behaviour of the mean average ICN for large values of p. Our simulation result shows that the model in fact works very well for the entire range of p. We also study the mean ICN between two equilateral random walks and polygons of different lengths. An interesting result is that even if one random walk (polygon) has a fixed length, the mean average ICN between the two random walks (polygons) would still approach infinity if the length of the other random walk (polygon) approached infinity. The data provided by our simulations match our theoretical predictions very well.