983 resultados para GAUSSIAN GENERATOR FUNCTIONS
Resumo:
The putative role of the N-terminal region of rhodopsin-like 7 transmembrane biogenic amine receptors in agonist-induced signaling has not yet been clarified despite recent advances in 7 transmembrane receptor structural biology. Given the existence of N-terminal nonsynonymous polymorphisms (R6G;E42G) within the HTR2B gene in a drug-abusing population, we assessed whether these polymorphisms affect 5-hydroxytryptamine 2B (5-HT2B) receptor in vitro pharmacologic and coupling properties in transfected COS-7 cells. Modification of the 5-HT2B receptor N terminus by the R6G;E42G polymorphisms increases such agonist signaling pathways as inositol phosphate accumulation as assessed by either classic or operational models. The N-terminal R6G;E42G mutations of the 5-HT2B receptor also increase cell proliferation and slow its desensitization kinetics compared with the wild-type receptor, further supporting a role for the N terminus in transduction efficacy. Furthermore, by coexpressing a tethered wild-type 5-HT2B receptor N terminus with a 5-HT2B receptor bearing a N-terminal deletion, we were able to restore original coupling. This reversion to normal activity of a truncated 5-HT2B receptor by coexpression of the membrane-tethered wild-type 5-HT2B receptor N terminus was not observed using a membrane-tethered 5-HT2B receptor R6G;E42G N terminus. These data suggest that the N terminus exerts a negative control over basal as well as agonist-stimulated receptor activity that is lost in the R6G;E42G mutant. Our findings reveal a new and unanticipated role of the 5-HT2B receptor N terminus as a negative modulator, affecting both constitutive and agonist-stimulated activity. Moreover, our data caution against excluding the N terminus and extracellular loops in structural studies of this 7 transmembrane receptor family
Resumo:
Spatial data are now prevalent in a wide range of fields including environmental and health science. This has led to the development of a range of approaches for analysing patterns in these data. In this paper, we compare several Bayesian hierarchical models for analysing point-based data based on the discretization of the study region, resulting in grid-based spatial data. The approaches considered include two parametric models and a semiparametric model. We highlight the methodology and computation for each approach. Two simulation studies are undertaken to compare the performance of these models for various structures of simulated point-based data which resemble environmental data. A case study of a real dataset is also conducted to demonstrate a practical application of the modelling approaches. Goodness-of-fit statistics are computed to compare estimates of the intensity functions. The deviance information criterion is also considered as an alternative model evaluation criterion. The results suggest that the adaptive Gaussian Markov random field model performs well for highly sparse point-based data where there are large variations or clustering across the space; whereas the discretized log Gaussian Cox process produces good fit in dense and clustered point-based data. One should generally consider the nature and structure of the point-based data in order to choose the appropriate method in modelling a discretized spatial point-based data.
Resumo:
This paper translates the concepts of sustainable production to three dimensions of economic, environmental and ecological sustainability to analyze optimal production scales by solving optimizing problems. Economic optimization seeks input-output combinations to maximize profits. Environmental optimization searches for input-output combinations that minimize the polluting effects of materials balance on the surrounding environment. Ecological optimization looks for input-output combinations that minimize the cumulative destruction of the entire ecosystem. Using an aggregate space, the framework illustrates that these optimal scales are often not identical because markets fail to account for all negative externalities. Profit-maximizing firms normally operate at the scales which are larger than optimal scales from the viewpoints of environmental and ecological sustainability; hence policy interventions are favoured. The framework offers a useful tool for efficiency studies and policy implication analysis. The paper provides an empirical investigation using a data set of rice farms in South Korea.
Resumo:
The role of exosomes in cancer development has become the focus of much research, due to the many emerging roles possessed by exosomes. These micro-vesicles that are ubiquitously released in to the extracellular milieu, have been found to regulate immune system function, particularly in tumorigenesis, as well as conditioning future metastatic sites for the attachment and growth of tumor tissue. Through an interaction with a range of host tissue, exosomes are able to generate a pro-tumor environment that is essential for carcinogenesis. Herein, we discuss the contents of exosomes and their contribution to tumorigenesis, as well as their role in chemotherapeutic resistance and the development of novel cancer treatments and the identification of cancer biomarkers.
Resumo:
In this chapter we describe a critical fairytales unit taught to 4.5 to 5.5 year olds in a context of intensifying pressure to raise literacy achievement. The unit was infused with lessons on reinterpreted fairytales followed by process drama activities built around a sophisticated picture book, Beware of the Bears (MacDonald, 2004). The latter entailed a text analytic approach to critical literacy derived from systemic functional linguistics (Halliday, 1978; Halliday & Matthiessen, 2004). This approach provides a way of analysing how words and discourse are used to represent the world in a particular way and shape reader relations with the author in a particular field (Janks, 2010).
Resumo:
The functions of the volunteer functions inventory were combined with the constructs of the theory of planned behaviour (i.e., attitudes, subjective norms, and perceived behavioural control) to establish whether a stronger, single explanatory model prevailed. Undertaken in the context of episodic, skilled volunteering by individuals who were retired or approaching retirement (N = 186), the research advances on prior studies which either examined the predictive capacity of each model independently or compared their explanatory value. Using hierarchical regression analysis, the functions of the volunteer functions inventory (when controlling for demographic variables) explained an additional 7.0% of variability in individuals’ willingness to volunteer over and above that accounted for by the theory of planned behaviour. Significant predictors in the final model included attitudes, subjective norms and perceived behavioural control from the theory of planned behaviour and the understanding function from the volunteer functions inventory. It is proposed that the items comprising the understanding function may represent a deeper psychological construct (e.g., self-actualisation) not accounted for by the theory of planned behaviour. The findings highlight the potential benefit of combining these two prominent models in terms of improving understanding of volunteerism and providing a single parsimonious model for raising rates of this important behaviour.
Resumo:
To enhance the efficiency of regression parameter estimation by modeling the correlation structure of correlated binary error terms in quantile regression with repeated measurements, we propose a Gaussian pseudolikelihood approach for estimating correlation parameters and selecting the most appropriate working correlation matrix simultaneously. The induced smoothing method is applied to estimate the covariance of the regression parameter estimates, which can bypass density estimation of the errors. Extensive numerical studies indicate that the proposed method performs well in selecting an accurate correlation structure and improving regression parameter estimation efficiency. The proposed method is further illustrated by analyzing a dental dataset.
Resumo:
Purpose – This paper aims to recognise the importance of informal processes within corporate governance and complement existing research in this area by investigating factors associated with the existence of informal interactions between audit committees and internal audit functions and in providing directions for future research. Design/methodology/approach – To examine the existence and drivers of informal interactions between audit committees and internal audit functions, this paper relies on a questionnaire survey of chief audit executives (CAEs) in the UK from listed and non-listed, as well as financial and non-financial, companies. While prior qualitative research suggests that informal interactions do take place, most of the evidence is based on particular organisational setting or on a very small range of interviews. The use of a questionnaire enabled the examination of the existence of internal interactions across a relatively larger number of entities. Findings – The paper finds evidence of audit committees and internal audit functions engaging in informal interactions in addition to formal pre-scheduled regular meetings. Informal interactions complement formal meetings with the audit committee and as such represent additional opportunities for the audit committees to monitor internal audit functions. Audit committees’ informal interactions are significantly and positively associated with audit committee independence, audit chair’s knowledge and experience, and internal audit quality. Originality/value – The results demonstrate the importance of the background of the audit committee chair for the effectiveness of the governance process. This is possibly the first paper to examine the relationship between audit committee quality and internal audit, on the existence and driver of informal interactions. Policy makers should recognize that in addition to formal mechanisms, informal processes, such as communication outside of formal pre-scheduled meetings, play a significant role in corporate governance.
Resumo:
Circos plots are graphical outputs that display three dimensional chromosomal interactions and fusion transcripts. However, the Circos plot tool is not an interactive visualization tool, but rather a figure generator. For example, it does not enable data to be added dynamically, nor does it provide information for specific data points interactively. Recently, an R-based Circos tool (RCircos) has been developed to integrate Circos to R, but similarly, Rcircos can only be used to generate plots. Thus, we have developed a Circos plot tool (J-Circos) that is an interactive visualization tool that can plot Circos figures, as well as being able to dynamically add data to the figure, and providing information for specific data points using mouse hover display and zoom in/out functions. J-Circos uses the Java computer language to enable it to be used on most operating systems (Windows, MacOS, Linux). Users can input data into JCircos using flat data formats, as well as from the GUI. J-Circos will enable biologists to better study more complex chromosomal interactions and fusion transcripts that are otherwise difficult to visualize from next-generation sequencing data.
Resumo:
Preneel, Govaerts and Vandewalle (PGV) analysed the security of single-block-length block cipher based compression functions assuming that the underlying block cipher has no weaknesses. They showed that 12 out of 64 possible compression functions are collision and (second) preimage resistant. Black, Rogaway and Shrimpton formally proved this result in the ideal cipher model. However, in the indifferentiability security framework introduced by Maurer, Renner and Holenstein, all these 12 schemes are easily differentiable from a fixed input-length random oracle (FIL-RO) even when their underlying block cipher is ideal. We address the problem of building indifferentiable compression functions from the PGV compression functions. We consider a general form of 64 PGV compression functions and replace the linear feed-forward operation in this generic PGV compression function with an ideal block cipher independent of the one used in the generic PGV construction. This modified construction is called a generic modified PGV (MPGV). We analyse indifferentiability of the generic MPGV construction in the ideal cipher model and show that 12 out of 64 MPGV compression functions in this framework are indifferentiable from a FIL-RO. To our knowledge, this is the first result showing that two independent block ciphers are sufficient to design indifferentiable single-block-length compression functions.
Resumo:
Objectives Directly measuring disease incidence in a population is difficult and not feasible to do routinely. We describe the development and application of a new method of estimating at a population level the number of incident genital chlamydia infections, and the corresponding incidence rates, by age and sex using routine surveillance data. Methods A Bayesian statistical approach was developed to calibrate the parameters of a decision-pathway tree against national data on numbers of notifications and tests conducted (2001-2013). Independent beta probability density functions were adopted for priors on the time-independent parameters; the shape parameters of these beta distributions were chosen to match prior estimates sourced from peer-reviewed literature or expert opinion. To best facilitate the calibration, multivariate Gaussian priors on (the logistic transforms of) the time-dependent parameters were adopted, using the Matérn covariance function to favour changes over consecutive years and across adjacent age cohorts. The model outcomes were validated by comparing them with other independent empirical epidemiological measures i.e. prevalence and incidence as reported by other studies. Results Model-based estimates suggest that the total number of people acquiring chlamydia per year in Australia has increased by ~120% over 12 years. Nationally, an estimated 356,000 people acquired chlamydia in 2013, which is 4.3 times the number of reported diagnoses. This corresponded to a chlamydia annual incidence estimate of 1.54% in 2013, increased from 0.81% in 2001 (~90% increase). Conclusions We developed a statistical method which uses routine surveillance (notifications and testing) data to produce estimates of the extent and trends in chlamydia incidence.
Resumo:
Hydraulic conductivity (K) fields are used to parameterize groundwater flow and transport models. Numerical simulations require a detailed representation of the K field, synthesized to interpolate between available data. Several recent studies introduced high-resolution K data (HRK) at the Macro Dispersion Experiment (MADE) site, and used ground-penetrating radar (GPR) to delineate the main structural features of the aquifer. This paper describes a statistical analysis of these data, and the implications for K field modeling in alluvial aquifers. Two striking observations have emerged from this analysis. The first is that a simple fractional difference filter can have a profound effect on data histograms, organizing non-Gaussian ln K data into a coherent distribution. The second is that using GPR facies allows us to reproduce the significantly non-Gaussian shape seen in real HRK data profiles, using a simulated Gaussian ln K field in each facies. This illuminates a current controversy in the literature, between those who favor Gaussian ln K models, and those who observe non-Gaussian ln K fields. Both camps are correct, but at different scales.
Resumo:
Structural damage detection using measured dynamic data for pattern recognition is a promising approach. These pattern recognition techniques utilize artificial neural networks and genetic algorithm to match pattern features. In this study, an artificial neural network–based damage detection method using frequency response functions is presented, which can effectively detect nonlinear damages for a given level of excitation. The main objective of this article is to present a feasible method for structural vibration–based health monitoring, which reduces the dimension of the initial frequency response function data and transforms it into new damage indices and employs artificial neural network method for detecting different levels of nonlinearity using recognized damage patterns from the proposed algorithm. Experimental data of the three-story bookshelf structure at Los Alamos National Laboratory are used to validate the proposed method. Results showed that the levels of nonlinear damages can be identified precisely by the developed artificial neural networks. Moreover, it is identified that artificial neural networks trained with summation frequency response functions give higher precise damage detection results compared to the accuracy of artificial neural networks trained with individual frequency response functions. The proposed method is therefore a promising tool for structural assessment in a real structure because it shows reliable results with experimental data for nonlinear damage detection which renders the frequency response function–based method convenient for structural health monitoring.
Resumo:
Cryptographic hash functions are an important tool of cryptography and play a fundamental role in efficient and secure information processing. A hash function processes an arbitrary finite length input message to a fixed length output referred to as the hash value. As a security requirement, a hash value should not serve as an image for two distinct input messages and it should be difficult to find the input message from a given hash value. Secure hash functions serve data integrity, non-repudiation and authenticity of the source in conjunction with the digital signature schemes. Keyed hash functions, also called message authentication codes (MACs) serve data integrity and data origin authentication in the secret key setting. The building blocks of hash functions can be designed using block ciphers, modular arithmetic or from scratch. The design principles of the popular Merkle–Damgård construction are followed in almost all widely used standard hash functions such as MD5 and SHA-1.