90 resultados para Pseudo-Philoxenus.
Resumo:
Membrane proteins play important roles in many biochemical processes and are also attractive targets of drug discovery for various diseases. The elucidation of membrane protein types provides clues for understanding the structure and function of proteins. Recently we developed a novel system for predicting protein subnuclear localizations. In this paper, we propose a simplified version of our system for predicting membrane protein types directly from primary protein structures, which incorporates amino acid classifications and physicochemical properties into a general form of pseudo-amino acid composition. In this simplified system, we will design a two-stage multi-class support vector machine combined with a two-step optimal feature selection process, which proves very effective in our experiments. The performance of the present method is evaluated on two benchmark datasets consisting of five types of membrane proteins. The overall accuracies of prediction for five types are 93.25% and 96.61% via the jackknife test and independent dataset test, respectively. These results indicate that our method is effective and valuable for predicting membrane protein types. A web server for the proposed method is available at http://www.juemengt.com/jcc/memty_page.php
Resumo:
Additive manufacturing forms a potential route towards economically viable production of cellular constructs for tissue engineering. Hydrogels are a suitable class of materials for cell delivery and 3D culture, but are generally unsuitable as construction materials. Gelatine-methacrylamide is an example of such a hydrogel system widely used in the field of tissue engineering, e.g. for cartilage and cardiovascular applications. Here we show that by the addition of gellan gum to gelatine-methacrylamide and tailoring salt concentrations, rheological properties such as pseudo-plasticity and yield stress can be optimised towards gel dispensing for additive manufacturing processes. In the hydrogel formulation, salt is partly substituted by mannose to obtain isotonicity and prevent a reduction in cell viability. With this, the potential of this new bioink for additive tissue manufacturing purposes is demonstrated.
Resumo:
Mixed convection laminar two-dimensional boundary-layer flow of non-Newtonian pseudo-plastic fluids is investigated from a horizontal circular cylinder with uniform surface heat flux using a modified power-law viscosity model, that contains no unrealistic limits of zero or infinite viscosity; consequently, no irremovable singularities are introduced into boundary-layer formulations for such fluids. The governing boundary layer equations are transformed into a non-dimensional form and the resulting nonlinear systems of partial differential equations are solved numerically applying marching order implicit finite difference method with double sweep technique. Numerical results are presented for the case of shear-thinning fluids in terms of the fluid temperature distributions, rate of heat transfer in terms of the local Nusselt number.
Resumo:
The richness of the iris texture and its variability across individuals make it a useful biometric trait for personal authentication. One of the key stages in classical iris recognition is the normalization process, where the annular iris region is mapped to a dimensionless pseudo-polar coordinate system. This process results in a rectangular structure that can be used to compensate for differences in scale and variations in pupil size. Most iris recognition methods in the literature adopt linear sampling in the radial and angular directions when performing iris normalization. In this paper, a biomechanical model of the iris is used to define a novel nonlinear normalization scheme that improves iris recognition accuracy under different degrees of pupil dilation. The proposed biomechanical model is used to predict the radial displacement of any point in the iris at a given dilation level, and this information is incorporated in the normalization process. Experimental results on the WVU pupil light reflex database (WVU-PLR) indicate the efficacy of the proposed technique, especially when matching iris images with large differences in pupil size.
Resumo:
Gene expression is arguably the most important indicator of biological function. Thus identifying differentially expressed genes is one of the main aims of high throughout studies that use microarray and RNAseq platforms to study deregulated cellular pathways. There are many tools for analysing differentia gene expression from transciptomic datasets. The major challenge of this topic is to estimate gene expression variance due to the high amount of ‘background noise’ that is generated from biological equipment and the lack of biological replicates. Bayesian inference has been widely used in the bioinformatics field. In this work, we reveal that the prior knowledge employed in the Bayesian framework also helps to improve the accuracy of differential gene expression analysis when using a small number of replicates. We have developed a differential analysis tool that uses Bayesian estimation of the variance of gene expression for use with small numbers of biological replicates. Our method is more consistent when compared to the widely used cyber-t tool that successfully introduced the Bayesian framework to differential analysis. We also provide a user-friendly web based Graphic User Interface for biologists to use with microarray and RNAseq data. Bayesian inference can compensate for the instability of variance caused when using a small number of biological replicates by using pseudo replicates as prior knowledge. We also show that our new strategy to select pseudo replicates will improve the performance of the analysis. - See more at: http://www.eurekaselect.com/node/138761/article#sthash.VeK9xl5k.dpuf
Resumo:
A 'pseudo-Bayesian' interpretation of standard errors yields a natural induced smoothing of statistical estimating functions. When applied to rank estimation, the lack of smoothness which prevents standard error estimation is remedied. Efficiency and robustness are preserved, while the smoothed estimation has excellent computational properties. In particular, convergence of the iterative equation for standard error is fast, and standard error calculation becomes asymptotically a one-step procedure. This property also extends to covariance matrix calculation for rank estimates in multi-parameter problems. Examples, and some simple explanations, are given.
Resumo:
This 'project' investigates Janet Cardiff's Whispering Room. It examines how Cardiff deconstructs the privileging of the visual over all other corporeal senses in her work, the Whispering Room. Using sound as a fulcrum, Cardiff explores the links between subjects, collective narratives, memories, experiences and performances. Janet Cardiff destabilizes time and space and fractures the continuum through the use of sound. My 'project' celebrates sound as a transgressive medium — sound not as a gendered medium but as a vehicle in which to speak (to) gender. It explores how sound can destabilize notions of perception and reception and question art and museal practices. In the process this 'project' reveals the complexity of interpreting and representing art as an object. My aim is to reflect the very intertextual and expressionist collage that Cardiff has created in Whispering Room in my own text. Cardiff solicits the viewer's intimacy and participation. Whispering Room is a physical yet metonymic space in which Cardiff creates a place for performatvity, experience, memory, desire and speech, thus she opens up a space for the utterance and performance of the viewer. Viewers construct and create meaning/s for themselves within this mnemonic space by digging up their own memories, desires and reveries. The strength of Cardiff's work is that it relies on a viewer to perform, a body to trigger the pseudo-spectacle and a voice to interrupt the whispers. One might ask of Whispering Room where the illusionistic space begins and where the physical space ends. This 'project' investigates how in Whispering Room there is no one experience but many experiences.
Resumo:
Species distribution modelling (SDM) typically analyses species’ presence together with some form of absence information. Ideally absences comprise observations or are inferred from comprehensive sampling. When such information is not available, then pseudo-absences are often generated from the background locations within the study region of interest containing the presences, or else absence is implied through the comparison of presences to the whole study region, e.g. as is the case in Maximum Entropy (MaxEnt) or Poisson point process modelling. However, the choice of which absence information to include can be both challenging and highly influential on SDM predictions (e.g. Oksanen and Minchin, 2002). In practice, the use of pseudo- or implied absences often leads to an imbalance where absences far outnumber presences. This leaves analysis highly susceptible to ‘naughty-noughts’: absences that occur beyond the envelope of the species, which can exert strong influence on the model and its predictions (Austin and Meyers, 1996). Also known as ‘excess zeros’, naughty noughts can be estimated via an overall proportion in simple hurdle or mixture models (Martin et al., 2005). However, absences, especially those that occur beyond the species envelope, can often be more diverse than presences. Here we consider an extension to excess zero models. The two-staged approach first exploits the compartmentalisation provided by classification trees (CTs) (as in O’Leary, 2008) to identify multiple sources of naughty noughts and simultaneously delineate several species envelopes. Then SDMs can be fit separately within each envelope, and for this stage, we examine both CTs (as in Falk et al., 2014) and the popular MaxEnt (Elith et al., 2006). We introduce a wider range of model performance measures to improve treatment of naughty noughts in SDM. We retain an overall measure of model performance, the area under the curve (AUC) of the Receiver-Operating Curve (ROC), but focus on its constituent measures of false negative rate (FNR) and false positive rate (FPR), and how these relate to the threshold in the predicted probability of presence that delimits predicted presence from absence. We also propose error rates more relevant to users of predictions: false omission rate (FOR), the chance that a predicted absence corresponds to (and hence wastes) an observed presence, and the false discovery rate (FDR), reflecting those predicted (or potential) presences that correspond to absence. A high FDR may be desirable since it could help target future search efforts, whereas zero or low FOR is desirable since it indicates none of the (often valuable) presences have been ignored in the SDM. For illustration, we chose Bradypus variegatus, a species that has previously been published as an exemplar species for MaxEnt, proposed by Phillips et al. (2006). We used CTs to increasingly refine the species envelope, starting with the whole study region (E0), eliminating more and more potential naughty noughts (E1–E3). When combined with an SDM fit within the species envelope, the best CT SDM had similar AUC and FPR to the best MaxEnt SDM, but otherwise performed better. The FNR and FOR were greatly reduced, suggesting that CTs handle absences better. Interestingly, MaxEnt predictions showed low discriminatory performance, with the most common predicted probability of presence being in the same range (0.00-0.20) for both true absences and presences. In summary, this example shows that SDMs can be improved by introducing an initial hurdle to identify naughty noughts and partition the envelope before applying SDMs. This improvement was barely detectable via AUC and FPR yet visible in FOR, FNR, and the comparison of predicted probability of presence distribution for pres/absence.
Resumo:
Having the ability to work with complex models can be highly beneficial, but the computational cost of doing so is often large. Complex models often have intractable likelihoods, so methods that directly use the likelihood function are infeasible. In these situations, the benefits of working with likelihood-free methods become apparent. Likelihood-free methods, such as parametric Bayesian indirect likelihood that uses the likelihood of an alternative parametric auxiliary model, have been explored throughout the literature as a good alternative when the model of interest is complex. One of these methods is called the synthetic likelihood (SL), which assumes a multivariate normal approximation to the likelihood of a summary statistic of interest. This paper explores the accuracy and computational efficiency of the Bayesian version of the synthetic likelihood (BSL) approach in comparison to a competitor known as approximate Bayesian computation (ABC) and its sensitivity to its tuning parameters and assumptions. We relate BSL to pseudo-marginal methods and propose to use an alternative SL that uses an unbiased estimator of the exact working normal likelihood when the summary statistic has a multivariate normal distribution. Several applications of varying complexity are considered to illustrate the findings of this paper.
Resumo:
This study reports an investigation of the ion exchange treatment of sodium chloride solutions in relation to use of resin technology for applications such as desalination of brackish water. In particular, a strong acid cation (SAC) resin (DOW Marathon C) was studied to determine its capacity for sodium uptake and to evaluate the fundamentals of the ion exchange process involved. Key questions to answer included: impact of resin identity; best models to simulate the kinetics and equilibrium exchange behaviour of sodium ions; difference between using linear least squares (LLS) and non-linear least squares (NLLS) methods for data interpretation; and, effect of changing the type of anion in solution which accompanied the sodium species. Kinetic studies suggested that the exchange process was best described by a pseudo first order rate expression based upon non-linear least squares analysis of the test data. Application of the Langmuir Vageler isotherm model was recommended as it allowed confirmation that experimental conditions were sufficient for maximum loading of sodium ions to occur. The Freundlich expression best fitted the equilibrium data when analysing the information by a NLLS approach. In contrast, LLS methods suggested that the Langmuir model was optimal for describing the equilibrium process. The Competitive Langmuir model which considered the stoichiometric nature of ion exchange process, estimated the maximum loading of sodium ions to be 64.7 g Na/kg resin. This latter value was comparable to sodium ion capacities for SAC resin published previously. Inherent discrepancies involved when using linearized versions of kinetic and isotherm equations were illustrated, and despite their widespread use, the value of this latter approach was questionable. The equilibrium behaviour of sodium ions form sodium fluoride solution revealed that the sodium ions were now more preferred by the resin compared to the situation with sodium chloride. The solution chemistry of hydrofluoric acid was suggested as promoting the affinity of the sodium ions to the resin.
Resumo:
Bisphenol-A (BPA) adsorption onto inorganic-organic clays (IOCs) was investigated. For this purpose, IOCs synthesised using octadecyltrimethylammonium bromide (ODTMA, organic modifier) and hydroxy aluminium (Al13, inorganic modifier) were used. Three intercalation methods were employed with varying ODTMA concentration in the synthesis of IOCs. Molecular interactions of clay surfaces with ODTMA and Al13 and their arrangements within the interlayers were determined using Fourier transform infrared spectroscopy (FTIR). Surface area and porous structure of IOCs were determined by applying Brunauer, Emmett, and Teller (BET) method to N2 adsorption-desorption isotherms. Surface area decreased upon ODTMA intercalation while it increased with Al13 pillaring. As a result, BET specific surface area of IOCs was considerably higher than those of organoclays. Initial concentration of BPA, contact time and adsorbent dose significantly affected BPA adsorption into IOCs. Pseudo-second order kinetics model is the best fit for BPA adsorption into IOCs. Both Langmuir and Freundlich adsorption isotherms were applicable for BPA adsorption (R2 > 0.91) for IOCs. Langmuir maximum adsorption capacity for IOCs was as high as 109.89 mg g‒1 and it was closely related to the loaded ODTMA amount into the clay. Hydrophobic interactions between long alkyl chains of ODTMA and BPA are responsible for BPA adsorption into IOCs.
Resumo:
Coal seam gas operations produce significant quantities of associated water which often require demineralization. Ion exchange with natural zeolites has been proposed as a possible approach. The interaction of natural zeolites with solutions of sodium chloride and sodium bicarbonate in addition to coal seam gas water is not clear. Hence, we investigated ion exchange kinetics, equilibrium, and column behaviour of an Australian natural zeolite. Kinetic tests suggested that the pseudo first order equation best simulated the data. Intraparticle diffusion was part of the rate limiting step and more than one diffusion process controlled the overall rate of sodium ion uptake. Using a constant mass of zeolite and variable concentration of either sodium chloride or sodium bicarbonate resulted in a convex isotherm which was fitted by a Langmuir model. However, using a variable mass of zeolite and constant concentration of sodium ions revealed that the exchange of sodium ions with the zeolite surface sites was in fact unfavourable. Sodium ion exchange from bicarbonate solutions (10.3 g Na/kg zeolite) was preferred relative to exchange from sodium chloride solutions (6.4 g Na/kg zeolite). The formation of calcium carbonate species was proposed to explain the observed behaviour. Column studies of coal seam gas water showed that natural zeolite had limited ability to reduce the concentration of sodium ions (loading 2.1 g Na/kg zeolite) with rapid breakthrough observed. It was concluded that natural zeolites may not be suitable for the removal of cations from coal seam gas water without improvement of their physical properties.
Resumo:
The increased availability of image capturing devices has enabled collections of digital images to rapidly expand in both size and diversity. This has created a constantly growing need for efficient and effective image browsing, searching, and retrieval tools. Pseudo-relevance feedback (PRF) has proven to be an effective mechanism for improving retrieval accuracy. An original, simple yet effective rank-based PRF mechanism (RB-PRF) that takes into account the initial rank order of each image to improve retrieval accuracy is proposed. This RB-PRF mechanism innovates by making use of binary image signatures to improve retrieval precision by promoting images similar to highly ranked images and demoting images similar to lower ranked images. Empirical evaluations based on standard benchmarks, namely Wang, Oliva & Torralba, and Corel datasets demonstrate the effectiveness of the proposed RB-PRF mechanism in image retrieval.
Resumo:
This article presents and evaluates Quantum Inspired models of Target Activation using Cued-Target Recall Memory Modelling over multiple sources of Free Association data. Two components were evaluated: Whether Quantum Inspired models of Target Activation would provide a better framework than their classical psychological counterparts and how robust these models are across the different sources of Free Association data. In previous work, a formal model of cued-target recall did not exist and as such Target Activation was unable to be assessed directly. Further to that, the data source used was suspected of suffering from temporal and geographical bias. As a consequence, Target Activation was measured against cued-target recall data as an approximation of performance. Since then, a formal model of cued-target recall (PIER3) has been developed [10] with alternative sources of data also becoming available. This allowed us to directly model target activation in cued-target recall with human cued-target recall pairs and use multiply sources of Free Association Data. Featural Characteristics known to be important to Target Activation were measured for each of the data sources to identify any major differences that may explain variations in performance for each of the models. Each of the activation models were used in the PIER3 memory model for each of the data sources and was benchmarked against cued-target recall pairs provided by the University of South Florida (USF). Two methods where used to evaluate performance. The first involved measuring the divergence between the sets of results using the Kullback Leibler (KL) divergence with the second utilizing a previous statistical analysis of the errors [9]. Of the three sources of data, two were sourced from human subjects being the USF Free Association Norms and the University of Leuven (UL) Free Association Networks. The third was sourced from a new method put forward by Galea and Bruza, 2015 in which pseudo Free Association Networks (Corpus Based Association Networks - CANs) are built using co-occurrence statistics on large text corpus. It was found that the Quantum Inspired Models of Target Activation not only outperformed the classical psychological model but was more robust across a variety of data sources.