176 resultados para Models validation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the potential of the CERN large hadron collider to probe the spin of new massive vector boson resonances predicted by Higgsless models. We consider its production via weak boson fusion which relies only on the coupling between the new resonances and the weak gauge bosons. We show that the large hadron collider will be able to unravel the spin of the particles associated with the partial restoration of unitarity in vector boson scattering for integrated luminosities of 150-560 fb(-1), depending on the new state mass and on the method used in the analyses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that the common singularities present in generic modified gravity models governed by actions of the type S = integral d(4)x root-gf(R, phi, X). with X = -1/2 g(ab)partial derivative(a)phi partial derivative(b)phi, are essentially the same anisotropic instabilities associated to the hypersurface F(phi) = 0 in the case of a nonminimal coupling of the type F(phi)R, enlightening the physical origin of such singularities that typically arise in rather complex and cumbersome inhomogeneous perturbation analyses. We show, moreover, that such anisotropic instabilities typically give rise to dynamically unavoidable singularities, precluding completely the possibility of having physically viable models for which the hypersurface partial derivative f/partial derivative R = 0 is attained. Some examples are explicitly discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In one-component Abelian sandpile models, the toppling probabilities are independent quantities. This is not the case in multicomponent models. The condition of associativity of the underlying Abelian algebras imposes nonlinear relations among the toppling probabilities. These relations are derived for the case of two-component quadratic Abelian algebras. We show that Abelian sandpile models with two conservation laws have only trivial avalanches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With each directed acyclic graph (this includes some D-dimensional lattices) one can associate some Abelian algebras that we call directed Abelian algebras (DAAs). On each site of the graph one attaches a generator of the algebra. These algebras depend on several parameters and are semisimple. Using any DAA, one can define a family of Hamiltonians which give the continuous time evolution of a stochastic process. The calculation of the spectra and ground-state wave functions (stationary state probability distributions) is an easy algebraic exercise. If one considers D-dimensional lattices and chooses Hamiltonians linear in the generators, in finite-size scaling the Hamiltonian spectrum is gapless with a critical dynamic exponent z=D. One possible application of the DAA is to sandpile models. In the paper we present this application, considering one- and two-dimensional lattices. In the one-dimensional case, when the DAA conserves the number of particles, the avalanches belong to the random walker universality class (critical exponent sigma(tau)=3/2). We study the local density of particles inside large avalanches, showing a depletion of particles at the source of the avalanche and an enrichment at its end. In two dimensions we did extensive Monte-Carlo simulations and found sigma(tau)=1.780 +/- 0.005.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The efficacy of fluorescence spectroscopy to detect squamous cell carcinoma is evaluated in an animal model following laser excitation at 442 and 532 nm. Lesions are chemically induced with a topical DMBA application at the left lateral tongue of Golden Syrian hamsters. The animals are investigated every 2 weeks after the 4th week of induction until a total of 26 weeks. The right lateral tongue of each animal is considered as a control site (normal contralateral tissue) and the induced lesions are analyzed as a set of points covering the entire clinically detectable area. Based on fluorescence spectral differences, four indices are determined to discriminate normal and carcinoma tissues, based on intraspectral analysis. The spectral data are also analyzed using a multivariate data analysis and the results are compared with histology as the diagnostic gold standard. The best result achieved is for blue excitation using the KNN (K-nearest neighbor, a interspectral analysis) algorithm with a sensitivity of 95.7% and a specificity of 91.6%. These high indices indicate that fluorescence spectroscopy may constitute a fast noninvasive auxiliary tool for diagnostic of cancer within the oral cavity. (C) 2008 Society of Photo-Optical Instrumentation Engineers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The structure of laser glasses in the system (Y(2)O(3))(0.2){(Al(2)O(3))(x))(B(2)O(3))(0.8-x)} (0.15 <= x <= 0.40) has been investigated by means of (11)B, (27)Al, and (89)Y solid state NMR as well as electron spin echo envelope modulation (ESEEM) of Yb-doped samples. The latter technique has been applied for the first time to an aluminoborate glass system. (11)B magic-angle spinning (MAS)-NMR spectra reveal that, while the majority of the boron atoms are three-coordinated over the entire composition region, the fraction of three-coordinated boron atoms increases significantly with increasing x. Charge balance considerations as well as (11)B NMR lineshape analyses suggest that the dominant borate species are predominantly singly charged metaborate (BO(2/2)O(-)), doubly charged pyroborate (BO(1/2)(O(-))(2)), and (at x = 0.40) triply charged orthoborate groups. As x increases along this series, the average anionic charge per trigonal borate group increases from 1.38 to 2.91. (27)Al MAS-NMR spectra show that the alumina species are present in the coordination states four, five and six, and the fraction of four-coordinated Al increases markedly with increasing x. All of the Al coordination states are in intimate contact with both the three-and the four-coordinate boron species and vice versa, as indicated by (11)B/(27)Al rotational echo double resonance (REDOR) data. These results are consistent with the formation of a homogeneous, non-segregated glass structure. (89)Y solid state NMR spectra show a significant chemical shift trend, reflecting that the second coordination sphere becomes increasingly ""aluminate-like'' with increasing x. This conclusion is supported by electron spin echo envelope modulation (ESEEM) data of Yb-doped glasses, which indicate that both borate and aluminate species participate in the medium range structure of the rare-earth ions, consistent with a random spatial distribution of the glass components.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this work was to develop and validate a rapid Reversed-Phase High-Performance Liquid Chromatography method for the quantification of 3,5,3 '-triiodothyroacetic acid (TRIAC) in nanoparticles delivery system prepared in different polymeric matrices. Special attention was given to developing a reliable reproductive technique for the pretreatment of the samples. Chromatographic runs were performed on an Agilent 1200 Series HPLC with a RP Phenomenex (R) Gemini C18 (150 x 4, 6 mm i.d., 5 mu m) column using acetonitrile and triethylamine buffer 0.1% (TEA) (40 : 60 v/v) as a mobile phase in an isocratic elution, pH 5.6 at a flow rate of 1 ml min(-1). TRIAC was detected at a wavelength of 220 nm. The injection volume was 20 mu l and the column temperature was maintained at 35 degrees C. The validation characteristics included accuracy, precision, specificity, linearity, recovery, and robustness. The standard curve was found to have a linear relationship (r(2) - 0.9996) over the analytical range of 5-100 mu g ml(-1) . The detection and quantitation limits were 1.3 and 3.8 mu g ml(-1), respectively. The recovery and loaded TRIAC in colloidal system delivery was nearly 100% and 98%, respectively. The method was successfully applied in polycaprolactone, polyhydroxybutyrate, and polymethylmethacrylate nanoparticles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thanks to recent advances in molecular biology, allied to an ever increasing amount of experimental data, the functional state of thousands of genes can now be extracted simultaneously by using methods such as cDNA microarrays and RNA-Seq. Particularly important related investigations are the modeling and identification of gene regulatory networks from expression data sets. Such a knowledge is fundamental for many applications, such as disease treatment, therapeutic intervention strategies and drugs design, as well as for planning high-throughput new experiments. Methods have been developed for gene networks modeling and identification from expression profiles. However, an important open problem regards how to validate such approaches and its results. This work presents an objective approach for validation of gene network modeling and identification which comprises the following three main aspects: (1) Artificial Gene Networks (AGNs) model generation through theoretical models of complex networks, which is used to simulate temporal expression data; (2) a computational method for gene network identification from the simulated data, which is founded on a feature selection approach where a target gene is fixed and the expression profile is observed for all other genes in order to identify a relevant subset of predictors; and (3) validation of the identified AGN-based network through comparison with the original network. The proposed framework allows several types of AGNs to be generated and used in order to simulate temporal expression data. The results of the network identification method can then be compared to the original network in order to estimate its properties and accuracy. Some of the most important theoretical models of complex networks have been assessed: the uniformly-random Erdos-Renyi (ER), the small-world Watts-Strogatz (WS), the scale-free Barabasi-Albert (BA), and geographical networks (GG). The experimental results indicate that the inference method was sensitive to average degree k variation, decreasing its network recovery rate with the increase of k. The signal size was important for the inference method to get better accuracy in the network identification rate, presenting very good results with small expression profiles. However, the adopted inference method was not sensible to recognize distinct structures of interaction among genes, presenting a similar behavior when applied to different network topologies. In summary, the proposed framework, though simple, was adequate for the validation of the inferred networks by identifying some properties of the evaluated method, which can be extended to other inference methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The inference of gene regulatory networks (GRNs) from large-scale expression profiles is one of the most challenging problems of Systems Biology nowadays. Many techniques and models have been proposed for this task. However, it is not generally possible to recover the original topology with great accuracy, mainly due to the short time series data in face of the high complexity of the networks and the intrinsic noise of the expression measurements. In order to improve the accuracy of GRNs inference methods based on entropy (mutual information), a new criterion function is here proposed. Results: In this paper we introduce the use of generalized entropy proposed by Tsallis, for the inference of GRNs from time series expression profiles. The inference process is based on a feature selection approach and the conditional entropy is applied as criterion function. In order to assess the proposed methodology, the algorithm is applied to recover the network topology from temporal expressions generated by an artificial gene network (AGN) model as well as from the DREAM challenge. The adopted AGN is based on theoretical models of complex networks and its gene transference function is obtained from random drawing on the set of possible Boolean functions, thus creating its dynamics. On the other hand, DREAM time series data presents variation of network size and its topologies are based on real networks. The dynamics are generated by continuous differential equations with noise and perturbation. By adopting both data sources, it is possible to estimate the average quality of the inference with respect to different network topologies, transfer functions and network sizes. Conclusions: A remarkable improvement of accuracy was observed in the experimental results by reducing the number of false connections in the inferred topology by the non-Shannon entropy. The obtained best free parameter of the Tsallis entropy was on average in the range 2.5 <= q <= 3.5 (hence, subextensive entropy), which opens new perspectives for GRNs inference methods based on information theory and for investigation of the nonextensivity of such networks. The inference algorithm and criterion function proposed here were implemented and included in the DimReduction software, which is freely available at http://sourceforge.net/projects/dimreduction and http://code.google.com/p/dimreduction/.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Based on solvation studies of polymers, the sum (1: 1) of the electron acceptor (AN) and electron donor (DN) values of solvents has been proposed as an alternative polarity scale. To test this, the electron paramagnetic resonance isotropic hyperfine splitting constant, a parameter known to be dependent on the polarity/proticity of the medium, was correlated with the (AN+DN) term using three paramagnetic probes. The linear regression coefficient calculated for 15 different solvents was approximately 0.9, quite similar to those of other well-known polarity parameters, attesting to the validity of the (AN+DN) term as a novel ""two-parameter"" solvent polarity scale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Mutations in TP53 are common events during carcinogenesis. In addition to gene mutations, several reports have focused on TP53 polymorphisms as risk factors for malignant disease. Many studies have highlighted that the status of the TP53 codon 72 polymorphism could influence cancer susceptibility. However, the results have been inconsistent and various methodological features can contribute to departures from Hardy-Weinberg equilibrium, a condition that may influence the disease risk estimates. The most widely accepted method of detecting genotyping error is to confirm genotypes by sequencing and/or via a separate method. Results: We developed two new genotyping methods for TP53 codon 72 polymorphism detection: Denaturing High Performance Liquid Chromatography (DHPLC) and Dot Blot hybridization. These methods were compared with Restriction Fragment Length Polymorphism (RFLP) using two different restriction enzymes. We observed high agreement among all methodologies assayed. Dot-blot hybridization and DHPLC results were more highly concordant with each other than when either of these methods was compared with RFLP. Conclusions: Although variations may occur, our results indicate that DHPLC and Dot Blot hybridization can be used as reliable screening methods for TP53 codon 72 polymorphism detection, especially in molecular epidemiologic studies, where high throughput methodologies are required.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of laser induced breakdown spectrometry (LIBS) aiming the direct analysis of plant materials is a great challenge that still needs efforts for its development and validation. In this way, a series of experimental approaches has been carried out in order to show that LIBS can be used as an alternative method to wet acid digestions based methods for analysis of agricultural and environmental samples. The large amount of information provided by LIBS spectra for these complex samples increases the difficulties for selecting the most appropriated wavelengths for each analyte. Some applications have suggested that improvements in both accuracy and precision can be achieved by the application of multivariate calibration in LIBS data when compared to the univariate regression developed with line emission intensities. In the present work, the performance of univariate and multivariate calibration, based on partial least squares regression (PLSR), was compared for analysis of pellets of plant materials made from an appropriate mixture of cryogenically ground samples with cellulose as the binding agent. The development of a specific PLSR model for each analyte and the selection of spectral regions containing only lines of the analyte of interest were the best conditions for the analysis. In this particular application, these models showed a similar performance. but PLSR seemed to be more robust due to a lower occurrence of outliers in comparison to the univariate method. Data suggests that efforts dealing with sample presentation and fitness of standards for LIBS analysis must be done in order to fulfill the boundary conditions for matrix independent development and validation. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: We carry out a systematic assessment on a suite of kernel-based learning machines while coping with the task of epilepsy diagnosis through automatic electroencephalogram (EEG) signal classification. Methods and materials: The kernel machines investigated include the standard support vector machine (SVM), the least squares SVM, the Lagrangian SVM, the smooth SVM, the proximal SVM, and the relevance vector machine. An extensive series of experiments was conducted on publicly available data, whose clinical EEG recordings were obtained from five normal subjects and five epileptic patients. The performance levels delivered by the different kernel machines are contrasted in terms of the criteria of predictive accuracy, sensitivity to the kernel function/parameter value, and sensitivity to the type of features extracted from the signal. For this purpose, 26 values for the kernel parameter (radius) of two well-known kernel functions (namely. Gaussian and exponential radial basis functions) were considered as well as 21 types of features extracted from the EEG signal, including statistical values derived from the discrete wavelet transform, Lyapunov exponents, and combinations thereof. Results: We first quantitatively assess the impact of the choice of the wavelet basis on the quality of the features extracted. Four wavelet basis functions were considered in this study. Then, we provide the average accuracy (i.e., cross-validation error) values delivered by 252 kernel machine configurations; in particular, 40%/35% of the best-calibrated models of the standard and least squares SVMs reached 100% accuracy rate for the two kernel functions considered. Moreover, we show the sensitivity profiles exhibited by a large sample of the configurations whereby one can visually inspect their levels of sensitiveness to the type of feature and to the kernel function/parameter value. Conclusions: Overall, the results evidence that all kernel machines are competitive in terms of accuracy, with the standard and least squares SVMs prevailing more consistently. Moreover, the choice of the kernel function and parameter value as well as the choice of the feature extractor are critical decisions to be taken, albeit the choice of the wavelet family seems not to be so relevant. Also, the statistical values calculated over the Lyapunov exponents were good sources of signal representation, but not as informative as their wavelet counterparts. Finally, a typical sensitivity profile has emerged among all types of machines, involving some regions of stability separated by zones of sharp variation, with some kernel parameter values clearly associated with better accuracy rates (zones of optimality). (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Here, I investigate the use of Bayesian updating rules applied to modeling how social agents change their minds in the case of continuous opinion models. Given another agent statement about the continuous value of a variable, we will see that interesting dynamics emerge when an agent assigns a likelihood to that value that is a mixture of a Gaussian and a uniform distribution. This represents the idea that the other agent might have no idea about what is being talked about. The effect of updating only the first moments of the distribution will be studied, and we will see that this generates results similar to those of the bounded confidence models. On also updating the second moment, several different opinions always survive in the long run, as agents become more stubborn with time. However, depending on the probability of error and initial uncertainty, those opinions might be clustered around a central value.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Today several different unsupervised classification algorithms are commonly used to cluster similar patterns in a data set based only on its statistical properties. Specially in image data applications, self-organizing methods for unsupervised classification have been successfully applied for clustering pixels or group of pixels in order to perform segmentation tasks. The first important contribution of this paper refers to the development of a self-organizing method for data classification, named Enhanced Independent Component Analysis Mixture Model (EICAMM), which was built by proposing some modifications in the Independent Component Analysis Mixture Model (ICAMM). Such improvements were proposed by considering some of the model limitations as well as by analyzing how it should be improved in order to become more efficient. Moreover, a pre-processing methodology was also proposed, which is based on combining the Sparse Code Shrinkage (SCS) for image denoising and the Sobel edge detector. In the experiments of this work, the EICAMM and other self-organizing models were applied for segmenting images in their original and pre-processed versions. A comparative analysis showed satisfactory and competitive image segmentation results obtained by the proposals presented herein. (C) 2008 Published by Elsevier B.V.