915 resultados para Automated estimator
Resumo:
Gaussianity and statistical isotropy of the Universe are modern cosmology's minimal set of hypotheses. In this work we introduce a new statistical test to detect observational deviations from this minimal set. By defining the temperature correlation function over the whole celestial sphere, we are able to independently quantify both angular and planar dependence (modulations) of the CMB temperature power spectrum over different slices of this sphere. Given that planar dependence leads to further modulations of the usual angular power spectrum C(l), this test can potentially reveal richer structures in the morphology of the primordial temperature field. We have also constructed an unbiased estimator for this angular-planar power spectrum which naturally generalizes the estimator for the usual C(l)'s. With the help of a chi-square analysis, we have used this estimator to search for observational deviations of statistical isotropy in WMAP's 5 year release data set (ILC5), where we found only slight anomalies on the angular scales l = 7 and l = 8. Since this angular-planar statistic is model-independent, it is ideal to employ in searches of statistical anisotropy (e.g., contaminations from the galactic plane) and to characterize non-Gaussianities.
Resumo:
Finite-size scaling analysis turns out to be a powerful tool to calculate the phase diagram as well as the critical properties of two-dimensional classical statistical mechanics models and quantum Hamiltonians in one dimension. The most used method to locate quantum critical points is the so-called crossing method, where the estimates are obtained by comparing the mass gaps of two distinct lattice sizes. The success of this method is due to its simplicity and the ability to provide accurate results even considering relatively small lattice sizes. In this paper, we introduce an estimator that locates quantum critical points by exploring the known distinct behavior of the entanglement entropy in critical and noncritical systems. As a benchmark test, we use this new estimator to locate the critical point of the quantum Ising chain and the critical line of the spin-1 Blume-Capel quantum chain. The tricritical point of this last model is also obtained. Comparison with the standard crossing method is also presented. The method we propose is simple to implement in practice, particularly in density matrix renormalization group calculations, and provides us, like the crossing method, amazingly accurate results for quite small lattice sizes. Our applications show that the proposed method has several advantages, as compared with the standard crossing method, and we believe it will become popular in future numerical studies.
Resumo:
Background: There are several studies in the literature depicting measurement error in gene expression data and also, several others about regulatory network models. However, only a little fraction describes a combination of measurement error in mathematical regulatory networks and shows how to identify these networks under different rates of noise. Results: This article investigates the effects of measurement error on the estimation of the parameters in regulatory networks. Simulation studies indicate that, in both time series (dependent) and non-time series (independent) data, the measurement error strongly affects the estimated parameters of the regulatory network models, biasing them as predicted by the theory. Moreover, when testing the parameters of the regulatory network models, p-values computed by ignoring the measurement error are not reliable, since the rate of false positives are not controlled under the null hypothesis. In order to overcome these problems, we present an improved version of the Ordinary Least Square estimator in independent (regression models) and dependent (autoregressive models) data when the variables are subject to noises. Moreover, measurement error estimation procedures for microarrays are also described. Simulation results also show that both corrected methods perform better than the standard ones (i.e., ignoring measurement error). The proposed methodologies are illustrated using microarray data from lung cancer patients and mouse liver time series data. Conclusions: Measurement error dangerously affects the identification of regulatory network models, thus, they must be reduced or taken into account in order to avoid erroneous conclusions. This could be one of the reasons for high biological false positive rates identified in actual regulatory network models.
Resumo:
We consider the problem of interaction neighborhood estimation from the partial observation of a finite number of realizations of a random field. We introduce a model selection rule to choose estimators of conditional probabilities among natural candidates. Our main result is an oracle inequality satisfied by the resulting estimator. We use then this selection rule in a two-step procedure to evaluate the interacting neighborhoods. The selection rule selects a small prior set of possible interacting points and a cutting step remove from this prior set the irrelevant points. We also prove that the Ising models satisfy the assumptions of the main theorems, without restrictions on the temperature, on the structure of the interacting graph or on the range of the interactions. It provides therefore a large class of applications for our results. We give a computationally efficient procedure in these models. We finally show the practical efficiency of our approach in a simulation study.
Resumo:
The aim of this paper was to study a method based on gas production technique to measure the biological effects of tannins on rumen fermentation. Six feeds were used as fermentation substrates in a semi-automated gas method: feed A - aroeira (Astronium urundeuva); feed B - jurema preta (Mimosa hostilis), feed C - sorghum grains (Sorghum bicolor); feed D - Tifton-85 (Cynodon sp.); and two others prepared mixing 450 g sorghum leaves, 450 g concentrate (maize and soybean meal) and 100 g either of acacia (Acacia mearnsii) tannin extract (feed E) or quebracho (Schinopsis lorentzii) tannin extract (feed F) per kg (w:w). Three assays were carried out to standardize the bioassay for tannins. The first assay compared two binding agents (polyethylene glycol - PEG - and polyvinyl polypirrolidone - PVPP) to attenuate the tannin effects. The complex formed by PEG and tannins showed to be more stable than PVPP and tannins. Then, in the second assay, PEG was used as binding agent, and this assay was done to evaluate levels of PEG (0, 500, 750, 1000 and 1250 mg/g DM) to minimize the tannin effect. All the tested levels of PEG produced a response to evaluate tannin effects but the best response was for dose of 1000 mg/g DM. Using this dose of PEG, the final assay was carried out to test three compounds (tannic acid, quebracho extract and acacia extract) to establish a curve of biological equivalent effect of tannins. For this, five levels of each compound were added to I g of a standard feed (Lucerne hay). The equivalent effect showed not to be directly related to the chemical analysis for tannins. It was shown that different sources of tannins had different activities or reactivities. The curves of biological equivalence can provide information about tannin reactivity and its use seems to be important as an additional factor for chemical analysis. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
A fully automated methodology was developed for the determination of the thyroid hormones levothyroxine (T4) and liothyronine (T3). The proposed method exploits the formation of highly coloured charge-transfer (CT) complexes between these compounds, acting as electron donors, and pi-acceptors such as chloranilic acid (CIA) and 2,3-dichloro-5,6-dicyano-p-benzoquinone (DDQ). For automation of the analytical procedure a simple, fast and versatile single interface flow system (SIFA)was implemented guaranteeing a simplified performance optimisation, low maintenance and a cost-effective operation. Moreover, the single reaction interface assured a convenient and straightforward approach for implementing job`s method of continuous variations used to establish the stoichiometry of the formed CT complexes. Linear calibration plots for levothyroxine and liothyronine concentrations ranging from 5.0 x 10(-5) to 2.5 x 10(-4) mol L(-1) and 1.0 x 10(-5) to 1.0 x 10(-4) mol L(-1), respectively, were obtained, with good precision (R.S.D. <4.6% and <3.9%) and with a determination frequency of 26 h(-1) for both drugs. The results obtained for pharmaceutical formulations were statistically comparable to the declared hormone amount with relative deviations lower than 2.1%. The accuracy was confirmed by carrying out recovery studies, which furnished recovery values ranging from 96.3% to 103.7% for levothyroxine and 100.1% for liothyronine. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
A fully automated multipumping flow system (MPFS) using water-soluble CdTe quantum dots (QD) as sensitizers is proposed for the chemiluminometric determination of the anti-diabetic drugs gliclazide and glipizide in pharmaceutical formulations. The nanocrystals acted as enhancers of the weak CL emission produced upon oxidation of sulphite by Ce(IV) in acidic medium, thus improving sensitivity and expanding the dynamical analytical concentration range. By interacting with the QD, the two analytes prevented their sensitizing effect yielding a chemiluminescence quenching of the Ce(IV)-SO(3)(2-)CdTe QD system. The pulsed flow inherent to MPFS assured a fast and efficient mixing of all solutions inside the flow cell, circumventing the need for a reaction coil and facilitating the monitoring of the short-lived generated chemiluminescent species. QD crystal size, concentration and spectral region for measurement were investigated. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Introduction: Internet users are increasingly using the worldwide web to search for information relating to their health. This situation makes it necessary to create specialized tools capable of supporting users in their searches. Objective: To apply and compare strategies that were developed to investigate the use of the Portuguese version of Medical Subject Headings (MeSH) for constructing an automated classifier for Brazilian Portuguese-language web-based content within or outside of the field of healthcare, focusing on the lay public. Methods: 3658 Brazilian web pages were used to train the classifier and 606 Brazilian web pages were used to validate it. The strategies proposed were constructed using content-based vector methods for text classification, such that Naive Bayes was used for the task of classifying vector patterns with characteristics obtained through the proposed strategies. Results: A strategy named InDeCS was developed specifically to adapt MeSH for the problem that was put forward. This approach achieved better accuracy for this pattern classification task (0.94 sensitivity, specificity and area under the ROC curve). Conclusions: Because of the significant results achieved by InDeCS, this tool has been successfully applied to the Brazilian healthcare search portal known as Busca Saude. Furthermore, it could be shown that MeSH presents important results when used for the task of classifying web-based content focusing on the lay public. It was also possible to show from this study that MeSH was able to map out mutable non-deterministic characteristics of the web. (c) 2010 Elsevier Inc. All rights reserved.
Resumo:
Transplantation of pancreatic islets constitutes a promising alternative treatment for type 1 diabetes. However, it is limited by the shortage of organ donors. Previous results from our laboratory have demonstrated beneficial effects of recombinant human prolactin (rhPRL) treatment on beta cell cultures. We therefore investigated the role of rhPRL action in human beta cell survival, focusing on the molecular mechanisms involved in this process. Human pancreatic islets were isolated using an automated method. Islet cultures were pre-treated in the absence or presence of rhPRL and then subjected to serum starvation or cytokine treatment. Beta cells were labelled with Newport green and apoptosis was evaluated using flow cytometry analysis. Levels of BCL2 gene family members were studied by quantitative RT-PCR and western blot. Caspase-8, -9 and -3 activity, as well as nitric oxide production, were evaluated by fluorimetric assays. The proportion of apoptotic beta cells was significantly lowered in the presence of rhPRL under both cell death-induced conditions. We also demonstrated that cytoprotection may involve an increase of BCL2/BAX ratio, as well as inhibition of caspase-8, -9 and -3. Our study provides relevant evidence for a protective effect of lactogens on human beta cell apoptosis. The results also suggest that the improvement of cell survival may involve, at least in part, inhibition of cell death pathways controlled by the BCL2 gene family members. These findings are highly relevant for improvement of the islet isolation procedure and for clinical islet transplantation.
Resumo:
Sequential injection analysis (SIA) is proposed for managing microvolumes of sample and arsenic species solutions for speciation analysis by capillary electrophoresis focusing on the reduction of hazardous waste residues. An electronically controlled hydrodynamic injector was projected to introduce microvolumes of solutions prepared by SIA into the CE capillary with precision better than 2%. The determination of arsenite, arsenate, monomethylarsonic acid, dimethylarsinic acid, and arsenobetaine was performed from 50 mu L volumes of lyophilized urine and extract of shrimp with the system hyphenated to inductively coupled plasma mass spectrometry (CE-ICP-SFMS).
Resumo:
This paper proposes a three-stage offline approach to detect, identify, and correct series and shunt branch parameter errors. In Stage 1 the branches suspected of having parameter errors are identified through an Identification Index (II). The II of a branch is the ratio between the number of measurements adjacent to that branch, whose normalized residuals are higher than a specified threshold value, and the total number of measurements adjacent to that branch. Using several measurement snapshots, in Stage 2 the suspicious parameters are estimated, in a simultaneous multiple-state-and-parameter estimation, via an augmented state and parameter estimator which increases the V - theta state vector for the inclusion of suspicious parameters. Stage 3 enables the validation of the estimation obtained in Stage 2, and is performed via a conventional weighted least squares estimator. Several simulation results (with IEEE bus systems) have demonstrated the reliability of the proposed approach to deal with single and multiple parameter errors in adjacent and non-adjacent branches, as well as in parallel transmission lines with series compensation. Finally the proposed approach is confirmed on tests performed on the Hydro-Quebec TransEnergie network.
Resumo:
This paper deals with the H(infinity) recursive estimation problem for general rectangular time-variant descriptor systems in discrete time. Riccati-equation based recursions for filtered and predicted estimates are developed based on a data fitting approach and game theory. In this approach, the nature determines a state sequence seeking to maximize the estimation cost, whereas the estimator tries to find an estimate that brings the estimation cost to a minimum. A solution exists for a specified gamma-level if the resulting cost is positive. In order to present some computational alternatives to the H(infinity) filters developed, they are rewritten in information form along with the respective array algorithms. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
The productivity associated with commonly available disassembly methods today seldomly makes disassembly the preferred end-of-life solution for massive take back product streams. Systematic reuse of parts or components, or recycling of pure material fractions are often not achievable in an economically sustainable way. In this paper a case-based review of current disassembly practices is used to analyse the factors influencing disassembly feasibility. Data mining techniques were used to identify major factors influencing the profitability of disassembly operations. Case characteristics such as involvement of the product manufacturer in the end-of-life treatment and continuous ownership are some of the important dimensions. Economic models demonstrate that the efficiency of disassembly operations should be increased an order of magnitude to assure the competitiveness of ecologically preferred, disassembly oriented end-of-life scenarios for large waste of electric and electronic equipment (WEEE) streams. Technological means available to increase the productivity of the disassembly operations are summarized. Automated disassembly techniques can contribute to the robustness of the process, but do not allow to overcome the efficiency gap if not combined with appropriate product design measures. Innovative, reversible joints, collectively activated by external trigger signals, form a promising approach to low cost, mass disassembly in this context. A short overview of the state-of-the-art in the development of such self-disassembling joints is included. (c) 2008 CIRP.
Resumo:
The activity of validating identified requirements for an information system helps to improve the quality of a requirements specification document and, consequently, the success of a project. Although various different support tools to requirements engineering exist in the market, there is still a lack of automated support for validation activity. In this context, the purpose of this paper is to make up for that deficiency, with the use of an automated tool, to provide the resources for the execution of an adequate validation activity. The contribution of this study is to enable an agile and effective follow-up of the scope established for the requirements, so as to lead the development to a solution which would satisfy the real necessities of the users, as well as to supply project managers with relevant information about the maturity of the analysts involved in requirements specification.