968 resultados para Bi-level approaches


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Iridium-, Ru-, and W-coated platforms were prepared by thermal treatment of the transversely heated graphite atomizer and investigated for the simultaneous determination of As, Bi, Pb, Sb, and Se in tap water by electrothermal atomic absorption spectrometry. The maximum pyrolysis temperature for As and Bi increased in a modifier sequence W < Ru < Ir. For Pb, Sb, and Se, this sequence was W < Ru, It. Calculated characteristic masses in the presence of It, Ru, and W were 35, 33, and 35 pg for As; 63, 51, and 52 pg for Bi; 50, 32, and 34 pg for Pb; 40, 35, and 31 pg for Sb; and 39, 39, and 93 pg for Se, respectively. Ruthenium was elected as the optimum modifier.Repeatability of the measurements was typically < 6%. Recoveries of As, Bi, Pb, Sb, and Se added to tap water samples varied from 79 to 109%. Accuracy was also checked by analysis of five certified reference materials (CRMs) from the National Institute of Standards and Technology (NIST1640 - Trace Elements in Natural Water; NIST 1643d Trace Elements in Water) and High Purity Standards (Trace Metals in Drinking Water Standards, lots #812708, #591107, and #710710). A paired t-test showed that the results for the CRMs were in agreement at the 95% confidence level with the certified values. The graphite tube lifetime was about 650 firings. multi-element determination is particularly challenging due to the necessity of carefully optimizing compromise conditions.Based on the considerations listed above, the aim of this paper was to evaluate the behavior of Ir, Ru, and W as permanent modifiers for the simultaneous determination of As, Bi, Pb, Sb, and Se. The performance of the proposed procedure was also verified after the ETAAS analysis of tap waters and reference materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regulatory authorities in many countries, in order to maintain an acceptable balance between appropriate customer service qualities and costs, are introducing a performance-based regulation. These regulations impose penalties-and, in some cases, rewards-that introduce a component of financial risk to an electric power utility due to the uncertainty associated with preserving a specific level of system reliability. In Brazil, for instance, one of the reliability indices receiving special attention by the utilities is the maximum continuous interruption duration (MCID) per customer.This parameter is responsible for the majority of penalties in many electric distribution utilities. This paper describes analytical and Monte Carlo simulation approaches to evaluate probability distributions of interruption duration indices. More emphasis will be given to the development of an analytical method to assess the probability distribution associated with the parameter MCID and the correspond ng penalties. Case studies on a simple distribution network and on a real Brazilian distribution system are presented and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bismuth was evaluated as an internal standard for the direct determination of Pb in vinegar by graphite furnace atomic absorption spectrometry using Ru as a permanent modifier with co-injection of Pd/Mg(NO3)(2). The correlation coefficient of the graph plotted from the non-nalized absorbance signals of Bi versus Pb was r=0.989. Matrix effects were evaluated by analyzing the slope ratios between the analytical curve, and analytical curves obtained from Pb additions in red and white wine vinegar obtained from reference solutions prepared in 0.2% (v/v) HNO3, samples. The calculated ratios were around 1.04 and 1.02 for analytical curves established applying an internal standard and 1.3 and 1.5 for analvtical curves without. Analytical curves in the 2.5-15 pg L-1 Pb concentration interval were established using the ratio Pb absorbance to Bi absorbance versus analvte concentration, and typical linear correlations of r=0.999 were obtained. The proposed method was applied for direct determination of Pb in 18 commercial vinegar samples and the Pb concentration varied from 2.6 to 31 pg L-1. Results were in agreement at a 95% confidence level (paired t-test) with those obtained for digested samples. Recoveries of Pb added to vinegars varied from 96 to 108% with and from 72 to 86% without an internal standard. Two water standard reference materials diluted in vinegar sample were also analyzed and results were in agreement with certified values at a 95% confidence level. The characteristic mass was 40 pg Pb and the useful lifetime of the tube was around 1600 firings. The limit of detection was 0.3 mu g L-1 and the relative standard deviation was <= 3.8% and <= 8.3% (n = 12) for a sample containing, 10 mu L-1 Pb with and without internal standard, respectively. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new method was developed for the simultaneous determination of As, Bi, Sb, and Se by flow injection hydride generation graphite furnace atomic absorption spectrometry. An alternative two-step sample treatment procedure was used. The sample was heated (80degreesC) for 10 min in 6 M HCl to reduce Se(VI) to Se(IV), followed by the addition of 1% (m/v) thiourea solution to reduce arsenic and antimony from the pentavalent to the trivalent states.With this procedure, all analytes were converted to their most favorable and sensitive oxidation states to generate the corresponding hydrides. The pre-treated sample solution was then processed in the flow system for in situ trapping and atomization in a graphite tube coated with iridium. The impermanent modifier remained stable up to 300 firings and new coating out significant were possible wit changes in the analytical performance.The accuracy was checked for As, Bi, Sb, and Se determination in water standard reference materials NIST 1640 and 1643d and the results were in agreement with the certified values at a 95% confidence level. Good recoveries (94-104%.) of spiked mineral waters and synthetic As(V), Sb(Ill), mixtures of As(Ill), Sb(V), Se(VI), and Se(IV) were also found. Calculated characteristic masses were 32 mug As, 79 mug Bi, 35 mug Sb, and 130 pg Se, and the corresponding limits of detection were 0.06, 0.16, 0.19, and 0.59 mug L-1, respectively. The repeatability for a typical solution containing 5 mug L-1 As, Bi, Sb, and Se was in the 1-3% range.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software Transactional Memory (STM) systems have poor performance under high contention scenarios. Since many transactions compete for the same data, most of them are aborted, wasting processor runtime. Contention management policies are typically used to avoid that, but they are passive approaches as they wait for an abort to happen so they can take action. More proactive approaches have emerged, trying to predict when a transaction is likely to abort so its execution can be delayed. Such techniques are limited, as they do not replace the doomed transaction by another or, when they do, they rely on the operating system for that, having little or no control on which transaction should run. In this paper we propose LUTS, a Lightweight User-Level Transaction Scheduler, which is based on an execution context record mechanism. Unlike other techniques, LUTS provides the means for selecting another transaction to run in parallel, thus improving system throughput. Moreover, it avoids most of the issues caused by pseudo parallelism, as it only launches as many system-level threads as the number of available processor cores. We discuss LUTS design and present three conflict-avoidance heuristics built around LUTS scheduling capabilities. Experimental results, conducted with STMBench7 and STAMP benchmark suites, show LUTS efficiency when running high contention applications and how conflict-avoidance heuristics can improve STM performance even more. In fact, our transaction scheduling techniques are capable of improving program performance even in overloaded scenarios. © 2011 Springer-Verlag.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the potential benefits and challenges of regionally managed e-government development initiatives. It examines the current state of e-government in four Caribbean countries – Barbados, Jamaica, Saint Vincent and the Grenadines, and Trinidad and Tobago – in order to establish a broader understanding of the challenges that face e-government initiatives in the region. It also reviews a number of e-government initiatives that have been undertaken through projects managed at a regional level. Based on this analysis, it presents a set of best practices that are recommended to agencies engaged in the task of coordinating the implementation of regionally-based e-government initiatives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To test six variations in the Goldberg equation for evaluating the underreporting of energy intake (EI) among obese women on the waiting list for bariatric surgery, considering variations in resting metabolic rate (RMR), physical activity, and food intake levels in group and individual approaches.Methods: One hundred obese women aged 20 to 45years (33.3 6.08) recruited from a bariatric surgery waiting list participated in the study. Underreporting assessment was based on the difference between reported energy intake, indirect calorimetry measurements and RMR (rEI:RMR), which is compatible with the predicted physical activity level (PAL). Six approaches were used for defining the cutoff points. The approaches took into account variances in the components of the rEI:RMR = PAL equation as a function of the assumed PAL, sample size (n), and measured or estimated RMR.Results: The underreporting percentage varied from 55% to 97%, depending on the approach used for generating the cutoff points. The ratio rEI:RMR and estimated PAL of the sample were significantly different (p = 0.001). Sixty-one percent of the women reported an EI lower than their RMR. The PAL variable significantly affected the cutoff point, leading to different proportions of underreporting. The RMR measured or estimated in the equation did not result in differences in the proportion of underreporting. The individual approach was less sensitive than the group approach.Conclusion: RMR did not interfere in underreporting estimates. However, PAL variations were responsible for significant differences in cutoff point. Thus, PAL should be considered when estimating underreporting, and even though the individual approach is less sensitive than the group approach, it may be a useful tool for clinical practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neurodegenerative disorders are undoubtedly an increasing problem in the health sciences, given the increase of life expectancy and occasional vicious life style. Despite the fact that the mechanisms of such diseases are far from being completely understood, a large number of studies; that derive from both the basic science and clinical approaches have contributed substantial data in that direction. In this review, it is discussed several frontiers of basic research on Parkinson's and Alzheimer's diseases, in which research groups from three departments of the Institute of Biomedical Sciences of the University of Sao Paulo have been involved in a multidisciplinary effort. The main focus of the review involves the animal models that have been developed to study cellular and molecular aspects of those neurodegenerative diseases, including oxidative stress, insulin signaling and proteomic analyses, among others. We anticipate that this review will help the group determine future directions of joint research in the field and, more importantly, set the level of cooperation we plan to develop in collaboration with colleagues of the Nucleus for Applied Neuroscience Research that are mostly involved with clinical research in the same field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and Objective Muscle regeneration is a complex phenomenon, involving coordinated activation of several cellular responses. During this process, oxidative stress and consequent tissue damage occur with a severity that may depend on the intensity and duration of the inflammatory response. Among the therapeutic approaches to attenuate inflammation and increase tissue repair, low-level laser therapy (LLLT) may be a safe and effective clinical procedure. The aim of this study was to evaluate the effects of LLLT on oxidative/nitrative stress and inflammatory mediators produced during a cryolesion of the tibialis anterior (TA) muscle in rats. Material and Methods Sixty Wistar rats were randomly divided into three groups (n?=?20): control (BC), injured TA muscle without LLLT (IC), injured TA muscle submitted to LLLT (IRI). The injured region was irradiated daily for 4 consecutive days, starting immediately after the lesion using a AlGaAs laser (continuous wave, 808?nm, tip area of 0.00785?cm2, power 30?mW, application time 47?seconds, fluence 180?J/cm2; 3.8?mW/cm2; and total energy 1.4?J). The animals were sacrificed on the fourth day after injury. Results LLLT reduced oxidative and nitrative stress in injured muscle, decreased lipid peroxidation, nitrotyrosine formation and NO production, probably due to reduction in iNOS protein expression. Moreover, LLLT increased SOD gene expression, and decreased the inflammatory response as measured by gene expression of NF-k beta and COX-2 and by TNF-a and IL-1 beta concentration. Conclusion These results suggest that LLLT could be an effective therapeutic approach to modulate oxidative and nitrative stress and to reduce inflammation in injured muscle. Lasers Surg. Med. 44: 726735, 2012. (c) 2012 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background The generalized odds ratio (GOR) was recently suggested as a genetic model-free measure for association studies. However, its properties were not extensively investigated. We used Monte Carlo simulations to investigate type-I error rates, power and bias in both effect size and between-study variance estimates of meta-analyses using the GOR as a summary effect, and compared these results to those obtained by usual approaches of model specification. We further applied the GOR in a real meta-analysis of three genome-wide association studies in Alzheimer's disease. Findings For bi-allelic polymorphisms, the GOR performs virtually identical to a standard multiplicative model of analysis (e.g. per-allele odds ratio) for variants acting multiplicatively, but augments slightly the power to detect variants with a dominant mode of action, while reducing the probability to detect recessive variants. Although there were differences among the GOR and usual approaches in terms of bias and type-I error rates, both simulation- and real data-based results provided little indication that these differences will be substantial in practice for meta-analyses involving bi-allelic polymorphisms. However, the use of the GOR may be slightly more powerful for the synthesis of data from tri-allelic variants, particularly when susceptibility alleles are less common in the populations (≤10%). This gain in power may depend on knowledge of the direction of the effects. Conclusions For the synthesis of data from bi-allelic variants, the GOR may be regarded as a multiplicative-like model of analysis. The use of the GOR may be slightly more powerful in the tri-allelic case, particularly when susceptibility alleles are less common in the populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past decade, the advent of efficient genome sequencing tools and high-throughput experimental biotechnology has lead to enormous progress in the life science. Among the most important innovations is the microarray tecnology. It allows to quantify the expression for thousands of genes simultaneously by measurin the hybridization from a tissue of interest to probes on a small glass or plastic slide. The characteristics of these data include a fair amount of random noise, a predictor dimension in the thousand, and a sample noise in the dozens. One of the most exciting areas to which microarray technology has been applied is the challenge of deciphering complex disease such as cancer. In these studies, samples are taken from two or more groups of individuals with heterogeneous phenotypes, pathologies, or clinical outcomes. these samples are hybridized to microarrays in an effort to find a small number of genes which are strongly correlated with the group of individuals. Eventhough today methods to analyse the data are welle developed and close to reach a standard organization (through the effort of preposed International project like Microarray Gene Expression Data -MGED- Society [1]) it is not unfrequant to stumble in a clinician's question that do not have a compelling statistical method that could permit to answer it.The contribution of this dissertation in deciphering disease regards the development of new approaches aiming at handle open problems posed by clinicians in handle specific experimental designs. In Chapter 1 starting from a biological necessary introduction, we revise the microarray tecnologies and all the important steps that involve an experiment from the production of the array, to the quality controls ending with preprocessing steps that will be used into the data analysis in the rest of the dissertation. While in Chapter 2 a critical review of standard analysis methods are provided stressing most of problems that In Chapter 3 is introduced a method to adress the issue of unbalanced design of miacroarray experiments. In microarray experiments, experimental design is a crucial starting-point for obtaining reasonable results. In a two-class problem, an equal or similar number of samples it should be collected between the two classes. However in some cases, e.g. rare pathologies, the approach to be taken is less evident. We propose to address this issue by applying a modified version of SAM [2]. MultiSAM consists in a reiterated application of a SAM analysis, comparing the less populated class (LPC) with 1,000 random samplings of the same size from the more populated class (MPC) A list of the differentially expressed genes is generated for each SAM application. After 1,000 reiterations, each single probe given a "score" ranging from 0 to 1,000 based on its recurrence in the 1,000 lists as differentially expressed. The performance of MultiSAM was compared to the performance of SAM and LIMMA [3] over two simulated data sets via beta and exponential distribution. The results of all three algorithms over low- noise data sets seems acceptable However, on a real unbalanced two-channel data set reagardin Chronic Lymphocitic Leukemia, LIMMA finds no significant probe, SAM finds 23 significantly changed probes but cannot separate the two classes, while MultiSAM finds 122 probes with score >300 and separates the data into two clusters by hierarchical clustering. We also report extra-assay validation in terms of differentially expressed genes Although standard algorithms perform well over low-noise simulated data sets, multi-SAM seems to be the only one able to reveal subtle differences in gene expression profiles on real unbalanced data. In Chapter 4 a method to adress similarities evaluation in a three-class prblem by means of Relevance Vector Machine [4] is described. In fact, looking at microarray data in a prognostic and diagnostic clinical framework, not only differences could have a crucial role. In some cases similarities can give useful and, sometimes even more, important information. The goal, given three classes, could be to establish, with a certain level of confidence, if the third one is similar to the first or the second one. In this work we show that Relevance Vector Machine (RVM) [2] could be a possible solutions to the limitation of standard supervised classification. In fact, RVM offers many advantages compared, for example, with his well-known precursor (Support Vector Machine - SVM [3]). Among these advantages, the estimate of posterior probability of class membership represents a key feature to address the similarity issue. This is a highly important, but often overlooked, option of any practical pattern recognition system. We focused on Tumor-Grade-three-class problem, so we have 67 samples of grade I (G1), 54 samples of grade 3 (G3) and 100 samples of grade 2 (G2). The goal is to find a model able to separate G1 from G3, then evaluate the third class G2 as test-set to obtain the probability for samples of G2 to be member of class G1 or class G3. The analysis showed that breast cancer samples of grade II have a molecular profile more similar to breast cancer samples of grade I. Looking at the literature this result have been guessed, but no measure of significance was gived before.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The focus of this research is to develop and apply an analytical framework for evaluating the effectiveness and practicability of sustainability certification schemes for biofuels, especially in a developing country’s perspective. The main question that drives the research analysis is “Which are the main elements of and how to develop sustainability certification schemes that would be effective and practicable in certifying the contribution of biofuels in meeting the goals Governments and other stakeholders have set up?”. Biofuels have been identified as a promising tool to reach a variety of goals: climate change protection, energy security, agriculture development, and, especially in developing countries, economic development. Once the goals have been identified, and ambitious mandatory targets for biofuels use agreed at national level, concerns have been raised by the scientific community on the negative externalities that biofuels production and use can have at environment, social and economic level. Therefore certification schemes have been recognized as necessary processes to measure these externalities, and examples of such schemes are in effect, or are in a negotiating phase, both at mandatory and voluntary levels. The research focus has emerged by the concern that the ongoing examples are very demanding in terms of compliance, both for those that are subject to certification and those that have to certify, on the quantity and quality of information to be reported. A certification system, for reasons linked to costs, lack of expertise, inadequate infrastructure, absence of an administrative and legislative support, can represent an intensive burden and can act as a serious impediment for the industrial and agriculture development of developing countries, going against the principle of equity and level playing field. While this research recognizes the importance of comprehensiveness and ambition in designing an important tool for the measurement of sustainability effects of biofuels production and use, it stresses the need to focus on the effectiveness and practicability of this tool in measuring the compliance with the goal. This research that falls under the rationale of the Sustainability Science Program housed at Harvard Kennedy School, has as main objective to close the gap between the research and policy makers worlds in the field of sustainability certification schemes for biofuels.