985 resultados para Partial oxalate method
Resumo:
We show that if performance measures in a stochastic scheduling problem satisfy a set of so-called partial conservation laws (PCL), which extend previously studied generalized conservation laws (GCL), then the problem is solved optimally by a priority-index policy for an appropriate range of linear performance objectives, where the optimal indices are computed by a one-pass adaptive-greedy algorithm, based on Klimov's. We further apply this framework to investigate the indexability property of restless bandits introduced by Whittle, obtaining the following results: (1) we identify a class of restless bandits (PCL-indexable) which are indexable; membership in this class is tested through a single run of the adaptive-greedy algorithm, which also computes the Whittle indices when the test is positive; this provides a tractable sufficient condition for indexability; (2) we further indentify the class of GCL-indexable bandits, which includes classical bandits, having the property that they are indexable under any linear reward objective. The analysis is based on the so-called achievable region method, as the results follow fromnew linear programming formulations for the problems investigated.
Resumo:
BACKGROUND: Hyperoxaluria is a major risk factor for kidney stone formation. Although urinary oxalate measurement is part of all basic stone risk assessment, there is no standardized method for this measurement. METHODS: Urine samples from 24-h urine collection covering a broad range of oxalate concentrations were aliquoted and sent, in duplicates, to six blinded international laboratories for oxalate, sodium and creatinine measurement. In a second set of experiments, ten pairs of native urine and urine spiked with 10 mg/L of oxalate were sent for oxalate measurement. Three laboratories used a commercially available oxalate oxidase kit, two laboratories used a high-performance liquid chromatography (HPLC)-based method and one laboratory used both methods. RESULTS: Intra-laboratory reliability for oxalate measurement expressed as intraclass correlation coefficient (ICC) varied between 0.808 [95% confidence interval (CI): 0.427-0.948] and 0.998 (95% CI: 0.994-1.000), with lower values for HPLC-based methods. Acidification of urine samples prior to analysis led to significantly higher oxalate concentrations. ICC for inter-laboratory reliability varied between 0.745 (95% CI: 0.468-0.890) and 0.986 (95% CI: 0.967-0.995). Recovery of the 10 mg/L oxalate-spiked samples varied between 8.7 ± 2.3 and 10.7 ± 0.5 mg/L. Overall, HPLC-based methods showed more variability compared to the oxalate oxidase kit-based methods. CONCLUSIONS: Significant variability was noted in the quantification of urinary oxalate concentration by different laboratories, which may partially explain the differences of hyperoxaluria prevalence reported in the literature. Our data stress the need for a standardization of the method of oxalate measurement.
Resumo:
The objective of this study was to extract and concentrate calcium oxalate (CaOx) crystals from plant leaves that form the above mentioned crystals. The chemical and physical studies of CaOx from plant to be performed depend on an adequate amount of the crystals. The plant used in this study was croton (Codiaeum variegatum). The leaves were ground in a heavy duty blender and sieved through a 0.20 mm sieve. The suspension obtained was suspended in distilled water. The crystals were concentrated at the bottom of a test tube. The supernatant must be washed until it is free of plant pigments and other organic substances. Biogenic CaOx crystals have well-defined and sharp peaks, indicating very high crystallinity. Moreover, the CaOx crystals were not damaged during the extraction procedure, as can be seen on the scanning electron microscope images. The porposed method can be considered efficient to extract and concentrate biogenic calcium oxalate.
Resumo:
The development of liquid-crystal panels for use in commercial equipment has been aimed at improving the pixel resolution and the display efficiency. These improvements have led to a reduction in the thickness of such devices, among other outcomes, that involves a loss in phase modulation. We propose a modification of the classical phase-only filter to permit displays in VGA liquid-crystal panels with a constant amplitude modulation and less than a 2¿(PI) phase modulation. The method was tested experimentally in an optical setup.
Resumo:
OBJECTIVES: To assess the long-term outcome, safety, and efficacy of partial cricotracheal resection (PCTR) for subglottic stenosis in a group of children and infants weighing less than 10 kg at the time of the surgery. STUDY DESIGN: Historical cohort study. SETTING: Academic tertiary medical center. SUBJECTS AND METHODS: Thirty-six children weighing less than 10 kg at the time of the surgery were compared to a group of 65 children who weighed more than 10 kg. The Kaplan Meier method and Cox regression were carried out to detect differences in decannulation time and rates and to examine the influence of various parameters (i.e., comorbidities, type of surgery, and complications requiring revision surgery) at the time of decannulation. Evaluation of the long-term outcome was based on questionnaires assessing breathing, voice, and swallowing. RESULTS: Decannulation rate was 92 percent (33/36) for the group of children weighing less than 10 kg. No significant differences were found between the two body weight groups with respect to the aforementioned covariates. The median follow-up period was nine years (range, 1-23 yrs). Questionnaire responses revealed completely normal breathing and swallowing in 72 percent and 90 percent of the children, respectively. Seventy-one percent of the patients considered their voice to be rough or weak. CONCLUSION: PCTR in infants and children weighing less than 10 kg is a safe and efficient technique with similar long-term results when compared to results seen in older and heavier children.
Resumo:
The ability of a PCR-based restriction fragment length polymorphism (RFLP) analysis of the cytochrome b (mtDNA) to distinguish Apodemus alpicola from two other Apodemus species was investigated. The partial sequencing of the cytochrome b allowed the identification of one enzyme as being potentially diagnostic. This was supported by an analysis of 131 specimens previously identified using morphometric and/or allozymic data, indicating that the PCR-based RFLP method provides a rapid and reliable tool for distinguishing A. alpicola from its two co-occurring congenerics. The method is applicable to samples taken in the field for ecological studies, and could easily be adapted to the identification of museum samples.
Resumo:
PURPOSE: Effective cancer treatment generally requires combination therapy. The combination of external beam therapy (XRT) with radiopharmaceutical therapy (RPT) requires accurate three-dimensional dose calculations to avoid toxicity and evaluate efficacy. We have developed and tested a treatment planning method, using the patient-specific three-dimensional dosimetry package 3D-RD, for sequentially combined RPT/XRT therapy designed to limit toxicity to organs at risk. METHODS AND MATERIALS: The biologic effective dose (BED) was used to translate voxelized RPT absorbed dose (D(RPT)) values into a normalized total dose (or equivalent 2-Gy-fraction XRT absorbed dose), NTD(RPT) map. The BED was calculated numerically using an algorithmic approach, which enabled a more accurate calculation of BED and NTD(RPT). A treatment plan from the combined Samarium-153 and external beam was designed that would deliver a tumoricidal dose while delivering no more than 50 Gy of NTD(sum) to the spinal cord of a patient with a paraspinal tumor. RESULTS: The average voxel NTD(RPT) to tumor from RPT was 22.6 Gy (range, 1-85 Gy); the maximum spinal cord voxel NTD(RPT) from RPT was 6.8 Gy. The combined therapy NTD(sum) to tumor was 71.5 Gy (range, 40-135 Gy) for a maximum voxel spinal cord NTD(sum) equal to the maximum tolerated dose of 50 Gy. CONCLUSIONS: A method that enables real-time treatment planning of combined RPT-XRT has been developed. By implementing a more generalized conversion between the dose values from the two modalities and an activity-based treatment of partial volume effects, the reliability of combination therapy treatment planning has been expanded.
Resumo:
Kolmiulotteisten kappaleiden rekonstruktio on yksi konenäön haastavimmista ongelmista, koska kappaleiden kolmiulotteisia etäisyyksiä ei voida selvittää yhdestä kaksiulotteisesta kuvasta. Ongelma voidaan ratkaista stereonäön avulla, jossa näkymän kolmiulotteinen rakenne päätellään usean kuvan perusteella. Tämä lähestymistapa mahdollistaa kuitenkin vain rekonstruktion niille kappaleiden osille, jotka näkyvät vähintään kahdessa kuvassa. Piilossa olevien osien rekonstruktio ei ole mahdollista pelkästään stereonäön avulla. Tässä työssä on kehitetty uusi menetelmä osittain piilossa olevien kolmiulotteisten tasomaisten kappaleiden rekonstruktioon. Menetelmän avulla voidaan selvittää hyvällä tarkkuudella tasomaisista pinnoista koostuvan kappaleen muoto ja paikka käyttäen kahta kuvaa kappaleesta. Menetelmä perustuu epipolaarigeometriaan, jonka avulla selvitetään molemmissa kuvissa näkyvät kappaleiden osat. Osittain piilossa olevien piirteiden rekonstruointi suoritetaan käyttämäen stereonäköä sekä tietoa kappaleen rakenteesta. Esitettyä ratkaisua voitaisiin käyttää esimerkiksi kolmiulotteisten kappaleiden visualisointiin, robotin navigointiin tai esineentunnistukseen.
Resumo:
n this work we analyze the behavior of complex information in Fresnel domain taking into account the limited capability to display complex transmittance values of current liquid crystal devices, when used as holographic displays. In order to do this analysis we compute the reconstruction of Fresnel holograms at several distances using the different parts of the complex distribution (real and imaginary parts, amplitude and phase) as well as using the full complex information adjusted with a method that combines two configurations of the devices in an adding architecture. The RMS error between the amplitude of these reconstructions and the original amplitude is used to evaluate the quality of the information displayed. The results of the error analysis show different behavior for the reconstructions using the different parts of the complex distribution and using the combined method of two devices. Better reconstructions are obtained when using two devices whose configurations densely cover the complex plane when they are added. Simulated and experimental results are also presented.
Resumo:
The partial least squares technique (PLS) has been touted as a viable alternative to latent variable structural equation modeling (SEM) for evaluating theoretical models in the differential psychology domain. We bring some balance to the discussion by reviewing the broader methodological literature to highlight: (1) the misleading characterization of PLS as an SEM method; (2) limitations of PLS for global model testing; (3) problems in testing the significance of path coefficients; (4) extremely high false positive rates when using empirical confidence intervals in conjunction with a new "sign change correction" for path coefficients; (5) misconceptions surrounding the supposedly superior ability of PLS to handle small sample sizes and non-normality; and (6) conceptual and statistical problems with formative measurement and the application of PLS to such models. Additionally, we also reanalyze the dataset provided by Willaby et al. (2015; doi:10.1016/j.paid.2014.09.008) to highlight the limitations of PLS. Our broader review and analysis of the available evidence makes it clear that PLS is not useful for statistical estimation and testing.
Resumo:
In order to develop a molecular method for detection and identification of Xanthomonas campestris pv. viticola (Xcv) the causal agent of grapevine bacterial canker, primers were designed based on the partial sequence of the hrpB gene. Primer pairs Xcv1F/Xcv3R and RST2/Xcv3R, which amplified 243- and 340-bp fragments, respectively, were tested for specificity and sensitivity in detecting DNA from Xcv. Amplification was positive with DNA from 44 Xcv strains and with DNA from four strains of X. campestris pv. mangiferaeindicae and five strains of X. axonopodis pv. passiflorae, with both primer pairs. However, the enzymatic digestion of PCR products could differentiate Xcv strains from the others. None of the primer pairs amplified DNA from grapevine, from 20 strains of nonpathogenic bacteria from grape leaves and 10 strains from six representative genera of plant pathogenic bacteria. Sensitivity of primers Xcv1F/Xcv3R and RST2/Xcv3R was 10 pg and 1 pg of purified Xcv DNA, respectively. Detection limit of primers RST2/Xcv3R was 10(4) CFU/ml, but this limit could be lowered to 10² CFU/ml with a second round of amplification using the internal primer Xcv1F. Presence of Xcv in tissues of grapevine petioles previously inoculated with Xcv could not be detected by PCR using macerated extract added directly in the reaction. However, amplification was positive with the introduction of an agar plating step prior to PCR. Xcv could be detected in 1 µl of the plate wash and from a cell suspension obtained from a single colony. Bacterium identity was confirmed by RFLP analysis of the RST2/Xcv3R amplification products digested with Hae III.
Resumo:
Bakgrunden och inspirationen till föreliggande studie är tidigare forskning i tillämpningar på randidentifiering i metallindustrin. Effektiv randidentifiering möjliggör mindre säkerhetsmarginaler och längre serviceintervall för apparaturen i industriella högtemperaturprocesser, utan ökad risk för materielhaverier. I idealfallet vore en metod för randidentifiering baserad på uppföljning av någon indirekt variabel som kan mätas rutinmässigt eller till en ringa kostnad. En dylik variabel för smältugnar är temperaturen i olika positioner i väggen. Denna kan utnyttjas som insignal till en randidentifieringsmetod för att övervaka ugnens väggtjocklek. Vi ger en bakgrund och motivering till valet av den geometriskt endimensionella dynamiska modellen för randidentifiering, som diskuteras i arbetets senare del, framom en flerdimensionell geometrisk beskrivning. I de aktuella industriella tillämpningarna är dynamiken samt fördelarna med en enkel modellstruktur viktigare än exakt geometrisk beskrivning. Lösningsmetoder för den s.k. sidledes värmeledningsekvationen har många saker gemensamt med randidentifiering. Därför studerar vi egenskaper hos lösningarna till denna ekvation, inverkan av mätfel och något som brukar kallas förorening av mätbrus, regularisering och allmännare följder av icke-välställdheten hos sidledes värmeledningsekvationen. Vi studerar en uppsättning av tre olika metoder för randidentifiering, av vilka de två första är utvecklade från en strikt matematisk och den tredje från en mera tillämpad utgångspunkt. Metoderna har olika egenskaper med specifika fördelar och nackdelar. De rent matematiskt baserade metoderna karakteriseras av god noggrannhet och låg numerisk kostnad, dock till priset av låg flexibilitet i formuleringen av den modellbeskrivande partiella differentialekvationen. Den tredje, mera tillämpade, metoden kännetecknas av en sämre noggrannhet förorsakad av en högre grad av icke-välställdhet hos den mera flexibla modellen. För denna gjordes även en ansats till feluppskattning, som senare kunde observeras överensstämma med praktiska beräkningar med metoden. Studien kan anses vara en god startpunkt och matematisk bas för utveckling av industriella tillämpningar av randidentifiering, speciellt mot hantering av olinjära och diskontinuerliga materialegenskaper och plötsliga förändringar orsakade av “nedfallande” väggmaterial. Med de behandlade metoderna förefaller det möjligt att uppnå en robust, snabb och tillräckligt noggrann metod av begränsad komplexitet för randidentifiering.
Resumo:
Oxidized starch is a key component in the paper industry, where it is used as both surfacing sizer and filler. Large quantities are annually used for this purpose; however, the methods for the oxidation are not environmentally friendly. In our research, we have studied the possibility to replace the harmful oxidation agents, such as hypochlorite or iodates and transition metal catalysts, with a more environmentally friendly oxidant, hydrogen peroxide (H2O2), and a special metal complex catalyst (FePcS), of which only a small amount is needed. The work comprised batch and semi-batch studies by H2O2, ultrasound studies of starch particles, determination of low-molecular by-products and determination of the decomposition kinetics of H2O2 in the presence of starch and the catalyst. This resulted in a waste-free oxidation method, which only produces water and oxygen as side products. The starch oxidation was studied in both semi-batch and batch modes in respective to the oxidant (H2O2) addition. The semi-batch mode proved to yield a sufficient degree of substitution (COOH groups) for industrial purposes. Treatment of starch granules by ultrasound was found to improve the reactivity of starch. The kinetic results were found out to have a rather complex pattern – several oxidation phases were observed, apparently due to the fact that the oxidation reaction in the beginning only took place on the surface, whereas after a prolonged reaction time, partial degradation of the solid starch granules allowed further reaction in the interior parts. Batch-mode experiments enabled a more detailed study of the mechanisms of starch in the presence of H2O2 and the catalyst, but yielded less oxidized starch due to rapid decomposition of H2O2 due to its high concentrations. The effect of the solid-liquid (S/L) ratio in the reaction system was studied in batch experiments. These studies revealed that the presence of the catalyst and the starch enhance the H2O2 decomposition.
Resumo:
The Mathematica system (version 4.0) is employed in the solution of nonlinear difusion and convection-difusion problems, formulated as transient one-dimensional partial diferential equations with potential dependent equation coefficients. The Generalized Integral Transform Technique (GITT) is first implemented for the hybrid numerical-analytical solution of such classes of problems, through the symbolic integral transformation and elimination of the space variable, followed by the utilization of the built-in Mathematica function NDSolve for handling the resulting transformed ODE system. This approach ofers an error-controlled final numerical solution, through the simultaneous control of local errors in this reliable ODE's solver and of the proposed eigenfunction expansion truncation order. For covalidation purposes, the same built-in function NDSolve is employed in the direct solution of these partial diferential equations, as made possible by the algorithms implemented in Mathematica (versions 3.0 and up), based on application of the method of lines. Various numerical experiments are performed and relative merits of each approach are critically pointed out.
Resumo:
The objective of this work was to study the effects of partial removal of wood hemicelluloses on the properties of kraft pulp.The work was conducted by extracting hemicelluloses (1) by a softwood chip pretreatment process prior to kraft pulping, (2) by alkaline extraction from bleached birch kraft pulp, and (3) by enzymatic treatment, xylanase treatment in particular, of bleached birch kraft pulp. The qualitative and quantitative changes in fibers and paper properties were evaluated. In addition, the applicability of the extraction concepts and hemicellulose-extracted birch kraft pulp as a raw material in papermaking was evaluated in a pilot-scale papermaking environment. The results showed that each examined hemicellulose extraction method has its characteristic effects on fiber properties, seen as differences in both the physical and chemical nature of the fibers. A prehydrolysis process prior to the kraft pulping process offered reductions in cooking time, bleaching chemical consumption and produced fibers with low hemicellulose content that are more susceptible to mechanically induced damages and dislocations. Softwood chip pretreatment for hemicellulose recovery prior to cooking, whether acidic or alkaline, had an impact on the physical properties of the non-refined and refined pulp. In addition, all the pretreated pulps exhibited slower beating response than the unhydrolyzed reference pulp. Both alkaline extraction and enzymatic (xylanase) treatment of bleached birch kraft pulp fibers indicated very selective hemicellulose removal, particularly xylan removal. Furthermore, these two hemicellulose-extracted birch kraft pulps were utilized in a pilot-scale papermaking environment in order to evaluate the upscalability of the extraction concepts. Investigations made using pilot paper machine trials revealed that some amount of alkalineextracted birch kraft pulp, with a 24.9% reduction in the total amount of xylan, could be used in the papermaking stock as a mixture with non-extracted pulp when producing 75 g/m2 paper. For xylanase-treated fibers there were no reductions in the mechanical properties of the 180 g/m2 paper produced compared to paper made from the control pulp, although there was a 14.2% reduction in the total amount of xylan in the xylanase-treated pulp compared to the control birch kraft pulp. This work emphasized the importance of the hemicellulose extraction method in providing new solutions to create functional fibers and in providing a valuable hemicellulose co-product stream. The hemicellulose removal concept therefore plays an important role in the integrated forest biorefinery scenario, where the target is to the co-production of hemicellulose-extracted pulp and hemicellulose-based chemicals or fuels.