898 resultados para Residual-Based Panel Cointegration Test
Resumo:
This work presents a study about the elimination of anticancer drugs, a group of pollutants considered recalcitrant during conventional activated sludge wastewater treatment, using a biological treatment based on the fungus Trametes versicolor. A 10-L fluidized bed bioreactor inoculated with this fungus was set up in order to evaluate the removal of 10 selected anticancer drugs in real hospital wastewater. Almost all the tested anticancer drugs were completely removed from the wastewater at the end of the batch experiment (8 d) with the exception of Ifosfamide and Tamoxifen. These two recalcitrant compounds, together with Cyclophosphamide, were selected for further studies to test their degradability by T. versicolor under optimal growth conditions. Cyclophosphamide and Ifosfamide were inalterable during batch experiments both at high and low concentration, whereas Tamoxifen exhibited a decrease in its concentration along the treatment. Two positional isomers of a hydroxylated form of Tamoxifen were identified during this experiment using a high resolution mass spectrometry based on ultra-high performance chromatography coupled to an Orbitrap detector (LTQ-Velos Orbitrap). Finally the identified transformation products of Tamoxifen were monitored in the bioreactor run with real hospital wastewater
Resumo:
In this work we describe both a chromatographic purification procedure and a spot test for the enzyme peroxidase (POD: EC 1.11.1.7). The enzyme was obtained from crude extracts of sweet potatoes and the chromatographic enzyme purification procedure resulted in several fractions. Therefore a simple, fast and economic spot test for monitoring peroxidase during the purification procedure was developed. The spot test is based on the reaction of hydrogen peroxide and guaiacol, which is catalyzed by the presence of peroxidase yielding the colored tetraguaiacol.
Resumo:
The main objective of this thesis is to show that plate strips subjected to transverse line loads can be analysed by using the beam on elastic foundation (BEF) approach. It is shown that the elastic behaviour of both the centre line section of a semi infinite plate supported along two edges, and the free edge of a cantilever plate strip can be accurately predicted by calculations based on the two parameter BEF theory. The transverse bending stiffness of the plate strip forms the foundation. The foundation modulus is shown, mathematically and physically, to be the zero order term of the fourth order differential equation governing the behaviour of BEF, whereas the torsion rigidity of the plate acts like pre tension in the second order term. Direct equivalence is obtained for harmonic line loading by comparing the differential equations of Levy's method (a simply supported plate) with the BEF method. By equating the second and zero order terms of the semi infinite BEF model for each harmonic component, two parameters are obtained for a simply supported plate of width B: the characteristic length, 1/ λ, and the normalized sum, n, being the effect of axial loading and stiffening resulting from the torsion stiffness, nlin. This procedure gives the following result for the first mode when a uniaxial stress field was assumed (ν = 0): 1/λ = √2B/π and nlin = 1. For constant line loading, which is the superimposition of harmonic components, slightly differing foundation parameters are obtained when the maximum deflection and bending moment values of the theoretical plate, with v = 0, and BEF analysis solutions are equated: 1 /λ= 1.47B/π and nlin. = 0.59 for a simply supported plate; and 1/λ = 0.99B/π and nlin = 0.25 for a fixed plate. The BEF parameters of the plate strip with a free edge are determined based solely on finite element analysis (FEA) results: 1/λ = 1.29B/π and nlin. = 0.65, where B is the double width of the cantilever plate strip. The stress biaxial, v > 0, is shown not to affect the values of the BEF parameters significantly the result of the geometric nonlinearity caused by in plane, axial and biaxial loading is studied theoretically by comparing the differential equations of Levy's method with the BEF approach. The BEF model is generalised to take into account the elastic rotation stiffness of the longitudinal edges. Finally, formulae are presented that take into account the effect of Poisson's ratio, and geometric non linearity, on bending behaviour resulting from axial and transverse inplane loading. It is also shown that the BEF parameters of the semi infinite model are valid for linear elastic analysis of a plate strip of finite length. The BEF model was verified by applying it to the analysis of bending stresses caused by misalignments in a laboratory test panel. In summary, it can be concluded that the advantages of the BEF theory are that it is a simple tool, and that it is accurate enough for specific stress analysis of semi infinite and finite plate bending problems.
Resumo:
Metaheuristic methods have become increasingly popular approaches in solving global optimization problems. From a practical viewpoint, it is often desirable to perform multimodal optimization which, enables the search of more than one optimal solution to the task at hand. Population-based metaheuristic methods offer a natural basis for multimodal optimization. The topic has received increasing interest especially in the evolutionary computation community. Several niching approaches have been suggested to allow multimodal optimization using evolutionary algorithms. Most global optimization approaches, including metaheuristics, contain global and local search phases. The requirement to locate several optima sets additional requirements for the design of algorithms to be effective in both respects in the context of multimodal optimization. In this thesis, several different multimodal optimization algorithms are studied in regard to how their implementation in the global and local search phases affect their performance in different problems. The study concentrates especially on variations of the Differential Evolution algorithm and their capabilities in multimodal optimization. To separate the global and local search search phases, three multimodal optimization algorithms are proposed, two of which hybridize the Differential Evolution with a local search method. As the theoretical background behind the operation of metaheuristics is not generally thoroughly understood, the research relies heavily on experimental studies in finding out the properties of different approaches. To achieve reliable experimental information, the experimental environment must be carefully chosen to contain appropriate and adequately varying problems. The available selection of multimodal test problems is, however, rather limited, and no general framework exists. As a part of this thesis, such a framework for generating tunable test functions for evaluating different methods of multimodal optimization experimentally is provided and used for testing the algorithms. The results demonstrate that an efficient local phase is essential for creating efficient multimodal optimization algorithms. Adding a suitable global phase has the potential to boost the performance significantly, but the weak local phase may invalidate the advantages gained from the global phase.
Resumo:
Background: Assessing of the costs of treating disease is necessary to demonstrate cost-effectiveness and to estimate the budget impact of new interventions and therapeutic innovations. However, there are few comprehensive studies on resource use and costs associated with lung cancer patients in clinical practice in Spain or internationally. The aim of this paper was to assess the hospital cost associated with lung cancer diagnosis and treatment by histology, type of cost and stage at diagnosis in the Spanish National Health Service. Methods: A retrospective, descriptive analysis on resource use and a direct medical cost analysis were performed. Resource utilisation data were collected by means of patient files from nine teaching hospitals. From a hospital budget impact perspective, the aggregate and mean costs per patient were calculated over the first three years following diagnosis or up to death. Both aggregate and mean costs per patient were analysed by histology, stage at diagnosis and cost type. Results: A total of 232 cases of lung cancer were analysed, of which 74.1% corresponded to non-small cell lung cancer (NSCLC) and 11.2% to small cell lung cancer (SCLC); 14.7% had no cytohistologic confirmation. The mean cost per patient in NSCLC ranged from 13,218 Euros in Stage III to 16,120 Euros in Stage II. The main cost components were chemotherapy (29.5%) and surgery (22.8%). Advanced disease stages were associated with a decrease in the relative weight of surgical and inpatient care costs but an increase in chemotherapy costs. In SCLC patients, the mean cost per patient was 15,418 Euros for limited disease and 12,482 Euros for extensive disease. The main cost components were chemotherapy (36.1%) and other inpatient costs (28.7%). In both groups, the Kruskall-Wallis test did not show statistically significant differences in mean cost per patient between stages. Conclusions: This study provides the costs of lung cancer treatment based on patient file reviews, with chemotherapy and surgery accounting for the major components of costs. This cost analysis is a baseline study that will provide a useful source of information for future studies on cost-effectiveness and on the budget impact of different therapeutic innovations in Spain.
Resumo:
Cefdinir has broad spectrum of activity and high prescription rates, hence its counterfeiting seems imminent. We have proposed a simple, fast, selective and non-extractive spectrophotometric method for the content assay of cefdinir in formulations. The method is based on complexation of cefdinir and Fe under reducing condition in a buffered medium (pH 11) to form a magenta colored donor-acceptor complex (λ max = 550 nm; apparent molar absorptivity = 3720 L mol-1 cm-1). No other cephalosporins, penicillins and common excipients interfere under the test conditions. The Beer's law is followed in the concentration range 8-160 µg mL-1.
Resumo:
Difenoconazole residues in strawberry fruit cultivated in pots were estimated using the solid-liquid extraction with low temperature partition (SLE/LTP) method for sample preparation and gas chromatography with electron capture detection (GC/ECD) for analysis. The optimized method presented excellent recovery values from fortified samples and reproducibility (average recovery values ≥ 98%; CV values < 15%). Linearity of response was demonstrated (r = 0.995) with a detection limit of 9 µg kg-1. The method was successfully applied for the determination of difenoconazole residues in strawberries. Based on these results, the fungicide dissipates quickly, but the residual concentration increases after multiple applications.
Resumo:
Virtual screening is a central technique in drug discovery today. Millions of molecules can be tested in silico with the aim to only select the most promising and test them experimentally. The topic of this thesis is ligand-based virtual screening tools which take existing active molecules as starting point for finding new drug candidates. One goal of this thesis was to build a model that gives the probability that two molecules are biologically similar as function of one or more chemical similarity scores. Another important goal was to evaluate how well different ligand-based virtual screening tools are able to distinguish active molecules from inactives. One more criterion set for the virtual screening tools was their applicability in scaffold-hopping, i.e. finding new active chemotypes. In the first part of the work, a link was defined between the abstract chemical similarity score given by a screening tool and the probability that the two molecules are biologically similar. These results help to decide objectively which virtual screening hits to test experimentally. The work also resulted in a new type of data fusion method when using two or more tools. In the second part, five ligand-based virtual screening tools were evaluated and their performance was found to be generally poor. Three reasons for this were proposed: false negatives in the benchmark sets, active molecules that do not share the binding mode, and activity cliffs. In the third part of the study, a novel visualization and quantification method is presented for evaluation of the scaffold-hopping ability of virtual screening tools.
Resumo:
Modern sophisticated telecommunication devices require even more and more comprehensive testing to ensure quality. The test case amount to ensure well enough coverage of testing has increased rapidly and this increased demand cannot be fulfilled anymore only by using manual testing. Also new agile development models require execution of all test cases with every iteration. This has lead manufactures to use test automation more than ever to achieve adequate testing coverage and quality. This thesis is separated into three parts. Evolution of cellular networks is presented at the beginning of the first part. Also software testing, test automation and the influence of development model for testing are examined in the first part. The second part describes a process which was used to implement test automation scheme for functional testing of LTE core network MME element. In implementation of the test automation scheme agile development models and Robot Framework test automation tool were used. In the third part two alternative models are presented for integrating this test automation scheme as part of a continuous integration process. As a result, the test automation scheme for functional testing was implemented. Almost all new functional level testing test cases can now be automated with this scheme. In addition, two models for integrating this scheme to be part of a wider continuous integration pipe were introduced. Also shift from usage of a traditional waterfall model to a new agile development based model in testing stated to be successful.
Resumo:
Two simple sensitive and cost-effective spectrophotometric methods are described for the determination of lansoprazole (LPZ) in bulk drug and in capsules using ceric ammonium sulphate (CAS), iron (II), orthophenanthroline and thiocyanate as reagents. In both methods, an acidic solution of lansoprazole is treated with a measured excess of CAS followed by the determination of unreacted oxidant by two procedures involving different reaction schemes. The first method involves the reduction of residual oxidant by a known amount of iron(II), and the unreacted iron(II) is complexed with orthophenanthroline at a raised pH, and the absorbance of the resulting complex measured at 510 nm (method A). In the second method, the unreacted CAS is reduced by excess of iron (II), and the resulting iron (III) is complexed with thiocyanate in the acid medium and the absorbance of the complex measured at 470 nm (method B). In both methods, the amount CAS reacted corresponds to the amount of LPZ. In method A, the absorbance is found to increase linearly with the concentration of LPZ where as in method B a linear decrease in absorbance occurs. The systems obey Beer's law for 2.5-30 and 2.5-25 µg mL-1 for method A and method B, respectively, and the corresponding molar absorptivity values are 8.1×10³ and 1.5×10(4) L mol-1cm-1 . The methods were successfully applied to the determination of LPZ in capsules and the results tallied well with the label claim. No interference was observed from the concomitant substances normally added to capsules.
Lanthanum based high surface area perovskite-type oxide and application in CO and propane combustion
Resumo:
The perovskite-type oxides using transition metals present a promising potential as catalysts in total oxidation reaction. The present work investigates the effect of synthesis by oxidant co-precipitation on the catalytic activity of perovskite-type oxides LaBO3 (B= Co, Ni, Mn) in total oxidation of propane and CO. The perovskite-type oxides were characterized by means of X-ray diffraction, nitrogen adsorption (BET method), thermo gravimetric and differential thermal analysis (ATG-DTA) and X-ray photoelectron spectroscopy (XPS). Through a method involving the oxidant co-precipitation it's possible to obtain catalysts with different BET surface areas, of 33-44 m²/g, according the salts of metal used. The characterization results proved that catalysts have a perovskite phase as well as lanthanum oxide, except LaMnO3, that presents a cationic vacancies and generation for known oxygen excess. The results of catalytic test showed that all oxides have a specific catalytic activity for total oxidation of CO and propane even though the temperatures for total conversion change for each transition metal and substance to be oxidized.
Resumo:
The aim of the study was to create an easily upgradable product costing model for laser welded hollow core steel panels to help in pricing decisions. The theory section includes a literature review to identify traditional and modern cost accounting methodologies, which are used by manufacturing companies. The theory section also presents the basics of steel panel structures and their manufacturing methods and manufacturing costs based on previous research. Activity-Based costing turned out to be the most appropriate methodology for the costing model because of wide product variations. Activity analysis and the determination of cost drivers based on observations and interviews were the key steps in the creation of the model. The created model was used to test how panel parameters affect the costs caused by the main manufacturing stages and materials. By comparing cost structures, it was possible to find the panel types that are the most economic and uneconomic to manufacture. A sensitivity analysis proved that the model gives sufficiently reliable cost information to support pricing decisions. More reliable cost information could be achieved by determining the cost drivers more accurately. Alternative methods for manufacturing the cores were compared with the model. The comparison proved that roll forming can be more advantageous and flexible than press brake bending. However, more extensive research showed that roll forming is possible only when the cores are designed to be manufactured by roll forming. Due to that fact, when new panels are designed consideration should be given to the possibility of using roll forming.
Resumo:
New luminometric particle-based methods were developed to quantify protein and to count cells. The developed methods rely on the interaction of the sample with nano- or microparticles and different principles of detection. In fluorescence quenching, timeresolved luminescence resonance energy transfer (TR-LRET), and two-photon excitation fluorescence (TPX) methods, the sample prevents the adsorption of labeled protein to the particles. Depending on the system, the addition of the analyte increases or decreases the luminescence. In the dissociation method, the adsorbed protein protects the Eu(III) chelate on the surface of the particles from dissociation at a low pH. The experimental setups are user-friendly and rapid and do not require hazardous test compounds and elevated temperatures. The sensitivity of the quantification of protein (from 40 to 500 pg bovine serum albumin in a sample) was 20-500-fold better than in most sensitive commercial methods. The quenching method exhibited low protein-to-protein variability and the dissociation method insensitivity to the assay contaminants commonly found in biological samples. Less than ten eukaryotic cells were detected and quantified with all the developed methods under optimized assay conditions. Furthermore, two applications, the method for detection of the aggregation of protein and the cell viability test, were developed by utilizing the TR-LRET method. The detection of the aggregation of protein was allowed at a more than 10,000 times lower concentration, 30 μg/L, compared to the known methods of UV240 absorbance and dynamic light scattering. The TR-LRET method was combined with a nucleic acid assay with cell-impermeable dye to measure the percentage of dead cells in a single tube test with cell counts below 1000 cells/tube.
Resumo:
To identify formulations of biological agents that enable survival, stability and a good surface distribution of the antagonistic agent, studies that test different application vehicles are necessary. The efficiency of two killer yeasts, Wickerhamomyces anomalus (strain 422) and Meyerozyma guilliermondii (strain 443), associated with five different application vehicles, was assessed for the protection of postharvest papayas. In this study, after 90 days of incubation at 4ºC, W. anomalus (strain 422) and M. guilliermondii (strain 443) were viable with all application vehicles tested. Fruits treated with different formulations (yeasts + application vehicles) had a decreased severity of disease (by at least 30%) compared with untreated fruits. The treatment with W. anomalus (strain 422) + 2% starch lowered disease occurrence by 48.3%. The most efficient treatments using M. guilliermondii (strain 443) were those with 2% gelatin or 2% liquid carnauba wax, both of which reduced anthracnose by 50% in postharvest papayas. Electron micrographs of the surface tissues of the treated fruits showed that all application vehicles provided excellent adhesion of the yeast to the surface. Formulations based on starch (2%), gelatin (2%) and carnauba wax (2%) were the most efficient at controlling fungal diseases in postharvest papayas.
Resumo:
Early identification of beginning readers at risk of developing reading and writing difficulties plays an important role in the prevention and provision of appropriate intervention. In Tanzania, as in other countries, there are children in schools who are at risk of developing reading and writing difficulties. Many of these children complete school without being identified and without proper and relevant support. The main language in Tanzania is Kiswahili, a transparent language. Contextually relevant, reliable and valid instruments of identification are needed in Tanzanian schools. This study aimed at the construction and validation of a group-based screening instrument in the Kiswahili language for identifying beginning readers at risk of reading and writing difficulties. In studying the function of the test there was special interest in analyzing the explanatory power of certain contextual factors related to the home and school. Halfway through grade one, 337 children from four purposively selected primary schools in Morogoro municipality were screened with a group test consisting of 7 subscales measuring phonological awareness, word and letter knowledge and spelling. A questionnaire about background factors and the home and school environments related to literacy was also used. The schools were chosen based on performance status (i.e. high, good, average and low performing schools) in order to include variation. For validation, 64 children were chosen from the original sample to take an individual test measuring nonsense word reading, word reading, actual text reading, one-minute reading and writing. School marks from grade one and a follow-up test half way through grade two were also used for validation. The correlations between the results from the group test and the three measures used for validation were very high (.83-.95). Content validity of the group test was established by using items drawn from authorized text books for reading in grade one. Construct validity was analyzed through item analysis and principal component analysis. The difficulty level of most items in both the group test and the follow-up test was good. The items also discriminated well. Principal component analysis revealed one powerful latent dimension (initial literacy factor), accounting for 93% of the variance. This implies that it could be possible to use any set of the subtests of the group test for screening and prediction. The K-Means cluster analysis revealed four clusters: at-risk children, strugglers, readers and good readers. The main concern in this study was with the groups of at-risk children (24%) and strugglers (22%), who need the most assistance. The predictive validity of the group test was analyzed by correlating the measures from the two school years and by cross tabulating grade one and grade two clusters. All the correlations were positive and very high, and 94% of the at-risk children in grade two were already identified in the group test in grade one. The explanatory power of some of the home and school factors was very strong. The number of books at home accounted for 38% of the variance in reading and writing ability measured by the group test. Parents´ reading ability and the support children received at home for schoolwork were also influential factors. Among the studied school factors school attendance had the strongest explanatory power, accounting for 21% of the variance in reading and writing ability. Having been in nursery school was also of importance. Based on the findings in the study a short version of the group test was created. It is suggested for use in the screening processes in grade one aiming at identifying children at risk of reading and writing difficulties in the Tanzanian context. Suggestions for further research as well as for actions for improving the literacy skills of Tanzanian children are presented.