885 resultados para Methods for Multi-criteria Evaluation
Resumo:
The purpose of this study is to develop and validate a dissolution test for fluconazole, an antifungal used for the treatment of superficial, cutaneous, and cutaneomucous infections caused by Candida species, in capsules dosage form. Techniques by HPLC and UV first derivative spectrophotometry (UV-FDS) were selected for quantitative evaluation. In the development of release profile, several conditions were evaluated. Dissolution test parameters were considered appropriate when a most discriminative release profile for fluconazole capsules was yielded. Dissolution test conditions for fluconazole capsules were 900mL of HCl 0.1M, 37 0.5C using baskets with 50rpm for 30min of test. The developed HPLC and UV-FDS methods for the antifungal evaluation were selective and met requirements for an appropriate and validated method, according to ICH and USP requirements. Both methods can be useful in the registration process of new drugs or their renewal. For routine analysis application cost, simplicity, equipment, solvents, speed, and application to large or small workloads should be observed.
Resumo:
Coordenao de Aperfeioamento de Pessoal de Nvel Superior (CAPES)
Resumo:
The purpose of this study was to examine the reliability, validity and classification accuracy of the South Oaks Gambling Screen (SOGS) in a sample of the Brazilian population. Participants in this study were drawn from three sources: 71 men and women from the general population interviewed at a metropolitan train station; 116 men and women encountered at a bingo venue; and 54 men and women undergoing treatment for gambling. The SOGS and a DSM-IV-based instrument were applied by trained researchers. The internal consistency of the SOGS was 0.75 according to the Cronbach`s alpha model, and construct validity was good. A significant difference among groups was demonstrated by ANOVA (F ((2.238)) = 221.3, P < 0.001). The SOGS items and DSM-IV symptoms were highly correlated (r = 0.854, P < 0.01). The SOGS also presented satisfactory psychometric properties: sensitivity (100), specificity (74.7), positive predictive rate (60.7), negative predictive rate (100) and misclassification rate (0.18). However, a cut-off score of eight improved classification accuracy and reduced the rate of false positives: sensitivity (95.4), specificity (89.8), positive predictive rate (78.5), negative predictive rate (98) and misclassification rate (0.09). Thus, the SOGS was found to be reliable and valid in the Brazilian population.
Resumo:
Introduction: The aim of this work was to identify possible lymphatic filariasis foci in the western Brazilian Amazonian that could be established from the reports of Rachou in the 1950s. The study was conducted in three cities of the western Brazilian Amazon region - Porto Velho and Guajar-Mirim (State of Rondnia) and Humait (State of Amazonas). Methods: For human infection evaluation thick blood smear stained with Giemsa was used to analyze samples collected from 10pm to 1am. Polymerase chain reaction (PCR) was used to examine mosquito vectors for the presence of Wuchereria bancrofti DNA. Humans were randomly sampled from night schools students and from inhabitants in neighborhoods lacking sanitation. Mosquitoes were collected from residences only. Results: A total 2,709 night students enrolled in the Program for Education of Young Adults (EJA), and 935 people registered in the residences near the schools were examined, being 641 from Porto Velho, 214 from Guajar-Mirim and 80 from Humait. No individual examined was positive for the presence of microfilariae in the blood stream. A total of 7,860 female Culex quinquefasciatus specimens examined were negative by PCR. Conclusions: This survey including human and mosquito examinations indicates that the western Amazon region of Brazil is not a focus of Bancroftian filariasis infection or transmission. Therefore, there is no need to be included in the Brazilian lymphatic filariasis control program.
Resumo:
Programa de doctorado: Sistemas Inteligentes y Aplicaciones Numricas en Ingeniera Instituto Universitario (SIANI)
Resumo:
La ricerca proposta si pone lobiettivo di definire e sperimentare un metodo per unarticolata e sistematica lettura del territorio rurale, che, oltre ad ampliare la conoscenza del territorio, sia di supporto ai processi di pianificazione paesaggistici ed urbanistici e allattuazione delle politiche agricole e di sviluppo rurale. Unapprofondita disamina dello stato dellarte riguardante levoluzione del processo di urbanizzazione e le conseguenze dello stesso in Italia e in Europa, oltre che del quadro delle politiche territoriali locali nellambito del tema specifico dello spazio rurale e periurbano, hanno reso possibile, insieme a una dettagliata analisi delle principali metodologie di analisi territoriale presenti in letteratura, la determinazione del concept alla base della ricerca condotta. E stata sviluppata e testata una metodologia multicriteriale e multilivello per la lettura del territorio rurale sviluppata in ambiente GIS, che si avvale di algoritmi di clustering (quale lalgoritmo IsoCluster) e classificazione a massima verosimiglianza, focalizzando lattenzione sugli spazi agricoli periurbani. Tale metodo si incentra sulla descrizione del territorio attraverso la lettura di diverse componenti dello stesso, quali quelle agro-ambientali e socio-economiche, ed opera una sintesi avvalendosi di una chiave interpretativa messa a punto allo scopo, lImpronta Agroambientale (Agro-environmental Footprint - AEF), che si propone di quantificare il potenziale impatto degli spazi rurali sul sistema urbano. In particolare obiettivo di tale strumento lidentificazione nel territorio extra-urbano di ambiti omogenei per caratteristiche attraverso una lettura del territorio a differenti scale (da quella territoriale a quella aziendale) al fine di giungere ad una sua classificazione e quindi alla definizione delle aree classificabili come agricole periurbane. La tesi propone la presentazione dellarchitettura complessiva della metodologia e la descrizione dei livelli di analisi che la compongono oltre che la successiva sperimentazione e validazione della stessa attraverso un caso studio rappresentativo posto nella Pianura Padana (Italia).
Resumo:
The last decade has witnessed the establishment of a Standard Cosmological Model, which is based on two fundamental assumptions: the first one is the existence of a new non relativistic kind of particles, i. e. the Dark Matter (DM) that provides the potential wells in which structures create, while the second one is presence of the Dark Energy (DE), the simplest form of which is represented by the Cosmological Constant , that sources the acceleration in the expansion of our Universe. These two features are summarized by the acronym CDM, which is an abbreviation used to refer to the present Standard Cosmological Model. Although the Standard Cosmological Model shows a remarkably successful agreement with most of the available observations, it presents some longstanding unsolved problems. A possible way to solve these problems is represented by the introduction of a dynamical Dark Energy, in the form of the scalar field . In the coupled DE models, the scalar field features a direct interaction with matter in different regimes. Cosmic voids are large under-dense regions in the Universe devoided of matter. Being nearby empty of matter their dynamics is supposed to be dominated by DE, to the nature of which the properties of cosmic voids should be very sensitive. This thesis work is devoted to the statistical and geometrical analysis of cosmic voids in large N-body simulations of structure formation in the context of alternative competing cosmological models. In particular we used the ZOBOV code (see ref. Neyrinck 2008), a publicly available void finder algorithm, to identify voids in the Halos catalogues extraxted from CoDECS simulations (see ref. Baldi 2012 ). The CoDECS are the largest N-body simulations to date of interacting Dark Energy (DE) models. We identify suitable criteria to produce voids catalogues with the aim of comparing the properties of these objects in interacting DE scenarios to the standard CDM model, at different redshifts. This thesis work is organized as follows: in chapter 1, the Standard Cosmological Model as well as the main properties of cosmic voids are intro- duced. In chapter 2, we will present the scalar field scenario. In chapter 3 the tools, the methods and the criteria by which a voids catalogue is created are described while in chapter 4 we discuss the statistical properties of cosmic voids included in our catalogues. In chapter 5 the geometrical properties of the catalogued cosmic voids are presented by means of their stacked profiles. In chapter 6 we summarized our results and we propose further developments of this work.
Resumo:
An increased or disturbed activation and aggregation of platelets plays a major role in the pathophysiology of thrombosis and haemostasis and is related to cardiovascular disease processes. In addition to qualitative disturbances of platelet function, changes in thrombopoiesis or an increased elimination of platelets, (e. g., in autoimmune thrombocytopenia), are also of major clinical relevance. Flow cytometry is increasingly used for the specific characterisation of phenotypic alterations of platelets which are related to cellular activation, haemostatic function and to maturation of precursor cells. These new techniques also allow the study of the in vitro response of platelets to stimuli and the modification thereof under platelet-targeted therapy as well as the characterisation of platelet-specific antibodies. In this protocol, specific flow cytometric techniques for platelet analysis are recommended based on a description of the current state of flow cytometric methodology. These recommendations are an attempt to promote the use of these new techniques which are at present broadly evaluated for diagnostic purposes. Furthermore, the definition of the still open questions primarily related to the technical details of the method should help to promote the multi-center evaluation of procedures with the goal to finally develop standardized operation procedures as the basis of interlaboratory reproducibility when applied to diagnostic testing.
Resumo:
The procurement of transportation services via large-scale combinatorial auctions involves a couple of complex decisions whose outcome highly influences the performance of the tender process. This paper examines the shipper's task of selecting a subset of the submitted bids which efficiently trades off total procurement cost against expected carrier performance. To solve this bi-objective winner determination problem, we propose a Pareto-based greedy randomized adaptive search procedure (GRASP). As a post-optimizer we use a path relinking procedure which is hybridized with branch-and-bound. Several variants of this algorithm are evaluated by means of artificial test instances which comply with important real-world characteristics. The two best variants prove superior to a previously published Pareto-based evolutionary algorithm.
Resumo:
Successful computer-supported distance education requires that its enabling technologies are accessible and usable anywhere. They should work seamlessly inside and outside the information superhighway, wherever the target learners are located, without obtruding on the learning activity. It has long been recognised that the usability of interactive computer systems is inversely related to the visibility of the implementing technologies. Reducing the visibility of technology is especially challenging in the area of online language learning systems, which require high levels of interactivity and communication along multiple dimensions such as speaking, listening, reading and writing. In this article, the authors review the concept of invisibility as it applies to the design of interactive technologies and appliances. They describe a specialised appliance matched to the requirements for distance second language learning, and report on a successful multi-phase evaluation process, including initial field testing at a Thai open university.
Resumo:
Successful computer-supported distance education requires that its enabling technologies are accessible and usable anywhere. They should work seamlessly inside and outside the information superhighway, wherever the target learners are located, without obtruding on the learning activity. It has long been recognised that the usability of interactive computer systems is inversely related to the visibility of the implementing technologies. Reducing the visibility of technology is especially challenging in the area of online language learning systems, which require high levels of interactivity and communication along multiple dimensions such as speaking, listening, reading and writing. In this article, the authors review the concept of invisibility as it applies to the design of interactive technologies and appliances. They describe a specialised appliance matched to the requirements for distance second language learning, and report on a successful multi-phase evaluation process, including initial field testing at a Thai open university.
Resumo:
OBJECTIVE To present the anatomical and functional results of the inside-out technique applied in pediatric cholestetaoma surgery and to evaluate functionality with good hearing results against radicality with lower recurrence rate. METHODS Retrospective analysis and evaluation of the postoperative outcome in a consecutive series of 126 children or 130 ears operated between 1992 and 2008. With the inside-out technique, cholesteatoma is eradicated from the epitympanum toward the mastoid and, as a single stage procedure, functional reconstruction of the middle ear is achieved by tympanoossiculoplasty. RESULTS In 89.2% of all cases, the ear was dry postoperatively. 80.9% of the ears reached a postoperative air-bone gap of 30 dB or less and the median air conduction hearing threshold was 29 dB; in 60.9% of all cases, hearing was postoperatively improved. The recurrence rate was 16.2% in a mean postoperative follow-up 8.5 years. Altogether, 48 ears (36.9%) underwent revision surgery. The complication rate was 3.1% and involved only minor complications. CONCLUSION The inside-out technique allows a safe removal of cholesteatoma from the epitympanum toward the mastoid with a single-stage reconstruction of the ossicular chain. For this reason we support our individual approach, which allows creation of the smallest possible cavity for the size of the cholesteatoma. Our results confirm that the inside-out technique is effective in the treatment of pediatric cholesteatoma.
Resumo:
Introduction Commercial treatment planning systems employ a variety of dose calculation algorithms to plan and predict the dose distributions a patient receives during external beam radiation therapy. Traditionally, the Radiological Physics Center has relied on measurements to assure that institutions participating in the National Cancer Institute sponsored clinical trials administer radiation in doses that are clinically comparable to those of other participating institutions. To complement the effort of the RPC, an independent dose calculation tool needs to be developed that will enable a generic method to determine patient dose distributions in three dimensions and to perform retrospective analysis of radiation delivered to patients who enrolled in past clinical trials. Methods A multi-source model representing output for Varian 6 MV and 10 MV photon beams was developed and evaluated. The Monte Carlo algorithm, know as the Dose Planning Method (DPM), was used to perform the dose calculations. The dose calculations were compared to measurements made in a water phantom and in anthropomorphic phantoms. Intensity modulated radiation therapy and stereotactic body radiation therapy techniques were used with the anthropomorphic phantoms. Finally, past patient treatment plans were selected and recalculated using DPM and contrasted against a commercial dose calculation algorithm. Results The multi-source model was validated for the Varian 6 MV and 10 MV photon beams. The benchmark evaluations demonstrated the ability of the model to accurately calculate dose for the Varian 6 MV and the Varian 10 MV source models. The patient calculations proved that the model was reproducible in determining dose under similar conditions described by the benchmark tests. Conclusions The dose calculation tool that relied on a multi-source model approach and used the DPM code to calculate dose was developed, validated, and benchmarked for the Varian 6 MV and 10 MV photon beams. Several patient dose distributions were contrasted against a commercial algorithm to provide a proof of principal to use as an application in monitoring clinical trial activity.
Resumo:
PURPOSE This study assessed whether a cycle of "routine" therapeutic drug monitoring (TDM) for imatinib dosage individualization, targeting an imatinib trough plasma concentration (C min) of 1,000ng/ml (tolerance: 750-1,500ng/ml), could improve clinical outcomes in chronic myelogenous leukemia (CML) patients, compared with TDM use only in case of problems ("rescue" TDM). METHODS Imatinib concentration monitoring evaluation was a multicenter randomized controlled trial including adult patients in chronic or accelerated phase CML receiving imatinib since less than 5years. Patients were allocated 1:1 to "routine TDM" or "rescue TDM." The primary endpoint was a combined outcome (failure- and toxicity-free survival with continuation on imatinib) over 1-year follow-up, analyzed in intention-to-treat (ISRCTN31181395). RESULTS Among 56 patients (55 evaluable), 14/27 (52%) receiving "routine TDM" remained event-free versus 16/28 (57%) "rescue TDM" controls (P=0.69). In the "routine TDM" arm, dosage recommendations were correctly adopted in 14 patients (median C min: 895ng/ml), who had fewer unfavorable events (28%) than the 13 not receiving the advised dosage (77%; P=0.03; median C min: 648ng/ml). CONCLUSIONS This first target concentration intervention trial could not formally demonstrate a benefit of "routine TDM" because of small patient number and surprisingly limited prescriber's adherence to dosage recommendations. Favorable outcomes were, however, found in patients actually elected for target dosing. This study thus shows first prospective indication for TDM being a useful tool to guide drug dosage and shift decisions. The study design and analysis provide an interesting paradigm for future randomized TDM trials on targeted anticancer agents.