960 resultados para new methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this Ph.D. project, original and innovative approaches for the quali-quantitative analysis of abuse substances, as well as therapeutic agents with abuse potential and related compounds were designed, developed and validated for application to different fields such as forensics, clinical and pharmaceutical. All the parameters involved in the developed analytical workflows were properly and accurately optimised, from sample collection to sample pretreatment up to the instrumental analysis. Advanced dried blood microsampling technologies have been developed, able of bringing several advantages to the method as a whole, such as significant reduction of solvent use, feasible storage and transportation conditions and enhancement of analyte stability. At the same time, the use of capillary blood allows to increase subject compliance and overall method applicability by exploiting such innovative technologies. Both biological and non-biological samples involved in this project were subjected to optimised pretreatment techniques developed ad-hoc for each target analyte, making also use of advanced microextraction techniques. Finally, original and advanced instrumental analytical methods have been developed based on high and ultra-high performance liquid chromatography (HPLC,UHPLC) coupled to different detection means (mainly mass spectrometry, but also electrochemical, and spectrophotometric detection for screening purpose), and on attenuated total reflectance-Fourier transform infrared spectroscopy (ATR-FTIR) for solid-state analysis. Each method has been designed to obtain highly selective, sensitive yet sustainable systems and has been validated according to international guidelines. All the methods developed herein proved to be suitable for the analysis of the compounds under investigation and may be useful tools in medicinal chemistry, pharmaceutical analysis, within clinical studies and forensic investigations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Contexte. Les études cas-témoins sont très fréquemment utilisées par les épidémiologistes pour évaluer l’impact de certaines expositions sur une maladie particulière. Ces expositions peuvent être représentées par plusieurs variables dépendant du temps, et de nouvelles méthodes sont nécessaires pour estimer de manière précise leurs effets. En effet, la régression logistique qui est la méthode conventionnelle pour analyser les données cas-témoins ne tient pas directement compte des changements de valeurs des covariables au cours du temps. Par opposition, les méthodes d’analyse des données de survie telles que le modèle de Cox à risques instantanés proportionnels peuvent directement incorporer des covariables dépendant du temps représentant les histoires individuelles d’exposition. Cependant, cela nécessite de manipuler les ensembles de sujets à risque avec précaution à cause du sur-échantillonnage des cas, en comparaison avec les témoins, dans les études cas-témoins. Comme montré dans une étude de simulation précédente, la définition optimale des ensembles de sujets à risque pour l’analyse des données cas-témoins reste encore à être élucidée, et à être étudiée dans le cas des variables dépendant du temps. Objectif: L’objectif général est de proposer et d’étudier de nouvelles versions du modèle de Cox pour estimer l’impact d’expositions variant dans le temps dans les études cas-témoins, et de les appliquer à des données réelles cas-témoins sur le cancer du poumon et le tabac. Méthodes. J’ai identifié de nouvelles définitions d’ensemble de sujets à risque, potentiellement optimales (le Weighted Cox model and le Simple weighted Cox model), dans lesquelles différentes pondérations ont été affectées aux cas et aux témoins, afin de refléter les proportions de cas et de non cas dans la population source. Les propriétés des estimateurs des effets d’exposition ont été étudiées par simulation. Différents aspects d’exposition ont été générés (intensité, durée, valeur cumulée d’exposition). Les données cas-témoins générées ont été ensuite analysées avec différentes versions du modèle de Cox, incluant les définitions anciennes et nouvelles des ensembles de sujets à risque, ainsi qu’avec la régression logistique conventionnelle, à des fins de comparaison. Les différents modèles de régression ont ensuite été appliqués sur des données réelles cas-témoins sur le cancer du poumon. Les estimations des effets de différentes variables de tabac, obtenues avec les différentes méthodes, ont été comparées entre elles, et comparées aux résultats des simulations. Résultats. Les résultats des simulations montrent que les estimations des nouveaux modèles de Cox pondérés proposés, surtout celles du Weighted Cox model, sont bien moins biaisées que les estimations des modèles de Cox existants qui incluent ou excluent simplement les futurs cas de chaque ensemble de sujets à risque. De plus, les estimations du Weighted Cox model étaient légèrement, mais systématiquement, moins biaisées que celles de la régression logistique. L’application aux données réelles montre de plus grandes différences entre les estimations de la régression logistique et des modèles de Cox pondérés, pour quelques variables de tabac dépendant du temps. Conclusions. Les résultats suggèrent que le nouveau modèle de Cox pondéré propose pourrait être une alternative intéressante au modèle de régression logistique, pour estimer les effets d’expositions dépendant du temps dans les études cas-témoins

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, the classic oscillator design methods are reviewed, and their strengths and weaknesses are shown. Provisos for avoiding the misuse of classic methods are also proposed. If the required provisos are satisfied, the solutions provided by the classic methods (oscillator start-up linear approximation) will be correct. The provisos verification needs to use the NDF (Network Determinant Function). The use of the NDF or the most suitable RRT (Return Relation Transponse), which is directly related to the NDF, as a tool to analyze oscillators leads to a new oscillator design method. The RRT is the "true" loop-gain of oscillators. The use of the new method is demonstrated with examples. Finally, a comparison of NDF/RRT results with the HB (Harmonic Balance) simulation and practical implementation measurements prove the universal use of the new methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-08

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Objective: To compare the cancer knowledge and skills of interns in 2001 who graduated from graduate medical program (GMP) courses with those from non-GMP courses, and to compare the cancer knowledge and skills of interns in 2001 with those who completed a similar survey in 1990. Design: Questionnaire survey of recently graduated interns in a random sample of Australian and New Zealand hospitals. The questionnaire was designed to allow direct comparison with the 1990 survey, and was guided by the Australian Cancer Society's Ideal Oncology Curriculum for Medical Schools. Results: 443 interns completed the survey (response rate, 62%; 42 were excluded, leaving 401 surveys for analysis: 118 from GMP courses and 283 from non-GMP courses). Interns from GMP courses felt more competent than those from non-GMP courses at discussing death (P= 0.02), breaking bad news (P= 0.04) and advising on smoking cessation (P= 0.02), but less competent at preparing a patient for a hazardous procedure (P= 0.02). Mote GMP interns would refer a breast cancer patient to a multidisciplinary clinic (83% versus 70%; P= 0.03). Knowledge about cancer risks and prognosis was significantly less in GMP interns, but GMP interns rated their clinical skills, such as taking a Pap smear, higher than non-GMP interns. The GMP and non-GMP groups did not differ in their exposure to cancer patients, but compared with 1990 interns recent graduates had less exposure to patients with cancer. Conclusions: GMP curricula appear to have successfully introduced new course material and new methods of teaching, but have not always succeeded in producing doctors with better knowledge about cancer. Recent graduates have less exposure to cancer patients than those who trained 10 years ago.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Epidemiological studies of drug misusers have until recently relied on two main forms of sampling: probability and convenience. The former has been used when the aim was simply to estimate the prevalence of the condition and the latter when in depth studies of the characteristics, profiles and behaviour of drug users were required, but each method has its limitations. Probability samples become impracticable when the prevalence of the condition is very low, less than 0.5% for example, or when the condition being studied is a clandestine activity such as illicit drug use. When stratified random samples are used, it may be difficult to obtain a truly representative sample, depending on the quality of the information used to develop the stratification strategy. The main limitation of studies using convenience samples is that the results cannot be generalised to the whole population of drug users due to selection bias and a lack of information concerning the sampling frame. New methods have been developed which aim to overcome some of these difficulties, for example, social network analysis, snowball sampling, capture-recapture techniques, privileged access interviewer method and contact tracing. All these methods have been applied to the study of drug misuse. The various methods are described and examples of their use given, drawn from both the Brazilian and international drug misuse literature.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Three different treatments were applied on several specimens of dolomitic and calcitic marble, properly stained with rust to mimic real situations (the stone specimens were exposed to the natural environment for about six months in contact with rusted iron). Thirty six marble specimens, eighteen calcitic and eighteen dolomitic, were characterized before and after treatment and monitored throughout the cleaning tests. The specimens were characterized by SEM-EDS (Scanning Electron Microscopy coupled with Energy Dispersion System), XRD (XRay Diffraction), XRF (X-Ray Fluorescence), FTIR (Fourier Transform Infrared Spectroscopy) and color measurements. It was also made a microscopic and macroscopic analysis of the stone surface along with the tests of short and long term capillary absorption. A series of test trials were conducted in order to understand which concentrations and contact times best suits to this purpose, to confirm what had been written to date in the literature. We sought to develop new methods of treatment application, skipping the usual methods of applying chemical treatments on stone substrates, with the use of cellulose poultice, resorting to the agar, a gel already used in many other areas, being something new in this area, which possesses great applicability in the field of conservation of stone materials. After the application of the best methodology for cleaning, specimens were characterized again in order to understand which treatment was more effective and less harmful, both for the operator and the stone material. Very briefly conclusions were that for a very intense and deep penetration into the stone, a solution of 3.5% of SDT buffered with ammonium carbonate to pH around 7 applied with agar support would be indicated. For rust stains in its initial state, the use of Ammonium citrate at a concentration of 5% buffered with ammonium to pH 7 could be applied more than once until satisfactory results appear.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Interpretability and power of genome-wide association studies can be increased by imputing unobserved genotypes, using a reference panel of individuals genotyped at higher marker density. For many markers, genotypes cannot be imputed with complete certainty, and the uncertainty needs to be taken into account when testing for association with a given phenotype. In this paper, we compare currently available methods for testing association between uncertain genotypes and quantitative traits. We show that some previously described methods offer poor control of the false-positive rate (FPR), and that satisfactory performance of these methods is obtained only by using ad hoc filtering rules or by using a harsh transformation of the trait under study. We propose new methods that are based on exact maximum likelihood estimation and use a mixture model to accommodate nonnormal trait distributions when necessary. The new methods adequately control the FPR and also have equal or better power compared to all previously described methods. We provide a fast software implementation of all the methods studied here; our new method requires computation time of less than one computer-day for a typical genome-wide scan, with 2.5 M single nucleotide polymorphisms and 5000 individuals.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Flow cytometry (FCM) is emerging as an important tool in environmental microbiology. Although flow cytometry applications have to date largely been restricted to certain specialized fields of microbiology, such as the bacterial cell cycle and marine phytoplankton communities, technical advances in instrumentation and methodology are leading to its increased popularity and extending its range of applications. Here we will focus on a number of recent flow cytometry developments important for addressing questions in environmental microbiology. These include (i) the study of microbial physiology under environmentally relevant conditions, (ii) new methods to identify active microbial populations and to isolate previously uncultured microorganisms, and (iii) the development of high-throughput autofluorescence bioreporter assays

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Local conditions in the past often limited opportunities for scholarly exchange. But now these limits are gone and the global workplace has replaced them. It is important to react to these changes. Every academic department must now adopt new methods and rethink processes. Another is the intense national and international debate about open access to scholarly knowledge. The Open Access Initiative shows that a change is taking place in the communication process. This change is also important for service departments within research institutions. Libraries, computer centers and related units have to ask themselves how to react appropriately to the new conditions. What services must be changed or redeveloped, and in what quality and quantity should they be offered? This article focuses on changes in the scholarly publication process. It describes both technological changes and the changes needed in people's attitudes.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

There has been relatively little change over recent decades in the methods used in research on self-reported delinquency. Face-to-face interviews and selfadministered interviews in the classroom are still the predominant alternatives envisaged. New methods have been brought into the picture by recent computer technology, the Internet, and an increasing availability of computer equipment and Internet access in schools. In the autumn of 2004, a controlled experiment was conducted with 1,203 students in Lausanne (Switzerland), where "paper-and-pencil" questionnaires were compared with computer-assisted interviews through the Internet. The experiment included a test of two different definitions of the (same) reference period. After the introductory question ("Did you ever..."), students were asked how many times they had done it (or experienced it), if ever, "over the last 12 months" or "since the October 2003 vacation". Few significant differences were found between the results obtained by the two methods and for the two definitions of the reference period, in the answers concerning victimisation, self-reported delinquency, drug use, failure to respond (missing data). Students were found to be more motivated to respond through the Internet, take less time for filling out the questionnaire, and were apparently more confident of privacy, while the school principals were less reluctant to allow classes to be interviewed through the Internet. The Internet method also involves considerable cost reductions, which is a critical advantage if self-reported delinquency surveys are to become a routinely applied method of evaluation, particularly so in countries with limited resources. On balance, the Internet may be instrumental in making research on self-reported delinquency far more feasible in situations where limited resources so far have prevented its implementation.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Although prosthetic joint infection (PJI) is a rare event after arthroplasty, it represents a significant complication that is associated with high morbidity, need for complex treatment, and substantial healthcare costs. An accurate and rapid diagnosis of PJI is crucial for treatment success. Current diagnostic methods in PJI are insufficient with 10-30% false-negative cultures. Consequently, there is a need for research and development into new methods aimed at improving diagnostic accuracy and speed of detection. In this article, we review available conventional diagnostic methods for the diagnosis of PJI (laboratory markers, histopathology, synovial fluid and periprosthetic tissue cultures), new diagnostic methods (sonication of implants, specific and multiplex PCR, mass spectrometry) and innovative techniques under development (new laboratory markers, microcalorimetry, electrical method, reverse transcription [RT]-PCR, fluorescence in situ hybridization [FISH], biofilm microscopy, microarray identification, and serological tests). The results of highly sensitive diagnostic techniques with unknown specificity should be interpreted with caution. The organism identified by a new method may represent a real pathogen that was unrecognized by conventional diagnostic methods or contamination during specimen sampling, transportation, or processing. For accurate interpretation, additional studies are needed, which would evaluate the long-term outcome (usually >2 years) with or without antimicrobial treatment. It is expected that new rapid, accurate, and fully automatic diagnostic tests will be developed soon.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

New methods and devices for pursuing performance enhancement through altitude training were developed in Scandinavia and the USA in the early 1990s. At present, several forms of hypoxic training and/or altitude exposure exist: traditional 'live high-train high' (LHTH), contemporary 'live high-train low' (LHTL), intermittent hypoxic exposure during rest (IHE) and intermittent hypoxic exposure during continuous session (IHT). Although substantial differences exist between these methods of hypoxic training and/or exposure, all have the same goal: to induce an improvement in athletic performance at sea level. They are also used for preparation for competition at altitude and/or for the acclimatization of mountaineers. The underlying mechanisms behind the effects of hypoxic training are widely debated. Although the popular view is that altitude training may lead to an increase in haematological capacity, this may not be the main, or the only, factor involved in the improvement of performance. Other central (such as ventilatory, haemodynamic or neural adaptation) or peripheral (such as muscle buffering capacity or economy) factors play an important role. LHTL was shown to be an efficient method. The optimal altitude for living high has been defined as being 2200-2500 m to provide an optimal erythropoietic effect and up to 3100 m for non-haematological parameters. The optimal duration at altitude appears to be 4 weeks for inducing accelerated erythropoiesis whereas <3 weeks (i.e. 18 days) are long enough for beneficial changes in economy, muscle buffering capacity, the hypoxic ventilatory response or Na(+)/K(+)-ATPase activity. One critical point is the daily dose of altitude. A natural altitude of 2500 m for 20-22 h/day (in fact, travelling down to the valley only for training) appears sufficient to increase erythropoiesis and improve sea-level performance. 'Longer is better' as regards haematological changes since additional benefits have been shown as hypoxic exposure increases beyond 16 h/day. The minimum daily dose for stimulating erythropoiesis seems to be 12 h/day. For non-haematological changes, the implementation of a much shorter duration of exposure seems possible. Athletes could take advantage of IHT, which seems more beneficial than IHE in performance enhancement. The intensity of hypoxic exercise might play a role on adaptations at the molecular level in skeletal muscle tissue. There is clear evidence that intense exercise at high altitude stimulates to a greater extent muscle adaptations for both aerobic and anaerobic exercises and limits the decrease in power. So although IHT induces no increase in VO(2max) due to the low 'altitude dose', improvement in athletic performance is likely to happen with high-intensity exercise (i.e. above the ventilatory threshold) due to an increase in mitochondrial efficiency and pH/lactate regulation. We propose a new combination of hypoxic method (which we suggest naming Living High-Training Low and High, interspersed; LHTLHi) combining LHTL (five nights at 3000 m and two nights at sea level) with training at sea level except for a few (2.3 per week) IHT sessions of supra-threshold training. This review also provides a rationale on how to combine the different hypoxic methods and suggests advances in both their implementation and their periodization during the yearly training programme of athletes competing in endurance, glycolytic or intermittent sports.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Decisions taken in modern organizations are often multi-dimensional, involving multiple decision makers and several criteria measured on different scales. Multiple Criteria Decision Making (MCDM) methods are designed to analyze and to give recommendations in this kind of situations. Among the numerous MCDM methods, two large families of methods are the multi-attribute utility theory based methods and the outranking methods. Traditionally both method families require exact values for technical parameters and criteria measurements, as well as for preferences expressed as weights. Often it is hard, if not impossible, to obtain exact values. Stochastic Multicriteria Acceptability Analysis (SMAA) is a family of methods designed to help in this type of situations where exact values are not available. Different variants of SMAA allow handling all types of MCDM problems. They support defining the model through uncertain, imprecise, or completely missing values. The methods are based on simulation that is applied to obtain descriptive indices characterizing the problem. In this thesis we present new advances in the SMAA methodology. We present and analyze algorithms for the SMAA-2 method and its extension to handle ordinal preferences. We then present an application of SMAA-2 to an area where MCDM models have not been applied before: planning elevator groups for high-rise buildings. Following this, we introduce two new methods to the family: SMAA-TRI that extends ELECTRE TRI for sorting problems with uncertain parameter values, and SMAA-III that extends ELECTRE III in a similar way. An efficient software implementing these two methods has been developed in conjunction with this work, and is briefly presented in this thesis. The thesis is closed with a comprehensive survey of SMAA methodology including a definition of a unified framework.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Aikuispotilaan kotisyntyisen keuhkokuumeen etiologinen diagnostiikka mikrobiologisilla pikamenetelmillä Tausta. Keuhkokuume on vakava sairaus, johon sairastuu Suomessa vuosittain n. 60 000 aikuista. Huolimatta siitä, että taudin hoito on kehittynyt, siihen liittyy yhä merkittävä, 6-15%:n kuolleisuus. Alahengitystieinfektion aiheuttajamikrobien tunnistaminen on myös edelleen haasteellista. Tavoitteet. Tämän työn tavoitteena oli tutkia Turun yliopistollisessa keskussairaalassa hoidettujen aikuispotilaiden keuhkokuumeen etiologiaa sekä selvittää uusien mikrobiologisten pikamenetelmi¬en hyödyllisyyttä taudinaiheuttajan toteamisessa. Aineisto. Osatöiden I ja III aineisto koostui 384 Turun yliopistollisen keskussairaalaan infektio-osastolla hoidetusta keuhkokuumepotilaasta. Osatyössä I tutkittiin keuhkokuumeen aiheuttaja¬mikrobeja käyttämällä perinteisten menetelmien lisäksi antigeeniosoitukseen ja PCR-tekniikkaan perustuvia pikamenetelmiä. Osatyö II käsitti 231 potilaasta koostuvan alaryhmän, jossa tutkittiin potilaiden nielun limanäytteestä rinovirusten ja enterovirusten esiintyvyyttä. Osatyössä III potilailta tutkittiin plasman C-reaktiivisen proteiinin (CRP) pitoisuus ensimmäisten viiden sairaalahoitopäi¬vän aikana. Laajoja tilastotieteellisiä analyysejä käyttämällä selvitettiin CRP:n käyttökelpoisuutta sairauden vaikeusasteen arvioinnissa ja komplikaatioiden kehittymisen ennustamisessa. Osatyössä IV 68 keuhkokuumepotilaan sairaalaan tulovaiheessa otetuista näytteistä määritettiin neutrofiilien pintareseptorien ekspressio. Osatyössä V analysoitiin sisätautien vuodeosastoilla vuosina 1996-2000 keuhkokuumepotilaille tehtyjen keuhkohuuhtelunäytteiden laboratoriotutkimustulokset. Tulokset. Keuhkokuumeen aiheuttaja löytyi 209 potilaalta, aiheuttajamikrobeja löydettiin kaikkiaan 230. Näistä aiheuttajista 135 (58.7%) löydettiin antigeenin osoituksella tai PCR-menetelmillä. Suu¬rin osa, 95 (70.4%), todettiin pelkästään kyseisillä pikamenetelmillä. Respiratorinen virus todettiin antigeeniosoituksella 11.1% keuhkokuumepotilaalla. Eniten respiratorisia viruksia löytyi vakavaa keuhkokuumetta sairastavilta potilailta (20.3%). 231 keuhkokuumepotilaan alaryhmässä todettiin PCR-menetelmällä picornavirus 19 (8.2%) potilaalla. Respiratorinen virus löytyi tässä potilasryh¬mässä kaiken kaikkiaan 47 (20%) potilaalta. Näistä 17:llä (36%) löytyi samanaikaisesti bakteerin aiheuttama infektio. CRP-tasot olivat sairaalaan tulovaiheessa merkitsevästi korkeammat vakavaa keuhkokuumetta (PSI-luokat III-V) sairastavilla potilailla kuin lievää keuhkokuumetta (PSI-luokat I-II) sairastavilla potilailla (p <0.001). Yli 100 mg/l oleva CRP-taso neljän päivän kuluttua sairaa¬laan tulosta ennusti keuhkokuumeen komplikaatiota tai huonoa hoitovastetta. Neutrofiilien komple¬menttireseptorin ekspressio oli pneumokokin aiheuttamaa keuhkokuumetta sairastavilla merkitse¬västi korkeampi kuin influenssan aiheuttamaa keuhkokuumetta sairastavilla. BAL-näytteistä vain yhdessä 71:stä (1.3%) todettiin diagnostinen bakteerikasvu kvantitatiivisessa viljelyssä. Uusilla menetelmilläkin keuhkokuumeen aiheuttaja löytyi vain 9.8% BAL-näytteistä. Päätelmät. Uusilla antigeeniosoitus- ja PCR-menetelmillä keuhkokuumeen etiologia voidaan saada selvitettyä nopeasti. Lisäksi näitä menetelmiä käyttämällä taudin aiheuttajamikrobi löytyi huomattavasti suuremmalta osalta potilaista kuin pelkästään tavanomaisia menetelmiä käyttämällä. Pikamenetelmien hyödyllisyys vaihteli taudin vaikeusasteen mukaan. Respiratorinen virus löytyi huomattavan usein keuhkokuumetta sairastavilta potilailta, ja näiden potilaiden taudinkuva oli usein vaikea. Tulovaiheen korkeaa CRP-tasoa voidaan käyttää lisäkeinona arvioitaessa keuhkokuumeen vaikeutta. CRP on erityisen hyödyllinen arvioitaessa hoitovastetta ja riskiä komplikaatioiden ke¬hittymiseen. Neutrofiilien komplementtireseptorin ekspression tutkiminen näyttää lupaavalta pi¬kamenetelmältä erottamaan bakteerien ja virusten aiheuttamat taudit toisistaan. Antimikrobihoitoa saavilla potilailla BAL-tutkimuksen löydökset olivat vähäiset ja vaikuttivat hoitoon vain harvoin.