875 resultados para Direct-Search Methods


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Dose kernel convolution (DK) methods have been proposed to speed up absorbed dose calculations in molecular radionuclide therapy. Our aim was to evaluate the impact of tissue density heterogeneities (TDH) on dosimetry when using a DK method and to propose a simple density-correction method. METHODS: This study has been conducted on 3 clinical cases: case 1, non-Hodgkin lymphoma treated with (131)I-tositumomab; case 2, a neuroendocrine tumor treatment simulated with (177)Lu-peptides; and case 3, hepatocellular carcinoma treated with (90)Y-microspheres. Absorbed dose calculations were performed using a direct Monte Carlo approach accounting for TDH (3D-RD), and a DK approach (VoxelDose, or VD). For each individual voxel, the VD absorbed dose, D(VD), calculated assuming uniform density, was corrected for density, giving D(VDd). The average 3D-RD absorbed dose values, D(3DRD), were compared with D(VD) and D(VDd), using the relative difference Δ(VD/3DRD). At the voxel level, density-binned Δ(VD/3DRD) and Δ(VDd/3DRD) were plotted against ρ and fitted with a linear regression. RESULTS: The D(VD) calculations showed a good agreement with D(3DRD). Δ(VD/3DRD) was less than 3.5%, except for the tumor of case 1 (5.9%) and the renal cortex of case 2 (5.6%). At the voxel level, the Δ(VD/3DRD) range was 0%-14% for cases 1 and 2, and -3% to 7% for case 3. All 3 cases showed a linear relationship between voxel bin-averaged Δ(VD/3DRD) and density, ρ: case 1 (Δ = -0.56ρ + 0.62, R(2) = 0.93), case 2 (Δ = -0.91ρ + 0.96, R(2) = 0.99), and case 3 (Δ = -0.69ρ + 0.72, R(2) = 0.91). The density correction improved the agreement of the DK method with the Monte Carlo approach (Δ(VDd/3DRD) < 1.1%), but with a lesser extent for the tumor of case 1 (3.1%). At the voxel level, the Δ(VDd/3DRD) range decreased for the 3 clinical cases (case 1, -1% to 4%; case 2, -0.5% to 1.5%, and -1.5% to 2%). No more linear regression existed for cases 2 and 3, contrary to case 1 (Δ = 0.41ρ - 0.38, R(2) = 0.88) although the slope in case 1 was less pronounced. CONCLUSION: This study shows a small influence of TDH in the abdominal region for 3 representative clinical cases. A simple density-correction method was proposed and improved the comparison in the absorbed dose calculations when using our voxel S value implementation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose The purpose of our multidisciplinary study was to define a pragmatic and secure alternative to the creation of a national centralised medical record which could gather together the different parts of the medical record of a patient scattered in the different hospitals where he was hospitalised without any risk of breaching confidentiality. Methods We first analyse the reasons for the failure and the dangers of centralisation (i.e. difficulty to define a European patients' identifier, to reach a common standard for the contents of the medical record, for data protection) and then propose an alternative that uses the existing available data on the basis that setting up a safe though imperfect system could be better than continuing a quest for a mythical perfect information system that we have still not found after a search that has lasted two decades. Results We describe the functioning of Medical Record Search Engines (MRSEs), using pseudonymisation of patients' identity. The MRSE will be able to retrieve and to provide upon an MD's request all the available information concerning a patient who has been hospitalised in different hospitals without ever having access to the patient's identity. The drawback of this system is that the medical practitioner then has to read all of the information and to create his own synthesis and eventually to reject extra data. Conclusions Faced with the difficulties and the risks of setting up a centralised medical record system, a system that gathers all of the available information concerning a patient could be of great interest. This low-cost pragmatic alternative which could be developed quickly should be taken into consideration by health authorities.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVES: This is the first meta-analysis on the efficacy of composite resin restorations in anterior teeth. The objective of the present meta-analysis was to verify whether specific material classes, tooth conditioning methods and operational procedures influence the result for Class III and Class IV restorations. MATERIAL AND METHODS: The database SCOPUS and PubMed were searched for clinical trials on anterior resin composites without restricting the search to the year of publication. The inclusion criteria were: (1) prospective clinical trial with at least 2 years of observation; (2) minimal number of restorations at last recall=20; (3) report on drop-out rate; (4) report of operative technique and materials used in the trial, and (5) utilization of Ryge or modified Ryge evaluation criteria. For the statistical analysis, a linear mixed model was used with random effects to account for the heterogeneity between the studies. p-Values smaller than 0.05 were considered to be significant. RESULTS: Of the 84 clinical trials, 21 studies met the inclusion criteria, 14 of them for Class III restorations, 6 for Class IV restorations and 1 for closure of diastemata; the latter was included in the Class IV group. Twelve of the 21 studies started before 1991 and 18 before 2001. The estimated median overall success rate (without replacement) after 10 years for Class III composite resin restorations was 95% and for Class IV restorations 90%. The main reason for the replacement of Class IV restorations was bulk fractures, which occurred significantly more frequently with microfilled composites than with hybrid and macrofilled composites. Caries adjacent to restorations was infrequent in most studies and accounted only for about 2.5% of all replaced restorations after 10 years irrespective of the cavity class. Class III restorations with glass ionomer derivates suffered significantly more loss of anatomical form than did fillings with other types of material. When the enamel was acid-etched and no bonding agent was applied, significantly more restorations showed marginal staining and detectable margins compared to enamel etching with enamel bonding or the total etch technique; fillings with self-etching systems were in between of these two outcome variables. Bevelling of the enamel was associated with a significantly reduced deterioration of the anatomical form compared to no bevelling but not with less marginal staining or less detectable margins. The type of isolation (absolute/relative) had a statistically significant influence on marginal caries which, however, might be a random finding.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Search engine optimization & marketing is a set of processes widely used on websites to improve search engine rankings which generate quality web traffic and increase ROI. Content is the most important part of any website. CMS web development is now become very essential for most of organizations and online businesses to develop their online system and websites. Every online business using a CMS wants to get users (customers) to make profit and ROI. This thesis comprises a brief study of existing SEO methods, tools and techniques and how they can be implemented to optimize a content base website. In results, the study provides recommendations about how to use SEO methods; tools and techniques to optimize CMS based websites on major search engines. This study compares popular CMS systems like Drupal, WordPress and Joomla SEO features and how implementing SEO can be improved on these CMS systems. Having knowledge of search engine indexing and search engine working is essential for a successful SEO campaign. This work is a complete guideline for web developers or SEO experts who want to optimize a CMS based website on all major search engines.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Yandex is the dominant search engine in Russia, followed by the world leader Google. This study focuses on the performance differences between the two in search advertising in the context of tourism, by running two identical campaigns and measuring the KPI’s, such as CPA (cost-per-action), on both campaigns. Search engine advertising is a new and fast changing form of advertising, which should be studied frequently in order to keep up with the changes. Research was done as an experimental study in cooperation with a Finnish tourism company and the data is gathered from the clickstream and not from questionnaires, which is recommended method by the literature. The results of the study suggests that Yandex.Direct performed better in the selected niche and that the individual campaign planning for Yandex.Direct and Google AdWords is an important part of the optimization of search advertising in Russia.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Intelligence from a human source, that is falsely thought to be true, is potentially more harmful than a total lack of it. The veracity assessment of the gathered intelligence is one of the most important phases of the intelligence process. Lie detection and veracity assessment methods have been studied widely but a comprehensive analysis of these methods’ applicability is lacking. There are some problems related to the efficacy of lie detection and veracity assessment. According to a conventional belief an almighty lie detection method, that is almost 100% accurate and suitable for any social encounter, exists. However, scientific studies have shown that this is not the case, and popular approaches are often over simplified. The main research question of this study was: What is the applicability of veracity assessment methods, which are reliable and are based on scientific proof, in terms of the following criteria? o Accuracy, i.e. probability of detecting deception successfully o Ease of Use, i.e. easiness to apply the method correctly o Time Required to apply the method reliably o No Need for Special Equipment o Unobtrusiveness of the method In order to get an answer to the main research question, the following supporting research questions were answered first: What kinds of interviewing and interrogation techniques exist and how could they be used in the intelligence interview context, what kinds of lie detection and veracity assessment methods exist that are reliable and are based on scientific proof and what kind of uncertainty and other limitations are included in these methods? Two major databases, Google Scholar and Science Direct, were used to search and collect existing topic related studies and other papers. After the search phase, the understanding of the existing lie detection and veracity assessment methods was established through a meta-analysis. Multi Criteria Analysis utilizing Analytic Hierarchy Process was conducted to compare scientifically valid lie detection and veracity assessment methods in terms of the assessment criteria. In addition, a field study was arranged to get a firsthand experience of the applicability of different lie detection and veracity assessment methods. The Studied Features of Discourse and the Studied Features of Nonverbal Communication gained the highest ranking in overall applicability. They were assessed to be the easiest and fastest to apply, and to have required temporal and contextual sensitivity. The Plausibility and Inner Logic of the Statement, the Method for Assessing the Credibility of Evidence and the Criteria Based Content Analysis were also found to be useful, but with some limitations. The Discourse Analysis and the Polygraph were assessed to be the least applicable. Results from the field study support these findings. However, it was also discovered that the most applicable methods are not entirely troublefree either. In addition, this study highlighted that three channels of information, Content, Discourse and Nonverbal Communication, can be subjected to veracity assessment methods that are scientifically defensible. There is at least one reliable and applicable veracity assessment method for each of the three channels. All of the methods require disciplined application and a scientific working approach. There are no quick gains if high accuracy and reliability is desired. Since most of the current lie detection studies are concentrated around a scenario, where roughly half of the assessed people are totally truthful and the other half are liars who present a well prepared cover story, it is proposed that in future studies lie detection and veracity assessment methods are tested against partially truthful human sources. This kind of test setup would highlight new challenges and opportunities for the use of existing and widely studied lie detection methods, as well as for the modern ones that are still under development.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

O losartano potássico é um agente anti-hipertensivo não peptídico, que exerce sua ação por bloqueio específico dos receptores da angiotensina II. Este trabalho propôs a validação e aplicação de métodos analíticos orientados ao controle de qualidade de losartano potássico 50 mg na forma farmacêutica cápsula, utilizando a espectrofotometria direta e derivada de primeira ordem na região do UV. Baseado nas características espectrofotométricas de losartano potássico, um sinal a 205 nm do espectro de ordem zero e um sinal a 234 nm do espectro de primeira derivada foram adequados para a quantificação. Os resultados foram usados para comparar essas duas técnicas instrumentais. O coeficiente de correlação entre as respostas e as concentrações de losartano potássico na faixa de 3,0-7,0 mg L-1 e 6,0-14,0 mg L-1 para espectrofotometria direta e derivada de primeira ordem em solução aquosa, respectivamente, foi de (r) of 0,9999 para ambos os casos. Os métodos foram aplicados para quantificação de losartano potássico em cápsulas obtidas de farmácias de manipulação locais e demonstraram ser eficientes, fáceis de aplicar e de baixo custo. Além disso, não necessitam de reagentes poluentes e requerem equipamentos economicamente viáveis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Aiming to improve the diagnosis of canine leishmaniasis (CanL) in an endemic area of the Northwest region of São Paulo State, Brazil, the efficacy of parasitological, immunological and molecular diagnostic methods were studied. Dogs with and without clinical sips of the disease and positive for Leishmania, by direct parasite identification on lymph node smears and/or specific antibody detection by ELISA, were selected for the study. According to the clinical signs, 89 dogs attending the Veterinary Hospital of UNESP in Aracatuba (SP, Brazil) were divided into three groups: symptomatic (36%), oligosymptomatic (22%) and asymptomatic (22%). Twenty-six dogs from an area non-endemic for CanL were used as negative controls (20%). Fine-needle aspiration biopsies (FNA) of popliteal lymph nodes were collected and Diff-Quick (R)-stained for optical microscopy. Direct immumofluorescence, immunocytochemistry and parasite DNA amplification by PCR were also performed. After euthanasia, fragments of popliteal lymph nodes, spleen, bone marrow and liver were collected and processed for HE and immunohistochemistry. Parasite detection by both HE and immunohistochemistry was specifically more effective in lymph nodes, when compared with the other organs. Immunolabeling provided higher sensitivity for parasite detection in the tissues. In the symptomatic group, assay sensitivity was 75.61% for direct parasite search on Diff-Quick (R)-stained FNAs, 92.68% for direct immunofluorescence, 92.68% for immunocytochemistry and 100% for PCR; the corresponding values in the other clinical groups were: 32, 60, 76 and 96% (oligosymptomatic), and 39.13, 73.91, 100 and 95.65% (asymptomatic). Results of the control animals from the CanL non-endemic area were all negative, indicating that the methods used were 100% specific. (C) 2006 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper we report on a search for short-duration gravitational wave bursts in the frequency range 64 Hz-1792 Hz associated with gamma-ray bursts (GRBs), using data from GEO 600 and one of the LIGO or Virgo detectors. We introduce the method of a linear search grid to analyze GRB events with large sky localization uncertainties, for example the localizations provided by the Fermi Gamma-ray Burst Monitor (GBM). Coherent searches for gravitational waves (GWs) can be computationally intensive when the GRB sky position is not well localized, due to the corrections required for the difference in arrival time between detectors. Using a linear search grid we are able to reduce the computational cost of the analysis by a factor of O(10) for GBM events. Furthermore, we demonstrate that our analysis pipeline can improve upon the sky localization of GRBs detected by the GBM, if a high-frequency GW signal is observed in coincidence. We use the method of the linear grid in a search for GWs associated with 129 GRBs observed satellite-based gamma-ray experiments between 2006 and 2011. The GRBs in our sample had not been previously analyzed for GW counterparts. A fraction of our GRB events are analyzed using data from GEO 600 while the detector was using squeezed-light states to improve its sensitivity; this is the first search for GWs using data from a squeezed-light interferometric observatory. We find no evidence for GW signals, either with any individual GRB in this sample or with the population as a whole. For each GRB we place lower bounds on the distance to the progenitor, under an assumption of a fixed GW emission energy of 10(-2)M circle dot c(2), with a median exclusion distance of 0.8 Mpc for emission at 500 Hz and 0.3 Mpc at 1 kHz. The reduced computational cost associated with a linear search grid will enable rapid searches for GWs associated with Fermi GBM events once the advanced LIGO and Virgo detectors begin operation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objective. The general aim of this article is to describe the state-of-the-art of biocompatibility testing for dental materials, and present new strategies for improving operative dentistry techniques and the biocompatibility of dental materials as they relate to their interaction with the dentin-pulp complex.Methods. The literature was reviewed focusing on articles related to biocompatibilty testing, the dentin-pulp complex and new strategies and materials for operative dentistry. For this purpose, the PubMed database as well as 118 articles published in English from 1939 to 2014 were searched. Data concerning types of biological tests and standardization of in vitro and in vivo protocols employed to evaluate the cytotoxicity and biocompatibility of dental materials were also searched from the US Food and Drug Administration (FDA), International Standards Organization (ISO) and American National Standards Institute (ANSI).Results. While there is an ongoing search for feasible strategies in the molecular approach to direct the repair or regeneration of structures that form the oral tissues, it is necessary for professionals to master the clinical therapies available at present. In turn, these techniques must be applied based on knowledge of the morphological and physiological characteristics of the tissues involved, as well as the physical, mechanical and biologic properties of the biomaterials recommended for each specific situation. Thus, particularly within modern esthetic restorative dentistry, the use of minimally invasive operative techniques associated with the use of dental materials with excellent properties and scientifically proved by means of clinical and laboratory studies must be a routine for dentists. This professional and responsible attitude will certainly result in greater possibility of achieving clinical success, benefiting patients and dentists themselves.Signcance. This article provides a general and critical view of the relations that permeate the interaction between dental materials and the dentin-pulp complex, and establish real possibilities and strategies that favor biocompatibility of the present and new products used in Dentistry, which will certainly benefit clinicians and their patients. (C) 2014 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Coagulation factor VIII (FVIII) concentrates are used in the treatment of patients with Hemophilia A. Human FVIII was purified directly from plasma using anion exchange chromatography followed by gel filtration. Three Q-Sepharose resins were tested, resulting in 40% recovery of FVIII activity using Q-Sepharose XL resin, about 80% using Q-Sepharose Fast Flow and 70% using the Q-Sepharose Big Beads. The vitamin K-dependent coagulation factors co-eluted with FVIII from the anion exchange columns. In the second step of purification, when Sepharose 6FF was used, 70% of FVIII activity was recovered free from vitamin K-dependent factors.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Large Hadron Collider, located at the CERN laboratories in Geneva, is the largest particle accelerator in the world. One of the main research fields at LHC is the study of the Higgs boson, the latest particle discovered at the ATLAS and CMS experiments. Due to the small production cross section for the Higgs boson, only a substantial statistics can offer the chance to study this particle properties. In order to perform these searches it is desirable to avoid the contamination of the signal signature by the number and variety of the background processes produced in pp collisions at LHC. Much account assumes the study of multivariate methods which, compared to the standard cut-based analysis, can enhance the signal selection of a Higgs boson produced in association with a top quark pair through a dileptonic final state (ttH channel). The statistics collected up to 2012 is not sufficient to supply a significant number of ttH events; however, the methods applied in this thesis will provide a powerful tool for the increasing statistics that will be collected during the next LHC data taking.