929 resultados para Pattern-search methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: To identify patients with spontaneous subarachnoid hemorrhage for whom CT angiography alone can exclude ruptured aneurysms. METHODS: An observational retrospective review was carried out of all consecutive patients with non-traumatic subarachnoid hemorrhage who underwent both CT angiography and catheter angiography to exclude an aneurysm. CT angiography negative cases (no aneurysm) were classified according to their CT hemorrhage pattern as "aneurismal", "perimesencephalic" or as "no-hemorrhage." RESULTS: Two hundred and forty-one patients were included. A CT angiography aneurysm detection sensitivity and specificity of 96.4% and 96.0% were observed. All 35 cases of perimesencephalic or no-hemorrhage out of 78 CT angiography negatives also had negative angiography findings. CONCLUSIONS: CT angiography is self-reliant to exclude ruptured aneurysms when either a perimesencephalic hemorrhage or no-hemorrhage pattern is identified on the CT within a week of symptom onset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To identify disease causing mutation in three generations of a Swiss family with pattern dystrophy and high intrafamilial variability of phenotype. To assess the effect of intravitreal ranibizumab injections in the treatment of subfoveal choroidal neovascularization associated with pattern dystrophy in one patient. METHODS: Affected family members were ascertained for phenotypic and genotypic characterization. Ophthalmic evaluations included fundus photography, autofluorescence imaging, optical coherence tomography, and International Society for Clinical Electrophysiology of Vision standard full-field electroretinography. When possible family members had genetic testing. The proband presented with choroidal neovascularization and had intravitreal injections as needed according to visual acuity and optical coherence tomography. RESULTS: Proband had a multifocal type pattern dystrophy, and his choroidal neovascularization regressed after four intravitreal injections. The vision improved from 0.8 to 1.0, and optical coherence tomography showed complete anatomical restoration. A butterfly-shaped pattern was observed in her cousin, whereas a fundus pulverulentus pattern was seen in a second cousin. Aunt had a multifocal atrophic appearance, simulating geographic atrophy in age-related macular degeneration. The Y141C mutation was identified in the peripherin/RDS gene and segregated with disease in the family. CONCLUSION: This is the first report of marked intrafamilial variation of pattern dystrophy because of peripherin/RDS Y141C mutation. Intravitreal ranibizumab injections might be a valuable treatment for associated subfoveal choroidal neovascularization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RESUME OBJECTIF: Outre la stimulation de la sécrétion d'hormone de croissance, la ghréline cause une prise pondérale par augmentation de l'assimilation d'aliments et réduction de la consommation lipidique. Il a été décrit que les taux de ghréline augmentent durant la phase pré-prandiale et diminuent juste après un repas, ceci suggérant qu'elle puisse jouer un rôle d'initiateur de la prise du repas. Cependant, la sécrétion de ghréline chez des sujets à jeun n'a pas encore été étudiée en détail. DESSIN: Les profils de sécrétion de ghréline pendant 24 heures ont été étudiés chez six sujets volontaires sains (3 femmes, 3 hommes; 25.5 ans; BMI 22.8 kg/m2) et comparés aux profils plasmatiques de l'hormone de croissance, de l'insuline et du glucose. METHODE: Des échantillons sanguins ont été prélevés toutes les 20 minutes pendant 24 heures et les taux de ghréline ont été mesurés par radio-immuno essai, utilisant un anticorps polyclonal de lapin. Le profil circadien de la sécrétion de ghréline (cluster analysis) a été évalué. RESULTATS: Une augmentation puis une diminution spontanée des taux de ghréline ont été observées aux moments où les sujets auraient habituellement mangé. La ghréline a été sécrétée de façon pulsatile avec approximativement 8 pics par 24 heures. Une diminution générale des taux de ghréline a également été observée durant la période d'étude. Aucune corrélation n'a pu être observée entre les taux de ghréline, d'homione de croissance, d'insuline et de glucose. CONCLUSIONS: Cette étude montre que pendant une période de jeûne les taux de ghréline suivent un profil similaire à ceux décrits chez des sujets mangeant 3 fois par jour. Durant le jeûne, l'hormone de croissance, l'insuline et le glucose ne semblent pas être impliqués dans la régulation de la sécrétion de ghréline. En outre, nous avons observé que la sécrétion de ghréline est pulsatile. La variation des taux de ghréline, indépendamment des repas, chez des sujets à jeun, renforce les observations préalables selon lesquelles le système nerveux central est primairement impliqué dans la régulation de la prise alimentaire. ABSTRACT: OBJECTIVE: Ghrelin stimulates GH release and causes weight gain through increased food intake and reduced fat utiIization. Ghrelin levels were shown to rise in the preprandial period and decrease shortly after meal consumption, suggesting a role as a possible meal initiator. However, ghrelin secretion in fasting subjects has not yet been studied in detail. DESIGN: 24-h ghrelin profiles were studied in six healthy volunteers (three females; 25.5 years; body mass index 22.8 kg/m2) and compared with GH, insulin and glucose levels. METHODS: Blood samples were taken every 20 min during a 24-h fasting period and total ghrelin levels were measured by RIA using a polyclonal rabbit antibody. The circadian pattern of ghrelin secretion and pulsatility (Cluster analysis) were evaluated. RESULTS: An increase and spontaneous decrease in ghrelin were seen at the timepoints of customary meals. Ghrelin was secreted in a pulsatile manner with approximately 8 peaks/24 h. An overall decrease in ghrelin levels was observed during the study period. There was no correlation of ghrelin with GH, insulin or blood glucose levels. CONCLUSIONS: This pilot study indicates that fasting ghrelin profiles display a circadian pattern similar to that described in people eating three times per day. In a fasting condition. GH, insulin and glucose do not appear to be involved in ghrelin regulation. In addition, we round that ghrelin is secreted in a pulsatile pattern. The variation in ghrelin independently of meals in fasting subjects supports previous observations that it is the brain that is primarily involved in the regulation of meal initiation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS: Surgical ablation procedures for treating atrial fibrillation have been shown to be highly successful. However, the ideal ablation pattern still remains to be determined. This article reports on a systematic study of the effectiveness of the performance of different ablation line patterns. METHODS AND RESULTS: This study of ablation line patterns was performed in a biophysical model of human atria by combining basic lines: (i) in the right atrium: isthmus line, line between vena cavae and appendage line and (ii) in the left atrium: several versions of pulmonary vein isolation, connection of pulmonary veins, isthmus line, and appendage line. Success rates and the presence of residual atrial flutter were documented. Basic patterns yielded conversion rates of only 10-25 and 10-55% in the right and the left atria, respectively. The best result for pulmonary vein isolation was obtained when a single closed line encompassed all veins (55%). Combination of lines in the right/left atrium only led to a success rate of 65/80%. Higher rates, up to 90-100%, could be obtained if right and left lines were combined. The inclusion of a left isthmus line was found to be essential for avoiding uncommon left atrial flutter. CONCLUSION: Some patterns studied achieved a high conversion rate, although using a smaller number of lines than those of the Maze III procedure. The biophysical atrial model is shown to be effective in the search for promising alternative ablation strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a validation study on statistical nonsupervised brain tissue classification techniques in magnetic resonance (MR) images. Several image models assuming different hypotheses regarding the intensity distribution model, the spatial model and the number of classes are assessed. The methods are tested on simulated data for which the classification ground truth is known. Different noise and intensity nonuniformities are added to simulate real imaging conditions. No enhancement of the image quality is considered either before or during the classification process. This way, the accuracy of the methods and their robustness against image artifacts are tested. Classification is also performed on real data where a quantitative validation compares the methods' results with an estimated ground truth from manual segmentations by experts. Validity of the various classification methods in the labeling of the image as well as in the tissue volume is estimated with different local and global measures. Results demonstrate that methods relying on both intensity and spatial information are more robust to noise and field inhomogeneities. We also demonstrate that partial volume is not perfectly modeled, even though methods that account for mixture classes outperform methods that only consider pure Gaussian classes. Finally, we show that simulated data results can also be extended to real data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Health professionals and policymakers aspire to make healthcare decisions based on the entire relevant research evidence. This, however, can rarely be achieved because a considerable amount of research findings are not published, especially in case of 'negative' results - a phenomenon widely recognized as publication bias. Different methods of detecting, quantifying and adjusting for publication bias in meta-analyses have been described in the literature, such as graphical approaches and formal statistical tests to detect publication bias, and statistical approaches to modify effect sizes to adjust a pooled estimate when the presence of publication bias is suspected. An up-to-date systematic review of the existing methods is lacking. METHODS/DESIGN: The objectives of this systematic review are as follows:âeuro¢ To systematically review methodological articles which focus on non-publication of studies and to describe methods of detecting and/or quantifying and/or adjusting for publication bias in meta-analyses.âeuro¢ To appraise strengths and weaknesses of methods, the resources they require, and the conditions under which the method could be used, based on findings of included studies.We will systematically search Web of Science, Medline, and the Cochrane Library for methodological articles that describe at least one method of detecting and/or quantifying and/or adjusting for publication bias in meta-analyses. A dedicated data extraction form is developed and pilot-tested. Working in teams of two, we will independently extract relevant information from each eligible article. As this will be a qualitative systematic review, data reporting will involve a descriptive summary. DISCUSSION: Results are expected to be publicly available in mid 2013. This systematic review together with the results of other systematic reviews of the OPEN project (To Overcome Failure to Publish Negative Findings) will serve as a basis for the development of future policies and guidelines regarding the assessment and handling of publication bias in meta-analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Classical disease phenotypes are mainly based on descriptions of symptoms and the hypothesis that a given pattern of symptoms provides a diagnosis. With refined technologies there is growing evidence that disease expression in patients is much more diverse and subtypes need to be defined to allow a better targeted treatment. One of the aims of the Mechanisms of the Development of Allergy Project (MeDALL,FP7) is to re-define the classical phenotypes of IgE-associated allergic diseases from birth to adolescence, by consensus among experts using a systematic review of the literature and identify possible gaps in research for new disease markers. This paper describes the methods to be used for the systematic review of the classical IgE-associated phenotypes applicable in general to other systematic reviews also addressing phenotype definitions based on evidence. METHODS/DESIGN: Eligible papers were identified by PubMed search (complete database through April 2011). This search yielded 12,043 citations. The review includes intervention studies (randomized and clinical controlled trials) and observational studies (cohort studies including birth cohorts, case-control studies) as well as case series. Systematic and non-systematic reviews, guidelines, position papers and editorials are not excluded but dealt with separately. Two independent reviewers in parallel conducted consecutive title and abstract filtering scans. For publications where title and abstract fulfilled the inclusion criteria the full text was assessed. In the final step, two independent reviewers abstracted data using a pre-designed data extraction form with disagreements resolved by discussion among investigators. DISCUSSION: The systematic review protocol described here allows to generate broad,multi-phenotype reviews and consensus phenotype definitions. The in-depth analysis of the existing literature on the classification of IgE-associated allergic diseases through such a systematic review will 1) provide relevant information on the current epidemiologic definitions of allergic diseases, 2) address heterogeneity and interrelationships and 3) identify gaps in knowledge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Osteoporosis is a systemic bone disease that is characterized by a generalized reduction of the bone mass. It is the main cause of fractures in elderly women. Bone densitometry is used in the lumbar spine and hip in order to detect osteoporosis in its early stages. Different studies have observed a correlation between the bone mineral density of the jaw (BMD) and that of the lumbar spine and/or hip. On the other hand, there are studies that evaluate the findings in the orthopantomograms and perapical X-rays, correlating them with the early diagnosis of osteoporosis and highlighting the role of the dentist in the early diagnosis of this disease. Materials and methods: A search was carried out in the Medline-Pubmed database in order to identify those articles that deal with the association between the X-ray findings observed in the orthopantomograms and the diagnosis of the osteoporosis, as well as those that deal with the bone mineral density of the jaw. Results: There were 406 articles, and with the limits established, this number was reduced to 21. Almost all of the articles indicate that when examining oral X-rays, it is possible to detect signs indicative of osteoporosis. Discussion: The radiomorphometric indices use measurements in orthopantomograms and evaluate possible loss of bone mineral density. They can be analyzed alone or along with the visual indices. In the periapical X-rays, the photodensimetric analyses and the trabecular pattern appear to be the most useful. There are seven studies that analyze the densitometry of the jaw, but only three do so independently of the photodensitometric analysis. Conclusions: The combination of mandibular indices, along with surveys on the risk of fracture, can be useful as indicators of early diagnosis of osteoporosis. Visual and morphometric indices appear to be especially important in the orthopantomograms. Photodensitometry indices and the trabecular pattern are used in periapical X-rays. Studies on mandibular dual-energy X-ray absorptiometry are inconclusive

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose The purpose of our multidisciplinary study was to define a pragmatic and secure alternative to the creation of a national centralised medical record which could gather together the different parts of the medical record of a patient scattered in the different hospitals where he was hospitalised without any risk of breaching confidentiality. Methods We first analyse the reasons for the failure and the dangers of centralisation (i.e. difficulty to define a European patients' identifier, to reach a common standard for the contents of the medical record, for data protection) and then propose an alternative that uses the existing available data on the basis that setting up a safe though imperfect system could be better than continuing a quest for a mythical perfect information system that we have still not found after a search that has lasted two decades. Results We describe the functioning of Medical Record Search Engines (MRSEs), using pseudonymisation of patients' identity. The MRSE will be able to retrieve and to provide upon an MD's request all the available information concerning a patient who has been hospitalised in different hospitals without ever having access to the patient's identity. The drawback of this system is that the medical practitioner then has to read all of the information and to create his own synthesis and eventually to reject extra data. Conclusions Faced with the difficulties and the risks of setting up a centralised medical record system, a system that gathers all of the available information concerning a patient could be of great interest. This low-cost pragmatic alternative which could be developed quickly should be taken into consideration by health authorities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Although randomized clinical trials (RCTs) are considered the gold standard of evidence, their reporting is often suboptimal. Trial registries have the potential to contribute important methodologic information for critical appraisal of study results. Methods and Findings: The objective of the study was to evaluate the reporting of key methodologic study characteristics in trial registries. We identified a random sample (n = 265) of actively recruiting RCTs using the World Health Organization International Clinical Trials Registry Platform (ICTRP) search portal in 2008. We assessed the reporting of relevant domains from the Cochrane Collaboration’s ‘Risk of bias’ tool and other key methodological aspects. Our primary outcomes were the proportion of registry records with adequate reporting of random sequence generation, allocation concealment, blinding, and trial outcomes. Two reviewers independently assessed each record. Weighted overall proportions in the ICTRP search portal for adequate reporting of sequence generation, allocation concealment, blinding (including and excluding open label RCT) and primary outcomes were 5.7% (95% CI 3.0–8.4%), 1.4% (0–2.8%), 41% (35–47%), 8.4% (4.1–13%), and 66% (60–72%), respectively. The proportion of adequately reported RCTs was higher for registries that used specific methodological fields for describing methods of randomization and allocation concealment compared to registries that did not. Concerning other key methodological aspects, weighted overall proportions of RCTs with adequately reported items were as follows: eligibility criteria (81%), secondary outcomes (46%), harm (5%) follow-up duration (62%), description of the interventions (53%) and sample size calculation (1%). Conclusions: Trial registries currently contain limited methodologic information about registered RCTs. In order to permit adequate critical appraisal of trial results reported in journals and registries, trial registries should consider requesting details on key RCT methods to complement journal publications. Full protocols remain the most comprehensive source of methodologic information and should be made publicly available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This master’s thesis aims to study and represent from literature how evolutionary algorithms are used to solve different search and optimisation problems in the area of software engineering. Evolutionary algorithms are methods, which imitate the natural evolution process. An artificial evolution process evaluates fitness of each individual, which are solution candidates. The next population of candidate solutions is formed by using the good properties of the current population by applying different mutation and crossover operations. Different kinds of evolutionary algorithm applications related to software engineering were searched in the literature. Applications were classified and represented. Also the necessary basics about evolutionary algorithms were presented. It was concluded, that majority of evolutionary algorithm applications related to software engineering were about software design or testing. For example, there were applications about classifying software production data, project scheduling, static task scheduling related to parallel computing, allocating modules to subsystems, N-version programming, test data generation and generating an integration test order. Many applications were experimental testing rather than ready for real production use. There were also some Computer Aided Software Engineering tools based on evolutionary algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Signal detection on 3D medical images depends on many factors, such as foveal and peripheral vision, the type of signal, and background complexity, and the speed at which the frames are displayed. In this paper, the authors focus on the speed with which radiologists and naïve observers search through medical images. Prior to the study, the authors asked the radiologists to estimate the speed at which they scrolled through CT sets. They gave a subjective estimate of 5 frames per second (fps). The aim of this paper is to measure and analyze the speed with which humans scroll through image stacks, showing a method to visually display the behavior of observers as the search is made as well as measuring the accuracy of the decisions. This information will be useful in the development of model observers, mathematical algorithms that can be used to evaluate diagnostic imaging systems. METHODS: The authors performed a series of 3D 4-alternative forced-choice lung nodule detection tasks on volumetric stacks of chest CT images iteratively reconstructed in lung algorithm. The strategy used by three radiologists and three naïve observers was assessed using an eye-tracker in order to establish where their gaze was fixed during the experiment and to verify that when a decision was made, a correct answer was not due only to chance. In a first set of experiments, the observers were restricted to read the images at three fixed speeds of image scrolling and were allowed to see each alternative once. In the second set of experiments, the subjects were allowed to scroll through the image stacks at will with no time or gaze limits. In both static-speed and free-scrolling conditions, the four image stacks were displayed simultaneously. All trials were shown at two different image contrasts. RESULTS: The authors were able to determine a histogram of scrolling speeds in frames per second. The scrolling speed of the naïve observers and the radiologists at the moment the signal was detected was measured at 25-30 fps. For the task chosen, the performance of the observers was not affected by the contrast or experience of the observer. However, the naïve observers exhibited a different pattern of scrolling than the radiologists, which included a tendency toward higher number of direction changes and number of slices viewed. CONCLUSIONS: The authors have determined a distribution of speeds for volumetric detection tasks. The speed at detection was higher than that subjectively estimated by the radiologists before the experiment. The speed information that was measured will be useful in the development of 3D model observers, especially anthropomorphic model observers which try to mimic human behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent advances in machine learning methods enable increasingly the automatic construction of various types of computer assisted methods that have been difficult or laborious to program by human experts. The tasks for which this kind of tools are needed arise in many areas, here especially in the fields of bioinformatics and natural language processing. The machine learning methods may not work satisfactorily if they are not appropriately tailored to the task in question. However, their learning performance can often be improved by taking advantage of deeper insight of the application domain or the learning problem at hand. This thesis considers developing kernel-based learning algorithms incorporating this kind of prior knowledge of the task in question in an advantageous way. Moreover, computationally efficient algorithms for training the learning machines for specific tasks are presented. In the context of kernel-based learning methods, the incorporation of prior knowledge is often done by designing appropriate kernel functions. Another well-known way is to develop cost functions that fit to the task under consideration. For disambiguation tasks in natural language, we develop kernel functions that take account of the positional information and the mutual similarities of words. It is shown that the use of this information significantly improves the disambiguation performance of the learning machine. Further, we design a new cost function that is better suitable for the task of information retrieval and for more general ranking problems than the cost functions designed for regression and classification. We also consider other applications of the kernel-based learning algorithms such as text categorization, and pattern recognition in differential display. We develop computationally efficient algorithms for training the considered learning machines with the proposed kernel functions. We also design a fast cross-validation algorithm for regularized least-squares type of learning algorithm. Further, an efficient version of the regularized least-squares algorithm that can be used together with the new cost function for preference learning and ranking tasks is proposed. In summary, we demonstrate that the incorporation of prior knowledge is possible and beneficial, and novel advanced kernels and cost functions can be used in algorithms efficiently.