890 resultados para Computer-assisted composition


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The advent of multiparametric MRI has made it possible to change the way in which prostate biopsy is done, allowing to direct biopsies to suspicious lesions rather than randomly. The subject of this review relates to a computer-assisted strategy, the MRI/US fusion software-based targeted biopsy, and to its performance compared to the other sampling methods. Different devices with different methods to register MR images to live TRUS are currently in use to allow software-based targeted biopsy. Main clinical indications of MRI/US fusion software-based targeted biopsy are re-biopsy in men with persistent suspicious of prostate cancer after first negative standard biopsy and the follow-up of patients under active surveillance. Some studies have compared MRI/US fusion software-based targeted versus standard biopsy. In men at risk with MRI-suspicious lesion, targeted biopsy consistently detects more men with clinically significant disease as compared to standard biopsy; some studies have also shown decreased detection of insignificant disease. Only two studies directly compared MRI/US fusion software-based targeted biopsy with MRI/US fusion visual targeted biopsy, and the diagnostic ability seems to be in favor of the software approach. To date, no study comparing software-based targeted biopsy against in-bore MRI biopsy is available. The new software-based targeted approach seems to have the characteristics to be added in the standard pathway for achieving accurate risk stratification. Once reproducibility and cost-effectiveness will be verified, the actual issue will be to determine whether MRI/TRUS fusion software-based targeted biopsy represents anadd-on test or a replacement to standard TRUS biopsy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent advances in machine learning methods enable increasingly the automatic construction of various types of computer assisted methods that have been difficult or laborious to program by human experts. The tasks for which this kind of tools are needed arise in many areas, here especially in the fields of bioinformatics and natural language processing. The machine learning methods may not work satisfactorily if they are not appropriately tailored to the task in question. However, their learning performance can often be improved by taking advantage of deeper insight of the application domain or the learning problem at hand. This thesis considers developing kernel-based learning algorithms incorporating this kind of prior knowledge of the task in question in an advantageous way. Moreover, computationally efficient algorithms for training the learning machines for specific tasks are presented. In the context of kernel-based learning methods, the incorporation of prior knowledge is often done by designing appropriate kernel functions. Another well-known way is to develop cost functions that fit to the task under consideration. For disambiguation tasks in natural language, we develop kernel functions that take account of the positional information and the mutual similarities of words. It is shown that the use of this information significantly improves the disambiguation performance of the learning machine. Further, we design a new cost function that is better suitable for the task of information retrieval and for more general ranking problems than the cost functions designed for regression and classification. We also consider other applications of the kernel-based learning algorithms such as text categorization, and pattern recognition in differential display. We develop computationally efficient algorithms for training the considered learning machines with the proposed kernel functions. We also design a fast cross-validation algorithm for regularized least-squares type of learning algorithm. Further, an efficient version of the regularized least-squares algorithm that can be used together with the new cost function for preference learning and ranking tasks is proposed. In summary, we demonstrate that the incorporation of prior knowledge is possible and beneficial, and novel advanced kernels and cost functions can be used in algorithms efficiently.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTERMED training implies a three week course, integrated in the "primary care module" for medical students in the first master year at the school of medicine in Lausanne. INTERMED uses an innovative teaching method based on repetitive sequences of e-learning-based individual learning followed by collaborative learning activities in teams, named Team-based learning (TBL). The e-learning takes place in a web-based virtual learning environment using a series of interactive multimedia virtual patients. By using INTERMED students go through a complete medical encounter applying clinical reasoning and choosing the diagnostic and therapeutic approach. INTERMED offers an authentic experience in an engaging and safe environment where errors are allowed and without consequences.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fetal MRI reconstruction aims at finding a high-resolution image given a small set of low-resolution images. It is usually modeled as an inverse problem where the regularization term plays a central role in the reconstruction quality. Literature has considered several regularization terms s.a. Dirichlet/Laplacian energy [1], Total Variation (TV)based energies [2,3] and more recently non-local means [4]. Although TV energies are quite attractive because of their ability in edge preservation, standard explicit steepest gradient techniques have been applied to optimize fetal-based TV energies. The main contribution of this work lies in the introduction of a well-posed TV algorithm from the point of view of convex optimization. Specifically, our proposed TV optimization algorithm for fetal reconstruction is optimal w.r.t. the asymptotic and iterative convergence speeds O(1/n(2)) and O(1/root epsilon), while existing techniques are in O(1/n) and O(1/epsilon). We apply our algorithm to (1) clinical newborn data, considered as ground truth, and (2) clinical fetal acquisitions. Our algorithm compares favorably with the literature in terms of speed and accuracy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Focal epilepsy is increasingly recognized as the result of an altered brain network, both on the structural and functional levels and the characterization of these widespread brain alterations is crucial for our understanding of the clinical manifestation of seizure and cognitive deficits as well as for the management of candidates to epilepsy surgery. Tractography based on Diffusion Tensor Imaging allows non-invasive mapping of white matter tracts in vivo. Recently, diffusion spectrum imaging (DSI), based on an increased number of diffusion directions and intensities, has improved the sensitivity of tractography, notably with respect to the problem of fiber crossing and recent developments allow acquisition times compatible with clinical application. We used DSI and parcellation of the gray matter in regions of interest to build whole-brain connectivity matrices describing the mutual connections between cortical and subcortical regions in patients with focal epilepsy and healthy controls. In addition, the high angular and radial resolution of DSI allowed us to evaluate also some of the biophysical compartment models, to better understand the cause of the changes in diffusion anisotropy. Global connectivity, hub architecture and regional connectivity patterns were altered in TLE patients and showed different characteristics in RTLE vs LTLE with stronger abnormalities in RTLE. The microstructural analysis suggested that disturbed axonal density contributed more than fiber orientation to the connectivity changes affecting the temporal lobes whereas fiber orientation changes were more involved in extratemporal lobe changes. Our study provides further structural evidence that RTLE and LTLE are not symmetrical entities and DSI-based imaging could help investigate the microstructural correlate of these imaging abnormalities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La plataforma ACME (Avaluació Continuada i Millora de l’Ensenyament) va ser creada l’any 1998 per un grup de professors del departament d’Informàtica i Matemàtica Aplicada. L’ACME es va concebre com una plataforma d’e-learning, és a dir, un sistema que mitjançant l’ús d’Internet afavorís l’aprenentatge, permeten la interactivitat entre l’alumne i el professor. La creació de la plataforma ACME tenia com a objectiu reduir el fracàs dels alumnes en les assignatures de matemàtiques, però degut a l’èxit que va suposar en aquestes, es va decidir incorporar la metodologia de treball ACME a altres disciplines com la programació, les bases de dades, la química, l’economia, etc. de manera que actualment es poden desenvolupar activitats ACME en moltes disciplines. Actualment l’ACME s’utilitza com a complement a les classes presencials, on el professor exposa de manera magistral els conceptes i resol algun exercici a mode d’exemple, per a que després l’alumne, utilitzant la plataforma ACME, intenti resoldre els exercicis proposats pel professor.L’objectiu d’aquest projecte és desenvolupar l’anàlisi, disseny i implementació de les modificacions necessàries a incorporar a la plataforma ACME per tal de millorar el gestor de grups, els exercicis Excel i finalment permetre el treball en grup. El projecte consta de tres parts: millorar les interfícies del professor i de l’alumne, la millora del exercicis Excel i la resolució d’exercicis en grup

Relevância:

80.00% 80.00%

Publicador:

Resumo:

CoSpace és una plataforma web dissenyada per proporcionar un espai virtual d’interacció i col•laboració entre formadors en comunitats virtuals. Es va originar com a resultat de les necessitats addicionals que van sorgir quan els professors que treballen en temes educatius en context de diversitat (en àrees de llenguatge, matemàtiques i ciències) van necessitar una eina per afavorir o recolzar la interacció i la col•laboració entre ells per compartir idees, experiències, objectes virtuals d’aprenentatge, entre d’altres, des d’una perspectiva actual de l’ús de la tecnologia i des d’una visió més propera a l’usuari final. Aquest paradigma promou la idea que les aplicacions han de ser concebudes per a usuaris poc experts i tenint en compte els principis d’usabilitat i accessibilitat.L’abast del projecte està definit per la necessitat de gestió per part dels professors, de la informació del seu perfil personal, permetent als usuaris la creació, edició, consulta i eliminació d’aquesta informació. També la necessitat de compartir arxius, publicar notícies entre comunitats de pràctica i, finalment, de la necessitat de gestionar els permisos d’usuari per tal que gestionin mòduls de la plataforma

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Las herramientas informáticas abren un amplio campo de posibilidades pedagógicas a las asignaturas de lengua. En el presente artículo se propone un modelo de combinación de recursos digitales (portafolios electrónicos y traducción asistida por ordenador) que refuerzan proyectos docentes del ámbito de las lenguas desde un enfoque pedagógico socioconstructivista. En algunos casos, las actividades se pueden integrar en proyectos reales. Por otra parte, los proyectos relacionados con el uso de estas herramientas pueden tener un enfoque multidisciplinar que implique tanto a los departamentos de las lenguas extranjeras y como a los de las lenguas maternas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The molecular basis of modern therapeutics consist in the modulation of cell function by the interaction of microbioactive molecules as drug cells macromolecules structures. Molecular modeling is a computational technique developed to access the chemical structure. This methodology, by means of the molecular similarity and complementary paradigm, is the basis for the computer-assisted drug design universally employed in pharmaceutical research laboratories to obtain more efficient, more selective, and safer drugs. In this work, we discuss some methods for molecular modeling and some approaches to evaluate new bioactive structures in development by our research group.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we reflect about the broadening of the field of application of CRM from the business domain to a wider context of relationships in which the inclusion of non-profit making organizations seems natural. In particular, we focus on analyzing the suitability of adopting CRM processes by universities and higher educational institutions dedicated to e-learning. This is an issue that, in our opinion, has much potential but has received little attention in research so far.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Peer-reviewed

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Peer-reviewed

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper attempts to shed light on the competencies a teacher must have inorder to teach in online university environments. We will relate a teacher trainingexperience, which was designed taking into account the methodological criteriaestablished in line with previous theoretical principles. The main objective of ouranalysis is to identify the achievements and difficulties of a specific formativeexperience, with the ultimate goal of assessing the suitability of this conceptualmethodologicalframework for the design of formative proposals aiming to contribute tothe development of teacher competencies for virtual environments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent standardization efforts in e-learning technology have resulted in a number of specifications, however, the automation process that is considered essential in a learning management system (LMS) is a lessexplored one. As learning technology becomes more widespread and more heterogeneous, there is a growing need to specify processes that cross the boundaries of a single LMS or learning resource repository. This article proposes to obtain a specification orientated to automation that takes on board the heterogeneity of systems and formats and provides a language for specifying complex and generic interactions. Having this goal in mind, a technique based on three steps is suggested. The semantic conformance profiles, the business process management (BPM) diagram, and its translation into the business process execution language (BPEL) seem to be suitable for achieving it.