907 resultados para computer-based diagnostics
Resumo:
Language extinction as a consequence of language shifts is a widespread social phenomenon that affects several million people all over the world today. An important task for social sciences research should therefore be to gain an understanding of language shifts, especially as a way of forecasting the extinction or survival of threatened languages, i.e., determining whether or not the subordinate language will survive in communities with a dominant and a subordinate language. In general, modeling is usually a very difficult task in the social sciences, particularly when it comes to forecasting the values of variables. However, the cellular automata theory can help us overcome this traditional difficulty. The purpose of this article is to investigate language shifts in the speech behavior of individuals using the methodology of the cellular automata theory. The findings on the dynamics of social impacts in the field of social psychology and the empirical data from language surveys on the use of Catalan in Valencia allowed us to define a cellular automaton and carry out a set of simulations using that automaton. The simulation results highlighted the key factors in the progression or reversal of a language shift and the use of these factors allowed us to forecast the future of a threatened language in a bilingual community.
Resumo:
Brain-computer interfaces (BCIs) are becoming more and more popular as an input device for virtual worlds and computer games. Depending on their function, a major drawback is the mental workload associated with their use and there is significant effort and training required to effectively control them. In this paper, we present two studies assessing how mental workload of a P300-based BCI affects participants" reported sense of presence in a virtual environment (VE). In the first study, we employ a BCI exploiting the P300 event-related potential (ERP) that allows control of over 200 items in a virtual apartment. In the second study, the BCI is replaced by a gaze-based selection method coupled with wand navigation. In both studies, overall performance is measured and individual presence scores are assessed by means of a short questionnaire. The results suggest that there is no immediate benefit for visualizing events in the VE triggered by the BCI and that no learning about the layout of the virtual space takes place. In order to alleviate this, we propose that future P300-based BCIs in VR are set up so as require users to make some inference about the virtual space so that they become aware of it,which is likely to lead to higher reported presence.
Resumo:
This master’s thesis aims to study and represent from literature how evolutionary algorithms are used to solve different search and optimisation problems in the area of software engineering. Evolutionary algorithms are methods, which imitate the natural evolution process. An artificial evolution process evaluates fitness of each individual, which are solution candidates. The next population of candidate solutions is formed by using the good properties of the current population by applying different mutation and crossover operations. Different kinds of evolutionary algorithm applications related to software engineering were searched in the literature. Applications were classified and represented. Also the necessary basics about evolutionary algorithms were presented. It was concluded, that majority of evolutionary algorithm applications related to software engineering were about software design or testing. For example, there were applications about classifying software production data, project scheduling, static task scheduling related to parallel computing, allocating modules to subsystems, N-version programming, test data generation and generating an integration test order. Many applications were experimental testing rather than ready for real production use. There were also some Computer Aided Software Engineering tools based on evolutionary algorithms.
Resumo:
Tämä työ esittelee uuden tarjota paikasta riippuvaa tietoa langattomien tietoverkkojen käyttäjille. Tieto välitetään jokaiselle käyttäjälle tietämättä mitään käyttäjän henkilöllisyydestä. Sovellustason protokollaksi valittiin HTTP, joka mahdollistaa tämän järjestelmän saattaa tietoa perille useimmille käyttäjille, jotka käyttävät hyvinkin erilaisia päätelaitteita. Tämä järjestelmä toimii sieppaavan www-liikenteen välityspalvelimen jatkeena. Erilaisten tietokantojen sisällä on perusteella järjestelmä päättää välitetäänkö tietoa vai ei. Järjestelmä sisältää myös yksinkertaisen ohjelmiston käyttäjien paikantamiseksi yksittäisen tukiaseman tarkkuudella. Vaikka esitetty ratkaisu tähtääkin paikkaan perustuvien mainosten tarjoamiseen, se on helposti muunnettavissa minkä tahansa tyyppisen tiedon välittämiseen käyttäjille.
Resumo:
Markkinasegmentointi nousi esiin ensi kerran jo 50-luvulla ja se on ollut siitä lähtien yksi markkinoinnin peruskäsitteistä. Suuri osa segmentointia käsittelevästä tutkimuksesta on kuitenkin keskittynyt kuluttajamarkkinoiden segmentointiin yritys- ja teollisuusmarkkinoiden segmentoinnin jäädessä vähemmälle huomiolle. Tämän tutkimuksen tavoitteena on luoda segmentointimalli teollismarkkinoille tietotekniikan tuotteiden ja palveluiden tarjoajan näkökulmasta. Tarkoituksena on selvittää mahdollistavatko case-yrityksen nykyiset asiakastietokannat tehokkaan segmentoinnin, selvittää sopivat segmentointikriteerit sekä arvioida tulisiko tietokantoja kehittää ja kuinka niitä tulisi kehittää tehokkaamman segmentoinnin mahdollistamiseksi. Tarkoitus on luoda yksi malli eri liiketoimintayksiköille yhteisesti. Näin ollen eri yksiköiden tavoitteet tulee ottaa huomioon eturistiriitojen välttämiseksi. Tutkimusmetodologia on tapaustutkimus. Lähteinä tutkimuksessa käytettiin sekundäärisiä lähteitä sekä primäärejä lähteitä kuten case-yrityksen omia tietokantoja sekä haastatteluita. Tutkimuksen lähtökohtana oli tutkimusongelma: Voiko tietokantoihin perustuvaa segmentointia käyttää kannattavaan asiakassuhdejohtamiseen PK-yritys sektorilla? Tavoitteena on luoda segmentointimalli, joka hyödyntää tietokannoissa olevia tietoja tinkimättä kuitenkaan tehokkaan ja kannattavan segmentoinnin ehdoista. Teoriaosa tutkii segmentointia yleensä painottuen kuitenkin teolliseen markkinasegmentointiin. Tarkoituksena on luoda selkeä kuva erilaisista lähestymistavoista aiheeseen ja syventää näkemystä tärkeimpien teorioiden osalta. Tietokantojen analysointi osoitti selviä puutteita asiakastiedoissa. Peruskontaktitiedot löytyvät mutta segmentointia varten tietoa on erittäin rajoitetusti. Tietojen saantia jälleenmyyjiltä ja tukkureilta tulisi parantaa loppuasiakastietojen saannin takia. Segmentointi nykyisten tietojen varassa perustuu lähinnä sekundäärisiin tietoihin kuten toimialaan ja yrityskokoon. Näitäkään tietoja ei ole saatavilla kaikkien tietokannassa olevien yritysten kohdalta.
Resumo:
PURPOSE OF REVIEW: Current computational neuroanatomy based on MRI focuses on morphological measures of the brain. We present recent methodological developments in quantitative MRI (qMRI) that provide standardized measures of the brain, which go beyond morphology. We show how biophysical modelling of qMRI data can provide quantitative histological measures of brain tissue, leading to the emerging field of in-vivo histology using MRI (hMRI). RECENT FINDINGS: qMRI has greatly improved the sensitivity and specificity of computational neuroanatomy studies. qMRI metrics can also be used as direct indicators of the mechanisms driving observed morphological findings. For hMRI, biophysical models of the MRI signal are being developed to directly access histological information such as cortical myelination, axonal diameters or axonal g-ratio in white matter. Emerging results indicate promising prospects for the combined study of brain microstructure and function. SUMMARY: Non-invasive brain tissue characterization using qMRI or hMRI has significant implications for both research and clinics. Both approaches improve comparability across sites and time points, facilitating multicentre/longitudinal studies and standardized diagnostics. hMRI is expected to shed new light on the relationship between brain microstructure, function and behaviour, both in health and disease, and become an indispensable addition to computational neuroanatomy.
Resumo:
We have studied how leaders emerge in a group as a consequence of interactions among its members. We propose that leaders can emerge as a consequence of a self-organized process based on local rules of dyadic interactions among individuals. Flocks are an example of self-organized behaviour in a group and properties similar to those observed in flocks might also explain some of the dynamics and organization of human groups. We developed an agent-based model that generated flocks in a virtual world and implemented it in a multi-agent simulation computer program that computed indices at each time step of the simulation to quantify the degree to which a group moved in a coordinated way (index of flocking behaviour) and the degree to which specific individuals led the group (index of hierarchical leadership). We ran several series of simulations in order to test our model and determine how these indices behaved under specific agent and world conditions. We identified the agent, world property, and model parameters that made stable, compact flocks emerge, and explored possible environmental properties that predicted the probability of becoming a leader.
Resumo:
We present an algorithm for the computation of reducible invariant tori of discrete dynamical systems that is suitable for tori of dimensions larger than 1. It is based on a quadratically convergent scheme that approximates, at the same time, the Fourier series of the torus, its Floquet transformation, and its Floquet matrix. The Floquet matrix describes the linearization of the dynamics around the torus and, hence, its linear stability. The algorithm presents a high degree of parallelism, and the computational effort grows linearly with the number of Fourier modes needed to represent the solution. For these reasons it is a very good option to compute quasi-periodic solutions with several basic frequencies. The paper includes some examples (flows) to show the efficiency of the method in a parallel computer. In these flows we compute invariant tori of dimensions up to 5, by taking suitable sections.
Resumo:
The advent of multiparametric MRI has made it possible to change the way in which prostate biopsy is done, allowing to direct biopsies to suspicious lesions rather than randomly. The subject of this review relates to a computer-assisted strategy, the MRI/US fusion software-based targeted biopsy, and to its performance compared to the other sampling methods. Different devices with different methods to register MR images to live TRUS are currently in use to allow software-based targeted biopsy. Main clinical indications of MRI/US fusion software-based targeted biopsy are re-biopsy in men with persistent suspicious of prostate cancer after first negative standard biopsy and the follow-up of patients under active surveillance. Some studies have compared MRI/US fusion software-based targeted versus standard biopsy. In men at risk with MRI-suspicious lesion, targeted biopsy consistently detects more men with clinically significant disease as compared to standard biopsy; some studies have also shown decreased detection of insignificant disease. Only two studies directly compared MRI/US fusion software-based targeted biopsy with MRI/US fusion visual targeted biopsy, and the diagnostic ability seems to be in favor of the software approach. To date, no study comparing software-based targeted biopsy against in-bore MRI biopsy is available. The new software-based targeted approach seems to have the characteristics to be added in the standard pathway for achieving accurate risk stratification. Once reproducibility and cost-effectiveness will be verified, the actual issue will be to determine whether MRI/TRUS fusion software-based targeted biopsy represents anadd-on test or a replacement to standard TRUS biopsy.
Resumo:
Adoptive cell transfer using engineered T cells is emerging as a promising treatment for metastatic melanoma. Such an approach allows one to introduce T cell receptor (TCR) modifications that, while maintaining the specificity for the targeted antigen, can enhance the binding and kinetic parameters for the interaction with peptides (p) bound to major histocompatibility complexes (MHC). Using the well-characterized 2C TCR/SIYR/H-2K(b) structure as a model system, we demonstrated that a binding free energy decomposition based on the MM-GBSA approach provides a detailed and reliable description of the TCR/pMHC interactions at the structural and thermodynamic levels. Starting from this result, we developed a new structure-based approach, to rationally design new TCR sequences, and applied it to the BC1 TCR targeting the HLA-A2 restricted NY-ESO-1157-165 cancer-testis epitope. Fifty-four percent of the designed sequence replacements exhibited improved pMHC binding as compared to the native TCR, with up to 150-fold increase in affinity, while preserving specificity. Genetically engineered CD8(+) T cells expressing these modified TCRs showed an improved functional activity compared to those expressing BC1 TCR. We measured maximum levels of activities for TCRs within the upper limit of natural affinity, K D = ∼1 - 5 μM. Beyond the affinity threshold at K D < 1 μM we observed an attenuation in cellular function, in line with the "half-life" model of T cell activation. Our computer-aided protein-engineering approach requires the 3D-structure of the TCR-pMHC complex of interest, which can be obtained from X-ray crystallography. We have also developed a homology modeling-based approach, TCRep 3D, to obtain accurate structural models of any TCR-pMHC complexes when experimental data is not available. Since the accuracy of the models depends on the prediction of the TCR orientation over pMHC, we have complemented the approach with a simplified rigid method to predict this orientation and successfully assessed it using all non-redundant TCR-pMHC crystal structures available. These methods potentially extend the use of our TCR engineering method to entire TCR repertoires for which no X-ray structure is available. We have also performed a steered molecular dynamics study of the unbinding of the TCR-pMHC complex to get a better understanding of how TCRs interact with pMHCs. This entire rational TCR design pipeline is now being used to produce rationally optimized TCRs for adoptive cell therapies of stage IV melanoma.
Resumo:
The extensional theory of arrays is one of the most important ones for applications of SAT Modulo Theories (SMT) to hardware and software verification. Here we present a new T-solver for arrays in the context of the DPLL(T) approach to SMT. The main characteristics of our solver are: (i) no translation of writes into reads is needed, (ii) there is no axiom instantiation, and (iii) the T-solver interacts with the Boolean engine by asking to split on equality literals between indices. As far as we know, this is the first accurate description of an array solver integrated in a state-of-the-art SMT solver and, unlike most state-of-the-art solvers, it is not based on a lazy instantiation of the array axioms. Moreover, it is very competitive in practice, specially on problems that require heavy reasoning on array literals
Resumo:
Abstract Objective: To perform a comparative dosimetric analysis, based on computer simulations, of temporary balloon implants with 99mTc and balloon brachytherapy with high-dose-rate (HDR) 192Ir, as boosts to radiotherapy. We hypothesized that the two techniques would produce equivalent doses under pre-established conditions of activity and exposure time. Materials and Methods: Simulations of implants with 99mTc-filled and HDR 192Ir-filled balloons were performed with the Siscodes/MCNP5, modeling in voxels a magnetic resonance imaging set related to a young female. Spatial dose rate distributions were determined. In the dosimetric analysis of the protocols, the exposure time and the level of activity required were specified. Results: The 99mTc balloon presented a weighted dose rate in the tumor bed of 0.428 cGy.h-1.mCi-1 and 0.190 cGyh-1.mCi-1 at the balloon surface and at 8-10 mm from the surface, respectively, compared with 0.499 and 0.150 cGyh-1.mCi-1, respectively, for the HDR 192Ir balloon. An exposure time of 24 hours was required for the 99mTc balloon to produce a boost of 10.14 Gy with 1.0 Ci, whereas only 24 minutes with 10.0 Ci segments were required for the HDR 192Ir balloon to produce a boost of 5.14 Gy at the same reference point, or 10.28 Gy in two 24-minutes fractions. Conclusion: Temporary 99mTc balloon implantation is an attractive option for adjuvant radiotherapy in breast cancer, because of its availability, economic viability, and similar dosimetry in comparison with the use of HDR 192Ir balloon implantation, which is the current standard in clinical practice.
Resumo:
The paper presents the results of the piloting or pilot test in a virtual classroom. This e-portfolio was carried out in the 2005-2006 academic year, with students of the Doctorate in Information Society, at the Open University of Catalonia. The electronic portfolio is a strategy for competence based assessment. This experience shows the types of e-portfolios, where students show their work without interactions, and apply the competence-based learning theories in an interactive portfolio system. The real process of learning is developed in the competency based system, the portfolio not only is a basic bio document, has become a real space for learning with competence model. The paper brings out new ideas and possibilities: the competence-based learning promotes closer relationships between universities and companies and redesigns the pedagogic act.
Resumo:
Peer-reviewed
Resumo:
Peer-reviewed