28 resultados para Computer Diagnostics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

CHARGE syndrome, Sotos syndrome and 3p deletion syndrome are examples of rare inherited syndromes that have been recognized for decades but for which the molecular diagnostics only have been made possible by recent advances in genomic research. Despite these advances, development of diagnostic tests for rare syndromes has been hindered by diagnostic laboratories having limited funds for test development, and their prioritization of tests for which a (relatively) high demand can be expected. In this study, the molecular diagnostic tests for CHARGE syndrome and Sotos syndrome were developed, resulting in their successful translation into routine diagnostic testing in the laboratory of Medical Genetics (UTUlab). In the CHARGE syndrome group, mutation was identified in 40.5% of the patients and in the Sotos syndrome group, in 34%, reflecting the use of the tests in routine diagnostics in differential diagnostics. In CHARGE syndrome, the low prevalence of structural aberrations was also confirmed. In 3p deletion syndrome, it was shown that small terminal deletions are not causative for the syndrome, and that testing with arraybased analysis provides a reliable estimate of the deletion size but benign copy number variants complicate result interpretation. During the development of the tests, it was discovered that finding an optimal molecular diagnostic strategy for a given syndrome is always a compromise between the sensitivity, specificity and feasibility of applying a new method. In addition, the clinical utility of the test should be considered prior to test development: sometimes a test performing well in a laboratory has limited utility for the patient, whereas a test performing poorly in the laboratory may have a great impact on the patient and their family. At present, the development of next generation sequencing methods is changing the concept of molecular diagnostics of rare diseases from single tests towards whole-genome analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The question of the trainability of executive functions and the impact of such training on related cognitive skills has stirred considerable research interest. Despite a number of studies investigating this, the question has not yet been solved. The general aim of this thesis was to investigate two very different types of training of executive functions: laboratory-based computerized training (Studies I-III) and realworld training through bilingualism (Studies IV-V). Bilingualism as a kind of training of executive functions is based on the idea that managing two languages requires executive resources, and previous studies have suggested a bilingual advantage in executive functions. Three executive functions were studied in the present thesis: updating of working memory (WM) contents, inhibition of irrelevant information, and shifting between tasks and mental sets. Studies I-III investigated the effects of computer-based training of WM updating (Study I), inhibition (Study II), and set shifting (Study III) in healthy young adults. All studies showed increased performance on the trained task. More importantly, improvement on an untrained task tapping the trained executive function (near transfer) was seen in Study I and II. None of the three studies showed improvement on untrained tasks tapping some other cognitive function (far transfer) as a result of training. Study I also used PET to investigate the effects of WM updating training on a neurotransmitter closely linked to WM, namely dopamine. The PET results revealed increased striatal dopamine release during WM updating performance as a result of training. Study IV investigated the ability to inhibit task-irrelevant stimuli in bilinguals and monolinguals by using a dichotic listening task. The results showed that the bilinguals exceeded the monolinguals in inhibiting task-irrelevant information. Study V introduced a new, complementary research approach to study the bilingual executive advantage and its underlying mechanisms. To circumvent the methodological problems related to natural groups design, this approach focuses only on bilinguals and examines whether individual differences in bilingual behavior correlate with executive task performances. Using measures that tap the three above-entioned executive functions, the results suggested that more frequent language switching was associated with better set shifting skills, and earlier acquisition of the second language was related to better inhibition skills. In conclusion, the present behavioral results showed that computer-based training of executive functions can improve performance on the trained task and on closely related tasks, but does not yield a more general improvement of cognitive skills. Moreover, the functional neuroimaging results reveal that WM training modulates striatal dopaminergic function, speaking for training-induced neural plasticity in this important neurotransmitter system. With regard to bilingualism, the results provide further support to the idea that bilingualism can enhance executive functions. In addition, the new complementary research approach proposed here provides some clues as to which aspects of everyday bilingual behavior may be related to the advantage in executive functions in bilingual individuals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Communication, the flow of ideas and information between individuals in a social context, is the heart of educational experience. Constructivism and constructivist theories form the foundation for the collaborative learning processes of creating and sharing meaning in online educational contexts. The Learning and Collaboration in Technology-enhanced Contexts (LeCoTec) course comprised of 66 participants drawn from four European universities (Oulu, Turku, Ghent and Ramon Llull). These participants were split into 15 groups with the express aim of learning about computer-supported collaborative learning (CSCL). The Community of Inquiry model (social, cognitive and teaching presences) provided the content and tools for learning and researching the collaborative interactions in this environment. The sampled comments from the collaborative phase were collected and analyzed at chain-level and group-level, with the aim of identifying the various message types that sustained high learning outcomes. Furthermore, the Social Network Analysis helped to view the density of whole group interactions, as well as the popular and active members within the highly collaborating groups. It was observed that long chains occur in groups having high quality outcomes. These chains were also characterized by Social, Interactivity, Administrative and Content comment-types. In addition, high outcomes were realized from the high interactive cases and high-density groups. In low interactive groups, commenting patterned around the one or two central group members. In conclusion, future online environments should support high-order learning and develop greater metacognition and self-regulation. Moreover, such an environment, with a wide variety of problem solving tools, would enhance interactivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The number of molecular diagnostic assays has increased tremendously in recent years.Nucleic acid diagnostic assays have been developed, especially for the detection of human pathogenic microbes and genetic markers predisposing to certain diseases. Closed-tube methods are preferred because they are usually faster and easier to perform than heterogenous methods and in addition, target nucleic acids are commonly amplified leading to risk of contamination of the following reactions by the amplification product if the reactions are opened. The present study introduces a new closed-tube switchable complementation probes based PCR assay concept where two non-fluorescent probes form a fluorescent lanthanide chelate complex in the presence of the target DNA. In this dual-probe PCR assay method one oligonucleotide probe carries a non-fluorescent lanthanide chelate and another probe a light absorbing antenna ligand. The fluorescent lanthanide chelate complex is formed only when the non-fluorescent probes are hybridized to adjacent positions into the target DNA bringing the reporter moieties in close proximity. The complex is formed by self-assembled lanthanide chelate complementation where the antenna ligand is coordinated to the lanthanide ion captured in the chelate. The complementation probes based assays with time-resolved fluorescence measurement showed low background signal level and hence, relatively high nucleic acid detection sensitivity (low picomolar target concentration). Different lanthanide chelate structures were explored and a new cyclic seven dentate lanthanide chelate was found suitable for complementation probe method. It was also found to resist relatively high PCR reaction temperatures, which was essential for the PCR assay applications. A seven-dentate chelate with two unoccupied coordination sites must be used instead of a more stable eight- or nine-dentate chelate because the antenna ligand needs to be coordinated to the free coordination sites of the lanthanide ion. The previously used linear seven-dentate lanthanide chelate was found to be unstable in PCR conditions and hence, the new cyclic chelate was needed. The complementation probe PCR assay method showed high signal-to-background ratio up to 300 due to a low background fluorescence level and the results (threshold cycles) in real-time PCR were reached approximately 6 amplification cycles earlier compared to the commonly used FRET-based closed-tube PCR method. The suitability of the complementation probe method for different nucleic acid assay applications was studied. 1) A duplex complementation probe C. trachomatis PCR assay with a simple 10-minute urine sample preparation was developed to study suitability of the method for clinical diagnostics. The performance of the C. trachomatis assay was equal to the commercial C. trachomatis nucleic acid amplification assay containing more complex sample preparation based on DNA extraction. 2) A PCR assay for the detection of HLA-DQA1*05 allele, that is used to predict the risk of type 1 diabetes, was developed to study the performance of the method in genotyping. A simple blood sample preparation was used where the nucleic acids were released from dried blood sample punches using high temperature and alkaline reaction conditions. The complementation probe HLA-DQA1*05 PCR assay showed good genotyping performance correlating 100% with the routinely used heterogenous reference assay. 3) To study the suitability of the complementation probe method for direct measurement of the target organism, e.g., in the culture media, the complementation probes were applied to amplificationfree closed-tube bacteriophage quantification by measuring M13 bacteriophage ssDNA. A low picomolar bacteriophage concentration was detected in a rapid 20- minute assay. The assay provides a quick and reliable alternative to the commonly used and relatively unreliable UV-photometry and time-consuming culture based bacteriophage detection methods and indicates that the method could also be used for direct measurement of other micro-organisms. The complementation probe PCR method has a low background signal level leading to a high signal-to-background ratio and relatively sensitive nucleic acid detection. The method is compatible with simple sample preparation and it was shown to tolerate residues of urine, blood, bacteria and bacterial culture media. The common trend in nucleic acid diagnostics is to create easy-to-use assays suitable for rapid near patient analysis. The complementation probe PCR assays with a brief sample preparation should be relatively easy to automate and hence, would allow the development of highperformance nucleic acid amplification assays with a short overall assay time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conventional diagnostics tests and technologies typically allow only a single analysis and result per test. The aim of this study was to propose robust and multiplex array-inwell test platforms based on oligonucleotide and protein arrays combining the advantages of simple instrumentation and upconverting phosphor (UCP) reporter technology. The UCPs are luminescent lanthanide-doped crystals that have a unique capability to convert infrared radiation into visible light. No autofluorescence is produced from the sample under infrared excitation enabling the development of highly sensitive assays. In this study, an oligonucleotide array-in-well hybridization assay was developed for the detection and genotyping of human adenoviruses. The study provided a verification of the advantages and potential of the UCP-based reporter technology in multiplex assays as well as anti-Stokes photoluminescence detection with a new anti- Stokes photoluminescence imager. The developed assay was technically improved and used to detect and genotype adenovirus types from clinical specimens. Based on the results of the epidemiological study, an outbreak of adenovirus type B03 was observed in the autumn of 2010. A quantitative array-in-well immunoassay was developed for three target analytes (prostate specific antigen, thyroid stimulating hormone, and luteinizing hormone). In this study, quantitative results were obtained for each analyte and the analytical sensitivities in buffer were in clinically relevant range. Another protein-based array-inwell assay was developed for multiplex serodiagnostics. The developed assay was able to detect parvovirus B19 IgG and adenovirus IgG antibodies simultaneously from serum samples according to reference assays. The study demonstrated that the UCPtechnology is a robust detection method for diverse multiplex imaging-based array-inwell assays.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis concentrates on the validation of a generic thermal hydraulic computer code TRACE under the challenges of the VVER-440 reactor type. The code capability to model the VVER-440 geometry and thermal hydraulic phenomena specific to this reactor design has been examined and demonstrated acceptable. The main challenge in VVER-440 thermal hydraulics appeared in the modelling of the horizontal steam generator. The major challenge here is not in the code physics or numerics but in the formulation of a representative nodalization structure. Another VVER-440 specialty, the hot leg loop seals, challenges the system codes functionally in general, but proved readily representable. Computer code models have to be validated against experiments to achieve confidence in code models. When new computer code is to be used for nuclear power plant safety analysis, it must first be validated against a large variety of different experiments. The validation process has to cover both the code itself and the code input. Uncertainties of different nature are identified in the different phases of the validation procedure and can even be quantified. This thesis presents a novel approach to the input model validation and uncertainty evaluation in the different stages of the computer code validation procedure. This thesis also demonstrates that in the safety analysis, there are inevitably significant uncertainties that are not statistically quantifiable; they need to be and can be addressed by other, less simplistic means, ultimately relying on the competence of the analysts and the capability of the community to support the experimental verification of analytical assumptions. This method completes essentially the commonly used uncertainty assessment methods, which are usually conducted using only statistical methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hur arbetar en framgångsrik programmerare? Uppgifterna att programmera datorspel och att programmera industriella, säkerhetskritiska system verkar tämligen olika. Genom en noggrann empirisk undersökning jämför och kontrasterar avhandlingen dessa två former av programmering och visar att programmering innefattar mer än teknisk förmåga. Med utgångspunkt i hermeneutisk och retorisk teori och med hjälp av både kulturvetenskap och datavetenskap visar avhandlingen att programmerarnas tradition och värderingar är grundläggande för deras arbete, och att båda sorter av programmering kan uppfattas och analyseras genom klassisk texttolkningstradition. Dessutom kan datorprogram betraktas och analyseras med hjälp av klassiska teorier om talproduktion i praktiken - program ses då i detta sammanhang som ett slags yttranden. Allt som allt förespråkar avhandlingen en återkomst till vetenskapens grunder, vilka innebär en ständig och oupphörlig cyklisk rörelse mellan att erfara och att förstå. Detta står i kontrast till en reduktionistisk syn på vetenskapen, som skiljer skarpt mellan subjektivt och objektivt, och på så sätt utgår från möjligheten att uppnå fullständigt vetande. Ofullständigt vetande är tolkandets och hermeneutikens domän. Syftet med avhandlingen är att med hjälp av exempel demonstrera programmeringens kulturella, hermeneutiska och retoriska natur.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the world becomes more technologically advanced and economies become globalized, computer science evolution has become faster than ever before. With this evolution and globalization come the need for sustainable university curricula that adequately prepare graduates for life in the industry. Additionally, behavioural skills or “soft” skills have become just as important as technical abilities and knowledge or “hard” skills. The objective of this study was to investigate the current skill gap that exists between computer science university graduates and actual industry needs as well as the sustainability of current computer science university curricula by conducting a systematic literature review of existing publications on the subject as well as a survey of recently graduated computer science students and their work supervisors. A quantitative study was carried out with respondents from six countries, mainly Finland, 31 of the responses came from recently graduated computer science professionals and 18 from their employers. The observed trends suggest that a skill gap really does exist particularly with “soft” skills and that many companies are forced to provide additional training to newly graduated employees if they are to be successful at their jobs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The computer game industry has grown steadily for years, and in revenues it can be compared to the music and film industries. The game industry has been moving to digital distribution. Computer gaming and the concept of business model are discussed among industrial practitioners and the scientific community. The significance of the business model concept has increased in the scientific literature recently, although there is still a lot of discussion going on on the concept. In the thesis, the role of the business model in the computer game industry is studied. Computer game developers, designers, project managers and organization leaders in 11 computer game companies were interviewed. The data was analyzed to identify the important elements of computer game business model, how the business model concept is perceived and how the growth of the organization affects the business model. It was identified that the importance of human capital is crucial to the business. As games are partly a product of creative thinking also innovation and the creative process are highly valued. The same applies to technical skills when performing various activities. Marketing and customer relationships are also considered as key elements in the computer game business model. Financing and partners are important especially for startups, when the organization is dependent on external funding and third party assets. The results of this study provide organizations with improved understanding on how the organization is built and what business model elements are weighted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Today, the user experience and usability in software application are becoming a major design issue due to the adaptation of many processes using new technologies. Therefore, the study of the user experience and usability might be included in every software development project and, thus, they should be tested to get traceable results. As a result of different testing methods to evaluate the concepts, a non-expert on the topic might have doubts on which option he/she should opt for and how to interpret the outcomes of the process. This work aims to create a process to ease the whole testing methodology based on the process created by Seffah et al. and a supporting software tool to follow the procedure of these testing methods for the user experience and usability.