916 resultados para information processing model
Resumo:
Abstract Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.
Resumo:
The continuous increase of genome sequencing projects produced a huge amount of data in the last 10 years: currently more than 600 prokaryotic and 80 eukaryotic genomes are fully sequenced and publically available. However the sole sequencing process of a genome is able to determine just raw nucleotide sequences. This is only the first step of the genome annotation process that will deal with the issue of assigning biological information to each sequence. The annotation process is done at each different level of the biological information processing mechanism, from DNA to protein, and cannot be accomplished only by in vitro analysis procedures resulting extremely expensive and time consuming when applied at a this large scale level. Thus, in silico methods need to be used to accomplish the task. The aim of this work was the implementation of predictive computational methods to allow a fast, reliable, and automated annotation of genomes and proteins starting from aminoacidic sequences. The first part of the work was focused on the implementation of a new machine learning based method for the prediction of the subcellular localization of soluble eukaryotic proteins. The method is called BaCelLo, and was developed in 2006. The main peculiarity of the method is to be independent from biases present in the training dataset, which causes the over‐prediction of the most represented examples in all the other available predictors developed so far. This important result was achieved by a modification, made by myself, to the standard Support Vector Machine (SVM) algorithm with the creation of the so called Balanced SVM. BaCelLo is able to predict the most important subcellular localizations in eukaryotic cells and three, kingdom‐specific, predictors were implemented. In two extensive comparisons, carried out in 2006 and 2008, BaCelLo reported to outperform all the currently available state‐of‐the‐art methods for this prediction task. BaCelLo was subsequently used to completely annotate 5 eukaryotic genomes, by integrating it in a pipeline of predictors developed at the Bologna Biocomputing group by Dr. Pier Luigi Martelli and Dr. Piero Fariselli. An online database, called eSLDB, was developed by integrating, for each aminoacidic sequence extracted from the genome, the predicted subcellular localization merged with experimental and similarity‐based annotations. In the second part of the work a new, machine learning based, method was implemented for the prediction of GPI‐anchored proteins. Basically the method is able to efficiently predict from the raw aminoacidic sequence both the presence of the GPI‐anchor (by means of an SVM), and the position in the sequence of the post‐translational modification event, the so called ω‐site (by means of an Hidden Markov Model (HMM)). The method is called GPIPE and reported to greatly enhance the prediction performances of GPI‐anchored proteins over all the previously developed methods. GPIPE was able to predict up to 88% of the experimentally annotated GPI‐anchored proteins by maintaining a rate of false positive prediction as low as 0.1%. GPIPE was used to completely annotate 81 eukaryotic genomes, and more than 15000 putative GPI‐anchored proteins were predicted, 561 of which are found in H. sapiens. In average 1% of a proteome is predicted as GPI‐anchored. A statistical analysis was performed onto the composition of the regions surrounding the ω‐site that allowed the definition of specific aminoacidic abundances in the different considered regions. Furthermore the hypothesis that compositional biases are present among the four major eukaryotic kingdoms, proposed in literature, was tested and rejected. All the developed predictors and databases are freely available at: BaCelLo http://gpcr.biocomp.unibo.it/bacello eSLDB http://gpcr.biocomp.unibo.it/esldb GPIPE http://gpcr.biocomp.unibo.it/gpipe
Resumo:
THE TITLE OF MY THESIS IS THE ROLE OF THE IDEAS AND THEIR CHANGE IN HIGHER EDUCATION POLICY-MAKING PROCESSES FROM THE EIGHTIES TO PRESENT-DAY: THE CASES OF ENGLAND AND NEW ZEALAND IN COMPARATIVE PERSPECTIVE UNDER A THEORETICAL POINT OF VIEW, THE AIM OF MY WORK IS TO CARRY OUT A RESEARCH MODELLED ON THE CONSTRUCTIVIST THEORY. IT FOCUSES ON THE ANALYSIS OF THE IMPACT OF IDEAS ON THE PROCESSES OF POLICY MAKING BY MEANS OF EPISTEMIC COMMUNITIES, THINK TANKS AND VARIOUS SOCIOECONOMIC CONTEXTS THAT MAY HAVE PLAYED A KEY ROLE IN THE CONSTRUCTION OF THE DIFFERENT PATHS. FROM MY POINT OF VIEW IDEAS CONSTITUTE A PRIORITY RESEARCH FIELD WHICH IS WORTH ANALYSING SINCE THEIR ROLE IN POLICY MAKING PROCESSES HAS BEEN TRADITIONALLY RATHER UNEXPLORED. IN THIS CONTEXT AND WITH THE AIM OF DEVELOPING A RESEARCH STRAND BASED ON THE ROLE OF IDEAS, I INTEND TO CARRY ON MY STUDY UNDER THE PERSPECTIVE OF CHANGE. DEPENDING ON THE DATA AND INFORMATION THAT I COLLECTED I EVALUATED THE WEIGHT OF EACH OF THESE VARIABLES AND MAYBE OTHERS SUCH AS THE INSTITUTIONS AND THE INDIVIDUAL INTERESTS, WHICH MAY HAVE INFLUENCED THE FORMATION OF THE POLICY MAKING PROCESSES. UNDER THIS LIGHT, I PLANNED TO ADOPT THE QUALITATIVE METHODOLOGY OF RESEARCH WHICH I BELIEVE TO BE VERY EFFECTIVE AGAINST THE MORE DIFFICULT AND POSSIBLY REDUCTIVE APPLICATION OF QUANTITIVE DATA SETS. I RECKON THEREFORE THAT THE MOST APPROPRIATE TOOLS FOR INFORMATION PROCESSING INCLUDE CONTENT ANALYSIS, AND IN-DEPTH INTERVIEWS TO PERSONALITIES OF THE POLITICAL PANORAMA (ÉLITE OR NOT) WHO HAVE PARTICIPATED IN THE PROCESS OF HIGHER EDUCATION REFORM FROM THE EIGHTIES TO PRESENT-DAY. THE TWO CASES TAKEN INTO CONSIDERATION SURELY SET AN EXAMPLE OF RADICAL REFORM PROCESSES WHICH HAVE OCCURRED IN QUITE DIFFERENT CONTEXTS DETERMINED BY THE SOCIOECONOMIC CHARACTERISTICS AND THE TRAITS OF THE ÉLITE. IN NEW ZEALAND THE DESCRIBED PROCESS HAS TAKEN PLACE WITH A STEADY PACE AND A GOOD GRADE OF CONSEQUANTIALITY, IN LINE WTH THE REFORMS IN OTHER STATE DIVISIONS DRIVEN BY THE IDEAS OF THE NEW PUBLIC MANAGEMENT. CONTRARILY IN ENGLAND THE REFORMATIVE ACTION OF MARGARET THATCHER HAS ACQUIRED A VERY RADICAL CONNOTATION AS IT HAS BROUGHT INTO THE AMBIT OF HIGHER EDUCATION POLICY CONCEPTS LIKE EFFICIENCY, EXCELLENCE, RATIONALIZATION THAT WOULD CONTRAST WITH THE GENERALISTIC AND MASS-ORIENTED IDEAS THAT WERE FASHIONABLE DURING THE SEVENTIES. THE MISSION I INTEND TO ACCOMPLISH THORUGHOUT MY RESEARCH IS TO INVESTIGATE AND ANALYSE INTO MORE DEPTH THE DIFFERENCES THAT SEEM TO EMERGE FROM TWO CONTEXTS WHICH MOST OF THE LITERATURE REGARDS AS A SINGLE MODEL: THE ANGLO-SAXON MODEL. UNDER THIS LIGHT, THE DENSE ANALYSIS OF POLICY PROCESSES ALLOWED TO BRING OUT BOTH THE CONTROVERSIAL AND CONTRASTING ASPECTS OF THE TWO REALITIES COMPARED, AND THE ROLE AND WEIGHT OF VARIABLES SUCH AS IDEAS (MAIN VARIABLE), INSTITUTIONAL SETTINGS AND INDIVIDUAL INTERESTS ACTING IN EACH CONTEXT. THE CASES I MEAN TO ATTEND PRESENT PECULIAR ASPECTS WORTH DEVELOPING AN IN-DEPTH ANALYSIS, AN OUTLINE OF WHICH WILL BE PROVIDED IN THIS ABSTRACT. ENGLAND THE CONSERVATIVE GOVERNMENT, SINCE 1981, INTRODUCED RADICAL CHANGES IN THE SECTOR OF HIGHER EDUCATION: FIRST CUTTING DOWN ON STATE FUNDINGS AND THEN WITH THE CREATION OF AN INSTITUTION FOR THE PLANNING AND LEADERSHIP OF THE POLYTECHNICS (NON-UNIVERSITY SECTOR). AFTERWARDS THE SCHOOL REFORM BY MARGARET THATCHER IN 1988 RAISED TO A GREAT STIR ALL OVER EUROPE DUE TO BOTH ITS CONSIDERABLE INNOVATIVE IMPRINT AND THE STRONG ATTACK AGAINST THE PEDAGOGY OF THE ‘ACTIVE’ SCHOOLING AND PROGRESSIVE EDUCATION, UNTIL THEN RECOGNIZED AS A MERIT OF THE BRITISH PUBLIC SCHOOL. IN THE AMBIT OF UNIVERSITY EDUCATION THIS REFORM, TOGETHER WITH SIMILAR MEASURES BROUGHT IN DURING 1992, PUT INTO PRACTICE THE CONSERVATIVE PRINCIPLES THROUGH A SERIES OF ACTIONS THAT INCLUDED: THE SUPPRESSION OF THE IRREMOVABILITY PRINCIPLE FOR UNIVERSITY TEACHERS; THE INTRODUCTION OF STUDENT LOANS FOR LOW-INCOME STUDENTS AND THE CANCELLATION OF THE CLEAR DISTINCTION BETWEEN UNIVERSITIES AND POLYTECHNICS. THE POLICIES OF THE LABOUR MAJORITY OF MR BLAIR DID NOT QUITE DIVERGE FROM THE CONSERVATIVES’ POSITION. IN 2003 BLAIR’S CABINET RISKED TO BECOME A MINORITY RIGHT ON THE OCCASION OF AN IMPORTANT UNIVERSITY REFORM PROPOSAL. THIS PROPOSAL WOULD FORESEE THE AUTONOMY FOR THE UNIVERSITIES TO RAISE UP TO 3.000 POUNDS THE ENROLMENT FEES FOR STUDENTS (WHILE FORMERLY THE CEILING WAS 1.125 POUNDS). BLAIR HAD TO FACE INTERNAL OPPOSITION WITHIN HIS OWN PARTY IN RELATION TO A MEASURE THAT, ACCORDING TO THE 150 MPS PROMOTERS OF AN ADVERSE MOTION, HAD NOT BEEN INCLUDED IN THE ELECTORAL PROGRAMME AND WOULD RISK CREATING INCOME-BASED DISCRIMINATION AMONG STUDENTS. AS A MATTER OF FACT THE BILL FOCUSED ON THE INTRODUCTION OF VERY LOW-INTEREST STUDENT LOANS TO BE SETTLED ONLY WHEN THE STUDENT WOULD HAVE FOUND A REMUNERATED OCCUPATION (A SYSTEM ALREADY PROVIDED FOR BY THE AUSTRALIAN LEGISLATION). NEW ZEALAND CONTRARILY TO MANY OTHER COUNTRIES, NEW ZEALAND HAS ADOPTED A VERY WIDE VISION OF THE TERTIARY EDUCATION. IT INCLUDES IN FACT THE FULL EDUCATIONAL PROGRAMME THAT IS INTERNATIONALLY RECOGNIZED AS THE POST-SECONDARY EDUCATION. SHOULD WE SPOTLIGHT A PECULIARITY OF THE NEW ZEALAND TERTIARY EDUCATION POLICY THEN IT WOULD BE ‘CHANGE’. LOOKING AT THE REFORM HISTORY RELATED TO THE TERTIARY EDUCATION SYSTEM, WE CAN CLEARLY IDENTIFY FOUR ‘SUB-PERIODS’ FROM THE EIGHTIES TO PRESENT-DAY: 1. BEFORE THE 80S’: AN ELITARIAN SYSTEM CHARACTERIZED BY LOW PARTICIPATION RATES. 2. BETWEEN MID AND LATE 80S’: A TREND TOWARDS THE ENLARGEMENT OF PARTICIPATION ASSOCIATED TO A GREATER COMPETITION. 3. 1990-1999: A FUTHER STEP TOWARDS A COMPETITIVE MODEL BASED ON THE MARKET-ORIENTED SYSTEM. 4. FROM 2000 TO TODAY: A CONTINUOUS EVOLUTION TOWARDS A MORE COMPETITIVE MODEL BASED ON THE MARKET-ORIENTED SYSTEM TOGETHER WITH A GROWING ATTENTION TO STATE CONTROL FOR SOCIAL AND ECONOMIC DEVELOPMENT OF THE NATION. AT PRESENT THE GOVERNMENT OF NEW ZEALAND OPERATES TO STRENGHTHEN THIS PROCESS, PRIMARILY IN RELATION TO THE ROLE OF TERTIARY EDUCATION AS A STEADY FACTOR OF NATIONAL WALFARE, WHERE PROFESSIONAL DEVELOPMENT CONTRIBUTES ACTIVELY TO THE GROWTH OF THE NATIONAL ECONOMIC SYSTEM5. THE CASES OF ENGLAND AND NEW ZEALAND ARE THE FOCUS OF AN IN-DEPTH INVESTIGATION THAT STARTS FROM AN ANALYSIS OF THE POLICIES OF EACH NATION AND DEVELOP INTO A COMPARATIVE STUDY. AT THIS POINT I ATTEMPT TO DRAW SOME PRELIMINARY IMPRESSIONS ON THE FACTS ESSENTIALLY DECRIBED ABOVE. THE UNIVERSITY POLICIES IN ENGLAND AND NEW ZEALAND HAVE BOTH UNDERGONE A SIGNIFICANT REFORMATORY PROCESS SINCE THE EARLY EIGHTIES; IN BOTH CONTEXTS THE IMPORTANCE OF IDEAS THAT CONSTITUTED THE BASE OF POLITICS UNTIL 1980 WAS QUITE RELEVANT. GENERALLY SPEAKING, IN BOTH CASES THE PRE-REFORM POLICIES WERE INSPIRED BY EGALITARIANISM AND EXPANSION OF THE STUDENT POPULATION WHILE THOSE BROUGHT IN BY THE REFORM WOULD PURSUE EFFICIENCY, QUALITY AND COMPETITIVENESS. UNDOUBTEDLY, IN LINE WITH THIS GENERAL TENDENCY THAT REFLECTS THE HYPOTHESIS PROPOSED, THE TWO UNIVERSITY SYSTEMS PRESENT SEVERAL DIFFERENCES. THE UNIVERSITY SYSTEM IN NEW ZEALAND PROCEEDED STEADILY TOWARDS THE IMPLEMENTATION OF A MANAGERIAL CONCEPTION OF TERTIARY EDUCATION, ESPECIALLY FROM 1996 ONWARDS, IN ACCORDANCE WITH THE REFORMATORY PROCESS OF THE WHOLE PUBLIC SECTOR. IN THE UNITED KINGDOM, AS IN THE REST OF EUROPE, THE NEW APPROACH TO UNIVERSITY POLICY-MAKING HAD TO CONFRONT A DEEP-ROOTED TRADITION OF PROGRESSIVE EDUCATION AND THE IDEA OF EDUCATION EXPANSION THAT IN FACT DOMINATED UNTIL THE EIGHTIES. FROM THIS VIEW POINT THE GOVERNING ACTION OF MARGARET THATCHER GAVE RISE TO A RADICAL CHANGE THAT REVOLUTIONIZED THE OBJECTIVES AND KEY VALUES OF THE WHOLE EDUCATIONAL SYSTEM, IN PARTICULAR IN THE HIGHER EDUCATION SECTOR. IDEAS AS EFFICIENCY, EXCELLENCE AND CONTROL OF THE PERFORMANCE BECAME DECISIVE. THE LABOUR CABINETS OF BLAIR DEVELOPED IN THE WAKE OF CONSERVATIVE REFORMS. THIS APPEARS TO BE A FOCAL POINT OF THIS STUDY THAT OBSERVES HOW ALSO IN NEW ZEALAND THE REFORMING PROCESS OCCURRED TRANSVERSELY DURING PROGRESSIVE AND CONSERVATIVE ADMINISTRATIONS. THE PRELIMINARY IMPRESSION IS THEREFORE THAT IDEAS DEEPLY MARK THE REFORMATIVE PROCESSES: THE AIM OF MY RESEARCH IS TO VERIFY TO WHICH EXTENT THIS STATEMENT IS TRUE. IN ORDER TO BUILD A COMPREHENSIVE ANALYLIS, FURTHER SIGNIFICANT FACTORS WILL HAVE TO BE INVESTIGATED: THE WAY IDEAS ARE PERCEIVED AND IMPLEMENTED BY THE DIFFERENT POLITICAL ELITES; HOW THE VARIOUS SOCIOECONOMIC CONTEXTS INFLUENCE THE REFORMATIVE PROCESS; HOW THE INSTITUTIONAL STRUCTURES CONDITION THE POLICY-MAKING PROCESSES; WHETHER INDIVIDUAL INTERESTS PLAY A ROLE AND, IF YES, TO WHICH EXTENT.
Resumo:
Theoretical models are developed for the continuous-wave and pulsed laser incision and cut of thin single and multi-layer films. A one-dimensional steady-state model establishes the theoretical foundations of the problem by combining a power-balance integral with heat flow in the direction of laser motion. In this approach, classical modelling methods for laser processing are extended by introducing multi-layer optical absorption and thermal properties. The calculation domain is consequently divided in correspondence with the progressive removal of individual layers. A second, time-domain numerical model for the short-pulse laser ablation of metals accounts for changes in optical and thermal properties during a single laser pulse. With sufficient fluence, the target surface is heated towards its critical temperature and homogeneous boiling or "phase explosion" takes place. Improvements are seen over previous works with the more accurate calculation of optical absorption and shielding of the incident beam by the ablation products. A third, general time-domain numerical laser processing model combines ablation depth and energy absorption data from the short-pulse model with two-dimensional heat flow in an arbitrary multi-layer structure. Layer removal is the result of both progressive short-pulse ablation and classical vaporisation due to long-term heating of the sample. At low velocity, pulsed laser exposure of multi-layer films comprising aluminium-plastic and aluminium-paper are found to be characterised by short-pulse ablation of the metallic layer and vaporisation or degradation of the others due to thermal conduction from the former. At high velocity, all layers of the two films are ultimately removed by vaporisation or degradation as the average beam power is increased to achieve a complete cut. The transition velocity between the two characteristic removal types is shown to be a function of the pulse repetition rate. An experimental investigation validates the simulation results and provides new laser processing data for some typical packaging materials.
Resumo:
Die rasante Entwicklung der Computerindustrie durch die stetige Verkleinerung der Transistoren führt immer schneller zum Erreichen der Grenze der Si-Technologie, ab der die Tunnelprozesse in den Transistoren ihre weitere Verkleinerung und Erhöhung ihrer Dichte in den Prozessoren nicht mehr zulassen. Die Zukunft der Computertechnologie liegt in der Verarbeitung der Quanteninformation. Für die Entwicklung von Quantencomputern ist die Detektion und gezielte Manipulation einzelner Spins in Festkörpern von größter Bedeutung. Die Standardmethoden der Spindetektion, wie ESR, erlauben jedoch nur die Detektion von Spinensembles. Die Idee, die das Auslesen von einzelnen Spins ermöglich sollte, besteht darin, die Manipulation getrennt von der Detektion auszuführen.rn Bei dem NV−-Zentrum handelt es sich um eine spezielle Gitterfehlstelle im Diamant, die sich als einen atomaren, optisch auslesbaren Magnetfeldsensor benutzen lässt. Durch die Messung seiner Fluoreszenz sollte es möglich sein die Manipulation anderer, optisch nicht detektierbaren, “Dunkelspins“ in unmittelbarer Nähe des NV-Zentrums mittels der Spin-Spin-Kopplung zu detektieren. Das vorgeschlagene Modell des Quantencomputers basiert auf dem in SWCNT eingeschlossenen N@C60.Die Peapods, wie die Einheiten aus den in Kohlenstoffnanoröhre gepackten Fullerenen mit eingefangenem Stickstoff genannt werden, sollen die Grundlage für die Recheneinheiten eines wahren skalierbaren Quantencomputers bilden. Die in ihnen mit dem Stickstoff-Elektronenspin durchgeführten Rechnungen sollen mit den oberflächennahen NV-Zentren (von Diamantplatten), über denen sie positioniert sein sollen, optisch ausgelesen werden.rnrnDie vorliegende Arbeit hatte das primäre Ziel, die Kopplung der oberflächennahen NV-Einzelzentren an die optisch nicht detektierbaren Spins der Radikal-Moleküle auf der Diamantoberfläche mittels der ODMR-Kopplungsexperimente optisch zu detektieren und damit entscheidende Schritte auf dem Wege der Realisierung eines Quantenregisters zu tun.rn Es wurde ein sich im Entwicklungsstadium befindende ODMR-Setup wieder aufgebaut und seine bisherige Funktionsweise wurde an kommerziellen NV-Zentrum-reichen Nanodiamanten verifiziert. Im nächsten Schritt wurde die Effektivität und Weise der Messung an die Detektion und Manipulation der oberflächennah (< 7 nm Tiefe) implantieren NV-Einzelzenten in Diamantplatten angepasst.Ein sehr großer Teil der Arbeit, der hier nur bedingt beschrieben werden kann, bestand aus derrnAnpassung der existierenden Steuersoftware an die Problematik der praktischen Messung. Anschließend wurde die korrekte Funktion aller implementierten Pulssequenzen und anderer Software-Verbesserungen durch die Messung an oberflächennah implantierten NV-Einzelzentren verifiziert. Auch wurde der Messplatz um die zur Messung der Doppelresonanz notwendigen Komponenten wie einen steuerbaren Elektromagneten und RF-Signalquelle erweitert. Unter der Berücksichtigung der thermischen Stabilität von N@C60 wurde für zukünftige Experimente auch ein optischer Kryostat geplant, gebaut, in das Setup integriert und charakterisiert.rn Die Spin-Spin-Kopplungsexperimente wurden mit dem sauerstoffstabilen Galvinoxyl-Radikalals einem Modell-System für Kopplung durchgeführt. Dabei wurde über die Kopplung mit einem NVZentrum das RF-Spektrum des gekoppelten Radikal-Spins beobachtet. Auch konnte von dem gekoppelten Spin eine Rabi-Nutation aufgenommen werden.rn Es wurden auch weitere Aspekte der Peapod Messung und Oberflächenimplantation betrachtet.Es wurde untersucht, ob sich die NV-Detektion durch die SWCNTs, Peapods oder Fullerene stören lässt. Es zeigte sich, dass die Komponenten des geplanten Quantencomputers, bis auf die C60-Cluster, für eine ODMR-Messanordnung nicht detektierbar sind und die NV-Messung nicht stören werden. Es wurde auch betrachtet, welche Arten von kommerziellen Diamantplatten für die Oberflächenimplantation geeignet sind, für die Kopplungsmessungen geeignete Dichte der implantierten NV-Zentren abgeschätzt und eine Implantation mit abgeschätzter Dichte betrachtet.
Resumo:
In this paper we propose a new system that allows reliable acetabular cup placement when the THA is operated in lateral approach. Conceptually it combines the accuracy of computer-generated patient-specific morphology information with an easy-to-use mechanical guide, which effectively uses natural gravity as the angular reference. The former is achieved by using a statistical shape model-based 2D-3D reconstruction technique that can generate a scaled, patient-specific 3D shape model of the pelvis from a single conventional anteroposterior (AP) pelvic X-ray radiograph. The reconstructed 3D shape model facilitates a reliable and accurate co-registration of the mechanical guide with the patient’s anatomy in the operating theater. We validated the accuracy of our system by conducting experiments on placing seven cups to four pelvises with different morphologies. Taking the measurements from an image-free navigation system as the ground truth, our system showed an average accuracy of 2.1 ±0.7 o for inclination and an average accuracy of 1.2 ±1.4 o for anteversion.
Resumo:
The construction of a reliable, practically useful prediction rule for future response is heavily dependent on the "adequacy" of the fitted regression model. In this article, we consider the absolute prediction error, the expected value of the absolute difference between the future and predicted responses, as the model evaluation criterion. This prediction error is easier to interpret than the average squared error and is equivalent to the mis-classification error for the binary outcome. We show that the distributions of the apparent error and its cross-validation counterparts are approximately normal even under a misspecified fitted model. When the prediction rule is "unsmooth", the variance of the above normal distribution can be estimated well via a perturbation-resampling method. We also show how to approximate the distribution of the difference of the estimated prediction errors from two competing models. With two real examples, we demonstrate that the resulting interval estimates for prediction errors provide much more information about model adequacy than the point estimates alone.
Resumo:
Based on an integrative brain model which focuses on memory-driven and EEG state-dependent information processing for the organisation of behaviour, we used the developmental changes of the awake EEG to further investigate the hypothesis that neurodevelopmental abnormalities (deviations in organisation and reorganisation of cortico-cortical connectivity during development) are involved in the pathogenesis of schizophrenia. First-episode, neuroleptic-naive schizophrenics and their matched controls and three age groups of normal adolescents were studied (total: 70 subjects). 19-channel EEG delta-theta, alpha and beta spectral band centroid frequencies during resting (baseline) and after verbal stimuli were used as measure of the level of attained complexity and momentary excitability of the neuronal network (working memory). Schizophrenics compared with all control groups showed lower delta-theta activity centroids and higher alpha and beta activity centroids. Reactivity centroids (centroid after stimulus minus centroid during resting) were used as measure of update of working memory. Schizophrenics showed partial similarities in delta-theta and beta reactivity centroids with the 11-year olds and in alpha reactivity centroids with the 13-year olds. Within the framework of our model, the results suggest multifactorially elicited imbalances in the level of excitability of neuronal networks in schizophrenia, resulting in network activation at dissociated complexity levels, partially regressed and partially prematurely developed. It is hypothesised that activation of age- and/or state-inadequate representations for coping with realities becomes manifest as productive schizophrenic symptoms. Thus, the results support some aspects of the neurodevelopmental hypothesis.
Resumo:
OBJECTIVE: To investigate whether autistic subjects show a different pattern of neural activity than healthy individuals during processing of faces and complex patterns. METHODS: Blood oxygen level-dependent (BOLD) signal changes accompanying visual processing of faces and complex patterns were analyzed in an autistic group (n = 7; 25.3 [6.9] years) and a control group (n = 7; 27.7 [7.8] years). RESULTS: Compared with unaffected subjects, autistic subjects demonstrated lower BOLD signals in the fusiform gyrus, most prominently during face processing, and higher signals in the more object-related medial occipital gyrus. Further signal increases in autistic subjects vs controls were found in regions highly important for visual search: the superior parietal lobule and the medial frontal gyrus, where the frontal eye fields are located. CONCLUSIONS: The cortical activation pattern during face processing indicates deficits in the face-specific regions, with higher activations in regions involved in visual search. These findings reflect different strategies for visual processing, supporting models that propose a predisposition to local rather than global modes of information processing in autism.
Resumo:
Artificial neural networks are based on computational units that resemble basic information processing properties of biological neurons in an abstract and simplified manner. Generally, these formal neurons model an input-output behaviour as it is also often used to characterize biological neurons. The neuron is treated as a black box; spatial extension and temporal dynamics present in biological neurons are most often neglected. Even though artificial neurons are simplified, they can show a variety of input-output relations, depending on the transfer functions they apply. This unit on transfer functions provides an overview of different transfer functions and offers a simulation that visualizes the input-output behaviour of an artificial neuron depending on the specific combination of transfer functions.
Resumo:
The increasing practice of offshore outsourcing software maintenance has posed the challenge of effectively transferring knowledge to individual software engineers of the vendor. In this theoretical paper, we discuss the implications of two learning theories, the model of work-based learning (MWBL) and cognitive load theory (CLT), for knowledge transfer during the transition phase. Taken together, the theories suggest that learning mechanisms need to be aligned with the type of knowledge (tacit versus explicit), task characteristics (complexity and recurrence), and the recipients’ expertise. The MWBL proposes that learning mechanisms need to include conceptual and practical activities based on the relative importance of explicit and tacit knowledge. CLT explains how effective portfolios of learning mechanisms change over time. While jobshadowing, completion tasks, and supportive information may prevail at the outset of transition, they may be replaced by the work on conventional tasks towards the end of transition.
Resumo:
So far, social psychology in sport has preliminary focused on team cohesion, and many studies and meta analyses tried to demonstrate a relation between cohesiveness of a team and it's performance. How a team really co-operates and how the individual actions are integrated towards a team action is a question that has received relatively little attention in research. This may, at least in part, be due to a lack of a theoretical framework for collective actions, a dearth that has only recently begun to challenge sport psychologists. In this presentation a framework for a comprehensive theory of teams in sport is outlined and its potential to integrate the following presentations is put up for discussion. Based on a model developed by von Cranach, Ochsenbein and Valach (1986), teams are information processing organisms, and team actions need to be investigated on two levels: the individual team member and the group as an entity. Elements to be considered are the task, the social structure, the information processing structure and the execution structure. Obviously, different task require different social structures, communication and co-ordination. From a cognitivist point of view, internal representations (or mental models) guide the behaviour mainly in situations requiring quick reactions and adaptations, were deliberate or contingency planning are difficult. In sport teams, the collective representation contains the elements of the team situation, that is team task and team members, and of the team processes, that is communication and co-operation. Different meta-perspectives may be distinguished and bear a potential to explain the actions of efficient teams. Cranach, M. von, Ochsenbein, G., & Valach, L. (1986).The group as a self-active system: Outline of a theory of group action. European Journal of Social Psychology, 16, 193-229.
Resumo:
BACKGROUND: Research on comorbidity of psychiatric disorders identifies broad superordinate dimensions as underlying structure of psychopathology. While a syndrome-level approach informs diagnostic systems, a symptom-level approach is more likely to represent the dimensional components within existing diagnostic categories. It may capture general emotional, cognitive or physiological processes as underlying liabilities of different disorders and thus further develop dimensional-spectrum models of psychopathology. METHODS: Exploratory and confirmatory factor analyses were used to examine the structure of psychopathological symptoms assessed with the Brief Symptom Inventory in two outpatient samples (n=3171), including several correlated-factors and bifactor models. The preferred models were correlated with DSM-diagnoses. RESULTS: A model containing eight correlated factors for depressed mood, phobic fear, aggression, suicidal ideation, nervous tension, somatic symptoms, information processing deficits, and interpersonal insecurity, as well a bifactor model fit the data best. Distinct patterns of correlations with DSM-diagnoses identified a) distress-related disorders, i.e., mood disorders, PTSD, and personality disorders, which were associated with all correlated factors as well as the underlying general distress factor; b) anxiety disorders with more specific patterns of correlations; and c) disorders defined by behavioural or somatic dysfunctions, which were characterised by non-significant or negative correlations with most factors. CONCLUSIONS: This study identified emotional, somatic, cognitive, and interpersonal components of psychopathology as transdiagnostic psychopathological liabilities. These components can contribute to a more accurate description and taxonomy of psychopathology, may serve as phenotypic constructs for further aetiological research, and can inform the development of tailored general and specific interventions to treat mental disorders.
Resumo:
The task of encoding and processing complex sensory input requires many types of transsynaptic signals. This requirement is served in part by an extensive group of neurotransmitter substances which may include thirty or more different compounds. At the next level of information processing, the existence of multiple receptors for a given neurotransmitter appears to be a widely used mechanism to generate multiple responses to a given first messenger (Snyder and Goodman, 1980). Despite the wealth of published data on GABA receptors, the existence of more than one GABA receptor was in doubt until the mid 1980's. Presently there is still disagreement on the number of types of GABA receptors, estimates for which range from two to four (DeFeudis, 1983; Johnston, 1985). Part of the problem in evaluating data concerning multiple receptor types is the lack of information on the number of gene products and their subsequent supramolecular organization in different neurons. In order to evaluate the question concerning the diversity of GABA receptors in the nervous system, we must rely on indirect information derived from a wide variety of experimental techniques. These include pharmacological binding studies to membrane fractions, electrophysiological studies, localization studies, purification studies, and functional assays. Almost all parts of the central and peripheral nervous system use GABA as a neurotransmitter, and these experimental techniques have therefore been applied to many different parts of the nervous system for the analysis of GABA receptor characteristics. We are left with a large amount of data from a wide variety of techniques derived from many parts of the nervous system. When this project was initiated in 1983, there were only a handful of pharmacological tools to assess the question of multiple GABA receptors. The approach adopted was to focus on a single model system, using a variety of experimental techniques, in order to evaluate the existence of multiple forms of GABA receptors. Using the in vitro rabbit retina, a combination of pharmacological binding studies, functional release studies and partial purification studies were undertaken to examine the GABA receptor composition of this tissue. Three types of GABA receptors were observed: Al receptors coupled to benzodiazepine and barbiturate modulation, and A2 or uncoupled GABA-A receptors, and GABA-B receptors. These results are evaluated and discussed in light of recent findings by others concerning the number and subtypes of GABA receptors in the nervous system. ^
Resumo:
Currently, dramatic changes are happening in the IS development industry. The incumbent system developers (hubs) are embracing partnerships with less well established companies (spokes), acting in specific niches. This paper seeks to establish a better understanding of the motives for this strategy. Relying on existing work on strategic alliance formation, it is argued that partnering is particularly attractive, if these small companies possess certain capabilities that are difficult to obtain through other arrangements than partnering. Again drawing on the literature, three categories of capabilities are identified: the capability to innovate within their niche, the capability to provide a specific functionality that can be integrated with the incumbents’ systems, and the capability to address novel markets. These factors are analyzed through a case study. The case represents a market leader in the global IS development industry, which fosters a network of smaller partner firms. The study reveals that temporal dynamics between the identified factors are playing a dominant role in these networks. A cyclical partnership model is developed that attempts to explain the life cycle of a partnership within such a network.