28 resultados para computer-based instruction
Resumo:
ABSTRACT Adult neuronal plasticity is a term that corresponds to a set of biological mechanisms allowing a neuronal circuit to respond and adapt to modifications of the received inputs. Mystacial whiskers of the mouse are the starting point of a major sensory pathway that provides the animal with information from its immediate environment. Through whisking, information is gathered that allows the animal to orientate itself and to recognize objects. This sensory system is crucial for nocturnal behaviour during which vision is not of much use. Sensory information of the whiskers are sent via brainstem and thalamus to the primary somatosensory area (S1) of the cerebral cortex in a strictly topological manner. Cell bodies in the layer N of S 1 are arranged in ring forming structures called barrels. As such, each barrel corresponds to the cortical representation in layer IV of a single whisker follicle. This histological feature allows to identify with uttermost precision the part of the cortex devoted to a given whisker and to study modifications induced by different experimental conditions. The condition used in the studies of my thesis is the passive stimulation of one whisker in the adult mouse for a period of 24 hours. It is performed by glueing a piece of metal on one whisker and placing the awake animal in a cage surrounded by an electromagnetic coil that generates magnetic field burst inducing whisker movement at a given frequency during 24 hours. I analysed the ultrastructure of the barrel corresponding the stimulated whisker using serial sections electron microscopy and computer-based three-dimensional reconstructions; analysis of neighbouring, unstimulated barrels as well as those from unstimulated mice served as control. The following elements were structurally analyzed: the spiny dendrites, the axons of excitatory as well as inhibitory cells, their connections via synapses and the astrocytic processes. The density of synapses and spines is upregulated in a barrel corresponding to a stimulated whisker. This upregulation is absent in the BDNF heterozygote mice, indicating that a certain level of activity-dependent released BDNF is required for synaptogenesis in the adult cerebral cortex. Synpaptogenesis is correlated with a modification of the astrocytes that place themselves in closer vicinity of the excitatory synapses on spines. Biochemical analysis revealed that the astrocytes upregulate the expression of transporters by which they internalise glutamate, the neurotransmitter responsible for the excitatory response of cortical neurons. In the final part of my thesis, I show that synaptogenesis in the stimulated barrel is due to the increase in the size of excitatory axonal boutons that become more frequently multisynaptic, whereas the inhibitory axons do not change their morphology but form more synapses with spines apposed to them. Taken together, my thesis demonstrates that all the cellular elements present in the neuronal tissue of the adult brain contribute to activity-dependent cortical plasticity and form part of a mechanism by which the animal responds to a modified sensory experience. Throughout life, the neuronal circuit keeps the faculty to adapt its function. These adaptations are partially transitory but some aspects remain and could be the structural basis of a memory trace in the cortical circuit. RESUME La plasticité neuronale chez l'adulte désigne un ensemble de mécanismes biologiques qui permettent aux circuits neuronaux de répondre et de s'adapter aux modifications des stimulations reçues. Les vibrisses des souris sont un système crucial fournissant des informations sensorielles au sujet de l'environnement de l'animal. L'information sensorielle collectée par les vibrisses est envoyée via le tronc cérébral et le thalamus à l'aire sensorielle primaire (S 1) du cortex cérébral en respectant strictement la somatotopie. Les corps cellulaires dans la couche IV de S 1 sont organisés en anneaux délimitant des structures nommées tonneaux. Chaque tonneau reçoit l'information d'une seule vibrisse et l'arrangement des tonneaux dans le cortex correspond à l'arrangement des vibrisses sur le museau de la souris. Cette particularité histologique permet de sélectionner avec certitude la partie du cortex dévolue à une vibrisse et de l'étudier dans diverses conditions. Le paradigme expérimental utilisé dans cette thèse est la stimulation passive d'une seule vibrisse durant 24 heures. Pour ce faire, un petit morceau de métal est collé sur une vibrisse et la souris est placée dans une cage entourée d'une bobine électromagnétique générant un champ qui fait vibrer le morceau de métal durant 24 heures. Nous analysons l'ultrastructure du cortex cérébral à l'aide de la microscopie électronique et des coupes sériées permettant la reconstruction tridimensionnelle à l'aide de logiciels informatiques. Nous observons les modifications des structures présentes : les dendrites épineuses, les axones des cellules excitatrices et inhibitrices, leurs connections par des synapses et les astrocytes. Le nombre de synapses et d'épines est augmenté dans un tonneau correspondant à une vibrisse stimulée 24 heures. Basé sur cela, nous montrons dans ces travaux que cette réponse n'est pas observée dans des souris hétérozygotes BDNF+/-. Cette neurotrophine sécrétée en fonction de l'activité neuronale est donc nécessaire pour la synaptogenèse. La synaptogenèse est accompagnée d'une modification des astrocytes qui se rapprochent des synapses excitatrices au niveau des épines dendritiques. Ils expriment également plus de transporteurs chargés d'internaliser le glutamate, le neurotransmetteur responsable de la réponse excitatrice des neurones. Nous montrons aussi que les axones excitateurs deviennent plus larges et forment plus de boutons multi-synaptiques à la suite de la stimulation tandis que les axones inhibiteurs ne changent pas de morphologie mais forment plus de synapses avec des épines apposées à leur membrane. Tous les éléments analysés dans le cerveau adulte ont maintenu la capacité de réagir aux modifications de l'activité neuronale et répondent aux modifications de l'activité permettant une constante adaptation à de nouveaux environnements durant la vie. Les circuits neuronaux gardent la capacité de créer de nouvelles synapses. Ces adaptations peuvent être des réponses transitoires aux stimuli mais peuvent aussi laisser une trace mnésique dans les circuits.
Resumo:
How communication systems emerge and remain stable is an important question in both cognitive science and evolutionary biology. For communication to arise, not only must individuals cooperate by signaling reliable information, but they must also coordinate and perpetuate signals. Most studies on the emergence of communication in humans typically consider scenarios where individuals implicitly share the same interests. Likewise, most studies on human cooperation consider scenarios where shared conventions of signals and meanings cannot be developed de novo. Here, we combined both approaches with an economic experiment where participants could develop a common language, but under different conditions fostering or hindering cooperation. Participants endeavored to acquire a resource through a learning task in a computer-based environment. After this task, participants had the option to transmit a signal (a color) to a fellow group member, who would subsequently play the same learning task. We varied the way participants competed with each other (either global scale or local scale) and the cost of transmitting a signal (either costly or noncostly) and tracked the way in which signals were used as communication among players. Under global competition, players signaled more often and more consistently, scored higher individual payoffs, and established shared associations of signals and meanings. In addition, costly signals were also more likely to be used under global competition; whereas under local competition, fewer signals were sent and no effective communication system was developed. Our results demonstrate that communication involves both a coordination and a cooperative dilemma and show the importance of studying language evolution under different conditions influencing human cooperation.
Resumo:
In recent years, Business Model Canvas design has evolved from being a paper-based activity to one that involves the use of dedicated computer-aided business model design tools. We propose a set of guidelines to help design more coherent business models. When combined with functionalities offered by CAD tools, they show great potential to improve business model design as an ongoing activity. However, in order to create complex solutions, it is necessary to compare basic business model design tasks, using a CAD system over its paper-based counterpart. To this end, we carried out an experiment to measure user perceptions of both solutions. Performance was evaluated by applying our guidelines to both solutions and then carrying out a comparison of business model designs. Although CAD did not outperform paper-based design, the results are very encouraging for the future of computer-aided business model design.
Resumo:
Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be developed in a forthcoming article (part II).
Resumo:
BACKGROUND: This study describes the prevalence, associated anomalies, and demographic characteristics of cases of multiple congenital anomalies (MCA) in 19 population-based European registries (EUROCAT) covering 959,446 births in 2004 and 2010. METHODS: EUROCAT implemented a computer algorithm for classification of congenital anomaly cases followed by manual review of potential MCA cases by geneticists. MCA cases are defined as cases with two or more major anomalies of different organ systems, excluding sequences, chromosomal and monogenic syndromes. RESULTS: The combination of an epidemiological and clinical approach for classification of cases has improved the quality and accuracy of the MCA data. Total prevalence of MCA cases was 15.8 per 10,000 births. Fetal deaths and termination of pregnancy were significantly more frequent in MCA cases compared with isolated cases (p < 0.001) and MCA cases were more frequently prenatally diagnosed (p < 0.001). Live born infants with MCA were more often born preterm (p < 0.01) and with birth weight < 2500 grams (p < 0.01). Respiratory and ear, face, and neck anomalies were the most likely to occur with other anomalies (34% and 32%) and congenital heart defects and limb anomalies were the least likely to occur with other anomalies (13%) (p < 0.01). However, due to their high prevalence, congenital heart defects were present in half of all MCA cases. Among males with MCA, the frequency of genital anomalies was significantly greater than the frequency of genital anomalies among females with MCA (p < 0.001). CONCLUSION: Although rare, MCA cases are an important public health issue, because of their severity. The EUROCAT database of MCA cases will allow future investigation on the epidemiology of these conditions and related clinical and diagnostic problems.
Resumo:
Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.
Resumo:
Functional connectivity in human brain can be represented as a network using electroencephalography (EEG) signals. These networks--whose nodes can vary from tens to hundreds--are characterized by neurobiologically meaningful graph theory metrics. This study investigates the degree to which various graph metrics depend upon the network size. To this end, EEGs from 32 normal subjects were recorded and functional networks of three different sizes were extracted. A state-space based method was used to calculate cross-correlation matrices between different brain regions. These correlation matrices were used to construct binary adjacency connectomes, which were assessed with regards to a number of graph metrics such as clustering coefficient, modularity, efficiency, economic efficiency, and assortativity. We showed that the estimates of these metrics significantly differ depending on the network size. Larger networks had higher efficiency, higher assortativity and lower modularity compared to those with smaller size and the same density. These findings indicate that the network size should be considered in any comparison of networks across studies.
Resumo:
Carotenoid-based yellowish to red plumage colors are widespread visual signals used in sexual and social communication. To understand their ultimate signaling functions, it is important to identify the proximate mechanism promoting variation in coloration. Carotenoid-based colors combine structural and pigmentary components, but the importance of the contribution of structural components to variation in pigment-based colors (i.e., carotenoid-based colors) has been undervalued. In a field experiment with great tits (Parus major), we combined a brood size manipulation with a simultaneous carotenoid supplementation in order to disentangle the effects of carotenoid availability and early growth condition on different components of the yellow breast feathers. By defining independent measures of feather carotenoid content (absolute carotenoid chroma) and background structure (background reflectance), we demonstrate that environmental factors experienced during the nestling period, namely, early growth conditions and carotenoid availability, contribute independently to variation in yellow plumage coloration. While early growth conditions affected the background reflectance of the plumage, the availability of carotenoids affected the absolute carotenoid chroma, the peak of maximum ultraviolet reflectance, and the overall shape, that is, chromatic information of the reflectance curves. These findings demonstrate that environment-induced variation in background structure contributes significantly to intraspecific variation in yellow carotenoid-based plumage coloration.
Resumo:
To make a comprehensive evaluation of organ-specific out-of-field doses using Monte Carlo (MC) simulations for different breast cancer irradiation techniques and to compare results with a commercial treatment planning system (TPS). Three breast radiotherapy techniques using 6MV tangential photon beams were compared: (a) 2DRT (open rectangular fields), (b) 3DCRT (conformal wedged fields), and (c) hybrid IMRT (open conformal+modulated fields). Over 35 organs were contoured in a whole-body CT scan and organ-specific dose distributions were determined with MC and the TPS. Large differences in out-of-field doses were observed between MC and TPS calculations, even for organs close to the target volume such as the heart, the lungs and the contralateral breast (up to 70% difference). MC simulations showed that a large fraction of the out-of-field dose comes from the out-of-field head scatter fluence (>40%) which is not adequately modeled by the TPS. Based on MC simulations, the 3DCRT technique using external wedges yielded significantly higher doses (up to a factor 4-5 in the pelvis) than the 2DRT and the hybrid IMRT techniques which yielded similar out-of-field doses. In sharp contrast to popular belief, the IMRT technique investigated here does not increase the out-of-field dose compared to conventional techniques and may offer the most optimal plan. The 3DCRT technique with external wedges yields the largest out-of-field doses. For accurate out-of-field dose assessment, a commercial TPS should not be used, even for organs near the target volume (contralateral breast, lungs, heart).
Resumo:
Metabolic problems lead to numerous failures during clinical trials, and much effort is now devoted to developing in silico models predicting metabolic stability and metabolites. Such models are well known for cytochromes P450 and some transferases, whereas less has been done to predict the activity of human hydrolases. The present study was undertaken to develop a computational approach able to predict the hydrolysis of novel esters by human carboxylesterase hCES2. The study involved first a homology modeling of the hCES2 protein based on the model of hCES1 since the two proteins share a high degree of homology (congruent with 73%). A set of 40 known substrates of hCES2 was taken from the literature; the ligands were docked in both their neutral and ionized forms using GriDock, a parallel tool based on the AutoDock4.0 engine which can perform efficient and easy virtual screening analyses of large molecular databases exploiting multi-core architectures. Useful statistical models (e.g., r (2) = 0.91 for substrates in their unprotonated state) were calculated by correlating experimental pK(m) values with distance between the carbon atom of the substrate's ester group and the hydroxy function of Ser228. Additional parameters in the equations accounted for hydrophobic and electrostatic interactions between substrates and contributing residues. The negatively charged residues in the hCES2 cavity explained the preference of the enzyme for neutral substrates and, more generally, suggested that ligands which interact too strongly by ionic bonds (e.g., ACE inhibitors) cannot be good CES2 substrates because they are trapped in the cavity in unproductive modes and behave as inhibitors. The effects of protonation on substrate recognition and the contrasting behavior of substrates and products were finally investigated by MD simulations of some CES2 complexes.
Resumo:
AIM: The aim of this study was to evaluate a new pedagogical approach in teaching fluid, electrolyte and acid-base pathophysiology in undergraduate students. METHODS: This approach comprises traditional lectures, the study of clinical cases on the web and a final interactive discussion of these cases in the classroom. When on the web, the students are asked to select laboratory tests that seem most appropriate to understand the pathophysiological condition underlying the clinical case. The percentage of students having chosen a given test is made available to the teacher who uses it in an interactive session to stimulate discussion with the whole class of students. The same teacher used the same case studies during 2 consecutive years during the third year of the curriculum. RESULTS: The majority of students answered the questions on the web as requested and evaluated positively their experience with this form of teaching and learning. CONCLUSIONS: Complementing traditional lectures with online case-based studies and interactive group discussions represents, therefore, a simple means to promote the learning and the understanding of complex pathophysiological mechanisms. This simple problem-based approach to teaching and learning may be implemented to cover all fields of medicine.
Resumo:
Positron emission tomography is a functional imaging technique that allows the detection of the regional metabolic rate, and is often coupled with other morphological imaging technique such as computed tomography. The rationale for its use is based on the clearly demonstrated fact that functional changes in tumor processes happen before morphological changes. Its introduction to the clinical practice added a new dimension in conventional imaging techniques. This review presents the current and proposed indications of the use of positron emission/computed tomography for prostate, bladder and testes, and the potential role of this exam in radiotherapy planning.
Resumo:
Diagnosis of several neurological disorders is based on the detection of typical pathological patterns in the electroencephalogram (EEG). This is a time-consuming task requiring significant training and experience. Automatic detection of these EEG patterns would greatly assist in quantitative analysis and interpretation. We present a method, which allows automatic detection of epileptiform events and discrimination of them from eye blinks, and is based on features derived using a novel application of independent component analysis. The algorithm was trained and cross validated using seven EEGs with epileptiform activity. For epileptiform events with compensation for eyeblinks, the sensitivity was 65 +/- 22% at a specificity of 86 +/- 7% (mean +/- SD). With feature extraction by PCA or classification of raw data, specificity reduced to 76 and 74%, respectively, for the same sensitivity. On exactly the same data, the commercially available software Reveal had a maximum sensitivity of 30% and concurrent specificity of 77%. Our algorithm performed well at detecting epileptiform events in this preliminary test and offers a flexible tool that is intended to be generalized to the simultaneous classification of many waveforms in the EEG.