970 resultados para Execution traces


Relevância:

10.00% 10.00%

Publicador:

Resumo:

La investigación recoge la situación de las personas que sufren algún tipo de trastorno mental y se encuentran cumpliendo una medida de ejecución penal, ya sea internamiento en prisión o bien una medida de seguridad. La muestra del estudio la componen las personas que en el año 2005 pasaron como mínimo un día por las unidades psiquiátricas de los centros penitenciarios, o que cumplieron alguna medida de seguridad (en el ámbito territorial de las comarcas de Barcelona). En la segunda parte del estudio, más de un centenar de profesionales y expertos opinan sobre las principales necesidades presentes y futuras de la intervención en salud mental en el mundo de la ejecución penal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract : In the subject of fingerprints, the rise of computers tools made it possible to create powerful automated search algorithms. These algorithms allow, inter alia, to compare a fingermark to a fingerprint database and therefore to establish a link between the mark and a known source. With the growth of the capacities of these systems and of data storage, as well as increasing collaboration between police services on the international level, the size of these databases increases. The current challenge for the field of fingerprint identification consists of the growth of these databases, which makes it possible to find impressions that are very similar but coming from distinct fingers. However and simultaneously, this data and these systems allow a description of the variability between different impressions from a same finger and between impressions from different fingers. This statistical description of the withinand between-finger variabilities computed on the basis of minutiae and their relative positions can then be utilized in a statistical approach to interpretation. The computation of a likelihood ratio, employing simultaneously the comparison between the mark and the print of the case, the within-variability of the suspects' finger and the between-variability of the mark with respect to a database, can then be based on representative data. Thus, these data allow an evaluation which may be more detailed than that obtained by the application of rules established long before the advent of these large databases or by the specialists experience. The goal of the present thesis is to evaluate likelihood ratios, computed based on the scores of an automated fingerprint identification system when the source of the tested and compared marks is known. These ratios must support the hypothesis which it is known to be true. Moreover, they should support this hypothesis more and more strongly with the addition of information in the form of additional minutiae. For the modeling of within- and between-variability, the necessary data were defined, and acquired for one finger of a first donor, and two fingers of a second donor. The database used for between-variability includes approximately 600000 inked prints. The minimal number of observations necessary for a robust estimation was determined for the two distributions used. Factors which influence these distributions were also analyzed: the number of minutiae included in the configuration and the configuration as such for both distributions, as well as the finger number and the general pattern for between-variability, and the orientation of the minutiae for within-variability. In the present study, the only factor for which no influence has been shown is the orientation of minutiae The results show that the likelihood ratios resulting from the use of the scores of an AFIS can be used for evaluation. Relatively low rates of likelihood ratios supporting the hypothesis known to be false have been obtained. The maximum rate of likelihood ratios supporting the hypothesis that the two impressions were left by the same finger when the impressions came from different fingers obtained is of 5.2 %, for a configuration of 6 minutiae. When a 7th then an 8th minutia are added, this rate lowers to 3.2 %, then to 0.8 %. In parallel, for these same configurations, the likelihood ratios obtained are on average of the order of 100,1000, and 10000 for 6,7 and 8 minutiae when the two impressions come from the same finger. These likelihood ratios can therefore be an important aid for decision making. Both positive evolutions linked to the addition of minutiae (a drop in the rates of likelihood ratios which can lead to an erroneous decision and an increase in the value of the likelihood ratio) were observed in a systematic way within the framework of the study. Approximations based on 3 scores for within-variability and on 10 scores for between-variability were found, and showed satisfactory results. Résumé : Dans le domaine des empreintes digitales, l'essor des outils informatisés a permis de créer de puissants algorithmes de recherche automatique. Ces algorithmes permettent, entre autres, de comparer une trace à une banque de données d'empreintes digitales de source connue. Ainsi, le lien entre la trace et l'une de ces sources peut être établi. Avec la croissance des capacités de ces systèmes, des potentiels de stockage de données, ainsi qu'avec une collaboration accrue au niveau international entre les services de police, la taille des banques de données augmente. Le défi actuel pour le domaine de l'identification par empreintes digitales consiste en la croissance de ces banques de données, qui peut permettre de trouver des impressions très similaires mais provenant de doigts distincts. Toutefois et simultanément, ces données et ces systèmes permettent une description des variabilités entre différentes appositions d'un même doigt, et entre les appositions de différents doigts, basées sur des larges quantités de données. Cette description statistique de l'intra- et de l'intervariabilité calculée à partir des minuties et de leurs positions relatives va s'insérer dans une approche d'interprétation probabiliste. Le calcul d'un rapport de vraisemblance, qui fait intervenir simultanément la comparaison entre la trace et l'empreinte du cas, ainsi que l'intravariabilité du doigt du suspect et l'intervariabilité de la trace par rapport à une banque de données, peut alors se baser sur des jeux de données représentatifs. Ainsi, ces données permettent d'aboutir à une évaluation beaucoup plus fine que celle obtenue par l'application de règles établies bien avant l'avènement de ces grandes banques ou par la seule expérience du spécialiste. L'objectif de la présente thèse est d'évaluer des rapports de vraisemblance calcul és à partir des scores d'un système automatique lorsqu'on connaît la source des traces testées et comparées. Ces rapports doivent soutenir l'hypothèse dont il est connu qu'elle est vraie. De plus, ils devraient soutenir de plus en plus fortement cette hypothèse avec l'ajout d'information sous la forme de minuties additionnelles. Pour la modélisation de l'intra- et l'intervariabilité, les données nécessaires ont été définies, et acquises pour un doigt d'un premier donneur, et deux doigts d'un second donneur. La banque de données utilisée pour l'intervariabilité inclut environ 600000 empreintes encrées. Le nombre minimal d'observations nécessaire pour une estimation robuste a été déterminé pour les deux distributions utilisées. Des facteurs qui influencent ces distributions ont, par la suite, été analysés: le nombre de minuties inclus dans la configuration et la configuration en tant que telle pour les deux distributions, ainsi que le numéro du doigt et le dessin général pour l'intervariabilité, et la orientation des minuties pour l'intravariabilité. Parmi tous ces facteurs, l'orientation des minuties est le seul dont une influence n'a pas été démontrée dans la présente étude. Les résultats montrent que les rapports de vraisemblance issus de l'utilisation des scores de l'AFIS peuvent être utilisés à des fins évaluatifs. Des taux de rapports de vraisemblance relativement bas soutiennent l'hypothèse que l'on sait fausse. Le taux maximal de rapports de vraisemblance soutenant l'hypothèse que les deux impressions aient été laissées par le même doigt alors qu'en réalité les impressions viennent de doigts différents obtenu est de 5.2%, pour une configuration de 6 minuties. Lorsqu'une 7ème puis une 8ème minutie sont ajoutées, ce taux baisse d'abord à 3.2%, puis à 0.8%. Parallèlement, pour ces mêmes configurations, les rapports de vraisemblance sont en moyenne de l'ordre de 100, 1000, et 10000 pour 6, 7 et 8 minuties lorsque les deux impressions proviennent du même doigt. Ces rapports de vraisemblance peuvent donc apporter un soutien important à la prise de décision. Les deux évolutions positives liées à l'ajout de minuties (baisse des taux qui peuvent amener à une décision erronée et augmentation de la valeur du rapport de vraisemblance) ont été observées de façon systématique dans le cadre de l'étude. Des approximations basées sur 3 scores pour l'intravariabilité et sur 10 scores pour l'intervariabilité ont été trouvées, et ont montré des résultats satisfaisants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Grid is a hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational resources. Grid enables access to the resources but it does not guarantee any quality of service. Moreover, Grid does not provide performance isolation; job of one user can influence the performance of other user’s job. The other problem with Grid is that the users of Grid belong to scientific community and the jobs require specific and customized software environment. Providing the perfect environment to the user is very difficult in Grid for its dispersed and heterogeneous nature. Though, Cloud computing provide full customization and control, but there is no simple procedure available to submit user jobs as in Grid. The Grid computing can provide customized resources and performance to the user using virtualization. A virtual machine can join the Grid as an execution node. The virtual machine can also be submitted as a job with user jobs inside. Where the first method gives quality of service and performance isolation, the second method also provides customization and administration in addition. In this thesis, a solution is proposed to enable virtual machine reuse which will provide performance isolation with customization and administration. The same virtual machine can be used for several jobs. In the proposed solution customized virtual machines join the Grid pool on user request. Proposed solution describes two scenarios to achieve this goal. In first scenario, user submits their customized virtual machine as a job. The virtual machine joins the Grid pool when it is powered on. In the second scenario, user customized virtual machines are preconfigured in the execution system. These virtual machines join the Grid pool on user request. Condor and VMware server is used to deploy and test the scenarios. Condor supports virtual machine jobs. The scenario 1 is deployed using Condor VM universe. The second scenario uses VMware-VIX API for scripting powering on and powering off of the remote virtual machines. The experimental results shows that as scenario 2 does not need to transfer the virtual machine image, the virtual machine image becomes live on pool more faster. In scenario 1, the virtual machine runs as a condor job, so it easy to administrate the virtual machine. The only pitfall in scenario 1 is the network traffic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En el projecte s’ha dut a terme un estudi sobre la tecnologia que aporten les targetes gràfiques (GPU) dins l’àmbit de programació d’aplicacions que tradicionalment eren executades en la CPU o altrament conegut com a GPGPU. S’ha fet una anàlisi profunda del marc tecnològic actual explicant part del maquinari de les targetes gràfiques i de què tracta el GPGPU. També s’han estudiat les diferents opcions que existeixen per poder realitzar els tests de rendiment que permetran avaluar el programari, quin programari està dissenyat per ser executat amb aquesta tecnologia i quin és el procediment a seguir per poder utilitzar-los. S’han efectuat diverses proves per avaluar el rendiment de programari dissenyat o compatible d’executar en la GPU, realitzant taules comparatives amb els temps de còmput. Un cop finalitzades les diferents proves del programari, es pot concloure que no tota aplicació processada en la GPU aporta un benefici. Per poder veure millores és necessari que l’aplicació reuneixi una sèrie de requisits com que disposi d’un elevat nombre d’operacions que es puguin realitzar en paral lel, que no existeixin condicionants per a l’execució de les operacions i que sigui un procés amb càlcul aritmètic intensiu.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La tolerancia a fallos es una línea de investigación que ha adquirido una importancia relevante con el aumento de la capacidad de cómputo de los súper-computadores actuales. Esto es debido a que con el aumento del poder de procesamiento viene un aumento en la cantidad de componentes que trae consigo una mayor cantidad de fallos. Las estrategias de tolerancia a fallos actuales en su mayoría son centralizadas y estas no escalan cuando se utiliza una gran cantidad de procesos, dado que se requiere sincronización entre todos ellos para realizar las tareas de tolerancia a fallos. Además la necesidad de mantener las prestaciones en programas paralelos es crucial, tanto en presencia como en ausencia de fallos. Teniendo en cuenta lo citado, este trabajo se ha centrado en una arquitectura tolerante a fallos descentralizada (RADIC – Redundant Array of Distributed and Independant Controllers) que busca mantener las prestaciones iniciales y garantizar la menor sobrecarga posible para reconfigurar el sistema en caso de fallos. La implementación de esta arquitectura se ha llevado a cabo en la librería de paso de mensajes denominada Open MPI, la misma es actualmente una de las más utilizadas en el mundo científico para la ejecución de programas paralelos que utilizan una plataforma de paso de mensajes. Las pruebas iniciales demuestran que el sistema introduce mínima sobrecarga para llevar a cabo las tareas correspondientes a la tolerancia a fallos. MPI es un estándar por defecto fail-stop, y en determinadas implementaciones que añaden cierto nivel de tolerancia, las estrategias más utilizadas son coordinadas. En RADIC cuando ocurre un fallo el proceso se recupera en otro nodo volviendo a un estado anterior que ha sido almacenado previamente mediante la utilización de checkpoints no coordinados y la relectura de mensajes desde el log de eventos. Durante la recuperación, las comunicaciones con el proceso en cuestión deben ser retrasadas y redirigidas hacia la nueva ubicación del proceso. Restaurar procesos en un lugar donde ya existen procesos sobrecarga la ejecución disminuyendo las prestaciones, por lo cual en este trabajo se propone la utilización de nodos spare para la recuperar en ellos a los procesos que fallan, evitando de esta forma la sobrecarga en nodos que ya tienen trabajo. En este trabajo se muestra un diseño propuesto para gestionar de un modo automático y descentralizado la recuperación en nodos spare en un entorno Open MPI y se presenta un análisis del impacto en las prestaciones que tiene este diseño. Resultados iniciales muestran una degradación significativa cuando a lo largo de la ejecución ocurren varios fallos y no se utilizan spares y sin embargo utilizándolos se restablece la configuración inicial y se mantienen las prestaciones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Generación dinámica de interfaces web basadas en ficheros descriptivos XML para el control de la parametrización compleja y ejecución de programas por línea de comandos. La necesidad surge con la aplicación mlcoalsim, utilizada por investigadores de la UAB, cuya parametrización requiere la edición manual de un fichero de texto la sintaxis del cual es complicada y pesada. Con la generación de interfaces web se pretende ayudar a los usuarios en la correcta parametrización y ejecución de aplicaciones como mlcoalsim.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resume : L'utilisation de l'encre comme indice en sciences forensiques est décrite et encadrée par une littérature abondante, comprenant entre autres deux standards de l'American Society for Testing and Materials (ASTM). La grande majorité de cette littérature se préoccupe de l'analyse des caractéristiques physiques ou chimiques des encres. Les standards ASTM proposent quelques principes de base qui concernent la comparaison et l'interprétation de la valeur d'indice des encres en sciences forensiques. L'étude de cette littérature et plus particulièrement des standards ASTM, en ayant a l'esprit les développements intervenus dans le domaine de l'interprétation de l'indice forensique, montre qu'il existe un potentiel certain pour l'amélioration de l'utilisation de l'indice encre et de son impact dans l'enquête criminelle. Cette thèse propose d'interpréter l'indice encre en se basant sur le cadre défini par le théorème de Bayes. Cette proposition a nécessité le développement d'un système d'assurance qualité pour l'analyse et la comparaison d'échantillons d'encre. Ce système d'assurance qualité tire parti d'un cadre théorique nouvellement défini. La méthodologie qui est proposée dans ce travail a été testée de manière compréhensive, en tirant parti d'un set de données spécialement créer pour l'occasion et d'outils importés de la biométrie. Cette recherche répond de manière convaincante à un problème concret généralement rencontré en sciences forensiques. L'information fournie par le criminaliste, lors de l'examen de traces, est souvent bridée, car celui-ci essaie de répondre à la mauvaise question. L'utilisation d'un cadre théorique explicite qui définit et formalise le goal de l'examen criminaliste, permet de déterminer les besoins technologiques et en matière de données. Le développement de cette technologie et la collection des données pertinentes peut être justifiées économiquement et achevée de manière scientifique. Abstract : The contribution of ink evidence to forensic science is described and supported by an abundant literature and by two standards from the American Society for Testing and Materials (ASTM). The vast majority of the available literature is concerned with the physical and chemical analysis of ink evidence. The relevant ASTM standards mention some principles regarding the comparison of pairs of ink samples and the evaluation of their evidential value. The review of this literature and, more specifically, of the ASTM standards in the light of recent developments in the interpretation of forensic evidence has shown some potential improvements, which would maximise the benefits of the use of ink evidence in forensic science. This thesis proposes to interpret ink evidence using the widely accepted and recommended Bayesian theorem. This proposition has required the development of a new quality assurance process for the analysis and comparison of ink samples, as well as of the definition of a theoretical framework for ink evidence. The proposed technology has been extensively tested using a large dataset of ink samples and state of the art tools, commonly used in biometry. Overall, this research successfully answers to a concrete problem generally encountered in forensic science, where scientists tend to self-limit the usefulness of the information that is present in various types of evidence, by trying to answer to the wrong questions. The declaration of an explicit framework, which defines and formalises their goals and expected contributions to the criminal and civil justice system, enables the determination of their needs in terms of technology and data. The development of this technology and the collection of the data is then justified economically, structured scientifically and can be proceeded efficiently.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The widespread use of digital imaging devices for surveillance (CCTV) and entertainment (e.g., mobile phones, compact cameras) has increased the number of images recorded and opportunities to consider the images as traces or documentation of criminal activity. The forensic science literature focuses almost exclusively on technical issues and evidence assessment [1]. Earlier steps in the investigation phase have been neglected and must be considered. This article is the first comprehensive description of a methodology to event reconstruction using images. This formal methodology was conceptualised from practical experiences and applied to different contexts and case studies to test and refine it. Based on this practical analysis, we propose a systematic approach that includes a preliminary analysis followed by four main steps. These steps form a sequence for which the results from each step rely on the previous step. However, the methodology is not linear, but it is a cyclic, iterative progression for obtaining knowledge about an event. The preliminary analysis is a pre-evaluation phase, wherein potential relevance of images is assessed. In the first step, images are detected and collected as pertinent trace material; the second step involves organising and assessing their quality and informative potential. The third step includes reconstruction using clues about space, time and actions. Finally, in the fourth step, the images are evaluated and selected as evidence. These steps are described and illustrated using practical examples. The paper outlines how images elicit information about persons, objects, space, time and actions throughout the investigation process to reconstruct an event step by step. We emphasise the hypothetico-deductive reasoning framework, which demonstrates the contribution of images to generating, refining or eliminating propositions or hypotheses. This methodology provides a sound basis for extending image use as evidence and, more generally, as clues in investigation and crime reconstruction processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El consum de drogues no institucionalitzades té una important dimensió epidemiològica. En l'actualitat augmenta el consum de cocaïna i de cànnabis i s'estabilitza o disminueix el de opiacis. En la majoria dels països hi ha una relació entre la drogoaddicció i el delicte. Objectius: 1) Obtenir dades sociodemogràfiques en una població detinguda en els jutjats de guàrdia que passen a disposició judicial; 2) Obtenir dades sanitàries referents a infecció per VIH, VHC I VHB; 3) Obtenir dades de consums de cocaïna, haxis, heroïna, benzodiazepines i drogues de síntesis; 4) Conèixer la correlació o concordança clínico-analítica i, 5) Valorar la relació de l'addicció a drogues amb la delictologia. Subjectes i Mètodes: Estudi realitzat en una població de 151 subjectes consumidors de drogues il·legals detinguts a disposició judicial al Jutjat de guàrdia de la ciutat de Barcelona. Temps de l'estudi: 1,5 anys. Mètodes: Administració d'un qüestionari amb dades sociodemogràfiques, sanitàries i de consums de tòxics. Obtenció d'una mostra d'orina que es va analitzar per immunoassaig en l'analitzador AsXym (*Abbot). Els resultats es van interpretar com positius o negatius segons el punt de tall establert per al mètode. Resultats: El perfil de la mostra és un home solter, edat mitjana de 31.4 anys, amb estudis primaris i sense professió qualificada. La droga il·legal més consumida és la cocaïna, 77,5%, seguida dels opiacis 62,9%, del cànnabis 60,3% i benzodiazepines 61.6% (auto-medicades 40,9%). La prevalença de HIV va ser del 16,7%, VHC 37,9% i VHB 19,1%. La via intravenosa és utilitzada pel 46% dels cocainòmans i pel 53.8% del heroinòmans. Un 45.5% tenen signes compatibles amb UDVP. Només un 14.2% presentava signes de deteriorament físic i un 23.1% una clara síndrome d’abstinència. Existeix un 74.3% de correlació o concordança clínico-analítica. El delicte més relacionat amb el consum de drogues va ser els delictes contra la propietat. Conclusions i Discusió: Es detecta un alt consum de cocaïna i disminució de l'heroïna. El consum de cànnabis i benzodiazepines és elevat. Els delictes contra la propietat estan associats al consum de drogues il·legals. Propostes: L'estudi té aplicabilitat en medicina forense. El screening rutinari d'orina en població detinguda és una mesura d'utilitat. Permet obtenir dades objectives quan se sol·liciten informes de drogoaddicció als perits. Els detinguts poden beneficiar-se de circumstàncies modificadores de la responsabilitat criminal en cas de ser consumidors de tòxics, d’acord amb la legislació actual. Els lletrats i l'autoritat judicial tindran una prova objectiva en les seves actuacions per poder resoldre amb més coneixement els procediments on es troben implicats els consumidors de drogues d'abús.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In fear conditioning, an animal learns to associate an unconditioned stimulus (US), such as a shock, and a conditioned stimulus (CS), such as a tone, so that the presentation of the CS alone can trigger conditioned responses. Recent research on the lateral amygdala has shown that following cued fear conditioning, only a subset of higher-excitable neurons are recruited in the memory trace. Their selective deletion after fear conditioning results in a selective erasure of the fearful memory. I hypothesize that the recruitment of highly excitable neurons depends on responsiveness to stimuli, intrinsic excitability and local connectivity. In addition, I hypothesize that neurons recruited for an initial memory also participate in subsequent memories, and that changes in neuronal excitability affect secondary fear learning. To address these hypotheses, I will show that A) a rat can learn to associate two successive short-term fearful memories; B) neuronal populations in the LA are competitively recruited in the memory traces depending on individual neuronal advantages, as well as advantages granted by the local network. By performing two successive cued fear conditioning experiments, I found that rats were able to learn and extinguish the two successive short-term memories, when tested 1 hour after learning for each memory. These rats were equipped with a system of stable extracellular recordings that I developed, which allowed to monitor neuronal activity during fear learning. 233 individual putative pyramidal neurons could modulate their firing rate in response to the conditioned tone (conditioned neurons) and/or non- conditioned tones (generalizing neurons). Out of these recorded putative pyramidal neurons 86 (37%) neurons were conditioned to one or both tones. More precisely, one population of neurons encoded for a shared memory while another group of neurons likely encoded the memories' new features. Notably, in spite of a successful behavioral extinction, the firing rate of those conditioned neurons in response to the conditioned tone remained unchanged throughout memory testing. Furthermore, by analyzing the pre-conditioning characteristics of the conditioned neurons, I determined that it was possible to predict neuronal recruitment based on three factors: 1) initial sensitivity to auditory inputs, with tone-sensitive neurons being more easily recruited than tone- insensitive neurons; 2) baseline excitability levels, with more highly excitable neurons being more likely to become conditioned; and 3) the number of afferent connections received from local neurons, with neurons destined to become conditioned receiving more connections than non-conditioned neurons. - En conditionnement de la peur, un animal apprend à associer un stimulus inconditionnel (SI), tel un choc électrique, et un stimulus conditionné (SC), comme un son, de sorte que la présentation du SC seul suffit pour déclencher des réflexes conditionnés. Des recherches récentes sur l'amygdale latérale (AL) ont montré que, suite au conditionnement à la peur, seul un sous-ensemble de neurones plus excitables sont recrutés pour constituer la trace mnésique. Pour apprendre à associer deux sons au même SI, je fais l'hypothèse que les neurones entrent en compétition afin d'être sélectionnés lors du recrutement pour coder la trace mnésique. Ce recrutement dépendrait d'un part à une activation facilité des neurones ainsi qu'une activation facilité de réseaux de neurones locaux. En outre, je fais l'hypothèse que l'activation de ces réseaux de l'AL, en soi, est suffisante pour induire une mémoire effrayante. Pour répondre à ces hypothèses, je vais montrer que A) selon un processus de mémoire à court terme, un rat peut apprendre à associer deux mémoires effrayantes apprises successivement; B) des populations neuronales dans l'AL sont compétitivement recrutées dans les traces mnésiques en fonction des avantages neuronaux individuels, ainsi que les avantages consentis par le réseau local. En effectuant deux expériences successives de conditionnement à la peur, des rats étaient capables d'apprendre, ainsi que de subir un processus d'extinction, pour les deux souvenirs effrayants. La mesure de l'efficacité du conditionnement à la peur a été effectuée 1 heure après l'apprentissage pour chaque souvenir. Ces rats ont été équipés d'un système d'enregistrements extracellulaires stables que j'ai développé, ce qui a permis de suivre l'activité neuronale pendant l'apprentissage de la peur. 233 neurones pyramidaux individuels pouvaient moduler leur taux d'activité en réponse au son conditionné (neurones conditionnés) et/ou au son non conditionné (neurones généralisant). Sur les 233 neurones pyramidaux putatifs enregistrés 86 (37%) d'entre eux ont été conditionnés à un ou deux tons. Plus précisément, une population de neurones code conjointement pour un souvenir partagé, alors qu'un groupe de neurones différent code pour de nouvelles caractéristiques de nouveaux souvenirs. En particulier, en dépit d'une extinction du comportement réussie, le taux de décharge de ces neurones conditionné en réponse à la tonalité conditionnée est resté inchangée tout au long de la mesure d'apprentissage. En outre, en analysant les caractéristiques de pré-conditionnement des neurones conditionnés, j'ai déterminé qu'il était possible de prévoir le recrutement neuronal basé sur trois facteurs : 1) la sensibilité initiale aux entrées auditives, avec les neurones sensibles aux sons étant plus facilement recrutés que les neurones ne répondant pas aux stimuli auditifs; 2) les niveaux d'excitabilité des neurones, avec les neurones plus facilement excitables étant plus susceptibles d'être conditionnés au son ; et 3) le nombre de connexions reçues, puisque les neurones conditionné reçoivent plus de connexions que les neurones non-conditionnés. Enfin, nous avons constaté qu'il était possible de remplacer de façon satisfaisante le SI lors d'un conditionnement à la peur par des injections bilatérales de bicuculline, un antagoniste des récepteurs de l'acide y-Aminobutirique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Good afternoon ladies and gentlemen. I am very pleased that you were all able to accept my invitation to join me here today on this landmark occasion for nursing education. It is fitting that all of the key stakeholders from the health and education sectors should be so well represented at the launch of an historic new development. Rapid and unpredictable change throughout society has been the hallmark of the twenty-first century, and healthcare is no exception. Regardless of what change occurs, no one doubts that nursing is intrinsic to the health of this nation. However, significant changes in nurse education are now needed if the profession is to deliver on its social mandate to promote people´s health by providing excellent and sensitive care. As science, technology and the demands of the public for sophisticated and responsive health care become increasingly complex, it is essential that the foundation of nursing education is redesigned. Pre-registration nursing education has already undergone radical change over the past eight years, during which time it has moved from an apprenticeship model of education and training to a diploma based programme firmly rooted in higher education. The Secretary General of my Department, Michael Kelly, played a leading role in bringing about this transformation, which has greatly enhanced the way students are prepared for entry to the nursing profession. The benefits of the revised model of education are clearly evident from the quality of the nurses graduating from the diploma programme. The Commission on Nursing examined the whole area of nursing education, and set out a very convincing case for educating nursing students to degree level. It argued that nurses of the future would be required to possess increased flexibility and the ability to work autonomously. A degree programme would provide nurses with a theoretical underpinning that would enable them to develop their clinical skills to a greater extent and to respond to future challenges in health care, for the benefit of patients and clients of the health services. The Commission has provided a solid framework for the professional development of nurses and midwives, including a process that is already underway for the creation of clinical nurse specialist and advanced nurse practitioner posts. This process will facilitate the transfer of skills across divisions of nursing. In this scenario, it is clearly desirable that the future benchmark qualification for registration as a nurse should be a degree in nursing studies. A Nursing Education Forum was established in early 1999 to prepare a strategic framework for the implementation of a nursing degree programme. When launching the Forum´s report last January, I indicated that the Government had agreed in principle to the introduction of the proposed degree programme next year. At the time two substantial outstanding issues had yet to be resolved, namely the basis on which nurse teachers would transfer from the health sector to the education sector and the amount of capital and revenue funding required to operate the degree programme. My Department has brokered agreements between the Nursing Alliance and the Higher Education Institutions for the assimilation of nurse teachers as lecturers into their affiliated institutions. The terms of these agreements have been accepted by all four nursing unions following a ballot of their nurse teacher members. I would like to pay particular tribute to all nurse teachers who have contributed to shaping the position, relevance and visibility of nursing through leadership, which embodies scholarship and excellence in the profession of nursing itself. In response to a recommendation of the Nursing Education Forum, I established an Inter-Departmental Steering Committee, chaired by Bernard Carey of my Department, to consider all the funding and policy issues. This Steering Committee includes representatives of the Department of Finance and the Department of Education and Science as well as the Higher Education Authority. The Steering Committee has been engaged in intensive negotiations with representatives of the Conference of Heads of Irish Universities and the Institutes of Technology in relation to their capital and revenue funding requirements. These negotiations were successfully concluded within the past few weeks. The satisfactory resolution of the industrial relations and funding issues cleared the way for me to go to the Government with concrete proposals for the implementation of degree level education for nursing students. I am delighted to announce here today that the Government has approved all of my proposals, and that a four-year undergraduate pre-registration nursing degree programme will be implemented on a nation-wide basis at the start of the next academic year, 2002/2003. The Government has approved the provision of capital funding totalling £176 million pounds for a major building and equipment programme to facilitate the full integration of nursing students into the higher education sector. This programme is due to be completed by September 2004, and will ensure that nursing students are accommodated in purpose built schools of nursing studies with state of the art clinical skills and human science laboratories at thirteen higher education sites throughout the country. The Government has also agreed to make available the substantial additional revenue funding required to support the nursing degree programme. By 2006, the full year cost of operating the programme will rise to some £43 million pounds. The scale of this investment in pre-registration nursing education is enormous by any yardstick. It demonstrates the firm commitment of myself and my Government colleagues to the full implementation of the recommendations of the Commission on Nursing, of which the introduction of pre-registration degree level education is arguably the most important. This historic decision, and it is truly historic, will finally put the education of nurses on a par with the education of other health care professionals. The nursing profession has long been striving for parity, and my own involvement in the achievement of it is a matter of deep personal satisfaction to me. I am also pleased to announce that the Government has approved my plans for increasing the number of nursing training places to coincide with the implementation of the degree programme next year. Ninety-three additional places in mental handicap and psychiatric nursing will be created at Athlone, Letterkenny, Tralee and Waterford Institutes of Technology. This will yield 392 extra places over the four years of the degree programme. A total of 1,640 places annually on the new degree programme will thus be available. This is an all-time record, and maintaining the annual student intake at this level for the foreseeable future is a key element of my overall strategy for ensuring that we produce sufficient “home-grown” nurses for our health services. I am aware that the Nursing Alliance were anxious that some funding would be provided for the further academic career development of nurse teachers who transfer to one of the six Universities that will be involved in the delivery of the degree programme. I am happy to confirm that up to £300,000 in total per year will be available for this purpose over the first four years of the degree programme. In line with a recommendation of the Commission on Nursing, my Department will have responsibility for the administration of the nursing degree budget until the programme has been bedded down in the higher education sector. A primary concern will be to ensure that the substantial capital and revenue funding involved is ring-fenced for nursing studies. It is intended that responsibility for the budget will be transferred to the Department of Education and Science after the first cohort of nursing degree students have graduated in 2006. In the context of today´s launch, it is relevant to refer to a special initiative that I introduced last year to assist registered nurses wishing to undertake part-time nursing degree courses. Under this initiative, nurses are entitled to have their course fees paid by their employers in return for a commitment to continue working in the public health service for a period following completion of the course. This initiative has proved extremely popular with large numbers of nurses availing of it. I want to confirm here today that the free fees initiative will continue in operation until 2005, at a total cost of at least £15 million pounds. I am giving this commitment in order to assure this year´s intake of nursing students to the final diploma programmes that fee support for a part-time nursing degree course will be available to them when they graduate in three years time. The focus of today´s celebration is rightly on the landmark Government decision to implement the nursing degree programme next year. As Minister for Health and Children, and as a former Minister for Education, I also have a particular interest in the educational opportunities available to other health service workers to upgrade their skills. I am pleased to announce that the Government has approved my proposals for the introduction of a sponsorship scheme for suitable, experienced health care assistants who wish to become nurses. This new scheme will commence next year and will be administered by the health boards. Successful applicants will be allowed to retain their existing salaries throughout the four years of the degree programme in return for a commitment to work as nurses for their health service employer for a period of five years following registration. Up to forty sponsorships will be available annually. The new scheme will enable suitable applicants to undertake nursing education and training without suffering financial hardship. The greatest advantage of the scheme will be the retention by the public health service of staff who are supported under it, since they will have had practical experience of working in the service and their own personal commitment to upgrading their skills will be informed by that experience. I am confident that the sponsorship scheme will be warmly welcomed by health service unions representing care assistants as providing an exciting new career development path for their members. Education and health are now the two pillars upon which the profession of nursing rests. We must continue to build bridges, even tunnels where needed to strengthen this partnership. We must all understand partnerships donâ?Tt just happen they are designed and must be worked at. The changes outlined here today are powerful incentives for those in healthcare agencies, academic institutions and regulatory bodies to design revolutionary programmes capable of shaping a critical mass of excellent practitioners. You have an opportunity, greater perhaps than has been granted to any other generation in history to make certain those changes are for the good. Ultimately changes that will make the country a healthier and more equitable place to live. The challenge relates to building a seamless preparatory programme which equally respects both education and practise as an indivisible duo whilst ensuring that high tech does not replace the human touch. This is a special day in the history of the development of the Irish nursing profession, and I would like to thank everybody for their contribution. I want to express my particular appreciation of two people who by this stage are well known to all of you – Bernard Carey of my Department and Siobhán O´Halloran of the National Implementation Committee. Bernard and Siobhán have devoted considerable time and energy to the project on my behalf over the past fourteen months or so. That we are here today celebrating the launch of degree level education is due in no small part to their successful execution of the mandate that I gave them. We live in a rapidly changing world, one in which nursing can no longer rely on systems of the past to guide it through the new millennium. In terms of contemporary healthcare, nursing is no longer just a reciprocal kindness but rather a highly complex set of professional behaviours, which require serious educational investment. Pre-registration nurse education will always need development and redesign to ensure our health care system meets the demands of modern society. Nothing is finite. Today more than ever the health system is dependent on the resourcefulness of nursing. I have no doubt that the new educational landscape painted will ensure that nurses of the future will be increasingly innovative, independent and in demand. The unmistakable message from my Department is that nursing really matters. Thank you.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Performance prediction and application behavior modeling have been the subject of exten- sive research that aim to estimate applications performance with an acceptable precision. A novel approach to predict the performance of parallel applications is based in the con- cept of Parallel Application Signatures that consists in extract an application most relevant parts (phases) and the number of times they repeat (weights). Executing these phases in a target machine and multiplying its exeuction time by its weight an estimation of the application total execution time can be made. One of the problems is that the performance of an application depends on the program workload. Every type of workload affects differently how an application performs in a given system and so affects the signature execution time. Since the workloads used in most scientific parallel applications have dimensions and data ranges well known and the behavior of these applications are mostly deterministic, a model of how the programs workload affect its performance can be obtained. We create a new methodology to model how a program’s workload affect the parallel application signature. Using regression analysis we are able to generalize each phase time execution and weight function to predict an application performance in a target system for any type of workload within predefined range. We validate our methodology using a synthetic program, benchmarks applications and well known real scientific applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El consumo energético es un aspecto cada vez más importante en el diseño de microprocesadores. Este trabajo experimenta con una técnica de control del consumo, el escalado dinámico de tensión y frecuencia (DVFS, siglas en inglés), para determinar cuan efectiva es la misma en la ejecución de programas con diferentes cargas de trabajo, intensivas en cómputo o memoria. Además, se ha extendido la experimentación a varios núcleos de ejecución, permitiendo comprobar en que medida las características de la ejecución en una arquitectura multicore afecta al desempeño de dicha técnica.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cada vez es mayor el número de aplicaciones desarrolladas en el ámbito científico, como en la Bioinformática o en las Geociencias, escritas bajo el modelo MapReduce, empleando herramientas de código abierto como Apache Hadoop. De la necesidad de integrar Hadoop en entornos HPC, para posibilitar la ejecutar aplicaciones desarrolladas bajo el paradigma MapReduce, nace el presente proyecto. Se analizan dos frameworks diseñados para facilitar dicha integración a los desarrolladores: HoD y myHadoop. En este proyecto se analiza, tanto las posibilidades en cuanto a entornos que ofrecen dichos frameworks para la ejecución de aplicaciones MapReduce, como el rendimiento de los clúster Hadoop generados con HoD o myHadoop respecto a un clúster Hadoop físico.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aquest treball de fi de carrera, com descriu el seu títol, consisteix a dissenyar i implementar un sistema de control de projectes. Com tot projecte, complirà el requisit de disposar d'un pla que permeti fer un seguiment dels terminis d'execució, de les fites establertes i un control dels lliurables identificats. Pel fet de tractar-se del desenvolupament d'un sistema informàtic, considerarà les etapes d'especificació de requisits, anàlisi, disseny, codificació, proves unitàries i proves funcionals, i es generaran els informes pertinents que serveixin de documentació i de referència en les etapes posteriors. Des d'un punt de vista tecnològic, permetrà aprofundir en el coneixement de l'estructura de funcionament del PL/SQL d'Oracle (crides a procediments i, especialment, al tractament).