43 resultados para Integrated Hydropyrolysis and Hydroconversion process
Resumo:
Summary : Forensic science - both as a source of and as a remedy for error potentially leading to judicial error - has been studied empirically in this research. A comprehensive literature review, experimental tests on the influence of observational biases in fingermark comparison, and semistructured interviews with heads of forensic science laboratories/units in Switzerland and abroad were the tools used. For the literature review, some of the areas studied are: the quality of forensic science work in general, the complex interaction between science and law, and specific propositions as to error sources not directly related to the interaction between law and science. A list of potential error sources all the way from the crime scene to the writing of the report has been established as well. For the empirical tests, the ACE-V (Analysis, Comparison, Evaluation, and Verification) process of fingermark comparison was selected as an area of special interest for the study of observational biases, due to its heavy reliance on visual observation and recent cases of misidentifications. Results of the tests performed with forensic science students tend to show that decision-making stages are the most vulnerable to stimuli inducing observational biases. For the semi-structured interviews, eleven senior forensic scientists answered questions on several subjects, for example on potential and existing error sources in their work, of the limitations of what can be done with forensic science, and of the possibilities and tools to minimise errors. Training and education to augment the quality of forensic science have been discussed together with possible solutions to minimise the risk of errors in forensic science. In addition, the time that samples of physical evidence are kept has been determined as well. Results tend to show considerable agreement on most subjects among the international participants. Their opinions on possible explanations for the occurrence of such problems and the relative weight of such errors in the three stages of crime scene, laboratory, and report writing, disagree, however, with opinions widely represented in existing literature. Through the present research it was therefore possible to obtain a better view of the interaction of forensic science and judicial error to propose practical recommendations to minimise their occurrence. Résumé : Les sciences forensiques - considérés aussi bien comme source de que comme remède à l'erreur judiciaire - ont été étudiées empiriquement dans cette recherche. Une revue complète de littérature, des tests expérimentaux sur l'influence du biais de l'observation dans l'individualisation de traces digitales et des entretiens semi-directifs avec des responsables de laboratoires et unités de sciences forensiques en Suisse et à l'étranger étaient les outils utilisés. Pour la revue de littérature, quelques éléments étudies comprennent: la qualité du travail en sciences forensiques en général, l'interaction complexe entre la science et le droit, et des propositions spécifiques quant aux sources d'erreur pas directement liées à l'interaction entre droit et science. Une liste des sources potentielles d'erreur tout le long du processus de la scène de crime à la rédaction du rapport a également été établie. Pour les tests empiriques, le processus d'ACE-V (analyse, comparaison, évaluation et vérification) de l'individualisation de traces digitales a été choisi comme un sujet d'intérêt spécial pour l'étude des effets d'observation, due à son fort recours à l'observation visuelle et dû à des cas récents d'identification erronée. Les résultats des tests avec des étudiants tendent à prouver que les étapes de prise de décision sont les plus vulnérables aux stimuli induisant des biais d'observation. Pour les entretiens semi-structurés, onze forensiciens ont répondu à des questions sur des sujets variés, par exemple sur des sources potentielles et existantes d'erreur dans leur travail, des limitations de ce qui peut être fait en sciences forensiques, et des possibilités et des outils pour réduire au minimum ses erreurs. La formation et l'éducation pour augmenter la qualité des sciences forensiques ont été discutées ainsi que les solutions possibles pour réduire au minimum le risque d'erreurs en sciences forensiques. Le temps que des échantillons sont gardés a été également déterminé. En général, les résultats tendent à montrer un grand accord sur la plupart des sujets abordés pour les divers participants internationaux. Leur avis sur des explications possibles pour l'occurrence de tels problèmes et sur le poids relatif de telles erreurs dans les trois étapes scène de crime;', laboratoire et rédaction de rapports est cependant en désaccord avec les avis largement représentés dans la littérature existante. Par cette recherche il était donc possible d'obtenir une meilleure vue de l'interaction des sciences forensiques et de l'erreur judiciaire afin de proposer des recommandations pratiques pour réduire au minimum leur occurrence. Zusammenfassung : Forensische Wissenschaften - als Ursache und als Hilfsmittel gegen Fehler, die möglicherweise zu Justizirrtümern führen könnten - sind hier empirisch erforscht worden. Die eingestzten Methoden waren eine Literaturübersicht, experimentelle Tests über den Einfluss von Beobachtungseffekten (observer bias) in der Individualisierung von Fingerabdrücken und halbstandardisierte Interviews mit Verantwortlichen von kriminalistischen Labors/Diensten in der Schweiz und im Ausland. Der Literaturüberblick umfasst unter anderem: die Qualität der kriminalistischen Arbeit im Allgemeinen, die komplizierte Interaktion zwischen Wissenschaft und Recht und spezifische Fehlerquellen, welche nicht direkt auf der Interaktion von Recht und Wissenschaft beruhen. Eine Liste möglicher Fehlerquellen vom Tatort zum Rapportschreiben ist zudem erstellt worden. Für die empirischen Tests wurde der ACE-V (Analyse, Vergleich, Auswertung und Überprüfung) Prozess in der Fingerabdruck-Individualisierung als speziell interessantes Fachgebiet für die Studie von Beobachtungseffekten gewählt. Gründe sind die Wichtigkeit von visuellen Beobachtungen und kürzliche Fälle von Fehlidentifizierungen. Resultate der Tests, die mit Studenten durchgeführt wurden, neigen dazu Entscheidungsphasen als die anfälligsten für Stimuli aufzuzeigen, die Beobachtungseffekte anregen könnten. Für die halbstandardisierten Interviews beantworteten elf Forensiker Fragen über Themen wie zum Beispiel mögliche und vorhandene Fehlerquellen in ihrer Arbeit, Grenzen der forensischen Wissenschaften und Möglichkeiten und Mittel um Fehler zu verringern. Wie Training und Ausbildung die Qualität der forensischen Wissenschaften verbessern können ist zusammen mit möglichen Lösungen zur Fehlervermeidung im selben Bereich diskutiert worden. Wie lange Beweismitten aufbewahrt werden wurde auch festgehalten. Resultate neigen dazu, für die meisten Themen eine grosse Übereinstimmung zwischen den verschiedenen internationalen Teilnehmern zu zeigen. Ihre Meinungen über mögliche Erklärungen für das Auftreten solcher Probleme und des relativen Gewichts solcher Fehler in den drei Phasen Tatort, Labor und Rapportschreiben gehen jedoch mit den Meinungen, welche in der Literatur vertreten werden auseinander. Durch diese Forschungsarbeit war es folglich möglich, ein besseres Verständnis der Interaktion von forensischen Wissenschaften und Justizirrtümer zu erhalten, um somit praktische Empfehlungen vorzuschlagen, welche diese verringern. Resumen : Esta investigación ha analizado de manera empírica el rol de las ciencias forenses como fuente y como remedio de potenciales errores judiciales. La metodología empleada consistió en una revisión integral de la literatura, en una serie de experimentos sobre la influencia de los sesgos de observación en la individualización de huellas dactilares y en una serie de entrevistas semiestructuradas con jefes de laboratorios o unidades de ciencias forenses en Suiza y en el extranjero. En la revisión de la literatura, algunas de las áreas estudiadas fueron: la calidad del trabajo en ciencias forenses en general, la interacción compleja entre la ciencia y el derecho, así como otras fuentes de error no relacionadas directamente con la interacción entre derecho y ciencia. También se ha establecido una lista exhaustiva de las fuentes potenciales de error desde la llegada a la escena del crimen a la redacción del informe. En el marco de los tests empíricos, al analizar los sesgos de observación dedicamos especial interés al proceso de ACE-V (análisis, comparación, evaluación y verificación) para la individualización de huellas dactilares puesto que este reposa sobre la observación visual y ha originado varios casos recientes de identificaciones erróneas. Los resultados de las experimentaciones realizadas con estudiantes sugieren que las etapas en las que deben tornarse decisiones son las más vulnerables a lös factores que pueden generar sesgos de observación. En el contexto de las entrevistas semi-estructuradas, once científicos forenses de diversos países contestaron preguntas sobre varios temas, incluyendo las fuentes potenciales y existehtes de error en su trabajo, las limitaciones propias a las ciencias forenses, las posibilidades de reducir al mínimo los errores y las herramientas que podrían ser utilizadas para ello. Se han sugerido diversas soluciones para alcanzar este objetivo, incluyendo el entrenamiento y la educación para aumentar la calidad de las ciencias forenses. Además, se ha establecido el periodo de conservación de las muestras judiciales. Los resultados apuntan a un elevado grado de consenso entre los entrevistados en la mayoría de los temas. Sin embargo, sus opiniones sobre las posibles causas de estos errores y su importancia relativa en las tres etapas de la investigación -la escena del crimen, el laboratorio y la redacción de informe- discrepan con las que predominan ampliamente en la literatura actual. De este modo, esta investigación nos ha permitido obtener una mejor imagen de la interacción entre ciencias forenses y errores judiciales, y comenzar a formular una serie de recomendaciones prácticas para reducirlos al minimo.
Resumo:
In the first part of this research, three stages were stated for a program to increase the information extracted from ink evidence and maximise its usefulness to the criminal and civil justice system. These stages are (a) develop a standard methodology for analysing ink samples by high-performance thin layer chromatography (HPTLC) in reproducible way, when ink samples are analysed at different time, locations and by different examiners; (b) compare automatically and objectively ink samples; and (c) define and evaluate theoretical framework for the use of ink evidence in forensic context. This report focuses on the second of the three stages. Using the calibration and acquisition process described in the previous report, mathematical algorithms are proposed to automatically and objectively compare ink samples. The performances of these algorithms are systematically studied for various chemical and forensic conditions using standard performance tests commonly used in biometrics studies. The results show that different algorithms are best suited for different tasks. Finally, this report demonstrates how modern analytical and computer technology can be used in the field of ink examination and how tools developed and successfully applied in other fields of forensic science can help maximising its impact within the field of questioned documents.
Resumo:
The immune system has evolved to allow robust responses against pathogens while avoiding autoimmunity. This is notably enabled by stimulatory and inhibitory signals which contribute to the regulation of immune responses. In the presence of a pathogen, a specific and effective immune response must be induced and this leads to antigen-specific T-cell proliferation, cytokines production, and induction of T-cell differentiation toward an effector phenotype. After clearance or control of the pathogen, the effector immune response must be terminated in order to avoid tissue damage and chronic inflammation and this process involves coinhibitory molecules. When the immune system fails to eliminate or control the pathogen, continuous stimulation of T cells prevents the full contraction and leads to the functional exhaustion of effector T cells. Several evidences both in vitro and in vivo suggest that this anergic state can be reverted by blocking the interactions between coinhibitory molecules and their ligands. The potential to revert exhausted or inactivated T-cell responses following selective blocking of their function made these markers interesting targets for therapeutic interventions in patients with persistent viral infections or cancer.
Resumo:
This paper addresses the migration behaviours of young university graduates from a rural region in Switzerland. Based on a questionnaire survey, it compares graduates' current place of residence (i.e. whether or not they returned to their home region) with characteristics related to their socio-familial, migration and professional trajectories. The propensity to return varies not only according to labour market variables (employment opportunities), but also to other factors, some of which have even more influence than job opportunities. The graduates' life course position (kind of household), their partners' characteristics (level of education and home region) and their family background (socio-economic status and history of migration) all play a central role. On the whole, results show that migration appears as a selective and complex process embedded in the life course of graduates.
Resumo:
Integration without cytotoxic effects and long-term expression of a transgene constitutes a major challenge in gene therapy and biotechnology applications. In this context, transposons represent an attractive system for gene transfer because of their ability to promote efficient integration of a transgene in a variety of cell lines. However, the transgene integration can lead to insertional mutagenesis and/or unstable transgene expression by epigenetic modifications. These unwanted events may be limited by the use of chromatin control elements called MARs (matrix attachment regions). Indeed, the insertion of these DNA elements next to the transgene usually results in higher and more stable expression by maintaining transgene chromatin in an active configuration and preventing gene silencing. In this study, we tested if the inclusion of the MAR 1-68 in the piggyBac transposon system may lead to efficient and safer transgene integration and ensure reliable stable and long-term expression of a transgene. The MAR-containing transposon construct was tested in CHO cells, for biotechnology applications, and in mesoangioblast cells that can differentiate into muscle cells and are important candidates for potential stem cell therapies of myopathies. We showed that the addition of the MAR 1 -68 in the piggyBac transposon did not interfere with transposition, thereby maintaining high frequency of transgene integrations in these cells. Moreover, the MAR allowed higher transgene expression from fewer transposon integration events. We also found that enriched transgene-expressing cell populations could be obtained without the need of selection pressure. Since antibiotic-enforced selection protocols often result in a higher integrated copy number and mosaic expression patterns, this strategy could benefit many applications in which a low copy number of integrated transgenes and antibiotic-free conditions are desired. In addition, the intramuscular transplantation of mouse tibialis anterior muscles with mesoangioblasts containing the transposon led to widespread and sustained myofiber transgene expression after differentiation of these cells in vivo. These findings indicated that piggyBac vectors may provide a viable approach to achieve stable gene transfer in the context of Duchenne muscular dystrophy therapy. - L'intégration sans effets cytotoxiques et l'expression à long terme d'un transgène constituent un défi majeur en thérapie génique et en biotechnologie. Dans ce contexte, les transposons représentent un système attrayant pour le transfert de gènes en raison de leur capacité à promouvoir l'intégration efficace d'un transgène dans une variété de lignées cellulaires. Toutefois, l'intégration d'un transgène peut conduire à une mutagénèse insertionnelle et/ou à une expression instable due au silençage du transgène suite à des modifications épigénétiques. Ces événements indésirables de silençage génique peuvent être diminués par l'utilisation d'éléments de contrôle de la chromatine appelés MAR (matrix attachment region). En effet, l'insertion de ces éléments d'ADN à proximité du transgène se traduit généralement par une expression plus élevée et plus stable de celui-ci, en permettant le maintien d'une chromatine dans une configuration active autour du transgène et en empêchant l'inactivation du gène. Dans cette étude, nous avons testé si l'inclusion du MAR 1-68 dans le système transposon piggyBac peut améliorer l'efficacité d'intégration de façon sécuritaire et l'expression à long terme d'un transgène. Le transposon contenant l'élément MAR a été testé dans les cellules CHO, couramment utilisées en biotechnologie, et dans des cellules progénitrices appelées mésoangioblastes, qui peuvent se différencier en cellules musculaires, et qui constituent ainsi des candidats prometteurs pour la thérapie à partir de cellules souches de patients souffrant de myopathie. Nous avons montré que l'addition du MAR 1-68 dans le transposon piggyBac n'interfère pas avec la transposition et permet de maintenir une fréquence élevée d'intégration du transgène dans ces deux types cellulaires. De plus, il semble que cette association mène à une meilleure expression du transgène à partir de peu d'événements d'intégration du transposon. En outre, ces populations enrichies en cellules exprimant de façon stable le transgène ont pu être obtenues sans avoir recours à une pression de sélection. Etant donné que les protocoles de sélection basée sur l'utilisation d'antibiotiques conduisent souvent à un nombre plus élevé de copies intégrées et à la variégation de l'expression du transgène et qu'ils impliquent une longue culture in vitro, cette stratégie pourrait profiter à des applications pour lesquelles on souhaite un faible nombre de copies intégrées et/ou l'utilisation d'antibiotiques n'est pas souhaitable. De plus, la transplantation intramusculaire de mésoangioblastes contenant le transposon dans le muscle tibial antérieur de souris a conduit, après la différentiation de ces cellules in vivo, à une expression constante et étendue du transgène dans les myofibres. Ces résultats indiquent que les vecteurs piggyBac pourraient fournir une approche viable pour assurer un transfert de gènes stables dans le contexte d'un traitement de la dystrophic musculaire de Duchenne.
Resumo:
Reliable and long-term expression of transgenes remain significant challenges for gene therapy and biotechnology applications, especially when antibiotic selection procedures are not applicable. In this context, transposons represent attractive gene transfer vectors because of their ability to promote efficient genomic integration in a variety of mammalian cell types. However, expression from genome-integrating vectors may be inhibited by variable gene transcription and/or silencing events. In this study, we assessed whether inclusion of two epigenetic control elements, the human Matrix Attachment Region (MAR) 1-68 and X-29, in a piggyBac transposon vector, may lead to more reliable and efficient expression in CHO cells. We found that addition of the MAR 1-68 at the center of the transposon did not interfere with transposition frequency, and transgene expressing cells could be readily detected from the total cell population without antibiotic selection. Inclusion of the MAR led to higher transgene expression per integrated copy, and reliable expression could be obtained from as few as 2-4 genomic copies of the MAR-containing transposon vector. The MAR X-29-containing transposons was found to mediate elevated expression of therapeutic proteins in polyclonal or monoclonal CHO cell populations using a transposable vector devoid of selection gene. Overall, we conclude that MAR and transposable vectors can be used to improve transgene expression from few genomic transposition events, which may be useful when expression from a low number of integrated transgene copies must be obtained and/or when antibiotic selection cannot be applied.
Resumo:
The InterPro database (http://www.ebi.ac.uk/interpro/) is a freely available resource that can be used to classify sequences into protein families and to predict the presence of important domains and sites. Central to the InterPro database are predictive models, known as signatures, from a range of different protein family databases that have different biological focuses and use different methodological approaches to classify protein families and domains. InterPro integrates these signatures, capitalizing on the respective strengths of the individual databases, to produce a powerful protein classification resource. Here, we report on the status of InterPro as it enters its 15th year of operation, and give an overview of new developments with the database and its associated Web interfaces and software. In particular, the new domain architecture search tool is described and the process of mapping of Gene Ontology terms to InterPro is outlined. We also discuss the challenges faced by the resource given the explosive growth in sequence data in recent years. InterPro (version 48.0) contains 36 766 member database signatures integrated into 26 238 InterPro entries, an increase of over 3993 entries (5081 signatures), since 2012.
Resumo:
BACKGROUND: Multimodality treatment suites for patients with cerebral arteriovenous malformations (AVM) have recently become available. This study was designed to evaluate feasibility, safety and impact on treatment of a new intraoperative flat-panel (FP) based integrated surgical and imaging suite for combined endovascular and surgical treatment of cerebral AVM. METHODS: Twenty-five patients with AVMs to treat with combined endovascular and surgical interventions were prospectively enrolled in this consecutive case series. The hybrid suite allows combined endovascular and surgical approaches with intraoperative scanner-like imaging (XperCT®) and intraoperative 3D rotational angiography (3D-RA). The impact of intraoperative multimodal imaging on feasibility, workflow of combined interventions, surgery, and unexpected imaging findings were analyzed. RESULTS: Twenty-five patients (mean age 38 ± 18.6 year) with a median Spetzler-Martin grade 2 AVM (range 1-4) underwent combined endovascular and surgical procedures. Sixteen patients presented with a ruptured AVM and nine with an unruptured AVM. In 16 % (n = 4) of cases, intraoperative imaging visualized AVM remnants ≤3 mm and allowed for completion of the resections in the same sessions. Complete resection was confirmed in all n = 16 patients who had follow-up angiography one year after surgery so far. All diagnostic and therapeutical steps, including angiographic control, were performed without having to move the patients CONCLUSION: The hybrid neurointerventional suite was shown to be a safe and useful setup which allowed for unconstrained combined microsurgical and neuroradiological workflow. It reduces the need for extraoperative angiographic controls and subsequent potential surgical revisions a second time, as small AVM remnants can be detected with high security.
Resumo:
There are only a few studies on the ontogeny and differentiation process of the hypothalamic supraoptic-paraventriculo-neurohypophysial neurosecretory system. In vitro neuron survival improves if cells are of embryonic origin; however, surviving hypothalamic neurons in culture were found to express small and minimal amounts of arginine-vasopressin (AVP) and oxytocin (OT), respectively. The aim of this study was to develop a primary neuronal culture design applicable to the study of magnocellular hypothalamic system functionality. For this purpose, a primary neuronal culture was set up after mechanical dissociation of sterile hypothalamic blocks from 17-day-old Sprague-Dawley rat embryos (E17) of both sexes. Isolated hypothalamic cells were cultured with supplemented (B27)-NeuroBasal medium containing an agent inhibiting non-neuron cell proliferation. The neurosecretory process was characterized by detecting AVP and OT secreted into the medium on different days of culture. Data indicate that spontaneous AVP and OT release occurred in a culture day-dependent fashion, being maximal on day 13 for AVP, and on day 10 for OT. Interestingly, brain-derived neurotrophic factor (BDNF) and Angiotensin II (A II) were able to positively modulate neuropeptide output. Furthermore, on day 17 of culture, non-specific (high-KCl) and specific (Angiotensin II) stimuli were able to significantly (P < 0.05) enhance the secretion of both neuropeptides over respective baselines. This study suggests that our experimental design is useful for the study of AVP- and OT-ergic neuron functionality and that BDNF and A II are positive modulators of embryonic hypothalamic cell development.
Resumo:
Under the influence of intelligence-led policing models, crime analysis methods have known of important developments in recent years. Applications have been proposed in several fields of forensic science to exploit and manage various types of material evidence in a systematic and more efficient way. However, nothing has been suggested so far in the field of false identity documents.This study seeks to fill this gap by proposing a simple and general method for profiling false identity documents which aims to establish links based on their visual forensic characteristics. A sample of more than 200 false identity documents including French stolen blank passports, counterfeited driving licenses from Iraq and falsified Bulgarian driving licenses was gathered from nine Swiss police departments and integrated into an ad hoc developed database called ProfID. Links detected automatically and systematically through this database were exploited and analyzed to produce strategic and tactical intelligence useful to the fight against identity document fraud.The profiling and intelligence process established for these three types of false identity documents has confirmed its efficiency, more than 30% of documents being linked. Identity document fraud appears as a structured and interregional criminality, against which material and forensic links detected between false identity documents might serve as a tool for investigation.
Resumo:
STUDY OBJECTIVES: Besides their well-established role in circadian rhythms, our findings that the forebrain expression of the clock-genes Per2 and Dbp increases and decreases, respectively, in relation to time spent awake suggest they also play a role in the homeostatic aspect of sleep regulation. Here, we determined whether time of day modulates the effects of elevated sleep pressure on clock-gene expression. Time of day effects were assessed also for recognized electrophysiological (EEG delta power) and molecular (Homer1a) markers of sleep homeostasis. DESIGN: EEG and qPCR data were obtained for baseline and recovery from 6-h sleep deprivation starting at ZT0, -6, -12, or -18. SETTING: Mouse sleep laboratory. PARTICIPANTS: Male mice. INTERVENTIONS: Sleep deprivation. RESULTS: The sleep-deprivation induced changes in Per2 and Dbp expression importantly varied with time of day, such that Per2 could even decrease during sleep deprivations occurring at the decreasing phase in baseline. Dbp showed similar, albeit opposite dynamics. These unexpected results could be reliably predicted assuming that these transcripts behave according to a driven damped harmonic oscillator. As expected, the sleep-wake distribution accounted for a large degree of the changes in EEG delta power and Homer1a. Nevertheless, the sleep deprivation-induced increase in delta power varied also with time of day with higher than expected levels when recovery sleep started at dark onset. CONCLUSIONS: Per2 and delta power are widely used as exclusive state variables of the circadian and homeostatic process, respectively. Our findings demonstrate a considerable cross-talk between these two processes. As Per2 in the brain responds to both sleep loss and time of day, this molecule is well positioned to keep track of and to anticipate homeostatic sleep need. CITATION: Curie T; Mongrain V; Dorsaz S; Mang GM; Emmenegger Y; Franken P. Homeostatic and circadian contribution to EEG and molecular state variables of sleep regulation. SLEEP 2013;36(3):311-323.
'Toxic' and 'Nontoxic': confirming critical terminology concepts and context for clear communication
Resumo:
If 'the dose makes the poison', and if the context of an exposure to a hazard shapes the risk as much as the innate character of the hazard itself, then what is 'toxic' and what is 'nontoxic'? This article is intended to help readers and communicators: anticipate that concepts such as 'toxic' and 'nontoxic' may have different meanings to different stakeholders in different contexts of general use, commerce, science, and the law; recognize specific situations in which terms and related information could potentially be misperceived or misinterpreted; evaluate the relevance, reliability, and other attributes of information for a given situation; control actions, assumptions, interpretations, conclusions, and decisions to avoid flaws and achieve a desired outcome; and confirm that the desired outcome has been achieved. To meet those objectives, we provide some examples of differing toxicology terminology concepts and contexts; a comprehensive decision-making framework for understanding and managing risk; along with a communication and education message and audience-planning matrix to support the involvement of all relevant stakeholders; a set of CLEAR-communication assessment criteria for use by both readers and communicators; example flaws in decision-making; a suite of three tools to assign relevance vs reliability, align know vs show, and refine perception vs reality aspects of information; and four steps to foster effective community involvement and support. The framework and supporting process are generally applicable to meeting any objective.
Resumo:
The proportion of population living in or around cites is more important than ever. Urban sprawl and car dependence have taken over the pedestrian-friendly compact city. Environmental problems like air pollution, land waste or noise, and health problems are the result of this still continuing process. The urban planners have to find solutions to these complex problems, and at the same time insure the economic performance of the city and its surroundings. At the same time, an increasing quantity of socio-economic and environmental data is acquired. In order to get a better understanding of the processes and phenomena taking place in the complex urban environment, these data should be analysed. Numerous methods for modelling and simulating such a system exist and are still under development and can be exploited by the urban geographers for improving our understanding of the urban metabolism. Modern and innovative visualisation techniques help in communicating the results of such models and simulations. This thesis covers several methods for analysis, modelling, simulation and visualisation of problems related to urban geography. The analysis of high dimensional socio-economic data using artificial neural network techniques, especially self-organising maps, is showed using two examples at different scales. The problem of spatiotemporal modelling and data representation is treated and some possible solutions are shown. The simulation of urban dynamics and more specifically the traffic due to commuting to work is illustrated using multi-agent micro-simulation techniques. A section on visualisation methods presents cartograms for transforming the geographic space into a feature space, and the distance circle map, a centre-based map representation particularly useful for urban agglomerations. Some issues on the importance of scale in urban analysis and clustering of urban phenomena are exposed. A new approach on how to define urban areas at different scales is developed, and the link with percolation theory established. Fractal statistics, especially the lacunarity measure, and scale laws are used for characterising urban clusters. In a last section, the population evolution is modelled using a model close to the well-established gravity model. The work covers quite a wide range of methods useful in urban geography. Methods should still be developed further and at the same time find their way into the daily work and decision process of urban planners. La part de personnes vivant dans une région urbaine est plus élevé que jamais et continue à croître. L'étalement urbain et la dépendance automobile ont supplanté la ville compacte adaptée aux piétons. La pollution de l'air, le gaspillage du sol, le bruit, et des problèmes de santé pour les habitants en sont la conséquence. Les urbanistes doivent trouver, ensemble avec toute la société, des solutions à ces problèmes complexes. En même temps, il faut assurer la performance économique de la ville et de sa région. Actuellement, une quantité grandissante de données socio-économiques et environnementales est récoltée. Pour mieux comprendre les processus et phénomènes du système complexe "ville", ces données doivent être traitées et analysées. Des nombreuses méthodes pour modéliser et simuler un tel système existent et sont continuellement en développement. Elles peuvent être exploitées par le géographe urbain pour améliorer sa connaissance du métabolisme urbain. Des techniques modernes et innovatrices de visualisation aident dans la communication des résultats de tels modèles et simulations. Cette thèse décrit plusieurs méthodes permettant d'analyser, de modéliser, de simuler et de visualiser des phénomènes urbains. L'analyse de données socio-économiques à très haute dimension à l'aide de réseaux de neurones artificiels, notamment des cartes auto-organisatrices, est montré à travers deux exemples aux échelles différentes. Le problème de modélisation spatio-temporelle et de représentation des données est discuté et quelques ébauches de solutions esquissées. La simulation de la dynamique urbaine, et plus spécifiquement du trafic automobile engendré par les pendulaires est illustrée à l'aide d'une simulation multi-agents. Une section sur les méthodes de visualisation montre des cartes en anamorphoses permettant de transformer l'espace géographique en espace fonctionnel. Un autre type de carte, les cartes circulaires, est présenté. Ce type de carte est particulièrement utile pour les agglomérations urbaines. Quelques questions liées à l'importance de l'échelle dans l'analyse urbaine sont également discutées. Une nouvelle approche pour définir des clusters urbains à des échelles différentes est développée, et le lien avec la théorie de la percolation est établi. Des statistiques fractales, notamment la lacunarité, sont utilisées pour caractériser ces clusters urbains. L'évolution de la population est modélisée à l'aide d'un modèle proche du modèle gravitaire bien connu. Le travail couvre une large panoplie de méthodes utiles en géographie urbaine. Toutefois, il est toujours nécessaire de développer plus loin ces méthodes et en même temps, elles doivent trouver leur chemin dans la vie quotidienne des urbanistes et planificateurs.
Resumo:
BACKGROUND: Clinical practice does not always reflect best practice and evidence, partly because of unconscious acts of omission, information overload, or inaccessible information. Reminders may help clinicians overcome these problems by prompting the doctor to recall information that they already know or would be expected to know and by providing information or guidance in a more accessible and relevant format, at a particularly appropriate time. OBJECTIVES: To evaluate the effects of reminders automatically generated through a computerized system and delivered on paper to healthcare professionals on processes of care (related to healthcare professionals' practice) and outcomes of care (related to patients' health condition). SEARCH METHODS: For this update the EPOC Trials Search Co-ordinator searched the following databases between June 11-19, 2012: The Cochrane Central Register of Controlled Trials (CENTRAL) and Cochrane Library (Economics, Methods, and Health Technology Assessment sections), Issue 6, 2012; MEDLINE, OVID (1946- ), Daily Update, and In-process; EMBASE, Ovid (1947- ); CINAHL, EbscoHost (1980- ); EPOC Specialised Register, Reference Manager, and INSPEC, Engineering Village. The authors reviewed reference lists of related reviews and studies. SELECTION CRITERIA: We included individual or cluster-randomized controlled trials (RCTs) and non-randomized controlled trials (NRCTs) that evaluated the impact of computer-generated reminders delivered on paper to healthcare professionals on processes and/or outcomes of care. DATA COLLECTION AND ANALYSIS: Review authors working in pairs independently screened studies for eligibility and abstracted data. We contacted authors to obtain important missing information for studies that were published within the last 10 years. For each study, we extracted the primary outcome when it was defined or calculated the median effect size across all reported outcomes. We then calculated the median absolute improvement and interquartile range (IQR) in process adherence across included studies using the primary outcome or median outcome as representative outcome. MAIN RESULTS: In the 32 included studies, computer-generated reminders delivered on paper to healthcare professionals achieved moderate improvement in professional practices, with a median improvement of processes of care of 7.0% (IQR: 3.9% to 16.4%). Implementing reminders alone improved care by 11.2% (IQR 6.5% to 19.6%) compared with usual care, while implementing reminders in addition to another intervention improved care by 4.0% only (IQR 3.0% to 6.0%) compared with the other intervention. The quality of evidence for these comparisons was rated as moderate according to the GRADE approach. Two reminder features were associated with larger effect sizes: providing space on the reminder for provider to enter a response (median 13.7% versus 4.3% for no response, P value = 0.01) and providing an explanation of the content or advice on the reminder (median 12.0% versus 4.2% for no explanation, P value = 0.02). Median improvement in processes of care also differed according to the behaviour the reminder targeted: for instance, reminders to vaccinate improved processes of care by 13.1% (IQR 12.2% to 20.7%) compared with other targeted behaviours. In the only study that had sufficient power to detect a clinically significant effect on outcomes of care, reminders were not associated with significant improvements. AUTHORS' CONCLUSIONS: There is moderate quality evidence that computer-generated reminders delivered on paper to healthcare professionals achieve moderate improvement in process of care. Two characteristics emerged as significant predictors of improvement: providing space on the reminder for a response from the clinician and providing an explanation of the reminder's content or advice. The heterogeneity of the reminder interventions included in this review also suggests that reminders can improve care in various settings under various conditions.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science- Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.