988 resultados para Digital forensic
Resumo:
Objectius: conèixer la taxa de detecció de retinopatia diabètica, mitjançant la retinografia no midriàtica des d’Atenció Primària. Descriure els criteris d'inclusió/exclusió, l'exploració oftalmològica bàsica realitzada, els criteris de derivació, les característiques del grup poblacional estudiat. Metodologia: anàlisi descriptiva. Exploració oftalmològica i interpretació de resultats des de Atenció Primària. Pacients: mostra de 800 pacients amb criteris d'inclusió. Resultats: El 94,63% presenta pressió intraocular normal. El 77,4% presenta agudesa visual alterada. El 12,02% presenta retinopatia diabètica. S'identifiquen altres patologies oculars. Es deriva el 14,01% a Oftalmologia. Conclusions: L'exploració oftalmològica des d’Atenció Primària permet augmentar la detecció de retinopatia diabètica.
Resumo:
Isopropyl alcohol (IPA) is widely used as an industrial solvent and cleaning fluid. After ingestion or absorption, IPA is converted into acetone by alcohol dehydrogenase. However, in ketosis, acetone can be reduced to IPA. The aim of this study was to investigate blood IPA and acetone concentrations in a series of 400 medico-legal autopsies, including cases of diabetic ketoacidosis, hypothermia and alcohol misuse-related deaths, to illustrate the extent of ketosis at the time of death. Vitreous glucose, blood 3-β-hydroxybutyrate (3HB) and acetoacetate (AcAc) concentrations were also determined systematically. Additionally, vitreous and urine IPA, acetone, 3HB and AcAc concentrations as well as other biochemical markers, including glycated hemoglobin and carbohydrate-deficient transferrin (CDT) were also determined in selected cases. The results of this study indicate that ketosis is characterized by the presence of IPA resulting from the acetone metabolism and that IPA can be detected in several substrates. These findings confirm the importance of the systematic determination of IPA and acetone levels that is used to quantify biochemical disturbances and the importance of ketosis at the time of death.
Resumo:
The effect of copper (Cu) filtration on image quality and dose in different digital X-ray systems was investigated. Two computed radiography systems and one digital radiography detector were used. Three different polymethylmethacrylate blocks simulated the pediatric body. The effect of Cu filters of 0.1, 0.2, and 0.3 mm thickness on the entrance surface dose (ESD) and the corresponding effective doses (EDs) were measured at tube voltages of 60, 66, and 73 kV. Image quality was evaluated in a contrast-detail phantom with an automated analyzer software. Cu filters of 0.1, 0.2, and 0.3 mm thickness decreased the ESD by 25-32%, 32-39%, and 40-44%, respectively, the ranges depending on the respective tube voltages. There was no consistent decline in image quality due to increasing Cu filtration. The estimated ED of anterior-posterior (AP) chest projections was reduced by up to 23%. No relevant reduction in the ED was noted in AP radiographs of the abdomen and pelvis or in posterior-anterior radiographs of the chest. Cu filtration reduces the ESD, but generally does not reduce the effective dose. Cu filters can help protect radiosensitive superficial organs, such as the mammary glands in AP chest projections.
Resumo:
L’aplicació de la tecnologia de Google Art Project al Museu d’Art Contemporani de Barcelona (MACBA) com a forma d’aproximació de l’art a un públic més internacional és el plantejament d’aquest treball. Amb aquesta finalitat es desenvoluparà una estratègia de comunicació digital que comprengui aquesta eina com a principal i abasti altres mètodes interactius a xarxes socials i a altres espais de socialització 2.0. L’elaboració d’aquesta estratègia estarà basada dins un context real de l’art contemporani a Barcelona i de la seva màxima compenetració amb aquesta innovadora iniciativa.
Resumo:
OBJECTIVE: Quality assurance (QA) in clinical trials is essential to ensure treatment is safely and effectively delivered. As QA requirements have increased in complexity in parallel with evolution of radiation therapy (RT) delivery, a need to facilitate digital data exchange emerged. Our objective is to present the platform developed for the integration and standardization of QART activities across all EORTC trials involving RT. METHODS: The following essential requirements were identified: secure and easy access without on-site software installation; integration within the existing EORTC clinical remote data capture system; and the ability to both customize the platform to specific studies and adapt to future needs. After retrospective testing within several clinical trials, the platform was introduced in phases to participating sites and QART study reviewers. RESULTS: The resulting QA platform, integrating RT analysis software installed at EORTC Headquarters, permits timely, secure, and fully digital central DICOM-RT based data review. Participating sites submit data through a standard secure upload webpage. Supplemental information is submitted in parallel through web-based forms. An internal quality check by the QART office verifies data consistency, formatting, and anonymization. QART reviewers have remote access through a terminal server. Reviewers evaluate submissions for protocol compliance through an online evaluation matrix. Comments are collected by the coordinating centre and institutions are informed of the results. CONCLUSIONS: This web-based central review platform facilitates rapid, extensive, and prospective QART review. This reduces the risk that trial outcomes are compromised through inadequate radiotherapy and facilitates correlation of results with clinical outcomes.
Resumo:
The aim of this work is to present some practical, postmortem biochemistry applications to illustrate the usefulness of this discipline and reassert the importance of carrying out biochemical investigations as an integral part of the autopsy process. Five case reports are presented pertaining to diabetic ketoacidosis in an adult who was not known to suffer from diabetes and in presence of multiple psychotropic substances; fatal flecainide intoxication in a poor metabolizer also presenting an impaired renal function; diabetic ketoacidosis showing severe postmortem changes; primary aldosteronism presented with intracranial hemorrhage and hypothermia showing severe postmortem changes. The cases herein presented can be considered representative examples of the importance of postmortem biochemistry investigations, which may provide significant information useful in determining the cause of death in routine forensic casework or contribute to understanding the pathophysiological mechanisms involved in the death process.
Resumo:
Resume : L'utilisation de l'encre comme indice en sciences forensiques est décrite et encadrée par une littérature abondante, comprenant entre autres deux standards de l'American Society for Testing and Materials (ASTM). La grande majorité de cette littérature se préoccupe de l'analyse des caractéristiques physiques ou chimiques des encres. Les standards ASTM proposent quelques principes de base qui concernent la comparaison et l'interprétation de la valeur d'indice des encres en sciences forensiques. L'étude de cette littérature et plus particulièrement des standards ASTM, en ayant a l'esprit les développements intervenus dans le domaine de l'interprétation de l'indice forensique, montre qu'il existe un potentiel certain pour l'amélioration de l'utilisation de l'indice encre et de son impact dans l'enquête criminelle. Cette thèse propose d'interpréter l'indice encre en se basant sur le cadre défini par le théorème de Bayes. Cette proposition a nécessité le développement d'un système d'assurance qualité pour l'analyse et la comparaison d'échantillons d'encre. Ce système d'assurance qualité tire parti d'un cadre théorique nouvellement défini. La méthodologie qui est proposée dans ce travail a été testée de manière compréhensive, en tirant parti d'un set de données spécialement créer pour l'occasion et d'outils importés de la biométrie. Cette recherche répond de manière convaincante à un problème concret généralement rencontré en sciences forensiques. L'information fournie par le criminaliste, lors de l'examen de traces, est souvent bridée, car celui-ci essaie de répondre à la mauvaise question. L'utilisation d'un cadre théorique explicite qui définit et formalise le goal de l'examen criminaliste, permet de déterminer les besoins technologiques et en matière de données. Le développement de cette technologie et la collection des données pertinentes peut être justifiées économiquement et achevée de manière scientifique. Abstract : The contribution of ink evidence to forensic science is described and supported by an abundant literature and by two standards from the American Society for Testing and Materials (ASTM). The vast majority of the available literature is concerned with the physical and chemical analysis of ink evidence. The relevant ASTM standards mention some principles regarding the comparison of pairs of ink samples and the evaluation of their evidential value. The review of this literature and, more specifically, of the ASTM standards in the light of recent developments in the interpretation of forensic evidence has shown some potential improvements, which would maximise the benefits of the use of ink evidence in forensic science. This thesis proposes to interpret ink evidence using the widely accepted and recommended Bayesian theorem. This proposition has required the development of a new quality assurance process for the analysis and comparison of ink samples, as well as of the definition of a theoretical framework for ink evidence. The proposed technology has been extensively tested using a large dataset of ink samples and state of the art tools, commonly used in biometry. Overall, this research successfully answers to a concrete problem generally encountered in forensic science, where scientists tend to self-limit the usefulness of the information that is present in various types of evidence, by trying to answer to the wrong questions. The declaration of an explicit framework, which defines and formalises their goals and expected contributions to the criminal and civil justice system, enables the determination of their needs in terms of technology and data. The development of this technology and the collection of the data is then justified economically, structured scientifically and can be proceeded efficiently.
Resumo:
The widespread use of digital imaging devices for surveillance (CCTV) and entertainment (e.g., mobile phones, compact cameras) has increased the number of images recorded and opportunities to consider the images as traces or documentation of criminal activity. The forensic science literature focuses almost exclusively on technical issues and evidence assessment [1]. Earlier steps in the investigation phase have been neglected and must be considered. This article is the first comprehensive description of a methodology to event reconstruction using images. This formal methodology was conceptualised from practical experiences and applied to different contexts and case studies to test and refine it. Based on this practical analysis, we propose a systematic approach that includes a preliminary analysis followed by four main steps. These steps form a sequence for which the results from each step rely on the previous step. However, the methodology is not linear, but it is a cyclic, iterative progression for obtaining knowledge about an event. The preliminary analysis is a pre-evaluation phase, wherein potential relevance of images is assessed. In the first step, images are detected and collected as pertinent trace material; the second step involves organising and assessing their quality and informative potential. The third step includes reconstruction using clues about space, time and actions. Finally, in the fourth step, the images are evaluated and selected as evidence. These steps are described and illustrated using practical examples. The paper outlines how images elicit information about persons, objects, space, time and actions throughout the investigation process to reconstruct an event step by step. We emphasise the hypothetico-deductive reasoning framework, which demonstrates the contribution of images to generating, refining or eliminating propositions or hypotheses. This methodology provides a sound basis for extending image use as evidence and, more generally, as clues in investigation and crime reconstruction processes.
Resumo:
Different interferometric techniques were developed last decade to obtain full field, quantitative, and absolute phase imaging, such as phase-shifting, Fourier phase microscopy, Hilbert phase microscopy or digital holographic microscopy (DHM). Although, these techniques are very similar, DHM combines several advantages. In contrast, to phase shifting, DHM is indeed capable of single-shot hologram recording allowing a real-time absolute phase imaging. On the other hand, unlike to Fourier phase or Hilbert phase microscopy, DHM does not require to record in focus images of the specimen on the digital detector (CCD or CMOS camera), because a numerical focalization adjustment can be performed by a numerical wavefront propagation. Consequently, the depth of view of high NA microscope objectives is numerically extended. For example, two different biological cells, floating at different depths in a liquid, can be focalized numerically from the same digital hologram. Moreover, the numerical propagation associated to digital optics and automatic fitting procedures, permits vibrations insensitive full- field phase imaging and the complete compensation for a priori any image distortion or/and phase aberrations introduced for example by imperfections of holders or perfusion chamber. Examples of real-time full-field phase images of biological cells have been demonstrated. ©2008 COPYRIGHT SPIE
Resumo:
Digital holographic microscopy (DHM) is a technique that allows obtaining, from a single recorded hologram, quantitative phase image of living cell with interferometric accuracy. Specifically the optical phase shift induced by the specimen on the transmitted wave front can be regarded as a powerful endogenous contrast agent, depending on both the thickness and the refractive index of the sample. Thanks to a decoupling procedure cell thickness and intracellular refractive index can be measured separately. Consequently, Mean corpuscular volume (MCV) and mean corpuscular hemoglobin concentration (MCHC), two highly relevant clinical parameters, have been measured non-invasively at a single cell level. The DHM nanometric axial and microsecond temporal sensitivities have permitted to measure the red blood cell membrane fluctuations (CMF) on the whole cell surface. ©2009 COPYRIGHT SPIE--The International Society for Optical Engineering.
Resumo:
AbstractDigitalization gives to the Internet the power by allowing several virtual representations of reality, including that of identity. We leave an increasingly digital footprint in cyberspace and this situation puts our identity at high risks. Privacy is a right and fundamental social value that could play a key role as a medium to secure digital identities. Identity functionality is increasingly delivered as sets of services, rather than monolithic applications. So, an identity layer in which identity and privacy management services are loosely coupled, publicly hosted and available to on-demand calls could be more realistic and an acceptable situation. Identity and privacy should be interoperable and distributed through the adoption of service-orientation and implementation based on open standards (technical interoperability). Ihe objective of this project is to provide a way to implement interoperable user-centric digital identity-related privacy to respond to the need of distributed nature of federated identity systems. It is recognized that technical initiatives, emerging standards and protocols are not enough to guarantee resolution for the concerns surrounding a multi-facets and complex issue of identity and privacy. For this reason they should be apprehended within a global perspective through an integrated and a multidisciplinary approach. The approach dictates that privacy law, policies, regulations and technologies are to be crafted together from the start, rather than attaching it to digital identity after the fact. Thus, we draw Digital Identity-Related Privacy (DigldeRP) requirements from global, domestic and business-specific privacy policies. The requirements take shape of business interoperability. We suggest a layered implementation framework (DigldeRP framework) in accordance to model-driven architecture (MDA) approach that would help organizations' security team to turn business interoperability into technical interoperability in the form of a set of services that could accommodate Service-Oriented Architecture (SOA): Privacy-as-a-set-of- services (PaaSS) system. DigldeRP Framework will serve as a basis for vital understanding between business management and technical managers on digital identity related privacy initiatives. The layered DigldeRP framework presents five practical layers as an ordered sequence as a basis of DigldeRP project roadmap, however, in practice, there is an iterative process to assure that each layer supports effectively and enforces requirements of the adjacent ones. Each layer is composed by a set of blocks, which determine a roadmap that security team could follow to successfully implement PaaSS. Several blocks' descriptions are based on OMG SoaML modeling language and BPMN processes description. We identified, designed and implemented seven services that form PaaSS and described their consumption. PaaSS Java QEE project), WSDL, and XSD codes are given and explained.