898 resultados para Election Counting and Reporting Software,
Resumo:
The purpose of this research study is to discuss privacy and data protection-related regulatory and compliance challenges posed by digital transformation in healthcare in the wake of the COVID-19 pandemic. The public health crisis accelerated the development of patient-centred remote/hybrid healthcare delivery models that make increased use of telehealth services and related digital solutions. The large-scale uptake of IoT-enabled medical devices and wellness applications, and the offering of healthcare services via healthcare platforms (online doctor marketplaces) have catalysed these developments. However, the use of new enabling technologies (IoT, AI) and the platformisation of healthcare pose complex challenges to the protection of patient’s privacy and personal data. This happens at a time when the EU is drawing up a new regulatory landscape for the use of data and digital technologies. Against this background, the study presents an interdisciplinary (normative and technology-oriented) critical assessment on how the new regulatory framework may affect privacy and data protection requirements regarding the deployment and use of Internet of Health Things (hardware) devices and interconnected software (AI systems). The study also assesses key privacy and data protection challenges that affect healthcare platforms (online doctor marketplaces) in their offering of video API-enabled teleconsultation services and their (anticipated) integration into the European Health Data Space. The overall conclusion of the study is that regulatory deficiencies may create integrity risks for the protection of privacy and personal data in telehealth due to uncertainties about the proper interplay, legal effects and effectiveness of (existing and proposed) EU legislation. The proliferation of normative measures may increase compliance costs, hinder innovation and ultimately, deprive European patients from state-of-the-art digital health technologies, which is paradoxically, the opposite of what the EU plans to achieve.
Resumo:
Vision systems are powerful tools playing an increasingly important role in modern industry, to detect errors and maintain product standards. With the enlarged availability of affordable industrial cameras, computer vision algorithms have been increasingly applied in industrial manufacturing processes monitoring. Until a few years ago, industrial computer vision applications relied only on ad-hoc algorithms designed for the specific object and acquisition setup being monitored, with a strong focus on co-designing the acquisition and processing pipeline. Deep learning has overcome these limits providing greater flexibility and faster re-configuration. In this work, the process to be inspected consists in vials’ pack formation entering a freeze-dryer, which is a common scenario in pharmaceutical active ingredient packaging lines. To ensure that the machine produces proper packs, a vision system is installed at the entrance of the freeze-dryer to detect eventual anomalies with execution times compatible with the production specifications. Other constraints come from sterility and safety standards required in pharmaceutical manufacturing. This work presents an overview about the production line, with particular focus on the vision system designed, and about all trials conducted to obtain the final performance. Transfer learning, alleviating the requirement for a large number of training data, combined with data augmentation methods, consisting in the generation of synthetic images, were used to effectively increase the performances while reducing the cost of data acquisition and annotation. The proposed vision algorithm is composed by two main subtasks, designed respectively to vials counting and discrepancy detection. The first one was trained on more than 23k vials (about 300 images) and tested on 5k more (about 75 images), whereas 60 training images and 52 testing images were used for the second one.
Resumo:
Il seguente elaborato, frutto dell’attività di tirocinio presso il laboratorio del DIN di Montecuccolino, si prefigge di presentare un’architettura software contenente tutte le leggi di controllo necessarie allo svolgimento di un’attività di Pick and Place con un Robot Delta. L’elaborato si articola in una prima fase di analisi del Robot, una seconda fase di calibrazione del sistema in struttura e una terza fase di sviluppo del software di controllo con successiva validazione sperimentale dei risultati ottenuti.
Resumo:
The increasing number of Resident Space Objects (RSOs) is a threat to spaceflight operations. Conjunction Data Messages (CDMs) are sent to satellite operators to warn for possible future collision and their probabilities. The research project described herein pushed forward an algorithm that is able to update the collision probability directly on-board starting from CDMs and the state vector of the hosting satellite which is constantly updated thanks to an onboard GNSS receiver. A large set of methods for computing the collision probability was analyzed in order to find the best ones for this application. The selected algorithm was then tested to assess and improve its performance. Finally, parts of the algorithm and external software were implemented on a Raspberry Pi 3B+ board to demonstrate the compatibility of this approach with computational resources similar to those typically available onboard modern spacecraft.
Resumo:
Planning is an important sub-field of artificial intelligence (AI) focusing on letting intelligent agents deliberate on the most adequate course of action to attain their goals. Thanks to the recent boost in the number of critical domains and systems which exploit planning for their internal procedures, there is an increasing need for planning systems to become more transparent and trustworthy. Along this line, planning systems are now required to produce not only plans but also explanations about those plans, or the way they were attained. To address this issue, a new research area is emerging in the AI panorama: eXplainable AI (XAI), within which explainable planning (XAIP) is a pivotal sub-field. As a recent domain, XAIP is far from mature. No consensus has been reached in the literature about what explanations are, how they should be computed, and what they should explain in the first place. Furthermore, existing contributions are mostly theoretical, and software implementations are rarely more than preliminary. To overcome such issues, in this thesis we design an explainable planning framework bridging the gap between theoretical contributions from literature and software implementations. More precisely, taking inspiration from the state of the art, we develop a formal model for XAIP, and the software tool enabling its practical exploitation. Accordingly, the contribution of this thesis is four-folded. First, we review the state of the art of XAIP, supplying an outline of its most significant contributions from the literature. We then generalise the aforementioned contributions into a unified model for XAIP, aimed at supporting model-based contrastive explanations. Next, we design and implement an algorithm-agnostic library for XAIP based on our model. Finally, we validate our library from a technological perspective, via an extensive testing suite. Furthermore, we assess its performance and usability through a set of benchmarks and end-to-end examples.
Resumo:
Lower levels of cytosine methylation have been found in the liver cell DNA from non-obese diabetic (NOD) mice under hyperglycemic conditions. Because the Fourier transform-infrared (FT-IR) profiles of dry DNA samples are differently affected by DNA base composition, single-stranded form and histone binding, it is expected that the methylation status in the DNA could also affect its FT-IR profile. The DNA FT-IR signatures obtained from the liver cell nuclei of hyperglycemic and normoglycemic NOD mice of the same age were compared. Dried DNA samples were examined in an IR microspectroscope equipped with an all-reflecting objective (ARO) and adequate software. Changes in DNA cytosine methylation levels induced by hyperglycemia in mouse liver cells produced changes in the respective DNA FT-IR profiles, revealing modifications to the vibrational intensities and frequencies of several chemical markers, including νas -CH3 stretching vibrations in the 5-methylcytosine methyl group. A smaller band area reflecting lower energy absorbed in the DNA was found in the hyperglycemic mice and assumed to be related to the lower levels of -CH3 groups. Other spectral differences were found at 1700-1500 cm(-1) and in the fingerprint region, and a slight change in the DNA conformation at the lower DNA methylation levels was suggested for the hyperglycemic mice. The changes that affect cytosine methylation levels certainly affect the DNA-protein interactions and, consequently, gene expression in liver cells from the hyperglycemic NOD mice.
Resumo:
To determine the most adequate number and size of tissue microarray (TMA) cores for pleomorphic adenoma immunohistochemical studies. Eighty-two pleomorphic adenoma cases were distributed in 3 TMA blocks assembled in triplicate containing 1.0-, 2.0-, and 3.0-mm cores. Immunohistochemical analysis against cytokeratin 7, Ki67, p63, and CD34 were performed and subsequently evaluated with PixelCount, nuclear, and microvessel software applications. The 1.0-mm TMA presented lower results than 2.0- and 3.0-mm TMAs versus conventional whole section slides. Possibly because of an increased amount of stromal tissue, 3.0-mm cores presented a higher microvessel density. Comparing the results obtained with one, two, and three 2.0-mm cores, there was no difference between triplicate or duplicate TMAs and a single-core TMA. Considering the possible loss of cylinders during immunohistochemical reactions, 2.0-mm TMAs in duplicate are a more reliable approach for pleomorphic adenoma immunohistochemical study.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Neste artigo apresentamos uma análise Bayesiana para o modelo de volatilidade estocástica (SV) e uma forma generalizada deste, cujo objetivo é estimar a volatilidade de séries temporais financeiras. Considerando alguns casos especiais dos modelos SV usamos algoritmos de Monte Carlo em Cadeias de Markov e o software WinBugs para obter sumários a posteriori para as diferentes formas de modelos SV. Introduzimos algumas técnicas Bayesianas de discriminação para a escolha do melhor modelo a ser usado para estimar as volatilidades e fazer previsões de séries financeiras. Um exemplo empírico de aplicação da metodologia é introduzido com a série financeira do IBOVESPA.
Resumo:
This article describes the design, implementation, and experiences with AcMus, an open and integrated software platform for room acoustics research, which comprises tools for measurement, analysis, and simulation of rooms for music listening and production. Through use of affordable hardware, such as laptops, consumer audio interfaces and microphones, the software allows evaluation of relevant acoustical parameters with stable and consistent results, thus providing valuable information in the diagnosis of acoustical problems, as well as the possibility of simulating modifications in the room through analytical models. The system is open-source and based on a flexible and extensible Java plug-in framework, allowing for cross-platform portability, accessibility and experimentation, thus fostering collaboration of users, developers and researchers in the field of room acoustics.
Resumo:
INTRODUÇÃO: A tuberculose na República da Guiné-Bissau não apresenta bons indicadores de saúde, assim como na maioria dos países em vias de desenvolvimento. OBJETIVO: Estudar na República da Guiné-Bissau e nas suas Províncias, a situação epidemiológica da doença no período de 2000 a 2005. MÉTODO: Realizou-se levantamento de dados secundários junto ao Programa Nacional de Luta Contra Lepra e Tuberculose, no período de 2000 a 2005, e análise de relatórios anuais da Capital e das Províncias da Guiné-Bissau, para o cálculo de coeficientes e taxas dos indicadores. RESULTADOS: O número de casos de tuberculose manteve-se estável no período de 2000 (1.959 casos) a 2005 (1.888 casos). O percentual de casos pulmonares variou de 96,0 a 98,8%, dos quais 55% eram bacilíferos. Em 2005 o coeficiente de prevalência foi de 142,4/100.000, o de incidência 131,3/100.000 e o de mortalidade, 16,8/100.000 habitantes. A maior concentração de casos ocorreu na região da Capital. A taxa de cura variou entre 46,5% em 2000 e 69,6% em 2005, e a de abandono de tratamento de 29,8% em 2000 para 12,1% em 2005. CONCLUSÃO: Os indicadores do Plano Estratégico Nacional devem ser melhorados, sobretudo no que diz respeito à busca ativa de casos, à descentralização do atendimento aos doentes, à implantação da estratégia DOTS e à necessidade de um sistema de informação e notificação eficientes.
Resumo:
O objetivo foi analisar a evolução do perfil de utilização de serviços de saúde, entre 2003 e 2008, no Brasil e nas suas macrorregiões. Foram utilizados dados da PNAD. A utilização de serviços de saúde foi medida pela proporção de pessoas que procuraram e foram atendidas nas 2 semanas anteriores e pelos que relataram internação nos últimos 12 meses, segundo SUS e não SUS. Foram analisadas as características socioeconômicas dos usuários, o tipo de atendimento e de serviço e os motivos da procura. A proporção de indivíduos que procuraram serviços de saúde não se alterou, assim como a parcela dos que conseguiram atendimento (96%), entre 2003 e 2008. O SUS respondeu por 56,7% dos atendimentos, realizando a maior parte das internações, vacinação e consultas e somente 1/3 das consultas odontológicas. Em 2008, manteve-se o gradiente de redução de utilização de serviços de saúde SUS conforme o aumento de renda e escolaridade. Houve decréscimo da proporção dos que procuraram serviços de saúde para ações de prevenção e aumento de procura para problemas odontológicos, acidentes e lesões e reabilitação. O padrão de utilização do SUS por região esteve inversamente relacionado à proporção de indivíduos com posse de planos privados de saúde.
Resumo:
Background: Papillary thyroid carcinoma (PTC) is frequently associated with a RET gene rearrangement that generates a RET/PTC oncogene. RET/PTC is a fusion of the tyrosine kinase domain of RET to the 50 portion of a different gene. This fusion results in a constitutively active MAPK pathway, which plays a key role in PTC development. The RET/PTC3 fusion is primarily associated with radiation-related PTC. Epidemiological studies show a lower incidence of PTC in radiation-exposed regions that are associated with an iodine-rich diet. Since the influence of excess iodine on the development of thyroid cancer is still unclear, the aim of this study is to evaluate the effect of high iodine concentrations on RET/PTC3-activated thyroid cells. Methods: PTC3-5 cells, a rat thyroid cell lineage harboring doxycycline-inducible RET/PTC3, were treated with 10(-3) M NaI. Cell growth was analyzed by cell counting and the MTT assay. The expression and phosphorylation state of MAPK pathway-related (Braf, Erk, pErk, and pRet) and thyroid-specific (natrium-iodide symporter [Nis] and thyroid-stimulating hormone receptor [Tshr]) proteins were analyzed by Western blotting. Thyroid-specific gene expression was further analyzed by quantitative reverse transcription (RT)-polymerase chain reaction. Results: A significant inhibition of proliferation was observed, along with no significant variation in cell death rate, in the iodine-treated cells. Further, iodine treatment attenuated the loss of Nis and Tshr gene and protein expression induced by RET/PTC3 oncogene induction. Finally, iodine treatment reduced Ret and Erk phosphorylation, without altering Braf and Erk expression. Conclusion: Our results indicate an antioncogenic role for excess iodine during thyroid oncogenic activation. These findings contribute to a better understanding of the effect of iodine on thyroid follicular cells, particularly how it may play a protective role during RET/PTC3 oncogene activation.
Resumo:
When building genetic maps, it is necessary to choose from several marker ordering algorithms and criteria, and the choice is not always simple. In this study, we evaluate the efficiency of algorithms try (TRY), seriation (SER), rapid chain delineation (RCD), recombination counting and ordering (RECORD) and unidirectional growth (UG), as well as the criteria PARF (product of adjacent recombination fractions), SARF (sum of adjacent recombination fractions), SALOD (sum of adjacent LOD scores) and LHMC (likelihood through hidden Markov chains), used with the RIPPLE algorithm for error verification, in the construction of genetic linkage maps. A linkage map of a hypothetical diploid and monoecious plant species was simulated containing one linkage group and 21 markers with fixed distance of 3 cM between them. In all, 700 F(2) populations were randomly simulated with and 400 individuals with different combinations of dominant and co-dominant markers, as well as 10 and 20% of missing data. The simulations showed that, in the presence of co-dominant markers only, any combination of algorithm and criteria may be used, even for a reduced population size. In the case of a smaller proportion of dominant markers, any of the algorithms and criteria (except SALOD) investigated may be used. In the presence of high proportions of dominant markers and smaller samples (around 100), the probability of repulsion linkage increases between them and, in this case, use of the algorithms TRY and SER associated to RIPPLE with criterion LHMC would provide better results. Heredity (2009) 103, 494-502; doi:10.1038/hdy.2009.96; published online 29 July 2009
Resumo:
The objective of this study was to compare the results of an on-farm test, named Somaticell, with results of electronic cell counting and for milk somatic cell count (SCC) among readers. The Somaticell test correctly determined the SCC in fresh quarter milk samples. Correlation between Somaticell and electronic enumeration of somatic cells was 0.92 and. coefficient 0.82. Using a threshold of 205,000 cells/mL, the sensitivity and specificity for determination of intramammary infections were 91.3 and 96.0%, respectively. The SCC was greater for milk samples from which major mastitis pathogens were recovered. Minor variation among readers was observed and most likely associated with the mixing procedure. However, the final analysis indicated that this variation was not significant and did not affect the amount of samples classified as having subclinical mastitis. The on-farm test evaluated in this study showed adequate capacity of determining SCC on quarter milk samples and may be considered as an alternative for on-farm detection of subclinical mastitis.