852 resultados para 080109 Pattern Recognition and Data Mining


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research described in this thesis was motivated by the need of a robust model capable of representing 3D data obtained with 3D sensors, which are inherently noisy. In addition, time constraints have to be considered as these sensors are capable of providing a 3D data stream in real time. This thesis proposed the use of Self-Organizing Maps (SOMs) as a 3D representation model. In particular, we proposed the use of the Growing Neural Gas (GNG) network, which has been successfully used for clustering, pattern recognition and topology representation of multi-dimensional data. Until now, Self-Organizing Maps have been primarily computed offline and their application in 3D data has mainly focused on free noise models, without considering time constraints. It is proposed a hardware implementation leveraging the computing power of modern GPUs, which takes advantage of a new paradigm coined as General-Purpose Computing on Graphics Processing Units (GPGPU). The proposed methods were applied to different problem and applications in the area of computer vision such as the recognition and localization of objects, visual surveillance or 3D reconstruction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A ecografia é o exame de primeira linha na identificação e caraterização de tumores anexiais. Foram descritos diversos métodos de diagnóstico diferencial incluindo a avaliação subjetiva do observador, índices descritivos simples e índices matematicamente desenvolvidos como modelos de regressão logística, continuando a avaliação subjectiva por examinador diferenciado a ser o melhor método de discriminação entre tumores malignos e benignos. No entanto, devido à subjectividade inerente a esta avaliação tornouse necessário estabelecer uma nomenclatura padronizada e uma classificação que facilitasse a comunicação de resultados e respectivas recomendações de vigilância. O objetivo deste artigo é resumir e comparar diferentes métodos de avaliação e classificação de tumores anexiais, nomeadamente os modelos do grupo International Ovary Tumor Analysis (IOTA) e a classificação Gynecologic Imaging Report and Data System (GI-RADS), em termos de desempenho diagnóstico e utilidade na prática clínica.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rainflow counting methods convert a complex load time history into a set of load reversals for use in fatigue damage modeling. Rainflow counting methods were originally developed to assess fatigue damage associated with mechanical cycling where creep of the material under load was not considered to be a significant contributor to failure. However, creep is a significant factor in some cyclic loading cases such as solder interconnects under temperature cycling. In this case, fatigue life models require the dwell time to account for stress relaxation and creep. This study develops a new version of the multi-parameter rainflow counting algorithm that provides a range-based dwell time estimation for use with time-dependent fatigue damage models. To show the applicability, the method is used to calculate the life of solder joints under a complex thermal cycling regime and is verified by experimental testing. An additional algorithm is developed in this study to provide data reduction in the results of the rainflow counting. This algorithm uses a damage model and a statistical test to determine which of the resultant cycles are statistically insignificant to a given confidence level. This makes the resulting data file to be smaller, and for a simplified load history to be reconstructed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho objetivou realizar a sistematização e análise das informações disponíveis na literatura sobre técnicas de produção de mudas de seis espécies florestais nativas e exóticas no Bioma Amazônia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, one of the most important areas of interest in archeology is the characterization of the submersed cultural heritage. Mediterranean Sea is rich in archaeological findings due to storms, accidents and naval battles since prehistoric times. Chemical analysis of submerged materials is an extremely valuable source of information on the origin and precedence of the wrecks, and also the raw materials employed during the manufacturing of the objects found in these sites. Nevertheless, sometimes it is not possible to extract the archaeological material from the marine environment due to size of the sample, the legislation or preservation purposes. In these cases, the in-situ analysis turns into the only alternative for obtaining information. In spite of this demand, no analytical techniques are available for the in-situ chemical characterization of underwater materials. The versatility of laser-induced breakdown spectroscopy (LIBS) has been successfully tested in oceanography 1. Advantages such as rapid and in situ analysis with no sample preparation make LIBS a suitable alternative for field measurements. To further exploit the inherent advantages of the technology, a mobile fiber-based LIBS platform capable of performing remote measurements up to 50 meters range has been designed for the recognition and identification of artworks in underwater archaeological shipwrecks. The LIBS prototype featured both single-pulse (SP-LIBS) and multi-pulse excitation (MP-LIBS) 2. The use of multi-pulse excitation allowed an increased laser beam energy (up to 95 mJ) transmitted through the optical fiber. This excitation mode results in an improved performance of the equipment in terms of extended range of analysis (to a depth of 50 m) and a broader variety of samples to be analyzed (i.e., rocks, marble, ceramics and concrete). In the present work, the design and construction considerations of the instrument are reported and its performance is discussed on the basis of the spectral response, the remote irradiance achieved upon the range of analysis and its influence on plasma properties, as well as the effect of the laser pulse duration and purge gas to the LIBS signal. Also, to check the reliability and reproducibility of the instrument for field analysis several robustness tests were performed outside the lab. Finally, the capability of this instrument was successfully demonstrated in an underwater archaeological shipwreck (San Pedro de Alcántara, Malaga).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Cliff Mine, an archaeological site situated on the Keweenaw Peninsula of Michigan, is the location of the first successful attempt to mine native copper in North America. Under the management of the Pittsburgh & Boston Mining Company from 1845-1879, two-third of the Cliff’s mineral output was in the form of mass copper, some pieces of which weighed over 5 tons when removed from the ground. The unique nature of mass copper and the Cliff Mine’s handling of it make it one of the best examples of early mining processes in the Keweenaw District. Mass copper only constituted 2% of the entire product of the Lake Superior copper districts, and the story of early mining on the Peninsula is generally overshadowed by later, longer running mines such as the Calumet & Helca and Quincy Mining Companies. Operating into the mid-twentieth century, the size and duration of these later mines would come to define the region, though they would not have been possible without the Cliff’s early success. Research on the Cliff Mine has previously focused on social and popular history, neglecting the structural remains. However, these remains are physical clues to the technical processes that defined early mining on the Keweenaw. Through archaeological investigations, these processes and their associated networks were documented as part of the 2010 Michigan Technological Archaeology Field School’s curriculum. The project will create a visual representation of these processes utilizing Geographic Information Systems software. This map will be a useful aid in future research, community engagement and possible future interpretive planning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Oxytocin (OT) plays a key role in the mediation of social and stress behaviors across many species; however, the mechanism is still unclear. The present study investigated the influence of prenatal levels of mesotocin (MT; avian homologue of OT) on postnatal social and stress behavior in Northern bobwhite quail. Experiment one determined endogenous levels of MT during prenatal development using an enzyme-linked immunoassay kit. Experiment two examined the influence of increased MT during prenatal development on chicks' individual recognition ability and stress response to a novel environment. Experiment one showed MT levels increased significantly throughout embryonic development. Experiment two showed significant differences in stress behavior for chicks with increased MT during prenatal development; however, no significant differences were found for social behavior. This study suggests MT serves different functions depending on the stage of embryonic development and that increasing MT levels affects postnatal stress behavior, but not social behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Diagnostic decision-making is made through a combination of Systems 1 (intuition or pattern-recognition) and Systems 2 (analytic) thinking. The purpose of this study was to use the Cognitive Reflection Test (CRT) to evaluate and compare the level of Systems 1 and 2 thinking among medical students in pre-clinical and clinical programs. Methods: The CRT is a three-question test designed to measure the ability of respondents to activate metacognitive processes and switch to System 2 (analytic) thinking where System 1 (intuitive) thinking would lead them astray. Each CRT question has a correct analytical (System 2) answer and an incorrect intuitive (System 1) answer. A group of medical students in Years 2 & 3 (pre-clinical) and Years 4 (in clinical practice) of a 5-year medical degree were studied. Results: Ten percent (13/128) of students had the intuitive answers to the three questions (suggesting they generally relied on System 1 thinking) while almost half (44%) answered all three correctly (indicating full analytical, System 2 thinking). Only 3-13% had incorrect answers (i.e. that were neither the analytical nor the intuitive responses). Non-native English speaking students (n = 11) had a lower mean number of correct answers compared to native English speakers (n = 117: 1.0 s 2.12 respectfully: p < 0.01). As students progressed through questions 1 to 3, the percentage of correct System 2 answers increased and the percentage of intuitive answers decreased in both the pre-clinical and clinical students. Conclusions: Up to half of the medical students demonstrated full or partial reliance on System 1 (intuitive) thinking in response to these analytical questions. While their CRT performance has no claims to make as to their future expertise as clinicians, the test may be used in helping students to understand the importance of awareness and regulation of their thinking processes in clinical practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The key functional operability in the pre-Lisbon PJCCM pillar of the EU is the exchange of intelligence and information amongst the law enforcement bodies of the EU. The twin issues of data protection and data security within what was the EU’s third pillar legal framework therefore come to the fore. With the Lisbon Treaty reform of the EU, and the increased role of the Commission in PJCCM policy areas, and the integration of the PJCCM provisions with what have traditionally been the pillar I activities of Frontex, the opportunity for streamlining the data protection and data security provisions of the law enforcement bodies of the post-Lisbon EU arises. This is recognised by the Commission in their drafting of an amending regulation for Frontex , when they say that they would prefer “to return to the question of personal data in the context of the overall strategy for information exchange to be presented later this year and also taking into account the reflection to be carried out on how to further develop cooperation between agencies in the justice and home affairs field as requested by the Stockholm programme.” The focus of the literature published on this topic, has for the most part, been on the data protection provisions in Pillar I, EC. While the focus of research has recently sifted to the previously Pillar III PJCCM provisions on data protection, a more focused analysis of the interlocking issues of data protection and data security needs to be made in the context of the law enforcement bodies, particularly with regard to those which were based in the pre-Lisbon third pillar. This paper will make a contribution to that debate, arguing that a review of both the data protection and security provision post-Lisbon is required, not only in order to reinforce individual rights, but also inter-agency operability in combating cross-border EU crime. The EC’s provisions on data protection, as enshrined by Directive 95/46/EC, do not apply to the legal frameworks covering developments within the third pillar of the EU. Even Council Framework Decision 2008/977/JHA, which is supposed to cover data protection provisions within PJCCM expressly states that its provisions do not apply to “Europol, Eurojust, the Schengen Information System (SIS)” or to the Customs Information System (CIS). In addition, the post Treaty of Prüm provisions covering the sharing of DNA profiles, dactyloscopic data and vehicle registration data pursuant to Council Decision 2008/615/JHA, are not to be covered by the provisions of the 2008 Framework Decision. As stated by Hijmans and Scirocco, the regime is “best defined as a patchwork of data protection regimes”, with “no legal framework which is stable and unequivocal, like Directive 95/46/EC in the First pillar”. Data security issues are also key to the sharing of data in organised crime or counterterrorism situations. This article will critically analyse the current legal framework for data protection and security within the third pillar of the EU.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is widely accepted that edema occurs early in the ischemic zone and persists in stable form for at least 1 week after myocardial ischemia/reperfusion. However, there are no longitudinal studies covering from very early (minutes) to late (1 week) reperfusion stages confirming this phenomenon. This study sought to perform a comprehensive longitudinal imaging and histological characterization of the edematous reaction after experimental myocardial ischemia/reperfusion. The study population consisted of 25 instrumented Large-White pigs (30 kg to 40 kg). Closed-chest 40-min ischemia/reperfusion was performed in 20 pigs, which were sacrificed at 120 min (n = 5), 24 h (n = 5), 4 days (n = 5), and 7 days (n = 5) after reperfusion and processed for histological quantification of myocardial water content. Cardiac magnetic resonance (CMR) scans with T2-weighted short-tau inversion recovery and T2-mapping sequences were performed at every follow-up stage until sacrifice. Five additional pigs sacrificed after baseline CMR served as controls. In all pigs, reperfusion was associated with a significant increase in T2 relaxation times in the ischemic region. On 24-h CMR, ischemic myocardium T2 times returned to normal values (similar to those seen pre-infarction). Thereafter, ischemic myocardium-T2 times in CMR performed on days 4 and 7 after reperfusion progressively and systematically increased. On day 7 CMR, T2 relaxation times were as high as those observed at reperfusion. Myocardial water content analysis in the ischemic region showed a parallel bimodal pattern: 2 high water content peaks at reperfusion and at day 7, and a significant decrease at 24 h. Contrary to the accepted view, myocardial edema during the first week after ischemia/reperfusion follows a bimodal pattern. The initial wave appears abruptly upon reperfusion and dissipates at 24 h. Conversely, the deferred wave of edema appears progressively days after ischemia/reperfusion and is maximal around day 7 after reperfusion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research activities involved the application of the Geomatic techniques in the Cultural Heritage field, following the development of two themes: Firstly, the application of high precision surveying techniques for the restoration and interpretation of relevant monuments and archaeological finds. The main case regards the activities for the generation of a high-fidelity 3D model of the Fountain of Neptune in Bologna. In this work, aimed to the restoration of the manufacture, both the geometrical and radiometrical aspects were crucial. The final product was the base of a 3D information system representing a shared tool where the different figures involved in the restoration activities shared their contribution in a multidisciplinary approach. Secondly, the arrangement of 3D databases for a Building Information Modeling (BIM) approach, in a process which involves the generation and management of digital representations of physical and functional characteristics of historical buildings, towards a so-called Historical Building Information Model (HBIM). A first application was conducted for the San Michele in Acerboli’s church in Santarcangelo di Romagna. The survey was performed by the integration of the classical and modern Geomatic techniques and the point cloud representing the church was used for the development of a HBIM model, where the relevant information connected to the building could be stored and georeferenced. A second application regards the domus of Obellio Firmo in Pompeii, surveyed by the integration of the classical and modern Geomatic techniques. An historical analysis permitted the definitions of phases and the organization of a database of materials and constructive elements. The goal is the obtaining of a federate model able to manage the different aspects: documental, analytic and reconstructive ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hadrontherapy employs high-energy beams of charged particles (protons and heavier ions) to treat deep-seated tumours: these particles have a favourable depth-dose distribution in tissue characterized by a low dose in the entrance channel and a sharp maximum (Bragg peak) near the end of their path. In these treatments nuclear interactions have to be considered: beam particles can fragment in the human body releasing a non-zero dose beyond the Bragg peak while fragments of human body nuclei can modify the dose released in healthy tissues. These effects are still in question given the lack of interesting cross sections data. Also space radioprotection can profit by fragmentation cross section measurements: the interest in long-term manned space missions beyond Low Earth Orbit is growing in these years but it has to cope with major health risks due to space radiation. To this end, risk models are under study: however, huge gaps in fragmentation cross sections data are currently present preventing an accurate benchmark of deterministic and Monte Carlo codes. To fill these gaps in data, the FOOT (FragmentatiOn Of Target) experiment was proposed. It is composed by two independent and complementary setups, an Emulsion Cloud Chamber and an electronic setup composed by several subdetectors providing redundant measurements of kinematic properties of fragments produced in nuclear interactions between a beam and a target. FOOT aims to measure double differential cross sections both in angle and kinetic energy which is the most complete information to address existing questions. In this Ph.D. thesis, the development of the Trigger and Data Acquisition system for the FOOT electronic setup and a first analysis of 400 MeV/u 16O beam on Carbon target data acquired in July 2021 at GSI (Darmstadt, Germany) are presented. When possible, a comparison with other available measurements is also reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The term Artificial intelligence acquired a lot of baggage since its introduction and in its current incarnation is synonymous with Deep Learning. The sudden availability of data and computing resources has opened the gates to myriads of applications. Not all are created equal though, and problems might arise especially for fields not closely related to the tasks that pertain tech companies that spearheaded DL. The perspective of practitioners seems to be changing, however. Human-Centric AI emerged in the last few years as a new way of thinking DL and AI applications from the ground up, with a special attention at their relationship with humans. The goal is designing a system that can gracefully integrate in already established workflows, as in many real-world scenarios AI may not be good enough to completely replace its humans. Often this replacement may even be unneeded or undesirable. Another important perspective comes from, Andrew Ng, a DL pioneer, who recently started shifting the focus of development from “better models” towards better, and smaller, data. He defined his approach Data-Centric AI. Without downplaying the importance of pushing the state of the art in DL, we must recognize that if the goal is creating a tool for humans to use, more raw performance may not align with more utility for the final user. A Human-Centric approach is compatible with a Data-Centric one, and we find that the two overlap nicely when human expertise is used as the driving force behind data quality. This thesis documents a series of case-studies where these approaches were employed, to different extents, to guide the design and implementation of intelligent systems. We found human expertise proved crucial in improving datasets and models. The last chapter includes a slight deviation, with studies on the pandemic, still preserving the human and data centric perspective.