948 resultados para crystallization screening data
Resumo:
Introduction: multimodality environment; requirement for greater understanding of the imaging technologies used, the limitations of these technologies, and how to best interpret the results; dose optimization; introduction of new techniques; current practice and best practice; incidental findings, in low-dose CT images obtained as part of the hybrid imaging process, are an increasing phenomenon with advancing CT technology; resultant ethical and medico-legal dilemmas; understanding limitations of these procedures important when reporting images and recommending follow-up; free-response observer performance study was used to evaluate lesion detection in low-dose CT images obtained during attenuation correction acquisitions for myocardial perfusion imaging, on two hybrid imaging systems.
Resumo:
A detailed analysis of fabrics of the chilled margin of a thick dolerite dyke (Foum Zguid dyke, Southern Morocco) was performed in order to better understand the development of sub-fabrics during dyke emplacement and cooling. AMS data were complemented with measurements of paramagnetic and ferrimagnetic fabrics (measured with high field torque magnetometer), neutron texture and microstructural analyses. The ferrimagnetic and AMS fabrics are similar, indicating that the ferrimagnetic minerals dominate the AMS signal. The paramagnetic fabric is different from the previous ones. Based on the crystallization timing of the different mineralogical phases, the paramagnetic fabric appears related to the upward flow, while the ferrimagnetic fabric rather reflects the late-stage of dyke emplacement and cooling stresses. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
OBJECTIVE To evaluate the prevalence of self-medication in Brazil’s adult population.METHODS Systematic review of cross-sectional population-based studies. The following databases were used: Medline, Embase, Scopus, ISI, CINAHL, Cochrane Library, CRD, Lilacs, SciELO, the Banco de teses brasileiras(Brazilian theses database) (Capes) and files from the Portal Domínio Público (Brazilian Public Domain). In addition, the reference lists from relevant studies were examined to identify potentially eligible articles. There were no applied restrictions in terms of the publication date, language or publication status. Data related to publication, population, methods and prevalence of self-medication were extracted by three independent researchers. Methodological quality was assessed following eight criteria related to sampling, measurement and presentation of results. The prevalences were measured from participants who used at least one medication during the recall period of the studies.RESULTS The literature screening identified 2,778 records, from which 12 were included for analysis. Most studies were conducted in the Southeastern region of Brazil, after 2000 and with a 15-day recall period. Only five studies achieved high methodological quality, of which one study had a 7-day recall period, in which the prevalence of self-medication was 22.9% (95%CI 14.6;33.9). The prevalence of self-medication in three studies of high methodological quality with a 15-day recall period was 35.0% (95%CI 29.0;40.0, I2 = 83.9%) in the adult Brazilian population.CONCLUSIONS Despite differences in the methodologies of the included studies, the results of this systematic review indicate that a significant proportion of the adult Brazilian population self-medicates. It is suggested that future research projects that assess self-medication in Brazil standardize their methods.
Resumo:
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies
Resumo:
Patients scheduled for a magnetic resonance imaging (MRI) scan sometimes require screening for ferromagnetic Intra Orbital Foreign Bodies (IOFBs). To assess this, they are required to fill out a screening protocol questionnaire before their scan. If it is established that a patient is at high risk, radiographic imaging is necessary. This review examines literature to evaluate which imaging modality should be used to screen for IOFBs, considering that the eye is highly sensitive to ionising radiation and any dose should be minimised. Method: Several websites and books were searched for information, these were as follows: PubMed, Science Direct, Web of Knowledge and Google Scholar. The terms searched related to IOFB, Ionising radiation, Magnetic Resonance Imaging Safety, Image Quality, Effective Dose, Orbits and X-ray. Thirty five articles were found, several were rejected due to age or irrelevance; twenty eight were eventually accepted. Results: There are several imaging techniques that can be used. Some articles investigated the use of ultrasound for investigation of ferromagnetic IOFBs of the eye and others discussed using Computed Tomography (CT) and X-ray. Some gaps in the literature were identified, mainly that there are no articles which discuss the lowest effective dose while having adequate image quality for orbital imaging. Conclusion: X-ray is the best method to identify IOFBs. The only problem is that there is no research which highlights exposure factors that maintain sufficient image quality for viewing IOFBs and keep the effective dose to the eye As Low As Reasonably Achievable (ALARA).
Resumo:
Purpose: To investigate whether standard X-ray acquisition factors for orbital radiographs are suitable for the detection of ferromagnetic intra-ocular foreign bodies in patients undergoing MRI. Method: 35 observers, at varied levels of education in radiography, attending a European Dose Optimisation EURASMUS Summer School were asked to score 24 images of varying acquisition factors against a clinical standard (reference image) using two alternative forced choice. The observers were provided with 12 questions and a 5 point Likert scale. Statistical tests were used to validate the scale, and scale reliability was also measured. The images which scored equal to, or better than, the reference image (36) were ranked alongside their corresponding effective dose (E), the image with the lowest dose equal to or better than the reference is considered the new optimum acquisition factors. Results: Four images emerged as equal to, or better than, the reference in terms of image quality. The images were then ranked in order of E. Only one image that scored the same as the reference had a lower dose. The reference image had a mean E of 3.31μSv, the image that scored the same had an E of 1.8μSv. Conclusion: Against the current clinical standard exposure factors of 70kVp, 20mAs and the use of an anti- scatter grid, one image proved to have a lower E whilst maintaining the same level of image quality and lesion visibility. It is suggested that the new exposure factors should be 60kVp, 20mAs and still include the use of an anti-scatter grid.
Resumo:
ABSTRACT OBJECTIVE To develop an assessment tool to evaluate the efficiency of federal university general hospitals. METHODS Data envelopment analysis, a linear programming technique, creates a best practice frontier by comparing observed production given the amount of resources used. The model is output-oriented and considers variable returns to scale. Network data envelopment analysis considers link variables belonging to more than one dimension (in the model, medical residents, adjusted admissions, and research projects). Dynamic network data envelopment analysis uses carry-over variables (in the model, financing budget) to analyze frontier shift in subsequent years. Data were gathered from the information system of the Brazilian Ministry of Education (MEC), 2010-2013. RESULTS The mean scores for health care, teaching and research over the period were 58.0%, 86.0%, and 61.0%, respectively. In 2012, the best performance year, for all units to reach the frontier it would be necessary to have a mean increase of 65.0% in outpatient visits; 34.0% in admissions; 12.0% in undergraduate students; 13.0% in multi-professional residents; 48.0% in graduate students; 7.0% in research projects; besides a decrease of 9.0% in medical residents. In the same year, an increase of 0.9% in financing budget would be necessary to improve the care output frontier. In the dynamic evaluation, there was progress in teaching efficiency, oscillation in medical care and no variation in research. CONCLUSIONS The proposed model generates public health planning and programming parameters by estimating efficiency scores and making projections to reach the best practice frontier.
Resumo:
This paper addresses the calculation of derivatives of fractional order for non-smooth data. The noise is avoided by adopting an optimization formulation using genetic algorithms (GA). Given the flexibility of the evolutionary schemes, a hierarchical GA composed by a series of two GAs, each one with a distinct fitness function, is established.
Resumo:
The morpho-structural evolution of oceanic islands results from competition between volcano growth and partial destruction by mass-wasting processes. We present here a multi-disciplinary study of the successive stages of development of Faial (Azores) during the last 1 Myr. Using high-resolution digital elevation model (DEM), and new K/Ar, tectonic, and magnetic data, we reconstruct the rapidly evolving topography at successive stages, in response to complex interactions between volcanic construction and mass wasting, including the development of a graben. We show that: (1) sub-aerial evolution of the island first involved the rapid growth of a large elongated volcano at ca. 0.85 Ma, followed by its partial destruction over half a million years; (2) beginning about 360 ka a new small edifice grew on the NE of the island, and was subsequently cut by normal faults responsible for initiation of the graben; (3) after an apparent pause of ca. 250 kyr, the large Central Volcano (CV) developed on the western side of the island at ca 120 ka, accumulating a thick pile of lava flows in less than 20 kyr, which were partly channelized within the graben; (4) the period between 120 ka and 40 ka is marked by widespread deformation at the island scale, including westward propagation of faulting and associated erosion of the graben walls, which produced sedimentary deposits; subsequent growth of the CV at 40 ka was then constrained within the graben, with lava flowing onto the sediments up to the eastern shore; (5) the island evolution during the Holocene involves basaltic volcanic activity along the main southern faults and pyroclastic eruptions associated with the formation of a caldera volcano-tectonic depression. We conclude that the whole evolution of Faial Island has been characterized by successive short volcanic pulses probably controlled by brief episodes of regional deformation. Each pulse has been separated by considerable periods of volcanic inactivity during which the Faial graben gradually developed. We propose that the volume loss associated with sudden magma extraction from a shallow reservoir in different episodes triggered incremental downward graben movement, as observed historically, when immediate vertical collapse of up to 2 m was observed along the western segments of the graben at the end of the Capelinhos eruptive crises (1957-58).
Resumo:
Conferência: CONTROLO’2012 - 16-18 July 2012 - Funchal
Resumo:
Data analytic applications are characterized by large data sets that are subject to a series of processing phases. Some of these phases are executed sequentially but others can be executed concurrently or in parallel on clusters, grids or clouds. The MapReduce programming model has been applied to process large data sets in cluster and cloud environments. For developing an application using MapReduce there is a need to install/configure/access specific frameworks such as Apache Hadoop or Elastic MapReduce in Amazon Cloud. It would be desirable to provide more flexibility in adjusting such configurations according to the application characteristics. Furthermore the composition of the multiple phases of a data analytic application requires the specification of all the phases and their orchestration. The original MapReduce model and environment lacks flexible support for such configuration and composition. Recognizing that scientific workflows have been successfully applied to modeling complex applications, this paper describes our experiments on implementing MapReduce as subworkflows in the AWARD framework (Autonomic Workflow Activities Reconfigurable and Dynamic). A text mining data analytic application is modeled as a complex workflow with multiple phases, where individual workflow nodes support MapReduce computations. As in typical MapReduce environments, the end user only needs to define the application algorithms for input data processing and for the map and reduce functions. In the paper we present experimental results when using the AWARD framework to execute MapReduce workflows deployed over multiple Amazon EC2 (Elastic Compute Cloud) instances.
Resumo:
Feature selection is a central problem in machine learning and pattern recognition. On large datasets (in terms of dimension and/or number of instances), using search-based or wrapper techniques can be cornputationally prohibitive. Moreover, many filter methods based on relevance/redundancy assessment also take a prohibitively long time on high-dimensional. datasets. In this paper, we propose efficient unsupervised and supervised feature selection/ranking filters for high-dimensional datasets. These methods use low-complexity relevance and redundancy criteria, applicable to supervised, semi-supervised, and unsupervised learning, being able to act as pre-processors for computationally intensive methods to focus their attention on smaller subsets of promising features. The experimental results, with up to 10(5) features, show the time efficiency of our methods, with lower generalization error than state-of-the-art techniques, while being dramatically simpler and faster.
Resumo:
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies
Resumo:
Most of the traditional software and database development approaches tend to be serial, not evolutionary and certainly not agile, especially on data-oriented aspects. Most of the more commonly used methodologies are strict, meaning they’re composed by several stages each with very specific associated tasks. A clear example is the Rational Unified Process (RUP), divided into Business Modeling, Requirements, Analysis & Design, Implementation, Testing and Deployment. But what happens when the needs of a well design and structured plan, meet the reality of a small starting company that aims to build an entire user experience solution. Here resource control and time productivity is vital, requirements are in constant change, and so is the product itself. In order to succeed in this environment a highly collaborative and evolutionary development approach is mandatory. The implications of constant changing requirements imply an iterative development process. Project focus is on Data Warehouse development and business modeling. This area is usually a tricky one. Business knowledge is part of the enterprise, how they work, their goals, what is relevant for analyses are internal business processes. Throughout this document it will be explained why Agile Modeling development was chosen. How an iterative and evolutionary methodology, allowed for reasonable planning and documentation while permitting development flexibility, from idea to product. More importantly how it was applied on the development of a Retail Focused Data Warehouse. A productized Data Warehouse built on the knowledge of not one but several client needs. One that aims not just to store usual business areas but create an innovative sets of business metrics by joining them with store environment analysis, converting Business Intelligence into Actionable Business Intelligence.
Resumo:
Dissertação apresentada como requisito parcial para obtenção do grau de Mestre em Estatística e Gestão de Informação