978 resultados para Open Access Data
Resumo:
Purpose: Traditional patient-specific IMRT QA measurements are labor intensive and consume machine time. Calculation-based IMRT QA methods typically are not comprehensive. We have developed a comprehensive calculation-based IMRT QA method to detect uncertainties introduced by the initial dose calculation, the data transfer through the Record-and-Verify (R&V) system, and various aspects of the physical delivery. Methods: We recomputed the treatment plans in the patient geometry for 48 cases using data from the R&V, and from the delivery unit to calculate the “as-transferred” and “as-delivered” doses respectively. These data were sent to the original TPS to verify transfer and delivery or to a second TPS to verify the original calculation. For each dataset we examined the dose computed from the R&V record (RV) and from the delivery records (Tx), and the dose computed with a second verification TPS (vTPS). Each verification dose was compared to the clinical dose distribution using 3D gamma analysis and by comparison of mean dose and ROI-specific dose levels to target volumes. Plans were also compared to IMRT QA absolute and relative dose measurements. Results: The average 3D gamma passing percentages using 3%-3mm, 2%-2mm, and 1%-1mm criteria for the RV plan were 100.0 (σ=0.0), 100.0 (σ=0.0), and 100.0 (σ=0.1); for the Tx plan they were 100.0 (σ=0.0), 100.0 (σ=0.0), and 99.0 (σ=1.4); and for the vTPS plan they were 99.3 (σ=0.6), 97.2 (σ=1.5), and 79.0 (σ=8.6). When comparing target volume doses in the RV, Tx, and vTPS plans to the clinical plans, the average ratios of ROI mean doses were 0.999 (σ=0.001), 1.001 (σ=0.002), and 0.990 (σ=0.009) and ROI-specific dose levels were 0.999 (σ=0.001), 1.001 (σ=0.002), and 0.980 (σ=0.043), respectively. Comparing the clinical, RV, TR, and vTPS calculated doses to the IMRT QA measurements for all 48 patients, the average ratios for absolute doses were 0.999 (σ=0.013), 0.998 (σ=0.013), 0.999 σ=0.015), and 0.990 (σ=0.012), respectively, and the average 2D gamma(5%-3mm) passing percentages for relative doses for 9 patients was were 99.36 (σ=0.68), 99.50 (σ=0.49), 99.13 (σ=0.84), and 98.76 (σ=1.66), respectively. Conclusions: Together with mechanical and dosimetric QA, our calculation-based IMRT QA method promises to minimize the need for patient-specific QA measurements by identifying outliers in need of further review.
Resumo:
The overarching goal of the Pathway Semantics Algorithm (PSA) is to improve the in silico identification of clinically useful hypotheses about molecular patterns in disease progression. By framing biomedical questions within a variety of matrix representations, PSA has the flexibility to analyze combined quantitative and qualitative data over a wide range of stratifications. The resulting hypothetical answers can then move to in vitro and in vivo verification, research assay optimization, clinical validation, and commercialization. Herein PSA is shown to generate novel hypotheses about the significant biological pathways in two disease domains: shock / trauma and hemophilia A, and validated experimentally in the latter. The PSA matrix algebra approach identified differential molecular patterns in biological networks over time and outcome that would not be easily found through direct assays, literature or database searches. In this dissertation, Chapter 1 provides a broad overview of the background and motivation for the study, followed by Chapter 2 with a literature review of relevant computational methods. Chapters 3 and 4 describe PSA for node and edge analysis respectively, and apply the method to disease progression in shock / trauma. Chapter 5 demonstrates the application of PSA to hemophilia A and the validation with experimental results. The work is summarized in Chapter 6, followed by extensive references and an Appendix with additional material.
Resumo:
Next-generation sequencing (NGS) technology has become a prominent tool in biological and biomedical research. However, NGS data analysis, such as de novo assembly, mapping and variants detection is far from maturity, and the high sequencing error-rate is one of the major problems. . To minimize the impact of sequencing errors, we developed a highly robust and efficient method, MTM, to correct the errors in NGS reads. We demonstrated the effectiveness of MTM on both single-cell data with highly non-uniform coverage and normal data with uniformly high coverage, reflecting that MTM’s performance does not rely on the coverage of the sequencing reads. MTM was also compared with Hammer and Quake, the best methods for correcting non-uniform and uniform data respectively. For non-uniform data, MTM outperformed both Hammer and Quake. For uniform data, MTM showed better performance than Quake and comparable results to Hammer. By making better error correction with MTM, the quality of downstream analysis, such as mapping and SNP detection, was improved. SNP calling is a major application of NGS technologies. However, the existence of sequencing errors complicates this process, especially for the low coverage (
Resumo:
These three manuscripts are presented as a PhD dissertation for the study of using GeoVis application to evaluate telehealth programs. The primary reason of this research was to understand how the GeoVis applications can be designed and developed using combined approaches of HC approach and cognitive fit theory and in terms utilized to evaluate telehealth program in Brazil. First manuscript The first manuscript in this dissertation presented a background about the use of GeoVisualization to facilitate visual exploration of public health data. The manuscript covered the existing challenges that were associated with an adoption of existing GeoVis applications. The manuscript combines the principles of Human Centered approach and Cognitive Fit Theory and a framework using a combination of these approaches is developed that lays the foundation of this research. The framework is then utilized to propose the design, development and evaluation of “the SanaViz” to evaluate telehealth data in Brazil, as a proof of concept. Second manuscript The second manuscript is a methods paper that describes the approaches that can be employed to design and develop “the SanaViz” based on the proposed framework. By defining the various elements of the HC approach and CFT, a mixed methods approach is utilized for the card sorting and sketching techniques. A representative sample of 20 study participants currently involved in the telehealth program at the NUTES telehealth center at UFPE, Recife, Brazil was enrolled. The findings of this manuscript helped us understand the needs of the diverse group of telehealth users, the tasks that they perform and helped us determine the essential features that might be necessary to be included in the proposed GeoVis application “the SanaViz”. Third manuscript The third manuscript involved mix- methods approach to compare the effectiveness and usefulness of the HC GeoVis application “the SanaViz” against a conventional GeoVis application “Instant Atlas”. The same group of 20 study participants who had earlier participated during Aim 2 was enrolled and a combination of quantitative and qualitative assessments was done. Effectiveness was gauged by the time that the participants took to complete the tasks using both the GeoVis applications, the ease with which they completed the tasks and the number of attempts that were taken to complete each task. Usefulness was assessed by System Usability Scale (SUS), a validated questionnaire tested in prior studies. In-depth interviews were conducted to gather opinions about both the GeoVis applications. This manuscript helped us in the demonstration of the usefulness and effectiveness of HC GeoVis applications to facilitate visual exploration of telehealth data, as a proof of concept. Together, these three manuscripts represent challenges of combining principles of Human Centered approach, Cognitive Fit Theory to design and develop GeoVis applications as a method to evaluate Telehealth data. To our knowledge, this is the first study to explore the usefulness and effectiveness of GeoVis to facilitate visual exploration of telehealth data. The results of the research enabled us to develop a framework for the design and development of GeoVis applications related to the areas of public health and especially telehealth. The results of our study showed that the varied users were involved with the telehealth program and the tasks that they performed. Further it enabled us to identify the components that might be essential to be included in these GeoVis applications. The results of our research answered the following questions; (a) Telehealth users vary in their level of understanding about GeoVis (b) Interaction features such as zooming, sorting, and linking and multiple views and representation features such as bar chart and choropleth maps were considered the most essential features of the GeoVis applications. (c) Comparing and sorting were two important tasks that the telehealth users would perform for exploratory data analysis. (d) A HC GeoVis prototype application is more effective and useful for exploration of telehealth data than a conventional GeoVis application. Future studies should be done to incorporate the proposed HC GeoVis framework to enable comprehensive assessment of the users and the tasks they perform to identify the features that might be necessary to be a part of the GeoVis applications. The results of this study demonstrate a novel approach to comprehensively and systematically enhance the evaluation of telehealth programs using the proposed GeoVis Framework.
Resumo:
In 2008, the 50th anniversary of the IGY (International Geophysical Year), WDCMARE presents with this CD publication 3632 data sets in Open Access as part of the most important results from 73 cruises of the research vessel METEOR between 1964 and 1985. The archive is a coherent organized collection of published and unpublished data sets produced by scientists of all marine research disciplines who participated in Meteor expeditions, measured environmental parameters during cruises and investigated sample material post cruise in the labs of the participating institutions. In most cases, the data was gathered from the Meteor Forschungsergebnisse, published by the Deutsche Forschungsgemeinschaft (DFG). A second important data source are time series and radiosonde ascensions of more than 20 years of ships weather observations, which were provided by the Deutscher Wetterdienst, Hamburg. The final inclusion of all data into the PANGAEA information system ensures secure archiving, future updates, widespread distribution in electronic, machine-readable form with longterm access via the Internet. To produce this publication, all data sets with metadata were extracted from PANGAEA and organized in a directory structure on a CD together with a search capability.
Resumo:
The first International Polar Year (IPY) was an international effort to perform continous meteorological and geophysical observations over a time period of two years (1882-1883). Eleven nations established twelve research stations in the Arctic along with thirteen auxilary stations. Two stations were operated on the southern hemisphere (South Georgia and Tierra del Fuego). The data were published in 26 volumes on 8700+ pages of reports, descriptions, tables and graphs in total. The list of meteorological parameters includes temperature, wind, pressure, clouds, precipitation, evaporation, humidity and radiation. In the light of Global Change and the intensification of observations and continous measurements in both polar regions, long-time series increase in importance. The observations of the first IPY from the 19th century enable us to extend the data from the 20th century even more back into the past. In the occasion of the fourth IPY (2007-2009) WDC-MARE decided to digitize the complete set of meteorological data in full hourly resolution and publish it in its reports and make it available in Open Access via the data library PANGAEA.
Resumo:
This work was financially supported by the German Federal Ministry of Food and Agriculture (BMEL) through the Federal Office for Agriculture and Food (BLE), (2851ERA01J). FT and RPR were supported by FACCE MACSUR (3200009600) through the Finnish Ministry of Agriculture and Forestry (MMM). EC, HE and EL were supported by The Swedish Research Council for Environment, Agricultural Sciences and Spatial Planning (220-2007-1218) and by the strategic funding ‘Soil-Water-Landscape’ from the faculty of Natural Resources and Agricultural Sciences (Swedish University of Agricultural Sciences) and thank professor P-E Jansson (Royal Institute of Technology, Stockholm) for support. JC, HR and DW thank the INRA ACCAF metaprogramm for funding and Eric Casellas from UR MIAT INRA for support. CB was funded by the Helmholtz project “REKLIM—Regional Climate Change”. CK was funded by the HGF Alliance “Remote Sensing and Earth System Dynamics” (EDA). FH was funded by the German Research Foundation (Deutsche Forschungsgemeinschaft, DFG) under the Grant FOR1695. FE and SS acknowledge support by the German Science Foundation (project EW 119/5-1). HH, GZ, SS, TG and FE thank Andreas Enders and Gunther Krauss (INRES, University of Bonn) for support. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Resumo:
Acknowledgements We thank Andrew Spink (Noldus Information Technology) and the Blogging Birds team members Peter Kindness and Abdul Adeniyi for their valuable contributions to this paper. John Fryxell, Chris Thaxter and Arjun Amar provided valuable comments on an earlier version. The study was part of the Digital Conservation project of dot.rural, the University of Aberdeen’s Digital Economy Research Hub, funded by RCUK (grant reference EP/G066051/1).
Resumo:
Funded by COST (European Cooperation in Science and Technology) CEH projects. Grant Numbers: NEC05264, NEC05100 Natural Environment Research Council UK. Grant Number: NE/J008001/1 © 2016 The Authors. Global Change Biology Published by John Wiley & Sons Ltd. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
Resumo:
Phase equilibrium data regression is an unavoidable task necessary to obtain the appropriate values for any model to be used in separation equipment design for chemical process simulation and optimization. The accuracy of this process depends on different factors such as the experimental data quality, the selected model and the calculation algorithm. The present paper summarizes the results and conclusions achieved in our research on the capabilities and limitations of the existing GE models and about strategies that can be included in the correlation algorithms to improve the convergence and avoid inconsistencies. The NRTL model has been selected as a representative local composition model. New capabilities of this model, but also several relevant limitations, have been identified and some examples of the application of a modified NRTL equation have been discussed. Furthermore, a regression algorithm has been developed that allows for the advisable simultaneous regression of all the condensed phase equilibrium regions that are present in ternary systems at constant T and P. It includes specific strategies designed to avoid some of the pitfalls frequently found in commercial regression tools for phase equilibrium calculations. Most of the proposed strategies are based on the geometrical interpretation of the lowest common tangent plane equilibrium criterion, which allows an unambiguous comprehension of the behavior of the mixtures. The paper aims to show all the work as a whole in order to reveal the necessary efforts that must be devoted to overcome the difficulties that still exist in the phase equilibrium data regression problem.
Resumo:
Femicide, defined as the killings of females by males because they are females, is becoming recognized worldwide as an important ongoing manifestation of gender inequality. Despite its high prevalence or widespread prevalence, only a few countries have specific registries about this issue. This study aims to assemble expert opinion regarding the strategies which might feasibly be employed to promote, develop and implement an integrated and differentiated femicide data collection system in Europe at both the national and international levels. Concept mapping methodology was followed, involving 28 experts from 16 countries in generating strategies, sorting and rating them with respect to relevance and feasibility. The experts involved were all members of the EU-Cost-Action on femicide, which is a scientific network of experts on femicide and violence against women across Europe. As a result, a conceptual map emerged, consisting of 69 strategies organized in 10 clusters, which fit into two domains: “Political action” and “Technical steps”. There was consensus among participants regarding the high relevance of strategies to institutionalize national databases and raise public awareness through different stakeholders, while strategies to promote media involvement were identified as the most feasible. Differences in perceived priorities according to the level of human development index of the experts’ countries were also observed.
Resumo:
Sensing techniques are important for solving problems of uncertainty inherent to intelligent grasping tasks. The main goal here is to present a visual sensing system based on range imaging technology for robot manipulation of non-rigid objects. Our proposal provides a suitable visual perception system of complex grasping tasks to support a robot controller when other sensor systems, such as tactile and force, are not able to obtain useful data relevant to the grasping manipulation task. In particular, a new visual approach based on RGBD data was implemented to help a robot controller carry out intelligent manipulation tasks with flexible objects. The proposed method supervises the interaction between the grasped object and the robot hand in order to avoid poor contact between the fingertips and an object when there is neither force nor pressure data. This new approach is also used to measure changes to the shape of an object’s surfaces and so allows us to find deformations caused by inappropriate pressure being applied by the hand’s fingers. Test was carried out for grasping tasks involving several flexible household objects with a multi-fingered robot hand working in real time. Our approach generates pulses from the deformation detection method and sends an event message to the robot controller when surface deformation is detected. In comparison with other methods, the obtained results reveal that our visual pipeline does not use deformations models of objects and materials, as well as the approach works well both planar and 3D household objects in real time. In addition, our method does not depend on the pose of the robot hand because the location of the reference system is computed from a recognition process of a pattern located place at the robot forearm. The presented experiments demonstrate that the proposed method accomplishes a good monitoring of grasping task with several objects and different grasping configurations in indoor environments.
Resumo:
Information technologies (IT) currently represent 2% of CO2 emissions. In recent years, a wide variety of IT solutions have been proposed, focused on increasing the energy efficiency of network data centers. Monitoring is one of the fundamental pillars of these systems, providing the information necessary for adequate decision making. However, today’s monitoring systems (MSs) are partial, specific and highly coupled solutions. This study proposes a model for monitoring data centers that serves as a basis for energy saving systems, offered as a value-added service embedded in a device with low cost and power consumption. The proposal is general in nature, comprehensive, scalable and focused on heterogeneous environments, and it allows quick adaptation to the needs of changing and dynamic environments. Further, a prototype of the system has been implemented in several devices, which has allowed validation of the proposal in addition to identification of the minimum hardware profile required to support the model.
Resumo:
The revelation of the top-secret US intelligence-led PRISM Programme has triggered wide-ranging debates across Europe. Press reports have shed new light on the electronic surveillance ‘fishing expeditions’ of the US National Security Agency and the FBI into the world’s largest electronic communications companies. This Policy Brief by a team of legal specialists and political scientists addresses the main controversies raised by the PRISM affair and the policy challenges that it poses for the EU. Two main arguments are presented: First, the leaks over the PRISM programme have undermined the trust that EU citizens have in their governments and the European institutions to safeguard and protect their privacy; and second, the PRISM affair raises questions regarding the capacity of EU institutions to draw lessons from the past and to protect the data of its citizens and residents in the context of transatlantic relations. The Policy Brief puts forward a set of policy recommendations for the EU to follow and implement a robust data protection strategy in response to the affair.
Resumo:
v. 1. System and program description.--v. 2. Error Messages.--v. 3. Summary of control cards.--v. 4. Sample jobs.--v. 5. Formulas and statistical references.--v. 6. Primer.