946 resultados para 08 Information and Computing Sciences


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cancer is the second leading cause of death in the United States. With the advent of new technologies, changes in health care delivery, and multiplicity of provider types that patients must see, cancer care management has become increasingly complex. The availability of cancer health information has been shown to help cancer patients cope with the management and effects of their cancers. As a result, more cancer patients are using the internet to find resources that can aid in decision-making and recovery. ^ The Health Information National Trends Survey (HINTS) is a nationally representative survey designed to collect information about the experiences of cancer and non-cancer adults with health information sources. The HINTS survey focused on both conventional sources as well as newer technologies, particularly the internet. This study is a descriptive analysis of the HINTS 2003 and HINTS 2005 survey data. The purpose of the research is to explore the general trends in health information seeking and use by US adults, and especially by cancer patients. ^ From 2003 to 2005, internet use for various health-related activities appears to have increased among adults with and without cancer. Differences were found between the groups in the general trust in information media, particularly the internet. Non-cancer respondents tended to have greater trust in information media than cancer respondents. ^ The latter portion of this work examined characteristics of HINTS respondents that were thought to be relevant to how much trust individuals placed in the internet as a source of health information. Trust in health information from the internet was significantly greater among younger adults, higher-earning households, internet users, online seekers of health or cancer information, and those who found online cancer information useful. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Problem: Medical and veterinary students memorize facts but then have difficulty applying those facts in clinical problem solving. Cognitive engineering research suggests that the inability of medical and veterinary students to infer concepts from facts may be due in part to specific features of how information is represented and organized in educational materials. First, physical separation of pieces of information may increase the cognitive load on the student. Second, information that is necessary but not explicitly stated may also contribute to the student’s cognitive load. Finally, the types of representations – textual or graphical – may also support or hinder the student’s learning process. This may explain why students have difficulty applying biomedical facts in clinical problem solving. Purpose: To test the hypothesis that three specific aspects of expository text – the patial distance between the facts needed to infer a rule, the explicitness of information, and the format of representation – affected the ability of students to solve clinical problems. Setting: The study was conducted in the parasitology laboratory of a college of veterinary medicine in Texas. Sample: The study subjects were a convenience sample consisting of 132 second-year veterinary students who matriculated in 2007. The age of this class upon admission ranged from 20-52, and the gender makeup of this class consisted of approximately 75% females and 25% males. Results: No statistically significant difference in student ability to solve clinical problems was found when relevant facts were placed in proximity, nor when an explicit rule was stated. Further, no statistically significant difference in student ability to solve clinical problems was found when students were given different representations of material, including tables and concept maps. Findings: The findings from this study indicate that the three properties investigated – proximity, explicitness, and representation – had no statistically significant effect on student learning as it relates to clinical problem-solving ability. However, ad hoc observations as well as findings from other researchers suggest that the subjects were probably using rote learning techniques such as memorization, and therefore were not attempting to infer relationships from the factual material in the interventions, unless they were specifically prompted to look for patterns. A serendipitous finding unrelated to the study hypothesis was that those subjects who correctly answered questions regarding functional (non-morphologic) properties, such as mode of transmission and intermediate host, at the family taxonomic level were significantly more likely to correctly answer clinical case scenarios than were subjects who did not correctly answer questions regarding functional properties. These findings suggest a strong relationship (p < .001) between well-organized knowledge of taxonomic functional properties and clinical problem solving ability. Recommendations: Further study should be undertaken investigating the relationship between knowledge of functional taxonomic properties and clinical problem solving ability. In addition, the effect of prompting students to look for patterns in instructional material, followed by the effect of factors that affect cognitive load such as proximity, explicitness, and representation, should be explored.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cryoablation for small renal tumors has demonstrated sufficient clinical efficacy over the past decade as a non-surgical nephron-sparing approach for treating renal masses for patients who are not surgical candidates. Minimally invasive percutaneous cryoablations have been performed with image guidance from CT, ultrasound, and MRI. During the MRI-guided cryoablation procedure, the interventional radiologist visually compares the iceball size on monitoring images with respect to the original tumor on separate planning images. The comparisons made during the monitoring step are time consuming, inefficient and sometimes lack the precision needed for decision making, requiring the radiologist to make further changes later in the procedure. This study sought to mitigate uncertainty in these visual comparisons by quantifying tissue response to cryoablation and providing visualization of the response during the procedure. Based on retrospective analysis of MR-guided cryoablation patient data, registration and segmentation algorithms were investigated and implemented for periprocedural visualization to deliver iceball position/size with respect to planning images registered within 3.3mm with at least 70% overlap and a quantitative logit model was developed to relate perfusion deficit in renal parenchyma visualized in verification images as a result of iceball size visualized in monitoring images. Through retrospective study of 20 patient cases, the relationship between likelihood of perfusion loss in renal parenchyma and distance within iceball was quantified and iteratively fit to a logit curve. Using the parameters from the logit fit, the margin for 95% perfusion loss likelihood was found to be 4.28 mm within the iceball. The observed margin corresponds well with the clinically accepted margin of 3-5mm within the iceball. In order to display the iceball position and perfusion loss likelihood to the radiologist, algorithms were implemented to create a fast segmentation and registration module which executed in under 2 minutes, within the clinically-relevant 3 minute monitoring period. Using 16 patient cases, the average Hausdorff distance was reduced from 10.1mm to 3.21 mm with average DSC increased from 46.6% to 82.6% before and after registration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hierarchical linear growth model (HLGM), as a flexible and powerful analytic method, has played an increased important role in psychology, public health and medical sciences in recent decades. Mostly, researchers who conduct HLGM are interested in the treatment effect on individual trajectories, which can be indicated by the cross-level interaction effects. However, the statistical hypothesis test for the effect of cross-level interaction in HLGM only show us whether there is a significant group difference in the average rate of change, rate of acceleration or higher polynomial effect; it fails to convey information about the magnitude of the difference between the group trajectories at specific time point. Thus, reporting and interpreting effect sizes have been increased emphases in HLGM in recent years, due to the limitations and increased criticisms for statistical hypothesis testing. However, most researchers fail to report these model-implied effect sizes for group trajectories comparison and their corresponding confidence intervals in HLGM analysis, since lack of appropriate and standard functions to estimate effect sizes associated with the model-implied difference between grouping trajectories in HLGM, and also lack of computing packages in the popular statistical software to automatically calculate them. ^ The present project is the first to establish the appropriate computing functions to assess the standard difference between grouping trajectories in HLGM. We proposed the two functions to estimate effect sizes on model-based grouping trajectories difference at specific time, we also suggested the robust effect sizes to reduce the bias of estimated effect sizes. Then, we applied the proposed functions to estimate the population effect sizes (d ) and robust effect sizes (du) on the cross-level interaction in HLGM by using the three simulated datasets, and also we compared the three methods of constructing confidence intervals around d and du recommended the best one for application. At the end, we constructed 95% confidence intervals with the suitable method for the effect sizes what we obtained with the three simulated datasets. ^ The effect sizes between grouping trajectories for the three simulated longitudinal datasets indicated that even though the statistical hypothesis test shows no significant difference between grouping trajectories, effect sizes between these grouping trajectories can still be large at some time points. Therefore, effect sizes between grouping trajectories in HLGM analysis provide us additional and meaningful information to assess group effect on individual trajectories. In addition, we also compared the three methods to construct 95% confident intervals around corresponding effect sizes in this project, which handled with the uncertainty of effect sizes to population parameter. We suggested the noncentral t-distribution based method when the assumptions held, and the bootstrap bias-corrected and accelerated method when the assumptions are not met.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

At marine seeps, methane is microbially oxidized resulting in the precipitation of carbonates close to the seafloor. Methane oxidation leads to sulfate depletion in sediment pore water, which induces a change in redox conditions. Rare earth element (REE) patterns of authigenic carbonate phases collected from modern seeps of the Gulf of Mexico, the Black Sea, and the Congo Fan were analyzed. Different carbonate minerals including aragonite and calcite with different crystal habits have been selected for analysis. Total REE content (SumREE) of seep carbonates varies widely, from 0.1 ppm to 42.5 ppm, but a common trend is that the SumREE in microcrystalline phases is higher than that of the associated later phases including micospar, sparite and blocky cement, suggesting that SumREE may be a function of diagenesis. The shale-normalized REE patterns of the seep carbonates often show different Ce anomalies even in samples from a specific site, suggesting that the formation conditions of seep carbonates are variable and complex. Overall, our results show that apart from anoxic, oxic conditions are at least temporarily common in seep environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ubiquitous computing software needs to be autonomous so that essential decisions such as how to configure its particular execution are self-determined. Moreover, data mining serves an important role for ubiquitous computing by providing intelligence to several types of ubiquitous computing applications. Thus, automating ubiquitous data mining is also crucial. We focus on the problem of automatically configuring the execution of a ubiquitous data mining algorithm. In our solution, we generate configuration decisions in a resource aware and context aware manner since the algorithm executes in an environment in which the context often changes and computing resources are often severely limited. We propose to analyze the execution behavior of the data mining algorithm by mining its past executions. By doing so, we discover the effects of resource and context states as well as parameter settings on the data mining quality. We argue that a classification model is appropriate for predicting the behavior of an algorithm?s execution and we concentrate on decision tree classifier. We also define taxonomy on data mining quality so that tradeoff between prediction accuracy and classification specificity of each behavior model that classifies by a different abstraction of quality, is scored for model selection. Behavior model constituents and class label transformations are formally defined and experimental validation of the proposed approach is also performed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently, student dropout rates are a matter of concern among universities. Many research studies, aimed at discovering the causes, have been carried out. However, few solutions, that could serve all students and related problems, have been proposed so far. One such problem is caused by the lack of the "knowledge chain educational links" that occurs when students move onto higher studies without mastering their basic studies. Most regulated studies imparted at universities are designed so that some basic subjects serve as support for other, more complicated, subjects, thus forming a complicated knowledge network. When a link in this chain fails, student frustration occurs as it prevents him from fully understanding the following educational links. In this proposal we try to mitigate these disasters that stem, for the most part, the student?s frustration beyond his college stay. On one hand, we make a dissertation on the student?s learning process, which we divide into a series of phases that amount to what we call the "learning lifecycle." Also, we analyze at what stage the action by the stakeholders involved in this scenario: teachers and students; is most important. On the other hand, we consider that Information and Communication Technologies ICT, such as Cloud Computing, can help develop new ways, allowing for the teaching of higher education, while easing and facilitating the student?s learning process. But, methods and processes need to be defined as to direct the use of such technologies; in the teaching process in general, and within higher education in particular; in order to achieve optimum results. Our methodology integrates, as another actor, the ICT into the "Learning Lifecycle". We stimulate students to stop being merely spectators of their own education, and encourage them to take an active part in their training process. To do this, we offer a set of useful tools to determine not only academic failure causes, (for self assessment), but also to remedy these failures (with corrective actions); "discovered the causes it is easier to determine solutions?. We believe this study will be useful for both students and teachers. Students learn from their own experience and improve their learning process, while obtaining all of the "knowledge chain educational links? required in their studies. We stand by the motto "Studying to learn instead of studying to pass". Teachers will also be benefited by detecting where and how to strengthen their teaching proposals. All of this will also result in decreasing dropout rates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El extraordinario auge de las nuevas tecnologías de la información, el desarrollo de la Internet de las Cosas, el comercio electrónico, las redes sociales, la telefonía móvil y la computación y almacenamiento en la nube, han proporcionado grandes beneficios en todos los ámbitos de la sociedad. Junto a éstos, se presentan nuevos retos para la protección y privacidad de la información y su contenido, como la suplantación de personalidad y la pérdida de la confidencialidad e integridad de los documentos o las comunicaciones electrónicas. Este hecho puede verse agravado por la falta de una frontera clara que delimite el mundo personal del mundo laboral en cuanto al acceso de la información. En todos estos campos de la actividad personal y laboral, la Criptografía ha jugado un papel fundamental aportando las herramientas necesarias para garantizar la confidencialidad, integridad y disponibilidad tanto de la privacidad de los datos personales como de la información. Por otro lado, la Biometría ha propuesto y ofrecido diferentes técnicas con el fin de garantizar la autentificación de individuos a través del uso de determinadas características personales como las huellas dáctilares, el iris, la geometría de la mano, la voz, la forma de caminar, etc. Cada una de estas dos ciencias, Criptografía y Biometría, aportan soluciones a campos específicos de la protección de datos y autentificación de usuarios, que se verían enormemente potenciados si determinadas características de ambas ciencias se unieran con vistas a objetivos comunes. Por ello es imperativo intensificar la investigación en estos ámbitos combinando los algoritmos y primitivas matemáticas de la Criptografía con la Biometría para dar respuesta a la demanda creciente de nuevas soluciones más técnicas, seguras y fáciles de usar que potencien de modo simultáneo la protección de datos y la identificacíón de usuarios. En esta combinación el concepto de biometría cancelable ha supuesto una piedra angular en el proceso de autentificación e identificación de usuarios al proporcionar propiedades de revocación y cancelación a los ragos biométricos. La contribución de esta tesis se basa en el principal aspecto de la Biometría, es decir, la autentificación segura y eficiente de usuarios a través de sus rasgos biométricos, utilizando tres aproximaciones distintas: 1. Diseño de un esquema criptobiométrico borroso que implemente los principios de la biometría cancelable para identificar usuarios lidiando con los problemas acaecidos de la variabilidad intra e inter-usuarios. 2. Diseño de una nueva función hash que preserva la similitud (SPHF por sus siglas en inglés). Actualmente estas funciones se usan en el campo del análisis forense digital con el objetivo de buscar similitudes en el contenido de archivos distintos pero similares de modo que se pueda precisar hasta qué punto estos archivos pudieran ser considerados iguales. La función definida en este trabajo de investigación, además de mejorar los resultados de las principales funciones desarrolladas hasta el momento, intenta extender su uso a la comparación entre patrones de iris. 3. Desarrollando un nuevo mecanismo de comparación de patrones de iris que considera tales patrones como si fueran señales para compararlos posteriormente utilizando la transformada de Walsh-Hadarmard. Los resultados obtenidos son excelentes teniendo en cuenta los requerimientos de seguridad y privacidad mencionados anteriormente. Cada uno de los tres esquemas diseñados han sido implementados para poder realizar experimentos y probar su eficacia operativa en escenarios que simulan situaciones reales: El esquema criptobiométrico borroso y la función SPHF han sido implementados en lenguaje Java mientras que el proceso basado en la transformada de Walsh-Hadamard en Matlab. En los experimentos se ha utilizado una base de datos de imágenes de iris (CASIA) para simular una población de usuarios del sistema. En el caso particular de la función de SPHF, además se han realizado experimentos para comprobar su utilidad en el campo de análisis forense comparando archivos e imágenes con contenido similar y distinto. En este sentido, para cada uno de los esquemas se han calculado los ratios de falso negativo y falso positivo. ABSTRACT The extraordinary increase of new information technologies, the development of Internet of Things, the electronic commerce, the social networks, mobile or smart telephony and cloud computing and storage, have provided great benefits in all areas of society. Besides this fact, there are new challenges for the protection and privacy of information and its content, such as the loss of confidentiality and integrity of electronic documents and communications. This is exarcebated by the lack of a clear boundary between the personal world and the business world as their differences are becoming narrower. In both worlds, i.e the personal and the business one, Cryptography has played a key role by providing the necessary tools to ensure the confidentiality, integrity and availability both of the privacy of the personal data and information. On the other hand, Biometrics has offered and proposed different techniques with the aim to assure the authentication of individuals through their biometric traits, such as fingerprints, iris, hand geometry, voice, gait, etc. Each of these sciences, Cryptography and Biometrics, provides tools to specific problems of the data protection and user authentication, which would be widely strengthen if determined characteristics of both sciences would be combined in order to achieve common objectives. Therefore, it is imperative to intensify the research in this area by combining the basics mathematical algorithms and primitives of Cryptography with Biometrics to meet the growing demand for more secure and usability techniques which would improve the data protection and the user authentication. In this combination, the use of cancelable biometrics makes a cornerstone in the user authentication and identification process since it provides revocable or cancelation properties to the biometric traits. The contributions in this thesis involve the main aspect of Biometrics, i.e. the secure and efficient authentication of users through their biometric templates, considered from three different approaches. The first one is designing a fuzzy crypto-biometric scheme using the cancelable biometric principles to take advantage of the fuzziness of the biometric templates at the same time that it deals with the intra- and inter-user variability among users without compromising the biometric templates extracted from the legitimate users. The second one is designing a new Similarity Preserving Hash Function (SPHF), currently widely used in the Digital Forensics field to find similarities among different files to calculate their similarity level. The function designed in this research work, besides the fact of improving the results of the two main functions of this field currently in place, it tries to expand its use to the iris template comparison. Finally, the last approach of this thesis is developing a new mechanism of handling the iris templates, considering them as signals, to use the Walsh-Hadamard transform (complemented with three other algorithms) to compare them. The results obtained are excellent taking into account the security and privacy requirements mentioned previously. Every one of the three schemes designed have been implemented to test their operational efficacy in situations that simulate real scenarios: The fuzzy crypto-biometric scheme and the SPHF have been implemented in Java language, while the process based on the Walsh-Hadamard transform in Matlab. The experiments have been performed using a database of iris templates (CASIA-IrisV2) to simulate a user population. The case of the new SPHF designed is special since previous to be applied i to the Biometrics field, it has been also tested to determine its applicability in the Digital Forensic field comparing similar and dissimilar files and images. The ratios of efficiency and effectiveness regarding user authentication, i.e. False Non Match and False Match Rate, for the schemes designed have been calculated with different parameters and cases to analyse their behaviour.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated. Acknowledgements: We thank Ms Margaret Fraser, Ms Samantha Flannigan, and Dr Wing Yee Kwong for their expert assistance. The staff at Grampian NHS Pregnancy Counselling Service were essential for collecting fetuses. We thank Professor Geoffrey Hammond and Dr Marc Simard, University of British Colombia for helpful comments on the manuscript. Supported by grants as follows: Scottish Senior Clinical Fellowship (AJD); Chief Scientist Office (Scottish Executive, CZG/1/109 to PAF, & CZG/4/742 (PAF & PJOS); NHS Grampian Endowments 08/02 (PAF, SB & PJOS); the European Community’s Seventh Framework Programme (FP7/2007-2013) under grant agreement no 212885 (PAF & SMR); the Medical Research Council grants MR/L010011/1 (PAF & PJOS) and MR/K018310/1 (AJD). None of the funding bodies played any role in the design, collection, analysis, and interpretation of data, in the writing of the manuscript, nor in the decision to submit the manuscript for publication

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Acknowledgements We thank Andrew Spink (Noldus Information Technology) and the Blogging Birds team members Peter Kindness and Abdul Adeniyi for their valuable contributions to this paper. John Fryxell, Chris Thaxter and Arjun Amar provided valuable comments on an earlier version. The study was part of the Digital Conservation project of dot.rural, the University of Aberdeen’s Digital Economy Research Hub, funded by RCUK (grant reference EP/G066051/1).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Funding A Health Systems Research Initiative Development Grant from the UK Department for International Development (DFID), Economic and Social Research Council (ESRC), Medical Research Council (MRC (and the Wellcome Trust (MR/N005597/1) funds the research presented in this paper. Support for the Agincourt HDSS including verbal autopsies was provided by The Wellcome Trust, UK (grants 058893/Z/99/A; 069683/Z/02/Z; 085477/Z/08/Z; 085477/B/08/Z), and the University of the Witwatersrand and Medical Research Council, South Africa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The current massive degradation of habitat and extinction of species is taking place on a catastrophically short timescale, and their effects will fundamentally reset the future evolution of the planet's biota. The fossil record suggests that recovery of global ecosystems has required millions or even tens of millions of years. Thus, intervention by humans, the very agents of the current environmental crisis, is required for any possibility of short-term recovery or maintenance of the biota. Many current recovery efforts have deficiencies, including insufficient information on the diversity and distribution of species, ecological processes, and magnitude and interaction of threats to biodiversity (pollution, overharvesting, climate change, disruption of biogeochemical cycles, introduced or invasive species, habitat loss and fragmentation through land use, disruption of community structure in habitats, and others). A much greater and more urgently applied investment to address these deficiencies is obviously warranted. Conservation and restoration in human-dominated ecosystems must strengthen connections between human activities, such as agricultural or harvesting practices, and relevant research generated in the biological, earth, and atmospheric sciences. Certain threats to biodiversity require intensive international cooperation and input from the scientific community to mitigate their harmful effects, including climate change and alteration of global biogeochemical cycles. In a world already transformed by human activity, the connection between humans and the ecosystems they depend on must frame any strategy for the recovery of the biota.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A central role of elections is the aggregation of information dispersed within a population. This article surveys recent work on elections as mechanisms for aggregating information and on the incentives for voters to vote strategically in such elections.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cybercrime and related malicious activity in our increasingly digital world has become more prevalent and sophisticated, evading traditional security mechanisms. Digital forensics has been proposed to help investigate, understand and eventually mitigate such attacks. The practice of digital forensics, however, is still fraught with various challenges. Some of the most prominent of these challenges include the increasing amounts of data and the diversity of digital evidence sources appearing in digital investigations. Mobile devices and cloud infrastructures are an interesting specimen, as they inherently exhibit these challenging circumstances and are becoming more prevalent in digital investigations today. Additionally they embody further characteristics such as large volumes of data from multiple sources, dynamic sharing of resources, limited individual device capabilities and the presence of sensitive data. These combined set of circumstances make digital investigations in mobile and cloud environments particularly challenging. This is not aided by the fact that digital forensics today still involves manual, time consuming tasks within the processes of identifying evidence, performing evidence acquisition and correlating multiple diverse sources of evidence in the analysis phase. Furthermore, industry standard tools developed are largely evidence-oriented, have limited support for evidence integration and only automate certain precursory tasks, such as indexing and text searching. In this study, efficiency, in the form of reducing the time and human labour effort expended, is sought after in digital investigations in highly networked environments through the automation of certain activities in the digital forensic process. To this end requirements are outlined and an architecture designed for an automated system that performs digital forensics in highly networked mobile and cloud environments. Part of the remote evidence acquisition activity of this architecture is built and tested on several mobile devices in terms of speed and reliability. A method for integrating multiple diverse evidence sources in an automated manner, supporting correlation and automated reasoning is developed and tested. Finally the proposed architecture is reviewed and enhancements proposed in order to further automate the architecture by introducing decentralization particularly within the storage and processing functionality. This decentralization also improves machine to machine communication supporting several digital investigation processes enabled by the architecture through harnessing the properties of various peer-to-peer overlays. Remote evidence acquisition helps to improve the efficiency (time and effort involved) in digital investigations by removing the need for proximity to the evidence. Experiments show that a single TCP connection client-server paradigm does not offer the required scalability and reliability for remote evidence acquisition and that a multi-TCP connection paradigm is required. The automated integration, correlation and reasoning on multiple diverse evidence sources demonstrated in the experiments improves speed and reduces the human effort needed in the analysis phase by removing the need for time-consuming manual correlation. Finally, informed by published scientific literature, the proposed enhancements for further decentralizing the Live Evidence Information Aggregator (LEIA) architecture offer a platform for increased machine-to-machine communication thereby enabling automation and reducing the need for manual human intervention.