41 resultados para Context information


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protecting confidential information from improper disclosure is a fundamental security goal. While encryption and access control are important tools for ensuring confidentiality, they cannot prevent an authorized system from leaking confidential information to its publicly observable outputs, whether inadvertently or maliciously. Hence, secure information flow aims to provide end-to-end control of information flow. Unfortunately, the traditionally-adopted policy of noninterference, which forbids all improper leakage, is often too restrictive. Theories of quantitative information flow address this issue by quantifying the amount of confidential information leaked by a system, with the goal of showing that it is intuitively "small" enough to be tolerated. Given such a theory, it is crucial to develop automated techniques for calculating the leakage in a system. ^ This dissertation is concerned with program analysis for calculating the maximum leakage, or capacity, of confidential information in the context of deterministic systems and under three proposed entropy measures of information leakage: Shannon entropy leakage, min-entropy leakage, and g-leakage. In this context, it turns out that calculating the maximum leakage of a program reduces to counting the number of possible outputs that it can produce. ^ The new approach introduced in this dissertation is to determine two-bit patterns, the relationships among pairs of bits in the output; for instance we might determine that two bits must be unequal. By counting the number of solutions to the two-bit patterns, we obtain an upper bound on the number of possible outputs. Hence, the maximum leakage can be bounded. We first describe a straightforward computation of the two-bit patterns using an automated prover. We then show a more efficient implementation that uses an implication graph to represent the two- bit patterns. It efficiently constructs the graph through the use of an automated prover, random executions, STP counterexamples, and deductive closure. The effectiveness of our techniques, both in terms of efficiency and accuracy, is shown through a number of case studies found in recent literature. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis research describes the design and implementation of a Semantic Geographic Information System (GIS) and the creation of its spatial database. The database schema is designed and created, and all textual and spatial data are loaded into the database with the help of the Semantic DBMS's Binary Database Interface currently being developed at the FIU's High Performance Database Research Center (HPDRC). A friendly graphical user interface is created together with the other main system's areas: displaying process, data animation, and data retrieval. All these components are tightly integrated to form a novel and practical semantic GIS that has facilitated the interpretation, manipulation, analysis, and display of spatial data like: Ocean Temperature, Ozone(TOMS), and simulated SeaWiFS data. At the same time, this system has played a major role in the testing process of the HPDRC's high performance and efficient parallel Semantic DBMS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing amount of available semistructured data demands efficient mechanisms to store, process, and search an enormous corpus of data to encourage its global adoption. Current techniques to store semistructured documents either map them to relational databases, or use a combination of flat files and indexes. These two approaches result in a mismatch between the tree-structure of semistructured data and the access characteristics of the underlying storage devices. Furthermore, the inefficiency of XML parsing methods has slowed down the large-scale adoption of XML into actual system implementations. The recent development of lazy parsing techniques is a major step towards improving this situation, but lazy parsers still have significant drawbacks that undermine the massive adoption of XML. Once the processing (storage and parsing) issues for semistructured data have been addressed, another key challenge to leverage semistructured data is to perform effective information discovery on such data. Previous works have addressed this problem in a generic (i.e. domain independent) way, but this process can be improved if knowledge about the specific domain is taken into consideration. This dissertation had two general goals: The first goal was to devise novel techniques to efficiently store and process semistructured documents. This goal had two specific aims: We proposed a method for storing semistructured documents that maps the physical characteristics of the documents to the geometrical layout of hard drives. We developed a Double-Lazy Parser for semistructured documents which introduces lazy behavior in both the pre-parsing and progressive parsing phases of the standard Document Object Model's parsing mechanism. The second goal was to construct a user-friendly and efficient engine for performing Information Discovery over domain-specific semistructured documents. This goal also had two aims: We presented a framework that exploits the domain-specific knowledge to improve the quality of the information discovery process by incorporating domain ontologies. We also proposed meaningful evaluation metrics to compare the results of search systems over semistructured documents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This poster presentation features three route planning applications developed by the Florida International University GIS Center and the Geomatics program at the University of Florida, and outlines their context based differences. The first route planner has been developed for cyclists in three Florida counties, i.e. Miami Dade County, Broward County, and Palm Beach County. The second route planner computes safe pedestrian routes to schools and has been developed for Miami Dade County. The third route planner combines pre-compiled cultural/eco routes and point-to-point route planning for the City of Coral Gables. This poster highlights the differences in design (user interface) and implementation (routing options) between the three route planners as a result of a different application context and target audience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research pursued the conceptualization, implementation, and verification of a system that enhances digital information displayed on an LCD panel to users with visual refractive errors. The target user groups for this system are individuals who have moderate to severe visual aberrations for which conventional means of compensation, such as glasses or contact lenses, does not improve their vision. This research is based on a priori knowledge of the user's visual aberration, as measured by a wavefront analyzer. With this information it is possible to generate images that, when displayed to this user, will counteract his/her visual aberration. The method described in this dissertation advances the development of techniques for providing such compensation by integrating spatial information in the image as a means to eliminate some of the shortcomings inherent in using display devices such as monitors or LCD panels. Additionally, physiological considerations are discussed and integrated into the method for providing said compensation. In order to provide a realistic sense of the performance of the methods described, they were tested by mathematical simulation in software, as well as by using a single-lens high resolution CCD camera that models an aberrated eye, and finally with human subjects having various forms of visual aberrations. Experiments were conducted on these systems and the data collected from these experiments was evaluated using statistical analysis. The experimental results revealed that the pre-compensation method resulted in a statistically significant improvement in vision for all of the systems. Although significant, the improvement was not as large as expected for the human subject tests. Further analysis suggest that even under the controlled conditions employed for testing with human subjects, the characterization of the eye may be changing. This would require real-time monitoring of relevant variables (e.g. pupil diameter) and continuous adjustment in the pre-compensation process to yield maximum viewing enhancement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the growing commercial importance of the Internet and the development of new real-time, connection-oriented services like IP-telephony and electronic commerce resilience is becoming a key issue in the design of TP-based networks. Two emerging technologies, which can accomplish the task of efficient information transfer, are Multiprotocol Label Switching (MPLS) and Differentiated Services. A main benefit of MPLS is the ability to introduce traffic-engineering concepts due to its connection-oriented characteristic. With MPLS it is possible to assign different paths for packets through the network. Differentiated services divides traffic into different classes and treat them differently, especially when there is a shortage of network resources. In this thesis, a framework was proposed to integrate the above two technologies and its performance in providing load balancing and improving QoS was evaluated. Simulation and analysis of this framework demonstrated that the combination of MPLS and Differentiated services is a powerful tool for QoS provisioning in IP networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a post-Cold War, post-9/11 world, the advent of US global supremacy resulted in the installation, perpetuation, and dissemination of an Absolutist Security Agenda (hereinafter, ASA). The US ASA explicitly and aggressively articulates and equates US national security interests with the security of all states in the international system, and replaced the bipolar, Cold War framework that defined international affairs from 1945-1992. Since the collapse of the USSR and the 11 September 2001 terrorist attacks, the US has unilaterally defined, implemented, and managed systemic security policy. The US ASA is indicative of a systemic category of knowledge (security) anchored in variegated conceptual and material components, such as morality, philosophy, and political rubrics. The US ASA is based on a logic that involves the following security components: 1., hyper militarization, 2., intimidation, 3., coercion, 4., criminalization, 5., panoptic surveillance, 6., plenary security measures, and 7., unabashed US interference in the domestic affairs of select states. Such interference has produced destabilizing tensions and conflicts that have, in turn, produced resistance, revolutions, proliferation, cults of personality, and militarization. This is the case because the US ASA rests on the notion that the international system of states is an extension, instrument of US power, rather than a system and/or society of states comprised of functionally sovereign entities. To analyze the US ASA, this study utilizes: 1., official government statements, legal doctrines, treaties, and policies pertaining to US foreign policy; 2., militarization rationales, budgets, and expenditures; and 3., case studies of rogue states. The data used in this study are drawn from information that is publicly available (academic journals, think-tank publications, government publications, and information provided by international organizations). The data supports the contention that global security is effectuated via a discrete set of hegemonic/imperialistic US values and interests, finding empirical expression in legal acts (USA Patriot ACT 2001) and the concept of rogue states. Rogue states, therefore, provide test cases to clarify the breadth, depth, and consequentialness of the US ASA in world affairs vis-a-vis the relationship between US security and global security.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A nuclear waste stream is the complete flow of waste material from origin to treatment facility to final disposal. The objective of this study was to design and develop a Geographic Information Systems (GIS) module using Google Application Programming Interface (API) for better visualization of nuclear waste streams that will identify and display various nuclear waste stream parameters. A proper display of parameters would enable managers at Department of Energy waste sites to visualize information for proper planning of waste transport. The study also developed an algorithm using quadratic Bézier curve to make the map more understandable and usable. Microsoft Visual Studio 2012 and Microsoft SQL Server 2012 were used for the implementation of the project. The study has shown that the combination of several technologies can successfully provide dynamic mapping functionality. Future work should explore various Google Maps API functionalities to further enhance the visualization of nuclear waste streams.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to test Lotka’s law of scientific publication productivity using the methodology outlined by Pao (1985), in the field of Library and Information Studies (LIS). Lotka’s law has been sporadically tested in the field over the past 30+ years, but the results of these studies are inconclusive due to the varying methods employed by the researchers. A data set of 1,856 citations that were found using the ISI Web of Knowledge databases were studied. The values of n and c were calculated to be 2.1 and 0.6418 (64.18%) respectively. The Kolmogorov-Smirnov (K-S) one sample goodness-of-fit test was conducted at the 0.10 level of significance. The Dmax value is 0.022758 and the calculated critical value is 0.026562. It was determined that the null hypothesis stating that there is no difference in the observed distribution of publications and the distribution obtained using Lotka’s and Pao’s procedure could not be rejected. This study finds that literature in the field of library and Information Studies does conform to Lotka’s law with reliable results. As result, Lotka’s law can be used in LIS as a standardized means of measuring author publication productivity which will lead to findings that are comparable on many levels (e.g., department, institution, national). Lotka’s law can be employed as an empirically proven analytical tool to establish publication productivity benchmarks for faculty and faculty librarians. Recommendations for further study include (a) exploring the characteristics of the high and low producers; (b) finding a way to successfully account for collaborative contributions in the formula; and, (c) a detailed study of institutional policies concerning publication productivity and its impact on the appointment, tenure and promotion process of academic librarians.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Historically, memory has been evaluated by examining how much is remembered, however a more recent conception of memory focuses on the accuracy of memories. When using this accuracy-oriented conception of memory, unlike with the quantity-oriented approach, memory does not always deteriorate over time. A possible explanation for this seemingly surprising finding lies in the metacognitive processes of monitoring and control. Use of these processes allows people to withhold responses of which they are unsure, or to adjust the precision of responses to a level that is broad enough to be correct. The ability to accurately report memories has implications for investigators who interview witnesses to crimes, and those who evaluate witness testimony. This research examined the amount of information provided, accuracy, and precision of responses provided during immediate and delayed interviews about a videotaped mock crime. The interview format was manipulated such that a single free narrative response was elicited, or a series of either yes/no or cued questions were asked. Instructions provided by the interviewer indicated to the participants that they should either stress being informative, or being accurate. The interviews were then transcribed and scored. Results indicate that accuracy rates remained stable and high after a one week delay. Compared to those interviewed immediately, after a delay participants provided less information and responses that were less precise. Participants in the free narrative condition were the most accurate. Participants in the cued questions condition provided the most precise responses. Participants in the yes/no questions condition were most likely to say “I don’t know”. The results indicate that people are able to monitor their memories and modify their reports to maintain high accuracy. When control over precision was not possible, such as in the yes/no condition, people said “I don’t know” to maintain accuracy. However when withholding responses and adjusting precision were both possible, people utilized both methods. It seems that concerns that memories reported after a long retention interval might be inaccurate are unfounded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrated project delivery (IPD) method has recently emerged as an alternative to traditional delivery methods. It has the potential to overcome inefficiencies of traditional delivery methods by enhancing collaboration among project participants. Information and communication technology (ICT) facilitates IPD by effective management, processing and communication of information within and among organizations. While the benefits of IPD, and the role of ICT in realizing them, have been generally acknowledged, the US public construction sector is very slow in adopting IPD. The reasons are - lack of experience and inadequate understanding of IPD in public owner as confirmed by the results of the questionnaire survey conducted under this research study. The public construction sector should be aware of the value of IPD and should know the essentials for effective implementation of IPD principles - especially, they should be cognizant of the opportunities offered by advancements in ICT to realize this. In order to address the need an IPD Readiness Assessment Model (IPD-RAM) was developed in this research study. The model was designed with a goal to determine IPD readiness of a public owner organization considering selected IPD principles, and ICT levels, at which project functions were carried out. Subsequent analysis led to identification of possible improvements in ICTs that have the potential to increase IPD readiness scores. Termed as the gap identification, this process was used to formulate improvement strategies. The model had been applied to six Florida International University (FIU) construction projects (case studies). The results showed that the IPD readiness of the organization was considerably low and several project functions can be improved by using higher and/or advanced level ICT tools and methods. Feedbacks from a focus group comprised of FIU officials and an independent group of experts had been received at various stages of this research and had been utilized during development and implementation of the model. Focus group input was also helpful for validation of the model and its results. It was hoped that the model developed would be useful to construction owner organizations in order to assess their IPD readiness and to identify appropriate ICT improvement strategies.