2 resultados para Sub-seafloor modeling
em DRUM (Digital Repository at the University of Maryland)
Resumo:
Authentication plays an important role in how we interact with computers, mobile devices, the web, etc. The idea of authentication is to uniquely identify a user before granting access to system privileges. For example, in recent years more corporate information and applications have been accessible via the Internet and Intranet. Many employees are working from remote locations and need access to secure corporate files. During this time, it is possible for malicious or unauthorized users to gain access to the system. For this reason, it is logical to have some mechanism in place to detect whether the logged-in user is the same user in control of the user's session. Therefore, highly secure authentication methods must be used. We posit that each of us is unique in our use of computer systems. It is this uniqueness that is leveraged to "continuously authenticate users" while they use web software. To monitor user behavior, n-gram models are used to capture user interactions with web-based software. This statistical language model essentially captures sequences and sub-sequences of user actions, their orderings, and temporal relationships that make them unique by providing a model of how each user typically behaves. Users are then continuously monitored during software operations. Large deviations from "normal behavior" can possibly indicate malicious or unintended behavior. This approach is implemented in a system called Intruder Detector (ID) that models user actions as embodied in web logs generated in response to a user's actions. User identification through web logs is cost-effective and non-intrusive. We perform experiments on a large fielded system with web logs of approximately 4000 users. For these experiments, we use two classification techniques; binary and multi-class classification. We evaluate model-specific differences of user behavior based on coarse-grain (i.e., role) and fine-grain (i.e., individual) analysis. A specific set of metrics are used to provide valuable insight into how each model performs. Intruder Detector achieves accurate results when identifying legitimate users and user types. This tool is also able to detect outliers in role-based user behavior with optimal performance. In addition to web applications, this continuous monitoring technique can be used with other user-based systems such as mobile devices and the analysis of network traffic.
Resumo:
Is fairness in process and outcome a generalizable driver of police legitimacy? In many industrialized nations, studies have demonstrated that police legitimacy is largely a function of whether citizens perceive treatment as normatively fair and respectful. Questions remain whether this model holds in less-industrialized contexts, where corruption and security challenges favor instrumental preferences for effective crime control and prevention. Support for and against the normative model of legitimacy has been found in less-industrialized countries, yet few have simultaneously compared these models across multiple industrializing countries. Using a multilevel framework and data from respondents in 27 countries in sub-Saharan Africa (n~43,000), I find evidence for the presence of both instrumental and normative influences in shaping the perceptions of police legitimacy. More importantly, the internal consistency of legitimacy (defined as obligation to obey, moral alignment, and perceived legality of the police) varies considerably from country to country, suggesting that relationships between legality, morality, and obligation operate differently across contexts. Results are robust to a number of different modeling assumptions and alternative explanations. Overall, the results indicate that both fairness and effectiveness matter, not in all places, and in some cases contrary to theoretical expectations.