6 resultados para Human-computer systems
em Digital Commons - Michigan Tech
Resumo:
In the current world geospatial information is being demanded in almost real time, which requires the speed at which this data is processed and made available to the user to be at an all-time high. In order to keep up with this ever increasing speed, analysts must find ways to increase their productivity. At the same time the demand for new analysts is high, and current methods of training are long and can be costly. Through the use of human computer interactions and basic networking systems, this paper explores new ways to increase efficiency in data processing and analyst training.
Resumo:
Among daily computer users who are proficient, some are flexible at accomplishing unfamiliar tasks on their own and others have difficulty. Software designers and evaluators involved with Human Computer Interaction (HCI) should account for any group of proficient daily users that are shown to stumble over unfamiliar tasks. We define "Just Enough" (JE) users as proficient daily computer users with predominantly extrinsic motivation style who know just enough to get what they want or need from the computer. We hypothesize that JE users have difficulty with unfamiliar computer tasks and skill transfer, whereas intrinsically motivated daily users accomplish unfamiliar tasks readily. Intrinsic motivation can be characterized by interest, enjoyment, and choice and extrinsic motivation is externally regulated. In our study we identified users by motivation style and then did ethnographic observations. Our results confirm that JE users do have difficulty accomplishing unfamiliar tasks on their own but had fewer problems with near skill transfer. In contrast, intrinsically motivated users had no trouble with unfamiliar tasks nor with near skill transfer. This supports our assertion that JE users know enough to get routine tasks done and can transfer that knowledge, but become unproductive when faced with unfamiliar tasks. This study combines quantitative and qualitative methods. We identified 66 daily users by motivation style using an inventory adapted from Deci and Ryan (Ryan and Deci 2000) and from Guay, Vallerand, and Blanchard (Guay et al. 2000). We used qualitative ethnographic methods with a think aloud protocol to observe nine extrinsic users and seven intrinsic users. Observation sessions had three customized phases where the researcher directed the participant to: 1) confirm the participant's proficiency; 2) test the participant accomplishing unfamiliar tasks; and 3) test transfer of existing skills to unfamiliar software.
Resumo:
In the realm of computer programming, the experience of writing a program is used to reinforce concepts and evaluate ability. This research uses three case studies to evaluate the introduction of testing through Kolb's Experiential Learning Model (ELM). We then analyze the impact of those testing experiences to determine methods for improving future courses. The first testing experience that students encounter are unit test reports in their early courses. This course demonstrates that automating and improving feedback can provide more ELM iterations. The JUnit Generation (JUG) tool also provided a positive experience for the instructor by reducing the overall workload. Later, undergraduate and graduate students have the opportunity to work together in a multi-role Human-Computer Interaction (HCI) course. The interactions use usability analysis techniques with graduate students as usability experts and undergraduate students as design engineers. Students get experience testing the user experience of their product prototypes using methods varying from heuristic analysis to user testing. From this course, we learned the importance of the instructors role in the ELM. As more roles were added to the HCI course, a desire arose to provide more complete, quality assured software. This inspired the addition of unit testing experiences to the course. However, we learned that significant preparations must be made to apply the ELM when students are resistant. The research presented through these courses was driven by the recognition of a need for testing in a Computer Science curriculum. Our understanding of the ELM suggests the need for student experience when being introduced to testing concepts. We learned that experiential learning, when appropriately implemented, can provide benefits to the Computer Science classroom. When examined together, these course-based research projects provided insight into building strong testing practices into a curriculum.
Resumo:
Planning, navigation, and search are fundamental human cognitive abilities central to spatial problem solving in search and rescue, law enforcement, and military operations. Despite a wealth of literature concerning naturalistic spatial problem solving in animals, literature on naturalistic spatial problem solving in humans is comparatively lacking and generally conducted by separate camps among which there is little crosstalk. Addressing this deficiency will allow us to predict spatial decision making in operational environments, and understand the factors leading to those decisions. The present dissertation is comprised of two related efforts, (1) a set of empirical research studies intended to identify characteristics of planning, execution, and memory in naturalistic spatial problem solving tasks, and (2) a computational modeling effort to develop a model of naturalistic spatial problem solving. The results of the behavioral studies indicate that problem space hierarchical representations are linear in shape, and that human solutions are produced according to multiple optimization criteria. The Mixed Criteria Model presented in this dissertation accounts for global and local human performance in a traditional and naturalistic Traveling Salesman Problem. The results of the empirical and modeling efforts hold implications for basic and applied science in domains such as problem solving, operations research, human-computer interaction, and artificial intelligence.
Resumo:
The dissertation titled "Driver Safety in Far-side and Far-oblique Crashes" presents a novel approach to assessing vehicle cockpit safety by integrating Human Factors and Applied Mechanics. The methodology of this approach is aimed at improving safety in compact mobile workspaces such as patrol vehicle cockpits. A statistical analysis performed using Michigan state's traffic crash data to assess various contributing factors that affect the risk of severe driver injuries showed that the risk was greater for unrestrained drivers (OR=3.38, p<0.0001) and for incidents involving front and far-side crashes without seatbelts (OR=8.0 and 23.0 respectively, p<0.005). Statistics also showed that near-side and far-side crashes pose similar threat to driver injury severity. A Human Factor survey was conducted to assess various Human-Machine/Human-Computer Interaction aspects in patrol vehicle cockpits. Results showed that tasks requiring manual operation, especially the usage of laptop, would require more attention and potentially cause more distraction. A vehicle survey conducted to evaluate ergonomics-related issues revealed that some of the equipment was in airbag deployment zones. In addition, experiments were conducted to assess the effects on driver distraction caused by changing the position of in-car accessories. A driving simulator study was conducted to mimic HMI/HCI in a patrol vehicle cockpit (20 subjects, average driving experience = 5.35 years, s.d. = 1.8). It was found that the mounting locations of manual tasks did not result in a significant change in response times. Visual displays resulted in response times less than 1.5sec. It can also be concluded that the manual task was equally distracting regardless of mounting positions (average response time was 15 secs). Average speeds and lane deviations did not show any significant results. Data from 13 full-scale sled tests conducted to simulate far-side impacts at 70 PDOF and 40 PDOF was used to analyze head injuries and HIC/AIS values. It was found that accelerations generated by the vehicle deceleration alone were high enough to cause AIS 3 - AIS 6 injuries. Pretensioners could mitigated injuries only in 40 PDOF (oblique) impacts but are useless in 70 PDOF impacts. Seat belts were ineffective in protecting the driver's head from injuries. Head would come in contact with the laptop during a far-oblique (40 PDOF) crash and far-side door for an angle-type crash (70 PDOF). Finite Element analysis head-laptop impact interaction showed that the contact velocity was the most crucial factor in causing a severe (and potentially fatal) head injury. Results indicate that no equipment may be mounted in driver trajectory envelopes. A very narrow band of space is left in patrol vehicles for installation of manual-task equipment to be both safe and ergonomic. In case of a contact, the material stiffness and damping properties play a very significant role in determining the injury outcome. Future work may be done on improving the interiors' material properties to better absorb and dissipate kinetic energy of the head. The design of seat belts and pretensioners may also be seen as an essential aspect to be further improved.
Resumo:
File system security is fundamental to the security of UNIX and Linux systems since in these systems almost everything is in the form of a file. To protect the system files and other sensitive user files from unauthorized accesses, certain security schemes are chosen and used by different organizations in their computer systems. A file system security model provides a formal description of a protection system. Each security model is associated with specified security policies which focus on one or more of the security principles: confidentiality, integrity and availability. The security policy is not only about “who” can access an object, but also about “how” a subject can access an object. To enforce the security policies, each access request is checked against the specified policies to decide whether it is allowed or rejected. The current protection schemes in UNIX/Linux systems focus on the access control. Besides the basic access control scheme of the system itself, which includes permission bits, setuid and seteuid mechanism and the root, there are other protection models, such as Capabilities, Domain Type Enforcement (DTE) and Role-Based Access Control (RBAC), supported and used in certain organizations. These models protect the confidentiality of the data directly. The integrity of the data is protected indirectly by only allowing trusted users to operate on the objects. The access control decisions of these models depend on either the identity of the user or the attributes of the process the user can execute, and the attributes of the objects. Adoption of these sophisticated models has been slow; this is likely due to the enormous complexity of specifying controls over a large file system and the need for system administrators to learn a new paradigm for file protection. We propose a new security model: file system firewall. It is an adoption of the familiar network firewall protection model, used to control the data that flows between networked computers, toward file system protection. This model can support decisions of access control based on any system generated attributes about the access requests, e.g., time of day. The access control decisions are not on one entity, such as the account in traditional discretionary access control or the domain name in DTE. In file system firewall, the access decisions are made upon situations on multiple entities. A situation is programmable with predicates on the attributes of subject, object and the system. File system firewall specifies the appropriate actions on these situations. We implemented the prototype of file system firewall on SUSE Linux. Preliminary results of performance tests on the prototype indicate that the runtime overhead is acceptable. We compared file system firewall with TE in SELinux to show that firewall model can accommodate many other access control models. Finally, we show the ease of use of firewall model. When firewall system is restricted to specified part of the system, all the other resources are not affected. This enables a relatively smooth adoption. This fact and that it is a familiar model to system administrators will facilitate adoption and correct use. The user study we conducted on traditional UNIX access control, SELinux and file system firewall confirmed that. The beginner users found it easier to use and faster to learn then traditional UNIX access control scheme and SELinux.