16 resultados para Flow of information
em Digital Commons at Florida International University
Resumo:
The primary purpose of this research is to study the linkage between perceived job design characteristics and information system environment characteristics before and after the replacement of a legacy information system with a new type of information system (referred to as an Enterprise Resource Planning or ERP system). A public state University implementing an academic version of an ERP system was selected for the study. Three survey instruments were used to examine the perception of the information system, the job characteristics, and the organizational culture before and after the system implementation. The research participants included two large departments resulting in a sample of 130 workers. Research questions were analyzed using multivariate procedures including factor analysis, path analysis, step-wise regression, and matched pair analysis. ^ Results indicated that the ERP system has introduced new elements into the working environment that has changed the perception of how the job design characteristics and organization culture dimensions are viewed by the workers. The understanding of how the perceived system characteristics align with an individual's perceived job design characteristics is supported by each of the system characteristics significantly correlated in the proposed direction. The stronger support of this relationship becomes visible in the causal flow of the effects seen in the path diagram and in the step-wise regression. The perceived job design characteristics aligning with dimensions of organizational culture are not as strong as the literature suggests. Although there are significant correlations between the job and culture variables, only one relationship can be seen in the causal flow. ^ This research has demonstrated that system characteristics of ERP do contribute to the perception of change in an organization and do support organizational culture behaviors and job characteristics. ^
Resumo:
Outline detailing the planning and implementation process for the establishment of the College of Medicine's Office of Information Technology. Includes a timeline of the Office of IT's development, a strategic roadmap, and a technology analysis.
Resumo:
The ultimate intent of this dissertation was to broaden and strengthen our understanding of IT implementation by emphasizing research efforts on the dynamic nature of the implementation process. More specifically, efforts were directed toward opening the "black box" and providing the story that explains how and why contextual conditions and implementation tactics interact to produce project outcomes. In pursuit of this objective, the dissertation was aimed at theory building and adopted a case study methodology combining qualitative and quantitative evidence. Precisely, it examined the implementation process, use and consequences of three clinical information systems at Jackson Memorial Hospital, a large tertiary care teaching hospital.^ As a preliminary step toward the development of a more realistic model of system implementation, the study proposes a new set of research propositions reflecting the dynamic nature of the implementation process.^ Findings clearly reveal that successful implementation projects are likely to be those where key actors envision end goals, anticipate challenges ahead, and recognize the presence of and seize opportunities. It was also found that IT implementation is characterized by the systems theory of equifinality, that is, there are likely several equally effective ways to achieve a given end goal. The selection of a particular implementation strategy appears to be a rational process where actions and decisions are largely influenced by the degree to which key actors recognize the mediating role of each tactic and are motivated to action. The nature of the implementation process is also characterized by the concept of "duality of structure," that is, context and actions mutually influence each other. Another key finding suggests that there is no underlying program that regulates the process of change and moves it form one given point toward a subsequent and already prefigured end. For this reason, the implementation process cannot be thought of as a series of activities performed in a sequential manner such as conceived in stage models. Finally, it was found that IT implementation is punctuated by a certain indeterminacy. Results suggest that only when substantial efforts are focused on what to look for and think about, it is less likely that unfavorable and undesirable consequences will occur. ^
Resumo:
The search-experience-credence framework from economics of information, the human-environment relations models from environmental psychology, and the consumer evaluation process from services marketing provide a conceptual basis for testing the model of "Pre-purchase Information Utilization in Service Physical Environments." The model addresses the effects of informational signs, as a dimension of the service physical environment, on consumers' perceptions (perceived veracity and perceived performance risk), emotions (pleasure) and behavior (willingness to buy). The informational signs provide attribute quality information (search and experience) through non-personal sources of information (simulated word-of-mouth and non-personal advocate sources).^ This dissertation examines: (1) the hypothesized relationships addressed in the model of "Pre-purchase Information Utilization in Service Physical Environments" among informational signs, perceived veracity, perceived performance risk, pleasure, and willingness to buy, and (2) the effects of attribute quality information and sources of information on consumers' perceived veracity and perceived performance risk.^ This research is the first in-depth study about the role and effects of information in service physical environments. Using a 2 x 2 between subjects experimental research procedure, undergraduate students were exposed to the informational signs in a simulated service physical environment. The service physical environments were simulated through color photographic slides.^ The results of the study suggest that: (1) the relationship between informational signs and willingness to buy is mediated by perceived veracity, perceived performance risk and pleasure, (2) experience attribute information shows higher perceived veracity and lower perceived performance risk when compared to search attribute information, and (3) information provided through simulated word-of-mouth shows higher perceived veracity and lower perceived performance risk when compared to information provided through non-personal advocate sources. ^
Resumo:
A study was conducted to investigate the effectiveness, as measured by performance on course posttests, of mindmapping versus traditional notetaking in a corporate training class. The purpose of this study was to increase knowledge concerning the effectiveness of mindmapping as an information encoding tool to enhance the effectiveness of learning. Corporations invest billions of dollars, annually, in training programs. Given this increased demand for effective and efficient workplace learning, continual reliance on traditional notetaking is questionable for the high-speed and continual learning required on workers.^ An experimental, posttest-only control group design was used to test the following hypotheses: (1) there is no significant difference in posttest scores on an achievement test, administered immediately after the course, between adult learners using mindmapping versus traditional notetaking methods in a training lecture, and (2) there is no significant difference in posttest scores on an achievement test, administered 30 days after the course, between adult learners using mindmapping versus traditional notetaking methods in a training lecture. After a 1.5 hour instruction on mindmapping, the treatment group used mindmapping throughout the course. The control group used traditional notetaking. T-tests were used to determine if there were significant differences between mean posttest scores between the two groups. In addition, an attitudinal survey, brain hemisphere dominance survey, course dynamics observations, and course evaluations were used to investigate preference for mindmapping, its perceived effect on test performance, and the effectiveness of mindmapping instruction.^ This study's principal finding was that although the mindmapping group did not perform significantly higher on posttests administered immediately and 30 days after the course, than the traditional notetaking group, the mindmapping group did score higher on both posttests and reported higher ratings of the course on every evaluation criteria. Lower educated, right brain dominant learners reported a significantly positive learning experience. These results suggest that mindmapping enhances and reinforces the preconditions of learning. Recommendations for future study are provided. ^
Resumo:
The complexity of many organizational tasks requires perspectives, expertise, and talents that are often not found in a single individual. Organizations have therefore been placing employees into groups, assigning them to tasks they would formally have undertaken individually. The use of these groups, known as workgroups, has become an important strategy for managing this increased complexity. Empirical research on participative budgeting however has been limited almost exclusively to individuals. This dissertation empirically examines the effect of the information that management and workgroups have about group members' performance capabilities, on the work standards that workgroups select during the participative budgeting process. ^ A laboratory experiment was conducted in which two hundred and forty undergraduate business students were randomly assigned to three-member groups. The study provides empirical evidence which suggests that when management is unaware of group members' performance capabilities, workgroups select higher work standards and have higher performance levels than when management is aware of their performance capabilities. ^
Resumo:
Historically, memory has been evaluated by examining how much is remembered, however a more recent conception of memory focuses on the accuracy of memories. When using this accuracy-oriented conception of memory, unlike with the quantity-oriented approach, memory does not always deteriorate over time. A possible explanation for this seemingly surprising finding lies in the metacognitive processes of monitoring and control. Use of these processes allows people to withhold responses of which they are unsure, or to adjust the precision of responses to a level that is broad enough to be correct. The ability to accurately report memories has implications for investigators who interview witnesses to crimes, and those who evaluate witness testimony. ^ This research examined the amount of information provided, accuracy, and precision of responses provided during immediate and delayed interviews about a videotaped mock crime. The interview format was manipulated such that a single free narrative response was elicited, or a series of either yes/no or cued questions were asked. Instructions provided by the interviewer indicated to the participants that they should either stress being informative, or being accurate. The interviews were then transcribed and scored. ^ Results indicate that accuracy rates remained stable and high after a one week delay. Compared to those interviewed immediately, after a delay participants provided less information and responses that were less precise. Participants in the free narrative condition were the most accurate. Participants in the cued questions condition provided the most precise responses. Participants in the yes/no questions condition were most likely to say “I don’t know”. The results indicate that people are able to monitor their memories and modify their reports to maintain high accuracy. When control over precision was not possible, such as in the yes/no condition, people said “I don’t know” to maintain accuracy. However when withholding responses and adjusting precision were both possible, people utilized both methods. It seems that concerns that memories reported after a long retention interval might be inaccurate are unfounded. ^
Resumo:
In response to the recent wide-scale applications of lnformation Technology (I/T) in the hospitality industry, this study analyzed articles in leading hospitality research journals, including the International Journal of Hospitality Management, Cornell Hotel and Restaurant Administration Quarterly, and the Journal of Hospitality & Tourism Research published in the period 1985 to 2004. A total of 1,896 full-length papers were published in these journals during the study period. Excluding book reviews, research notes, and comments from editors and readers, 130 full-length IT-related papers were identified. These papers were then grouped into six defined categories of IT. The findings revealed that during the entire study period, the largest number of publications were in general business applications, whereas the highest growth rate from the first decade to the second decade were in articles on networking
Resumo:
Successful introduction of information technology applications in various operations of hotel management is vital to most service firms. In recent decades, technologies of information, automation, and communication are increasingly recognized as essential components of a hotel company’s strategic plan. In this study, 62 super-deluxe hotels (5 star), deluxe hotels (4 star), and tourist hotels (3 star) in Korea are examined for differences in the impact of information technology services on guest’ satisfaction, guest convenience, and operational efficiency. The findings generally suggest that the impacts of information technology-enhanced services vary according to the category of hotels in Korea. The results of the study are expected to assist managers in the selections and implementation of information technology systems in their hotel.
Resumo:
The technologies that empower biometrics have been around for a number of years, but until recently these technologies have been viewed as exotic. In the not too distant future biometrics will be used to regulate internal processes and to improve services in the hospitality and tourism industries. This paper provides an understanding of the current use of biometrics in general and its practical value for the future in hospitality and tourism. The study presents a review of current practices of biometrics with special reference to the hospitality and tourism businesses, addresses key issues imposed by this technology, and identifies business and marketing implications for these industries.
Resumo:
Near infrared spectroscopy (NIRS) is an emerging non-invasive optical neuro imaging technique that monitors the hemodynamic response to brain activation with ms-scale temporal resolution and sub-cm spatial resolution. The overall goal of my dissertation was to develop and apply NIRS towards investigation of neurological response to language, joint attention and planning and execution of motor skills in healthy adults. Language studies were performed to investigate the hemodynamic response, synchrony and dominance feature of the frontal and fronto-temporal cortex of healthy adults in response to language reception and expression. The mathematical model developed based on granger causality explicated the directional flow of information during the processing of language stimuli by the fronto-temporal cortex. Joint attention and planning/ execution of motor skill studies were performed to investigate the hemodynamic response, synchrony and dominance feature of the frontal cortex of healthy adults and in children (5-8 years old) with autism (for joint attention studies) and individuals with cerebral palsy (for planning/execution of motor skills studies). The joint attention studies on healthy adults showed differences in activation as well as intensity and phase dependent connectivity in the frontal cortex during joint attention in comparison to rest. The joint attention studies on typically developing children showed differences in frontal cortical activation in comparison to that in children with autism. The planning and execution of motor skills studies on healthy adults and individuals with cerebral palsy (CP) showed difference in the frontal cortical dominance, that is, bilateral and ipsilateral dominance, respectively. The planning and execution of motor skills studies also demonstrated the plastic and learning behavior of brain wherein correlation was found between the relative change in total hemoglobin in the frontal cortex and the kinematics of the activity performed by the participants. Thus, during my dissertation the NIRS neuroimaging technique was successfully implemented to investigate the neurological response of language, joint attention and planning and execution of motor skills in healthy adults as well as preliminarily on children with autism and individuals with cerebral palsy. These NIRS studies have long-term potential for the design of early stage interventions in children with autism and customized rehabilitation in individuals with cerebral palsy.
Resumo:
Historically, memory has been evaluated by examining how much is remembered, however a more recent conception of memory focuses on the accuracy of memories. When using this accuracy-oriented conception of memory, unlike with the quantity-oriented approach, memory does not always deteriorate over time. A possible explanation for this seemingly surprising finding lies in the metacognitive processes of monitoring and control. Use of these processes allows people to withhold responses of which they are unsure, or to adjust the precision of responses to a level that is broad enough to be correct. The ability to accurately report memories has implications for investigators who interview witnesses to crimes, and those who evaluate witness testimony. This research examined the amount of information provided, accuracy, and precision of responses provided during immediate and delayed interviews about a videotaped mock crime. The interview format was manipulated such that a single free narrative response was elicited, or a series of either yes/no or cued questions were asked. Instructions provided by the interviewer indicated to the participants that they should either stress being informative, or being accurate. The interviews were then transcribed and scored. Results indicate that accuracy rates remained stable and high after a one week delay. Compared to those interviewed immediately, after a delay participants provided less information and responses that were less precise. Participants in the free narrative condition were the most accurate. Participants in the cued questions condition provided the most precise responses. Participants in the yes/no questions condition were most likely to say “I don’t know”. The results indicate that people are able to monitor their memories and modify their reports to maintain high accuracy. When control over precision was not possible, such as in the yes/no condition, people said “I don’t know” to maintain accuracy. However when withholding responses and adjusting precision were both possible, people utilized both methods. It seems that concerns that memories reported after a long retention interval might be inaccurate are unfounded.
Resumo:
Near infrared spectroscopy (NIRS) is an emerging non-invasive optical neuro imaging technique that monitors the hemodynamic response to brain activation with ms-scale temporal resolution and sub-cm spatial resolution. The overall goal of my dissertation was to develop and apply NIRS towards investigation of neurological response to language, joint attention and planning and execution of motor skills in healthy adults. Language studies were performed to investigate the hemodynamic response, synchrony and dominance feature of the frontal and fronto-temporal cortex of healthy adults in response to language reception and expression. The mathematical model developed based on granger causality explicated the directional flow of information during the processing of language stimuli by the fronto-temporal cortex. Joint attention and planning/ execution of motor skill studies were performed to investigate the hemodynamic response, synchrony and dominance feature of the frontal cortex of healthy adults and in children (5-8 years old) with autism (for joint attention studies) and individuals with cerebral palsy (for planning/execution of motor skills studies). The joint attention studies on healthy adults showed differences in activation as well as intensity and phase dependent connectivity in the frontal cortex during joint attention in comparison to rest. The joint attention studies on typically developing children showed differences in frontal cortical activation in comparison to that in children with autism. The planning and execution of motor skills studies on healthy adults and individuals with cerebral palsy (CP) showed difference in the frontal cortical dominance, that is, bilateral and ipsilateral dominance, respectively. The planning and execution of motor skills studies also demonstrated the plastic and learning behavior of brain wherein correlation was found between the relative change in total hemoglobin in the frontal cortex and the kinematics of the activity performed by the participants. Thus, during my dissertation the NIRS neuroimaging technique was successfully implemented to investigate the neurological response of language, joint attention and planning and execution of motor skills in healthy adults as well as preliminarily on children with autism and individuals with cerebral palsy. These NIRS studies have long-term potential for the design of early stage interventions in children with autism and customized rehabilitation in individuals with cerebral palsy.
Resumo:
Secrecy is fundamental to computer security, but real systems often cannot avoid leaking some secret information. For this reason, the past decade has seen growing interest in quantitative theories of information flow that allow us to quantify the information being leaked. Within these theories, the system is modeled as an information-theoretic channel that specifies the probability of each output, given each input. Given a prior distribution on those inputs, entropy-like measures quantify the amount of information leakage caused by the channel. ^ This thesis presents new results in the theory of min-entropy leakage. First, we study the perspective of secrecy as a resource that is gradually consumed by a system. We explore this intuition through various models of min-entropy consumption. Next, we consider several composition operators that allow smaller systems to be combined into larger systems, and explore the extent to which the leakage of a combined system is constrained by the leakage of its constituents. Most significantly, we prove upper bounds on the leakage of a cascade of two channels, where the output of the first channel is used as input to the second. In addition, we show how to decompose a channel into a cascade of channels. ^ We also establish fundamental new results about the recently-proposed g-leakage family of measures. These results further highlight the significance of channel cascading. We prove that whenever channel A is composition refined by channel B, that is, whenever A is the cascade of B and R for some channel R, the leakage of A never exceeds that of B, regardless of the prior distribution or leakage measure (Shannon leakage, guessing entropy leakage, min-entropy leakage, or g-leakage). Moreover, we show that composition refinement is a partial order if we quotient away channel structure that is redundant with respect to leakage alone. These results are strengthened by the proof that composition refinement is the only way for one channel to never leak more than another with respect to g-leakage. Therefore, composition refinement robustly answers the question of when a channel is always at least as secure as another from a leakage point of view.^
Resumo:
Protecting confidential information from improper disclosure is a fundamental security goal. While encryption and access control are important tools for ensuring confidentiality, they cannot prevent an authorized system from leaking confidential information to its publicly observable outputs, whether inadvertently or maliciously. Hence, secure information flow aims to provide end-to-end control of information flow. Unfortunately, the traditionally-adopted policy of noninterference, which forbids all improper leakage, is often too restrictive. Theories of quantitative information flow address this issue by quantifying the amount of confidential information leaked by a system, with the goal of showing that it is intuitively "small" enough to be tolerated. Given such a theory, it is crucial to develop automated techniques for calculating the leakage in a system. ^ This dissertation is concerned with program analysis for calculating the maximum leakage, or capacity, of confidential information in the context of deterministic systems and under three proposed entropy measures of information leakage: Shannon entropy leakage, min-entropy leakage, and g-leakage. In this context, it turns out that calculating the maximum leakage of a program reduces to counting the number of possible outputs that it can produce. ^ The new approach introduced in this dissertation is to determine two-bit patterns, the relationships among pairs of bits in the output; for instance we might determine that two bits must be unequal. By counting the number of solutions to the two-bit patterns, we obtain an upper bound on the number of possible outputs. Hence, the maximum leakage can be bounded. We first describe a straightforward computation of the two-bit patterns using an automated prover. We then show a more efficient implementation that uses an implication graph to represent the two- bit patterns. It efficiently constructs the graph through the use of an automated prover, random executions, STP counterexamples, and deductive closure. The effectiveness of our techniques, both in terms of efficiency and accuracy, is shown through a number of case studies found in recent literature. ^