920 resultados para Computer System Management
Resumo:
In his dialogue - Near Term Computer Management Strategy For Hospitality Managers and Computer System Vendors - by William O'Brien, Associate Professor, School of Hospitality Management at Florida International University, Associate Professor O’Brien initially states: “The computer revolution has only just begun. Rapid improvement in hardware will continue into the foreseeable future; over the last five years it has set the stage for more significant improvements in software technology still to come. John Naisbitt's information electronics economy¹ based on the creation and distribution of information has already arrived and as computer devices improve, hospitality managers will increasingly do at least a portion of their work with software tools.” At the time of this writing Assistant Professor O’Brien will have you know, contrary to what some people might think, the computer revolution is not over, it’s just beginning; it’s just an embryo. Computer technology will only continue to develop and expand, says O’Brien with citation. “A complacent few of us who feel “we have survived the computer revolution” will miss opportunities as a new wave of technology moves through the hospitality industry,” says ‘Professor O’Brien. “Both managers who buy technology and vendors who sell it can profit from strategy based on understanding the wave of technological innovation,” is his informed opinion. Property managers who embrace rather than eschew innovation, in this case computer technology, will benefit greatly from this new science in hospitality management, O’Brien says. “The manager who is not alert to or misunderstands the nature of this wave of innovation will be the constant victim of technology,” he advises. On the vendor side of the equation, O’Brien observes, “Computer-wise hospitality managers want systems which are easier and more profitable to operate. Some view their own industry as being somewhat behind the times… They plan to pay significantly less for better computer devices. Their high expectations are fed by vendor marketing efforts…” he says. O’Brien warns against taking a gamble on a risky computer system by falling victim to un-substantiated claims and pie-in-the-sky promises. He recommends affiliating with turn-key vendors who provide hardware, software, and training, or soliciting the help of large mainstream vendors such as IBM, NCR, or Apple. Many experts agree that the computer revolution has merely and genuinely morphed into the software revolution, informs O’Brien; “…recognizing that a computer is nothing but a box in which programs run.” Yes, some of the empirical data in this article is dated by now, but the core philosophy of advancing technology, and properties continually tapping current knowledge is sound.
Resumo:
Because of their limited number of senior positions and fewer alternative career paths, small businesses have a more difficult time attracting and retaining skilled information systems (IS) staff and are thus dependent upon external expertise. Small businesses are particularly dependent on outside expertise when first computerizing. Because small businesses suffer from severe financial constraints. it is often difficult to justify the cost of custom software. Hence. for many small businesses, engaging a consultant to help with identifying suitable packaged software and related hardware, is their first critical step toward computerization. This study explores the importance of proactive client involvement when engaging a consultant to assist with computer system selection in small businesses. Client involvement throughout consultant engagement is found to be integral to project success and frequently lacking due to misconceptions of small businesses regarding their role. Small businesses often overestimate the impact of consultant and vendor support in achieving successful computer system selection and implementation. For consultant engagement to be successful, the process must be viewed as being directed toward the achievement of specific organizational results where the client accepts responsibility for direction of the process.
Resumo:
This thesis is an investigation into the nature of data analysis and computer software systems which support this activity.
The first chapter develops the notion of data analysis as an experimental science which has two major components: data-gathering and theory-building. The basic role of language in determining the meaningfulness of theory is stressed, and the informativeness of a language and data base pair is studied. The static and dynamic aspects of data analysis are then considered from this conceptual vantage point. The second chapter surveys the available types of computer systems which may be useful for data analysis. Particular attention is paid to the questions raised in the first chapter about the language restrictions imposed by the computer system and its dynamic properties.
The third chapter discusses the REL data analysis system, which was designed to satisfy the needs of the data analyzer in an operational relational data system. The major limitation on the use of such systems is the amount of access to data stored on a relatively slow secondary memory. This problem of the paging of data is investigated and two classes of data structure representations are found, each of which has desirable paging characteristics for certain types of queries. One representation is used by most of the generalized data base management systems in existence today, but the other is clearly preferred in the data analysis environment, as conceptualized in Chapter I.
This data representation has strong implications for a fundamental process of data analysis -- the quantification of variables. Since quantification is one of the few means of summarizing and abstracting, data analysis systems are under strong pressure to facilitate the process. Two implementations of quantification are studied: one analagous to the form of the lower predicate calculus and another more closely attuned to the data representation. A comparison of these indicates that the use of the "label class" method results in orders of magnitude improvement over the lower predicate calculus technique.
Resumo:
A system for visual recognition is described, with implications for the general problem of representation of knowledge to assist control. The immediate objective is a computer system that will recognize objects in a visual scene, specifically hammers. The computer receives an array of light intensities from a device like a television camera. It is to locate and identify the hammer if one is present. The computer must produce from the numerical "sensory data" a symbolic description that constitutes its perception of the scene. Of primary concern is the control of the recognition process. Control decisions should be guided by the partial results obtained on the scene. If a hammer handle is observed this should suggest that the handle is part of a hammer and advise where to look for the hammer head. The particular knowledge that a handle has been found combines with general knowledge about hammers to influence the recognition process. This use of knowledge to direct control is denoted here by the term "active knowledge". A descriptive formalism is presented for visual knowledge which identifies the relationships relevant to the active use of the knowledge. A control structure is provided which can apply knowledge organized in this fashion actively to the processing of a given scene.
Resumo:
The use of barcode technology to capture data on pharmacists' clinical interventions is described.
Resumo:
The SAL system embodies a new kind of human-computer interaction, where a person and a computer carry out a fluent, emotionally coloured conversation. Because that kind of capability is new, evaluating systems that have it is a new challenge. This paper outlines techniques that have been developed to evaluate SAL interactions, and uses the case to highlight the range of variables that become relevant in dealing with systems of this order of complexity.
Resumo:
ABSTRACT - The Patient Protection and Affordable Care Act shook the foundations of the US health system, offering all Americans access to health care by changing the way the health insurance industry works. As President Obama signed the Act on 23 March 2010, he said that it stood for “the core principle that everybody should have some basic security when it comes to their health care”. Unlike the U.S., the Article 64 of the Portuguese Constitution provides, since 1976, the right to universal access to health care. However, facing a severe economic crisis, Portugal has, under the supervision of the Troika, a tight schedule to implement measures to improve the efficiency of the National Health Service. Both countries are therefore despite their different situation, in a conjuncture of reform and the use of new health management measures. The present work, using a qualitative research methodology examines the Affordable Care Act in order to describe its principles and enforcement mechanisms. In order to describe the reality in Portugal, the Portuguese health system and the measures imposed by Troika are also analyzed. The intention of this entire analysis is not only to disclose the innovative U.S. law, but to find some innovative measures that could serve health management in Portugal. Essentially we identified the Exchanges and Wellness Programs, described throughout this work, leaving also the idea of the possibility of using them in the Portuguese national health system.
Resumo:
Modern computer systems are plagued with stability and security problems: applications lose data, web servers are hacked, and systems crash under heavy load. Many of these problems or anomalies arise from rare program behavior caused by attacks or errors. A substantial percentage of the web-based attacks are due to buffer overflows. Many methods have been devised to detect and prevent anomalous situations that arise from buffer overflows. The current state-of-art of anomaly detection systems is relatively primitive and mainly depend on static code checking to take care of buffer overflow attacks. For protection, Stack Guards and I-leap Guards are also used in wide varieties.This dissertation proposes an anomaly detection system, based on frequencies of system calls in the system call trace. System call traces represented as frequency sequences are profiled using sequence sets. A sequence set is identified by the starting sequence and frequencies of specific system calls. The deviations of the current input sequence from the corresponding normal profile in the frequency pattern of system calls is computed and expressed as an anomaly score. A simple Bayesian model is used for an accurate detection.Experimental results are reported which show that frequency of system calls represented using sequence sets, captures the normal behavior of programs under normal conditions of usage. This captured behavior allows the system to detect anomalies with a low rate of false positives. Data are presented which show that Bayesian Network on frequency variations responds effectively to induced buffer overflows. It can also help administrators to detect deviations in program flow introduced due to errors.
Resumo:
Production of mini vegetables in organic system is a good alternative to improve profit, but there are no researches about the optimum plant density for these cultivars in Brazilian conditions. Two experiments were conducted to evaluate the production of mini lettuce cultivars in different plant densities. Experiment 1 was conducted from January 1th (sowing) to February 10th (harvesting), 2012. The experimental design was completely randomized blocks, with six treatments in factorial scheme, 3 mini lettuce cultivars (Tudela, Renoir and Sartre) x 2 spacing between plants (16 and 20 cm), with eight replications and plots (2.04 m(2)) with six rows, spaced 15 cm. Experiment 2 was conducted from June 6th (sowing) to July 18th (harvesting), 2012. The cultivars Sartre and Renoir were evaluated under four plant densities (444,444; 333,333; 266,667 and 200,000 plants ha(-1), corresponding to spacing of 15x15, 15x20, 25x15 and 25x20 cm, respectively). Eight treatments were defined by a factorial scheme 2 (cultivars) x 4 (plant densities) and arranged in a completely randomized block design, with nine replications and plots with 2.04 m2. The evaluated characteristics in both experiments were total and marketable fresh weight per plant, plant dry weight, plant diameter and height, marketable yield and discard percentage. In first experiment, during the summer, cultivar Sartre showed the highest marketable fresh weight (72 g plant(-1)). Heaviest plants (91.6 g plant(-1)) were obtained with the higher plant spacing, but the highest yield (2.51 kg m(-2)) was obtained with the smaller spacing. In winter, plants with higher total (190 g plant(-1)) and marketable (146 g planta(-1)) fresh weight were obtained with cultivar Sartre, and the same was observed in low plant density. However, the higher plant density, the higher the yield.
Resumo:
A mail survey was conducted to assess current computer hardware use and perceived needs of potential users for software related to crop pest management in Nebraska. Surveys were sent to University of Nebraska-Lincoln agricultural extension agents, agribusiness personnel (including independent crop consultants), and crop producers identified by extension agents as computer users. There were no differences between the groups in several aspects of computer hardware use (percentage computer use, percentage IBM-compatible computer, amount of RAM memory, percentage with hard drive, hard drive size, or monitor graphics capability). Responses were similar among the three groups in several areas that are important to crop pest management (pest identification, pest biology, treatment decision making, control options, and pesticide selection), and a majority of each group expressed the need for additional sources of such information about insects, diseases, and weeds. However, agents mentioned vertebrate pest management information as a need more often than the other two groups. Also, majorities of each group expressed an interest in using computer software, if available, to obtain information in these areas. Appropriate software to address these needs should find an audience among all three groups.
Resumo:
The objective of this study was to undertake a critical reflection regarding assessment as a managerial tool that promotes the inclusion of nurses in the health system management process. Nurses, because of their education and training, which encompasses knowledge in both the clinical and managerial fields and is centered on care, have the potential to assume a differentiated attitude in management, making decisions and proposing health policies. Nevertheless, it is necessary to first create and consolidate an expressive inclusion in decisive levels of management. Assessment is a component of management, the results of which may contribute to making decisions that are more objective and allow for improving healthcare interventions and reorganizing health practice within a political, economic, social and professional context; it is also an area for the application of knowledge that has the potential to change the current panorama of including nurses in management.