29 resultados para 190202 Computer Gaming and Animation

em Digital Commons at Florida International University


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research examines evolving issues in applied computer science and applies economic and business analyses as well. There are two main areas. The first is internetwork communications as embodied by the Internet. The goal of the research is to devise an efficient pricing, prioritization, and incentivization plan that could be realistically implemented on the existing infrastructure. Criteria include practical and economic efficiency, and proper incentives for both users and providers. Background information on the evolution and functional operation of the Internet is given, and relevant literature is surveyed and analyzed. Economic analysis is performed on the incentive implications of the current pricing structure and organization. The problems are identified, and minimally disruptive solutions are proposed for all levels of implementation to the lowest level protocol. Practical issues are considered and performance analyses are done. The second area of research is mass market software engineering, and how this differs from classical software engineering. Software life-cycle revenues are analyzed and software pricing and timing implications are derived. A profit maximizing methodology is developed to select or defer the development of software features for inclusion in a given release. An iterative model of the stages of the software development process is developed, taking into account new communications capabilities as well as profitability. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to develop a model to predict transport and fate of gasoline components of environmental concern in the Miami River by mathematically simulating the movement of dissolved benzene, toluene, xylene (BTX), and methyl-tertiary-butyl ether (MTBE) occurring from minor gasoline spills in the inter-tidal zone of the river. Computer codes were based on mathematical algorithms that acknowledge the role of advective and dispersive physical phenomena along the river and prevailing phase transformations of BTX and MTBE. Phase transformations included volatilization and settling. ^ The model used a finite-difference scheme of steady-state conditions, with a set of numerical equations that was solved by two numerical methods: Gauss-Seidel and Jacobi iterations. A numerical validation process was conducted by comparing the results from both methods with analytical and numerical reference solutions. Since similar trends were achieved after the numerical validation process, it was concluded that the computer codes algorithmically were correct. The Gauss-Seidel iteration yielded at a faster convergence rate than the Jacobi iteration. Hence, the mathematical code was selected to further develop the computer program and software. The model was then analyzed for its sensitivity. It was found that the model was very sensitive to wind speed but not to sediment settling velocity. ^ A computer software was developed with the model code embedded. The software was provided with two major user-friendly visualized forms, one to interface with the database files and the other to execute and present the graphical and tabulated results. For all predicted concentrations of BTX and MTBE, the maximum concentrations were over an order of magnitude lower than current drinking water standards. It should be pointed out, however, that smaller concentrations than the latter reported standards and values, although not harmful to humans, may be very harmful to organisms of the trophic levels of the Miami River ecosystem and associated waters. This computer model can be used for the rapid assessment and management of the effects of minor gasoline spills on inter-tidal riverine water quality. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The casino segment of the hospitality industry is experiencing unprecedented growth. As a result, many academics and practitioners alike cannot stay abreast of developments in the field. The author addresses the situation by providing an overview of casino development in the United States from an historical perspective, a review of current developments, and some predictions about the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Indian Gaming Regulatory Act of 1988 was intended to provide a statutory basis for the growth of Indian gaming. This article explains that the intentions of the act, when coupled with court decisions and a competitive economic environment, may be the basis for federal intervention in the gaming industry, specifically for Native American gaming. The author reviews the history of programs and promises, the magnitude of the total gaming industry, and the role of Native American gaming.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Physiological signals, which are controlled by the autonomic nervous system (ANS), could be used to detect the affective state of computer users and therefore find applications in medicine and engineering. The Pupil Diameter (PD) seems to provide a strong indication of the affective state, as found by previous research, but it has not been investigated fully yet. ^ In this study, new approaches based on monitoring and processing the PD signal for off-line and on-line affective assessment ("relaxation" vs. "stress") are proposed. Wavelet denoising and Kalman filtering methods are first used to remove abrupt changes in the raw Pupil Diameter (PD) signal. Then three features (PDmean, PDmax and PDWalsh) are extracted from the preprocessed PD signal for the affective state classification. In order to select more relevant and reliable physiological data for further analysis, two types of data selection methods are applied, which are based on the paired t-test and subject self-evaluation, respectively. In addition, five different kinds of the classifiers are implemented on the selected data, which achieve average accuracies up to 86.43% and 87.20%, respectively. Finally, the receiver operating characteristic (ROC) curve is utilized to investigate the discriminating potential of each individual feature by evaluation of the area under the ROC curve, which reaches values above 0.90. ^ For the on-line affective assessment, a hard threshold is implemented first in order to remove the eye blinks from the PD signal and then a moving average window is utilized to obtain the representative value PDr for every one-second time interval of PD. There are three main steps for the on-line affective assessment algorithm, which are preparation, feature-based decision voting and affective determination. The final results show that the accuracies are 72.30% and 73.55% for the data subsets, which were respectively chosen using two types of data selection methods (paired t-test and subject self-evaluation). ^ In order to further analyze the efficiency of affective recognition through the PD signal, the Galvanic Skin Response (GSR) was also monitored and processed. The highest affective assessment classification rate obtained from GSR processing is only 63.57% (based on the off-line processing algorithm). The overall results confirm that the PD signal should be considered as one of the most powerful physiological signals to involve in future automated real-time affective recognition systems, especially for detecting the "relaxation" vs. "stress" states.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study is to identify the relationship between the characteristics of distance education students, their computer literacy and technology acceptance and distance education course satisfaction. The theoretical framework for this study will apply Rogers and Havelock's Innovation, Diffusion & Utilization theories to distance education. It is hypothesized that technology acceptance and computer competency will influence the student course satisfaction and explain the decision to adopt or reject distance education curriculum and technology. Distance education delivery, Institutional Support, Convenience, Interactivity and five distance education technologies were studied. The data were collected by a survey questionnaire sent to four Florida universities. Three hundred and nineteen and students returned the questionnaire. A factor and regression analysis on three measure of satisfaction revealed significant difference between the three main factors related to the overall satisfaction of distance education students and their adoption of distance education technology as medium of learning. Computer literacy is significantly related to greater overall student satisfaction. However, when competing with other factors such as delivery, support, interactivity, and convenience, computer literacy is not significant. Results indicate that age and status are the only two student characteristics to be significant. Distance education technology acceptance is positively related to higher overall satisfaction. Innovativeness is also positively related to student overall satisfaction. Finally, the technology used relates positively to greater satisfaction levels within the educational experience. Additional research questions were investigated and provided insights into the innovation decision process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Physiological signals, which are controlled by the autonomic nervous system (ANS), could be used to detect the affective state of computer users and therefore find applications in medicine and engineering. The Pupil Diameter (PD) seems to provide a strong indication of the affective state, as found by previous research, but it has not been investigated fully yet. In this study, new approaches based on monitoring and processing the PD signal for off-line and on-line affective assessment (“relaxation” vs. “stress”) are proposed. Wavelet denoising and Kalman filtering methods are first used to remove abrupt changes in the raw Pupil Diameter (PD) signal. Then three features (PDmean, PDmax and PDWalsh) are extracted from the preprocessed PD signal for the affective state classification. In order to select more relevant and reliable physiological data for further analysis, two types of data selection methods are applied, which are based on the paired t-test and subject self-evaluation, respectively. In addition, five different kinds of the classifiers are implemented on the selected data, which achieve average accuracies up to 86.43% and 87.20%, respectively. Finally, the receiver operating characteristic (ROC) curve is utilized to investigate the discriminating potential of each individual feature by evaluation of the area under the ROC curve, which reaches values above 0.90. For the on-line affective assessment, a hard threshold is implemented first in order to remove the eye blinks from the PD signal and then a moving average window is utilized to obtain the representative value PDr for every one-second time interval of PD. There are three main steps for the on-line affective assessment algorithm, which are preparation, feature-based decision voting and affective determination. The final results show that the accuracies are 72.30% and 73.55% for the data subsets, which were respectively chosen using two types of data selection methods (paired t-test and subject self-evaluation). In order to further analyze the efficiency of affective recognition through the PD signal, the Galvanic Skin Response (GSR) was also monitored and processed. The highest affective assessment classification rate obtained from GSR processing is only 63.57% (based on the off-line processing algorithm). The overall results confirm that the PD signal should be considered as one of the most powerful physiological signals to involve in future automated real-time affective recognition systems, especially for detecting the “relaxation” vs. “stress” states.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation is about the research carried on developing an MPS (Multipurpose Portable System) which consists of an instrument and many accessories. The instrument is portable, hand-held, and rechargeable battery operated, and it measures temperature, absorbance, and concentration of samples by using optical principles. The system also performs auxiliary functions like incubation and mixing. This system can be used in environmental, industrial, and medical applications. ^ Research emphasis is on system modularity, easy configuration, accuracy of measurements, power management schemes, reliability, low cost, computer interface, and networking. The instrument can send the data to a computer for data analysis and presentation, or to a printer. ^ This dissertation includes the presentation of a full working system. This involved integration of hardware and firmware for the micro-controller in assembly language, software in C and other application modules. ^ The instrument contains the Optics, Transimpedance Amplifiers, Voltage-to-Frequency Converters, LCD display, Lamp Driver, Battery Charger, Battery Manager, Timer, Interface Port, and Micro-controller. ^ The accessories are a Printer, Data Acquisition Adapter (to transfer the measurements to a computer via the Printer Port and expand the Analog/Digital conversion capability), Car Plug Adapter, and AC Transformer. This system has been fully evaluated for fault tolerance and the schemes will also be presented. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation is a study of customer relationship management theory and practice. Customer Relationship Management (CRM) is a business strategy whereby companies build strong relationships with existing and prospective customers with the goal of increasing organizational profitability. It is also a learning process involving managing change in processes, people, and technology. CRM implementation and its ramifications are also not completely understood as evidenced by the high number of failures in CRM implementation in organizations and the resulting disappointments. ^ The goal of this dissertation is to study emerging issues and trends in CRM, including the effect of computer software and the accompanying new management processes on organizations, and the dynamics of the alignment of marketing, sales and services, and all other functions responsible for delivering customers a satisfying experience. ^ In order to understand CRM better a content analysis of more than a hundred articles and documents from academic and industry sources was undertaken using a new methodological twist to the traditional method. An Internet domain name (http://crm.fiu.edu) was created for the purpose of this research by uploading an initial one hundred plus abstracts of articles and documents onto it to form a knowledge database. Once the database was formed a search engine was developed to enable the search of abstracts using relevant CRM keywords to reveal emergent dominant CRM topics. The ultimate aim of this website is to serve as an information hub for CRM research, as well as a search engine where interested parties can enter CRM-relevant keywords or phrases to access abstracts, as well as submit abstracts to enrich the knowledge hub. ^ Research questions were investigated and answered by content analyzing the interpretation and discussion of dominant CRM topics and then amalgamating the findings. This was supported by comparisons within and across individual, paired, and sets-of-three occurrences of CRM keywords in the article abstracts. ^ Results show that there is a lack of holistic thinking and discussion of CRM in both academics and industry which is required to understand how the people, process, and technology in CRM impact each other to affect successful implementation. Industry has to get their heads around CRM and holistically understand how these important dimensions affect each other. Only then will organizational learning occur, and overtime result in superior processes leading to strong profitable customer relationships and a hard to imitate competitive advantage. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today, over 15,000 Ion Mobility Spectrometry (IMS) analyzers are employed at worldwide security checkpoints to detect explosives and illicit drugs. Current portal IMS instruments and other electronic nose technologies detect explosives and drugs by analyzing samples containing the headspace air and loose particles residing on a surface. Canines can outperform these systems at sampling and detecting the low vapor pressure explosives and drugs, such as RDX, PETN, cocaine, and MDMA, because these biological detectors target the volatile signature compounds available in the headspace rather than the non-volatile parent compounds of explosives and drugs.^ In this dissertation research volatile signature compounds available in the headspace over explosive and drug samples were detected using SPME as a headspace sampling tool coupled to an IMS analyzer. A Genetic Algorithm (GA) technique was developed to optimize the operating conditions of a commercial IMS (GE Itemizer 2), leading to the successful detection of plastic explosives (Detasheet, Semtex H, and C-4) and illicit drugs (cocaine, MDMA, and marijuana). Short sampling times (between 10 sec to 5 min) were adequate to extract and preconcentrate sufficient analytes (> 20 ng) representing the volatile signatures in the headspace of a 15 mL glass vial or a quart-sized can containing ≤ 1 g of the bulk explosive or drug.^ Furthermore, a research grade IMS with flexibility for changing operating conditions and physical configurations was designed and fabricated to accommodate future research into different analytes or physical configurations. The design and construction of the FIU-IMS were facilitated by computer modeling and simulation of ion’s behavior within an IMS. The simulation method developed uses SIMION/SDS and was evaluated with experimental data collected using a commercial IMS (PCP Phemto Chem 110). The FIU-IMS instrument has comparable performance to the GE Itemizer 2 (average resolving power of 14, resolution of 3 between two drugs and two explosives, and LODs range from 0.7 to 9 ng). ^ The results from this dissertation further advance the concept of targeting volatile components to presumptively detect the presence of concealed bulk explosives and drugs by SPME-IMS, and the new FIU-IMS provides a flexible platform for future IMS research projects.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Given the importance of color processing in computer vision and computer graphics, estimating and rendering illumination spectral reflectance of image scenes is important to advance the capability of a large class of applications such as scene reconstruction, rendering, surface segmentation, object recognition, and reflectance estimation. Consequently, this dissertation proposes effective methods for reflection components separation and rendering in single scene images. Based on the dichromatic reflectance model, a novel decomposition technique, named the Mean-Shift Decomposition (MSD) method, is introduced to separate the specular from diffuse reflectance components. This technique provides a direct access to surface shape information through diffuse shading pixel isolation. More importantly, this process does not require any local color segmentation process, which differs from the traditional methods that operate by aggregating color information along each image plane. ^ Exploiting the merits of the MSD method, a scene illumination rendering technique is designed to estimate the relative contributing specular reflectance attributes of a scene image. The image feature subset targeted provides a direct access to the surface illumination information, while a newly introduced efficient rendering method reshapes the dynamic range distribution of the specular reflectance components over each image color channel. This image enhancement technique renders the scene illumination reflection effectively without altering the scene’s surface diffuse attributes contributing to realistic rendering effects. ^ As an ancillary contribution, an effective color constancy algorithm based on the dichromatic reflectance model was also developed. This algorithm selects image highlights in order to extract the prominent surface reflectance that reproduces the exact illumination chromaticity. This evaluation is presented using a novel voting scheme technique based on histogram analysis. ^ In each of the three main contributions, empirical evaluations were performed on synthetic and real-world image scenes taken from three different color image datasets. The experimental results show over 90% accuracy in illumination estimation contributing to near real world illumination rendering effects. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fueled by increasing human appetite for high computing performance, semiconductor technology has now marched into the deep sub-micron era. As transistor size keeps shrinking, more and more transistors are integrated into a single chip. This has increased tremendously the power consumption and heat generation of IC chips. The rapidly growing heat dissipation greatly increases the packaging/cooling costs, and adversely affects the performance and reliability of a computing system. In addition, it also reduces the processor's life span and may even crash the entire computing system. Therefore, dynamic thermal management (DTM) is becoming a critical problem in modern computer system design. Extensive theoretical research has been conducted to study the DTM problem. However, most of them are based on theoretically idealized assumptions or simplified models. While these models and assumptions help to greatly simplify a complex problem and make it theoretically manageable, practical computer systems and applications must deal with many practical factors and details beyond these models or assumptions. The goal of our research was to develop a test platform that can be used to validate theoretical results on DTM under well-controlled conditions, to identify the limitations of existing theoretical results, and also to develop new and practical DTM techniques. This dissertation details the background and our research efforts in this endeavor. Specifically, in our research, we first developed a customized test platform based on an Intel desktop. We then tested a number of related theoretical works and examined their limitations under the practical hardware environment. With these limitations in mind, we developed a new reactive thermal management algorithm for single-core computing systems to optimize the throughput under a peak temperature constraint. We further extended our research to a multicore platform and developed an effective proactive DTM technique for throughput maximization on multicore processor based on task migration and dynamic voltage frequency scaling technique. The significance of our research lies in the fact that our research complements the current extensive theoretical research in dealing with increasingly critical thermal problems and enabling the continuous evolution of high performance computing systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In his essay - Regulating Casino Gaming: A Checklist for States Considering It – by Leonard E. Goodall, Professor of Management and Public Administration, College of Business and Econornics, University of Nevada, Las Vegas, Professor Goodall initially states: “Since various states are likely to continue to debate the issue of the establishment of legal casinos, and since states considering legal casinos must also decide how best to regulate them, the author discusses the similarities and contrasts in the regulatory systems already in operation.” Certainly not all states have solicited casino gaming, or what people generally refer to as gambling, but many have and the list is growing. If casinos are to be, and indications are that many more states will endorse gaming as a source of revenue, then regulating them must follow as a matter of due course says the author. Keep in mind this essay was written in 1988, and the actuality of casino gaming has indeed come to fruition in many states. “Nevada, having legalized casino gaming in 1931, has over a half-century of experience with the regulatory process,” Professor Goodall informs. “When New Jersey approved the establishment of casinos in Atlantic City in 1976, state officials studied the Nevada system carefully and adopted many of Nevada's procedures.” Professor Goodall bullet-points at least 7 key elements that states wanting to pursue gaming should, or in the cases of Nevada and New Jersey, have already addressed in regard to regulation of the industry. Goodall parses, in more detail, those essentials. The ultimate form of regulation is ownership Goodall says. Either state run, or private are the logical options. “The arguments for private ownership have been both pragmatic and political,” Goodall says. “Legislators, like the general public, are skeptical of the ability of state bureaucracies to run big businesses in an efficient manner. Many of them also believe regulation can be more effective if there is at least an arm's-length distance between regulation and ownership,” the professor opines. Additionally important to consider is the purpose of legalization, says Goodall. Are the proceeds earmarked for general funds, or to be used specifically? Geographic considerations are key, Goodall points out. “This decision will depend partly on a state's reasons for having casinos in the first place,” he expands. “New Jersey's policy, for example, is obviously consistent with its goal of using casinos to reinvigorate Atlantic City.” “In both states, one of the most important functions of the regulatory agencies is that of licensing, the process of investigating individuals or organizations and then authorizing them to participate in the gaming business,” Goodall provides. In closing, Goodall says there is no need for ensuing states to reinvent the wheel when it comes to casino gaming regulation. Nevada and New Jersey already provide two good designs from which to emulate and/or build upon.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the discussion - Ethics, Value Systems And The Professionalization Of Hoteliers by K. Michael Haywood, Associate Professor, School of Hotel and Food Administration, University of Guelph, Haywood initially presents: “Hoteliers and executives in other service industries should realize that the foundation of success in their businesses is based upon personal and corporate value systems and steady commitment to excellence. The author illustrates how ethical issues and manager morality are linked to, and shaped by the values of executives and the organization, and how improved professionalism can only be achieved through the adoption of a value system that rewards contributions rather than the mere attainment of results.” The bottom line of this discussion is, how does the hotel industry reconcile its behavior with that of public perception? “The time has come for hoteliers to examine their own standards of ethics, value systems, and professionalism,” Haywood says. And it is ethics that are at the center of this issue; Haywood holds that component in an estimable position. “Hoteliers must become value-driven,” advises Haywood. “They must be committed to excellence both in actualizing their best potentialities and in excelling in all they do. In other words, the professionalization of the hotelier can be achieved through a high degree of self-control, internalized values, codes of ethics, and related socialization processes,” he expands. “Serious ethical issues exist for hoteliers as well as for many business people and professionals in positions of responsibility,” Haywood alludes in defining some inter-industry problems. “The acceptance of kickbacks and gifts from suppliers, the hiding of income from taxation authorities, the lack of interest in installing and maintaining proper safety and security systems, and the raiding of competitors' staffs are common practices,” he offers, with the reasoning that if these problems can occur within ranks, then there is going to be a negative backlash in the public/client arena as well. Haywood divides the key principles of his thesis statement - ethics, value systems, and professionalism – into specific elements, and then continues to broaden the scope of each element. Promotion, product/service, and pricing are additional key components in Haywood’s discussion, and he addresses each with verve and vitality. Haywood references the four character types - craftsmen, jungle fighters, company men, and gamesmen – via a citation to Michael Maccoby, in the portion of the discussion dedicated to morality and success. Haywood closes with a series of questions derived from Lawrence Miller's American Spirit, Visions of a New Corporate Culture, each question designed to focus, shape, and organize management's attention to the values that Miller sets forth in his piece.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The local area network (LAN) interconnecting computer systems and soft- ware can make a significant contribution to the hospitality industry. The author discusses the advantages and disadvantages of such systems.