912 resultados para Business Administration, Management|Information Science|Engineering, System Science


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This presentation was offered as part of the CUNY Library Assessment Conference, Reinventing Libraries: Reinventing Assessment, held at the City University of New York in June 2014.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research objectives Poker and responsible gambling both entail the use of the executive functions (EF), which are higher-level cognitive abilities. The main objective of this work was to assess if online poker players of different ability show different performances in their EF and if so, which functions are the most discriminating ones. The secondary objective was to assess if the EF performance can predict the quality of gambling, according to the Gambling Related Cognition Scale (GRCS), the South Oaks Gambling Screen (SOGS) and the Problem Gambling Severity Index (PGSI). Sample and methods The study design consisted of two stages: 46 Italian active players (41m, 5f; age 32±7,1ys; education 14,8±3ys) fulfilled the PGSI in a secure IT web system and uploaded their own hand history files, which were anonymized and then evaluated by two poker experts. 36 of these players (31m, 5f; age 33±7,3ys; education 15±3ys) accepted to take part in the second stage: the administration of an extensive neuropsychological test battery by a blinded trained professional. To answer the main research question we collected all final and intermediate scores of the EF tests on each player together with the scoring on the playing ability. To answer the secondary research question, we referred to GRCS, PGSI and SOGS scores.  We determined which variables that are good predictors of the playing ability score using statistical techniques able to deal with many regressors and few observations (LASSO, best subset algorithms and CART). In this context information criteria and cross-validation errors play a key role for the selection of the relevant regressors, while significance testing and goodness-of-fit measures can lead to wrong conclusions.   Preliminary findings We found significant predictors of the poker ability score in various tests. In particular, there are good predictors 1) in some Wisconsin Card Sorting Test items that measure flexibility in choosing strategy of problem-solving, strategic planning, modulating impulsive responding, goal setting and self-monitoring, 2) in those Cognitive Estimates Test variables related to deductive reasoning, problem solving, development of an appropriate strategy and self-monitoring, 3) in the Emotional Quotient Inventory Short (EQ-i:S) Stress Management score, composed by the Stress Tolerance and Impulse Control scores, and in the Interpersonal score (Empathy, Social Responsibility, Interpersonal Relationship). As for the quality of gambling, some EQ-i:S scales scores provide the best predictors: General Mood for the PGSI; Intrapersonal (Self-Regard; Emotional Self-Awareness, Assertiveness, Independence, Self-Actualization) and Adaptability  (Reality Testing, Flexibility, Problem Solving) for the SOGS, Adaptability for the GRCS. Implications for the field Through PokerMapper we gathered knowledge and evaluated the feasibility of the construction of short tasks/card games in online poker environments for profiling users’ executive functions. These card games will be part of an IT system able to dynamically profile EF and provide players with a feedback on their expected performance and ability to gamble responsibly in that particular moment. The implementation of such system in existing gambling platforms could lead to an effective proactive tool for supporting responsible gambling. 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Building demolition has been undergoing evolutionary development in its technologies for several decades. In order to achieve a high level of demolition material reuse and recycling, new management approaches are also necessitated, in particular in conjunction with the applications of information technologies. The development of an information system for demolition project management is an impactful strategy to support various demolition activities including waste exchange, demolition visualization, and demolition method selection and evaluation. This paper aims to develop a framework of an integrated information system for building demolition project demolition decision-making and waste minimization. The components of this information system and their interactions are demonstrated through a specifical demolition project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Building demolition has been undergoing evolutionary development in its technologies for several decades. In order to achieve a high level of demolition material reuse and recycling, new management approaches are also necessitated. Several information systems are proposed or developed particularly promoting efficient project management, waste minimization and project safety. These information systems include waste exchange, 4D visualization, safety aware schedule, waste product schedule, site atTangement optimization and so on. However, the fragmented information systems applied by various parties involved in the demolition project could generate conflicts due to the lack of communication and standardization. This paper aims to develop a framework of an integrated information system for building demolition projects, which covers the major aspects of innovative management approaches and conventional construction project management perspective. Practically, the system will serve as an information portal for all demolition project team members.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the past 15 years, governments in the developed, Western world have been contracting out, or outsourcing, services as a key part of public sector reforms. Outsourcing has been argued to lead to cost savings, improved discipline, better services, access to scarce skills, and the capacity for managers to focus more time on the core business of their organizations (Domberger, 1998). Government outsourcing initiatives have encompassed a range of services, but given the large sums of money invested in IT assets, the outsourcing of IT services (IT outsourcing, or ITO) has been a major initiative for many agencies. Lacity and Willcocks (1998, p. 3) defined ITO as "handing over to a third party [the] management of IS/IT assets, resources and/or activities for required results." For public-sector outsourcing, this handover is usually made by way of a competitive tender. Case studies have reported ITO successes and failures (e.g., Currie & Willcocks, 1998; Rouse & Corbitt, 2003; Willcocks & Currie, 1997; Willcocks & Lacity, 2001; Willcocks & Kern, 1998), but much of the evidence presented to public-sector decision makers to justify this reform is anecdotal and unsystematic, and when investigated in depth, does not necessarily support widespread conclusions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Distributed Shared Memory (DSM) provides programmers with a shared memory environment in systems where memory is not physically shared. Clusters of Workstations (COWs), an often untapped source of computing power, are characterised by a very low cost/performance ratio. The combination of Clusters of Workstations (COWs) with DSM provides an environment in which the programmer can use the well known approaches and methods of programming for physically shared memory systems and parallel processing can be carried out to make full use of the computing power and cost advantages of the COW. The aim of this research is to synthesise and develop a distributed shared memory system as an integral part of an operating system in order to provide application programmers with a convenient environment in which the development and execution of parallel applications can be done easily and efficiently, and which does this in a transparent manner. Furthermore, in order to satisfy our challenging design requirements we want to demonstrate that the operating system into which the DSM system is integrated should be a distributed operating system. In this thesis a study into the synthesis of a DSM system within a microkernel and client-server based distributed operating system which uses both strict and weak consistency models, with a write-invalidate and write-update based approach for consistency maintenance is reported. Furthermore a unique automatic initialisation system which allows the programmer to start the parallel execution of a group of processes with a single library call is reported. The number and location of these processes are determined by the operating system based on system load information. The DSM system proposed has a novel approach in that it provides programmers with a complete programming environment in which they are easily able to develop and run their code or indeed run existing shared memory code. A set of demanding DSM system design requirements are presented and the incentives for the placement of the DSM system with a distributed operating system and in particular in the memory management server have been reported. The new DSM system concentrated on an event-driven set of cooperating and distributed entities, and a detailed description of the events and reactions to these events that make up the operation of the DSM system is then presented. This is followed by a pseudocode form of the detailed design of the main modules and activities of the primitives used in the proposed DSM system. Quantitative results of performance tests and qualitative results showing the ease of programming and use of the RHODOS DSM system are reported. A study of five different application is given and the results of tests carried out on these applications together with a discussion of the results are given. A discussion of how RHODOS’ DSM allows programmers to write shared memory code in an easy to use and familiar environment and a comparative evaluation of RHODOS DSM with other DSM systems is presented. In particular, the ease of use and transparency of the DSM system have been demonstrated through the description of the ease with which a moderately inexperienced undergraduate programmer was able to convert, write and run applications for the testing of the DSM system. Furthermore, the description of the tests performed using physically shared memory shows that the latter is indistinguishable from distributed shared memory; this is further evidence that the DSM system is fully transparent. This study clearly demonstrates that the aim of the research has been achieved; it is possible to develop a programmer friendly and efficient DSM system fully integrated within a distributed operating system. It is clear from this research that client-server and microkernel based distributed operating system integrated DSM makes shared memory operations transparent and almost completely removes the involvement of the programmer beyond classical activities needed to deal with shared memory. The conclusion can be drawn that DSM, when implemented within a client-server and microkernel based distributed operating system, is one of the most encouraging approaches to parallel processing since it guarantees performance improvements with minimal programmer involvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis unveils an integrated system that once applied, could standardise and simplify the processes used for high quality water recovery and wastewater treatment. It forsees lower prices of desalinated and recovered water, in a streamlined and more efficient water industry, by departing from today's thinking of conventional wastewater treatment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis addresses the issue of a limited success of knowledge management systems in spite of substantial investments in their development and implementation. Through the research, core reasons for this situation were identified and an innovative user-centered solution that focuses knowledge management on supporting professional activities is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many organizations make use of information system development methodologies to guide their staff in developing computerised information systems. This thesis contributes to methodology engineering research by introducing a number of important innovations in methodology fragment architectures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electronic commerce (e-commerce) offers enormous opportunities for online trading while at the same time presenting potential risks. Although various mechanisms have been developed to elevate trust in e-commerce, research shows that shoppers continue to be skeptical about buying online and lack of trust is often cited as the main reason for it. Thus, enhancing success in e-commerce requires eliminating or reducing the risks. In this chapter, we present a multi-attribute trust management model that incorporates trust, transaction costs and product warranties. The new trust management system enables potential buyers to determine the risk level of a product before committing to proceed with the transaction. This is useful to online buyers as it allows them to be aware of the risk level and subsequently take the appropriate actions to minimize potential risks before engaging in risky businesses. Results of various simulation experiments show that the proposed multi-attribute trust management system can be highly effective in identifying risky transaction in electronic market places.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This book focuses on network management and traffic engineering for Internet and distributed computing technologies, as well as present emerging technology trends and advanced platform

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Remote technologies are changing our way of life. The radio frequency identification (RFJD) system is a new technology which uses the open air to transmit information. This information transmission needs to be protected to provide user safety and privacy. Business will look for a 5ystem that hasfraud resilience to prevent the misuse of information to take dishonest advantage. The business and the user need to be assured that the transmitted information has no content which is capable of undertaking malicious activities. Public awareness of RFID security will help users and organizations to understand the need for security protection. Publishing a security guideline from the regulating body and monitoring implementation of that guideline in RFID 5ystems will ensure that businesses and users are protected. This chapter explains the importance of security in a RFID system and will outline the protective measures. It also points out the research direction of RFID 5ystems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explores project management techniques that can support the development of novel product-service systems. Some observations from the development of an airborne earth properties measurement system are provided. The intellectual property and the data this system could potentially deliver was more important than the potential commercial value of the product itself. What was sought was a complete business service solution. A concurrent engineering approach was implemented linking both product development and survey data/analysis services. The blend of product and service was integrated using a function modeling technique. It was observed that the implementation of some functions required radical innovation whilst others could be implemented through incremental improvements to current practice. It is suggested in the paper that adapting production learning curve concepts that reflect the relative degrees of uncertainty involved in individual subsystems can enhance project management forecasting practice © 2013 The Authors and IOS Press.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Big data analytics has shown great potential in optimizing operations, making decisions, spotting business trends, preventing threats, and capitalizing on new sources of revenues in various fields such as manufacturing, healthcare, finance, insurance, and retail. The management of various networks has become inefficient and difficult because of their high complexities and interdependencies. Big data, in forms of device logs, software logs, media content, and sensed data, provide rich information and facilitate a fundamentally different and novel approach to explore, design, and develop reliable and scalable networks. This Special Issue covers the most recent research results that address challenges of big data for networking. We received 45 submissions, and ultimately nine high quality papers, organized into two groups, have been selected for inclusion in this Special Issue.