836 resultados para Automating Hospitality Information: Network Technology and Systems Management
Resumo:
Margaret Woods discusses lessons from the UK's too organizations on integrating risk and performance management.
Resumo:
In this paper the issues of Ukrainian new three-level pension system are discussed. First, the paper presents the mathematical model that allows calculating the optimal size of contributions to the non-state pension fund. Next, the non-state pension fund chooses an Asset Management Company. To do so it is proposed to use an approach based on Kohonen networks to classify asset management companies that work in Ukrainian market. Further, when the asset management company is chosen, it receives the pension contributions of the participants of the non-pension fund. Asset Management Company has to invest these contributions profitably. This paper proposes an approach for choosing the most profitable investment project using decision trees. The new pension system has been lawfully ratified only four years ago and is still developing, that is why this paper is very important.
Resumo:
We overview our recent developments in the theory of dispersion-managed (DM) solitons within the context of optical applications. First, we present a class of localized solutions with a period multiple to that of the standard DM soliton in the nonlinear Schrödinger equation with periodic variations of the dispersion. In the framework of a reduced ordinary differential equation-based model, we discuss the key features of these structures, such as a smaller energy compared to traditional DM solitons with the same temporal width. Next, we present new results on dissipative DM solitons, which occur in the context of mode-locked lasers. By means of numerical simulations and a reduced variational model of the complex Ginzburg-Landau equation, we analyze the influence of the different dissipative processes that take place in a laser.
Resumo:
In this study, the authors investigate the outage-optimal relay strategy under outdated channel state information (CSI) in a decode-and-forward cooperative communication system. They first confirm mathematically that minimising the outage probability under outdated CSI is equivalent to minimising the conditional outage probability on the outdated CSI of all the decodable relays' links. They then propose a multiple-relay strategy with optimised transmitting power allocation (MRS-OTPA) that minimises the conditional outage probability. It is shown that this MRS is a generalised relay approach to achieve the outage optimality under outdated CSI. To reduce the complexity, they also propose a MRS with equal transmitting power allocation (MRS-ETPA) that achieves near-optimal outage performance. It is proved that full spatial diversity, which has been achieved under ideal CSI, can still be achieved under outdated CSI through MRS-OTPA and MRS-ETPA. Finally, the outage performance and diversity order of MRS-OTPA and MRS-ETPA are evaluated by simulation.
Resumo:
While most studies take a dyadic view when examining the environmental difference between the home country of a multinational enterprise (MNE) and a particular foreign country, they ignore that an MNE is managing a network of subsidiaries embedded in diverse environments. Additionally, neither the impacts of global environments on top executives nor the effects of top executives’ capabilities to handle institutional complexity are fully explored. Thus, using a three-essay format, this dissertation tried to fill these gaps by addressing the effects of institutional complexity and top management characteristics on top executive compensation and firm performance. ^ Essay 1 investigated the impact of an MNE’s institutional complexity, or the diversity of national institutions facing an MNE’s network of subsidiaries, on the top management team (TMT) compensation. This essay proposed that greater political and cultural complexity leads to not only greater TMT total compensation but also to a greater portion of TMT compensation linked with long-term performance. The arguments are supported in this essay by using an unbalanced panel dataset including 296 U.S. firms with 1,340 observations. ^ Essay 2 explored TMT social capital and its moderating role on value creation and appropriation by the chief executive officer (CEO). Using a sample with 548 U.S. firms and 2,010 observations, it found that greater TMT social capital does facilitate the effects of CEO intellectual capital and social capital on firm growth. Finally, essay 3 examined the performance implications for the fit between managerial information-processing capabilities and institutional complexity. It proposed that institutional complexity is associated with the needs of information-processing. On the other hand, smaller TMT turnover and larger TMT size reflect larger managerial information-processing capabilities. Consequently, superior performance is achieved by the match among institutional complexity, TMT turnover, and TMT size. All hypotheses in essay 3 are supported in a sample of 301 U.S. firms and 1,404 observations. ^ To conclude, this dissertation advances and extends our knowledge on the roles of institutional environments and top executives on firm performance and top executive compensation.^
Resumo:
A number of factors influence the information processing needs of organizations, particularly with respect to the coordination and control mechanisms within a hotel. The authors use a theoretical framework to illustrate alternative mechanisms that can be used to coordinate and control hotel operations.
Resumo:
Technology will play an increasingly larger role in the education of students within the hospitality curriculum. There are a significant number of emerging educational technologies aimed at changing the delivery of the entire curriculum. The development of technological platforms for multimedia instructional courseware, distance learning through audiographics, and virtual reality simulation are expected to alter and enhance the learning process while extending the boundaries of the traditional hospitality classroom.
Resumo:
In the wake of the “9-11” terrorists' attacks, the U.S. Government has turned to information technology (IT) to address a lack of information sharing among law enforcement agencies. This research determined if and how information-sharing technology helps law enforcement by examining the differences in perception of the value of IT between law enforcement officers who have access to automated regional information sharing and those who do not. It also examined the effect of potential intervening variables such as user characteristics, training, and experience, on the officers' evaluation of IT. The sample was limited to 588 officers from two sheriff's offices; one of them (the study group) uses information sharing technology, the other (the comparison group) does not. Triangulated methodologies included surveys, interviews, direct observation, and a review of agency records. Data analysis involved the following statistical methods: descriptive statistics, Chi-Square, factor analysis, principal component analysis, Cronbach's Alpha, Mann-Whitney tests, analysis of variance (ANOVA), and Scheffe' post hoc analysis. ^ Results indicated a significant difference between groups: the study group perceived information sharing technology as being a greater factor in solving crime and in increasing officer productivity. The study group was more satisfied with the data available to it. As to the number of arrests made, information sharing technology did not make a difference. Analysis of the potential intervening variables revealed several remarkable results. The presence of a strong performance management imperative (in the comparison sheriff's office) appeared to be a factor in case clearances and arrests, technology notwithstanding. As to the influence of user characteristics, level of education did not influence a user's satisfaction with technology, but user-satisfaction scores differed significantly among years of experience as a law enforcement officer and the amount of computer training, suggesting a significant but weak relationship. ^ Therefore, this study finds that information sharing technology assists law enforcement officers in doing their jobs. It also suggests that other variables such as computer training, experience, and management climate should be accounted for when assessing the impact of information technology. ^
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.
Resumo:
A primary goal of context-aware systems is delivering the right information at the right place and right time to users in order to enable them to make effective decisions and improve their quality of life. There are three key requirements for achieving this goal: determining what information is relevant, personalizing it based on the users’ context (location, preferences, behavioral history etc.), and delivering it to them in a timely manner without an explicit request from them. These requirements create a paradigm that we term as “Proactive Context-aware Computing”. Most of the existing context-aware systems fulfill only a subset of these requirements. Many of these systems focus only on personalization of the requested information based on users’ current context. Moreover, they are often designed for specific domains. In addition, most of the existing systems are reactive - the users request for some information and the system delivers it to them. These systems are not proactive i.e. they cannot anticipate users’ intent and behavior and act proactively without an explicit request from them. In order to overcome these limitations, we need to conduct a deeper analysis and enhance our understanding of context-aware systems that are generic, universal, proactive and applicable to a wide variety of domains. To support this dissertation, we explore several directions. Clearly the most significant sources of information about users today are smartphones. A large amount of users’ context can be acquired through them and they can be used as an effective means to deliver information to users. In addition, social media such as Facebook, Flickr and Foursquare provide a rich and powerful platform to mine users’ interests, preferences and behavioral history. We employ the ubiquity of smartphones and the wealth of information available from social media to address the challenge of building proactive context-aware systems. We have implemented and evaluated a few approaches, including some as part of the Rover framework, to achieve the paradigm of Proactive Context-aware Computing. Rover is a context-aware research platform which has been evolving for the last 6 years. Since location is one of the most important context for users, we have developed ‘Locus’, an indoor localization, tracking and navigation system for multi-story buildings. Other important dimensions of users’ context include the activities that they are engaged in. To this end, we have developed ‘SenseMe’, a system that leverages the smartphone and its multiple sensors in order to perform multidimensional context and activity recognition for users. As part of the ‘SenseMe’ project, we also conducted an exploratory study of privacy, trust, risks and other concerns of users with smart phone based personal sensing systems and applications. To determine what information would be relevant to users’ situations, we have developed ‘TellMe’ - a system that employs a new, flexible and scalable approach based on Natural Language Processing techniques to perform bootstrapped discovery and ranking of relevant information in context-aware systems. In order to personalize the relevant information, we have also developed an algorithm and system for mining a broad range of users’ preferences from their social network profiles and activities. For recommending new information to the users based on their past behavior and context history (such as visited locations, activities and time), we have developed a recommender system and approach for performing multi-dimensional collaborative recommendations using tensor factorization. For timely delivery of personalized and relevant information, it is essential to anticipate and predict users’ behavior. To this end, we have developed a unified infrastructure, within the Rover framework, and implemented several novel approaches and algorithms that employ various contextual features and state of the art machine learning techniques for building diverse behavioral models of users. Examples of generated models include classifying users’ semantic places and mobility states, predicting their availability for accepting calls on smartphones and inferring their device charging behavior. Finally, to enable proactivity in context-aware systems, we have also developed a planning framework based on HTN planning. Together, these works provide a major push in the direction of proactive context-aware computing.
Resumo:
Have been less than thirty years since a group of graduate students and computer scientists working on a federal contract performed the first successful connection between two computers located at remote sites. This group known as the NWG Network Working Group, comprised of highly creative geniuses who as soon as they began meeting started talking about things like intellectual graphics, cooperating processes, automation questions, email, and many other interesting possibilities 1 . In 1968, the group's task was to design NWG's first computer network, in October 1969, the first data exchange occurred and by the end of that year a network of four computers was in operation. Since the invention of the telephone in 1876 no other technology has revolutionized the field of communications over the computer network. The number of people who have made great contributions to the creation and development of the Internet are many, the computer network a much more complex than the phone is the result of people of many nationalities and cultures. However, remember that some years later in 19732 two computer scientists Robert Kahn and Vinton Cerft created a more sophisticated communication program called Transmission Control Protocol - Internet Protocol TCP / IP which is still in force in the Internet today.
Resumo:
This study aimed to survey farmers knowledge and practices on the management of pastures, stocking rates and markets of meat goat-producing enterprises within New South Wales and Queensland, Australia. An interview-based questionnaire was conducted on properties that derived a significant proportion of their income from goats. The survey covered 31 landholders with a total land area of 567 177 ha and a reported total of 160 010 goats. A total of 55% (17/31) of producers were involved in both opportunistic harvesting and commercial goat operations, and 45% (14/31) were specialised seedstock producers. Goats were the most important livestock enterprise on 55% (17/31) of surveyed properties. Stocking rate varied considerably (0.3?9.3 goats/ha) within and across surveyed properties and was found to be negatively associated with property size and positively associated with rainfall. Overall, 81% (25/31) of producers reported that the purpose of running goats on their properties was to target international markets. Producers also cited the importance of targeting markets as a way to increase profitability. Fifty-three percent of producers were located over 600 km from a processing plant and the high cost of freight can limit the continuity of goats supplied to abattoirs. Fencing was an important issue for goat farmers, with many producers acknowledging this could potentially add to capital costs associated with better goat management and production. Producers in the pastoral regions appear to have a low investment in pasture development and opportunistic goat harvesting appears to be an important source of income.
Resumo:
Although its great potential as low to medium temperature waste heat recovery (WHR) solution, the ORC technology presents open challenges that still prevent its diffusion in the market, which are different depending on the application and the size at stake. Focusing on the micro range power size and low temperature heat sources, the ORC technology is still not mature due to the lack of appropriate machines and working fluids. Considering instead the medium to large size, the technology is already available but the investment is still risky. The intention of this thesis is to address some of the topical themes in the ORC field, paying special attention in the development of reliable models based on realistic data and accounting for the off-design performance of the ORC system and of each of its components. Concerning the “Micro-generation” application, this work: i) explores the modelling methodology, the performance and the optimal parameters of reciprocating piston expanders; ii) investigates the performance of such expander and of the whole micro-ORC system when using Hydrofluorocarbons as working fluid or their new low GWP alternatives and mixtures; iii) analyzes the innovative ORC reversible architecture (conceived for the energy storage), its optimal regulation strategy and its potential when inserted in typical small industrial frameworks. Regarding the “Industrial WHR” sector, this thesis examines the WHR opportunity of ORCs, with a focus on the natural gas compressor stations application. This work provides information about all the possible parameters that can influence the optimal sizing, the performance and thus the feasibility of installing an ORC system. New WHR configurations are explored: i) a first one, relying on the replacement of a compressor prime mover with an ORC; ii) a second one, which consists in the use of a supercritical CO2 cycle as heat recovery system.
Resumo:
Chagas disease is still a major public health problem in Latin America. Its causative agent, Trypanosoma cruzi, can be typed into three major groups, T. cruzi I, T. cruzi II and hybrids. These groups each have specific genetic characteristics and epidemiological distributions. Several highly virulent strains are found in the hybrid group; their origin is still a matter of debate. The null hypothesis is that the hybrids are of polyphyletic origin, evolving independently from various hybridization events. The alternative hypothesis is that all extant hybrid strains originated from a single hybridization event. We sequenced both alleles of genes encoding EF-1 alpha, actin and SSU rDNA of 26 T. cruzi strains and DHFR-TS and TR of 12 strains. This information was used for network genealogy analysis and Bayesian phylogenies. We found T. cruzi I and T. cruzi II to be monophyletic and that all hybrids had different combinations of T. cruzi I and T. cruzi II haplotypes plus hybrid-specific haplotypes. Bootstrap values (networks) and posterior probabilities (Bayesian phylogenies) of clades supporting the monophyly of hybrids were far below the 95% confidence interval, indicating that the hybrid group is polyphyletic. We hypothesize that T. cruzi I and T. cruzi II are two different species and that the hybrids are extant representatives of independent events of genome hybridization, which sporadically have sufficient fitness to impact on the epidemiology of Chagas disease.