381 resultados para Run
Resumo:
This paper is a deductive theoretical enquiry into the flow of effects from the geometry of price bubbles/busts, to price indices, to pricing behaviours of sellers and buyers, and back to price bubbles/busts. The intent of the analysis is to suggest analytical approaches to identify the presence, maturity, and/or sustainability of a price bubble. We present a pricing model to emulate market behaviour, including numeric examples and charts of the interaction of supply and demand. The model extends into dynamic market solutions myopic (single- and multi-period) backward looking rational expectations to demonstrate how buyers and sellers interact to affect supply and demand and to show how capital gain expectations can be a destabilising influence – i.e. the lagged effects of past price gains can drive the market price away from long-run market-worth. Investing based on the outputs of past price-based valuation models appear to be more of a game-of-chance than a sound investment strategy.
Resumo:
As part of a Doctor of Business Administration degree programme jointly run by Curtin University, Perth, Australia and Lingnan University, Hong Kong, a research thesis relating organizational effectiveness to the organizational culture of Hong Kong construction firms involved in public housing is being undertaken. Organizational effectiveness is measured by the Housing Department (HD) Performance Assessment Scoring System (PASS) and organizational culture traits and strengths have been measured by using the Denison Organizational Culture Survey (OCS), developed by Daniel Denison and William S. Neale and based on 16 years of research involving over 1,000 organizations. The PASS scores of building contractors are compared with the OCS scores to determine if there is any significant correlation between highly effective companies and particular organizational strengths and traits. Profiles are then drawn using the Denison Model and can be compared against ‘norms’ for the industry sector on which the survey has been carried out. The next stage of the work is to present the results of the survey to individual companies, conduct focus group interviews to test the results, discover more detail on that company’s culture and discuss possible actions based on the results. It is in this latter stage that certain value management techniques may well prove very useful.
Resumo:
This article explores new the realities of the permissions culture and “all rights reserved copyright” in the networked environment and poses the question: why is lending a copy of a book sharing but emailing a PDF of it piracy? It explores new approaches to publishing and distribution of books by highlighting two books in the Aduki Independent Press catalogue. It was modeled on a presentation delivered by Elliott Bledsoe at the Changing Climates in Arts Publishing forum run by Artlink and the Copyright Agency Limited in Adelaide, Australia on 9 May 2009 and in Sydney, Australia on 27 June 2009.
Resumo:
Business Process Management (BPM) has increased in popularity and maturity in recent years. Large enterprises engage use process management approaches to model, manage and refine repositories of process models that detail the whole enterprise. These process models can run to the thousands in number, and may contain large hierarchies of tasks and control structures that become cumbersome to maintain. Tools are therefore needed to effectively traverse this process model space in an efficient manner, otherwise the repositories remain hard to use, and thus are lowered in their effectiveness. In this paper we analyse a range of BPM tools for their effectiveness in handling large process models. We establish that the present set of commercial tools is lacking in key areas regarding visualisation of, and interaction with, large process models. We then present six tool functionalities for the development of advanced business process visualisation and interaction, presenting a design for a tool that will exploit the latest advances in 2D and 3D computer graphics to enable fast and efficient search, traversal and modification of process models.
Resumo:
The ability to forecast machinery failure is vital to reducing maintenance costs, operation downtime and safety hazards. Recent advances in condition monitoring technologies have given rise to a number of prognostic models for forecasting machinery health based on condition data. Although these models have aided the advancement of the discipline, they have made only a limited contribution to developing an effective machinery health prognostic system. The literature review indicates that there is not yet a prognostic model that directly models and fully utilises suspended condition histories (which are very common in practice since organisations rarely allow their assets to run to failure); that effectively integrates population characteristics into prognostics for longer-range prediction in a probabilistic sense; which deduces the non-linear relationship between measured condition data and actual asset health; and which involves minimal assumptions and requirements. This work presents a novel approach to addressing the above-mentioned challenges. The proposed model consists of a feed-forward neural network, the training targets of which are asset survival probabilities estimated using a variation of the Kaplan-Meier estimator and a degradation-based failure probability density estimator. The adapted Kaplan-Meier estimator is able to model the actual survival status of individual failed units and estimate the survival probability of individual suspended units. The degradation-based failure probability density estimator, on the other hand, extracts population characteristics and computes conditional reliability from available condition histories instead of from reliability data. The estimated survival probability and the relevant condition histories are respectively presented as “training target” and “training input” to the neural network. The trained network is capable of estimating the future survival curve of a unit when a series of condition indices are inputted. Although the concept proposed may be applied to the prognosis of various machine components, rolling element bearings were chosen as the research object because rolling element bearing failure is one of the foremost causes of machinery breakdowns. Computer simulated and industry case study data were used to compare the prognostic performance of the proposed model and four control models, namely: two feed-forward neural networks with the same training function and structure as the proposed model, but neglected suspended histories; a time series prediction recurrent neural network; and a traditional Weibull distribution model. The results support the assertion that the proposed model performs better than the other four models and that it produces adaptive prediction outputs with useful representation of survival probabilities. This work presents a compelling concept for non-parametric data-driven prognosis, and for utilising available asset condition information more fully and accurately. It demonstrates that machinery health can indeed be forecasted. The proposed prognostic technique, together with ongoing advances in sensors and data-fusion techniques, and increasingly comprehensive databases of asset condition data, holds the promise for increased asset availability, maintenance cost effectiveness, operational safety and – ultimately – organisation competitiveness.
Resumo:
Introduction: The Google Online Marketing Challenge is a global competition in which student teams run advertising campaigns for small and medium-sized businesses (SMEs) using AdWords, Google’s text-based advertisements. In 2008, its inaugural year, over 8,000 students and 300 instructors from 47 countries representing over 200 schools participated. The Challenge ran in undergraduate and graduate classes in disciplines such as marketing, tourism, advertising, communication and information systems. Combining advertising and education, the Challenge gives student hands-on experience in the increasingly important field of online marketing, engages them with local businesses and motivates them through the thrill of a global competition. Student teams receive US$200 in AdWords credits, Google’s premier advertising product that offers cost-per-click advertisements. The teams then recruit and work with a local business to devise an effective online marketing campaign. Students first outline a strategy, run a series of campaigns, and provide their business with recommendations to improve their online marketing. Teams submit two written reports for judging by 14 academics in eight countries. In addition, Google AdWords experts judge teams on their campaign statistics such as success metrics and account management. Rather than a marketing simulation against a computer or hypothetical marketing plans for hypothetical businesses, the Challenges has student teams develop and manage real online advertising campaigns for their clients and compete against peers globally.
Resumo:
Hope is a word that has re-emerged in light of Obama's stunning win in the United States election. In this time of economic gloom and the reality of bleak recession and unprecedented job losses the United States has embraced the hopeful message of Barack Obama. For many years 'hope' has been a word that has been lost, forgotten , and banished to the margins of romantic longing and wishful thinking. Hope is also a word that has been much discussed in relation to the iconic The Great Gatsby but usually in a negative fashion to demonstrate the unattainability of the American dream. Marcella Taylor called Gatsby "the unfinished American Epic" which focused on the "passing of the last utopian frontier" and suggested the significance of this passing on American society as a whole. In the last months, however, hope has made a return and one gets the feeling that Fitzgerald's words "but that's no matter-to-morrow we will run faster, stretch out our arms farther . . . And one fine morning' are once again being heard.
Resumo:
Survey-based health research is in a boom phase following an increased amount of health spending in OECD countries and the interest in ageing. A general characteristic of survey-based health research is its diversity. Different studies are based on different health questions in different datasets; they use different statistical techniques; they differ in whether they approach health from an ordinal or cardinal perspective; and they differ in whether they measure short-term or long-term effects. The question in this paper is simple: do these differences matter for the findings? We investigate the effects of life-style choices (drinking, smoking, exercise) and income on six measures of health in the US Health and Retirement Study (HRS) between 1992 and 2002: (1) self-assessed general health status, (2) problems with undertaking daily tasks and chores, (3) mental health indicators, (4) BMI, (5) the presence of serious long-term health conditions, and (6) mortality. We compare ordinal models with cardinal models; we compare models with fixed effects to models without fixed-effects; and we compare short-term effects to long-term effects. We find considerable variation in the impact of different determinants on our chosen health outcome measures; we find that it matters whether ordinality or cardinality is assumed; we find substantial differences between estimates that account for fixed effects versus those that do not; and we find that short-run and long-run effects differ greatly. All this implies that health is an even more complicated notion than hitherto thought, defying generalizations from one measure to the others or one methodology to another.
Resumo:
Interactive development environments are making a resurgence. The traditional batch style of programming, edit -> compile -> run, is slowly being reevaluated by the development community at large. Languages such as Perl, Python and Ruby are at the heart of a new programming culture commonly described as extreme, agile or dynamic. Musicians are also beginning to embrace these environments and to investigate the opportunity to use dynamic programming tools in live performance. This paper provides an introduction to Impromptu, a new interactive development environment for musicians and sound artists.
Resumo:
This paper proposes a method for power flow control between utility and microgrid through back-to-back converters, which facilitates desired real and reactive power flow between utility and microgrid. In the proposed control strategy, the system can run in two different modes depending on the power requirement in the microgrid. In mode-1, specified amount of real and reactive power are shared between the utility and the microgrid through the back-to-back converters. Mode-2 is invoked when the power that can be supplied by the DGs in the microgrid reaches its maximum limit. In such a case, the rest of the power demand of the microgrid has to be supplied by the utility. An arrangement between DGs in the microgrid is proposed to achieve load sharing in both grid connected and islanded modes. The back-to-back converters also provide total frequency isolation between the utility and the microgrid. It is shown that the voltage or frequency fluctuation in the utility side has no impact on voltage or power in microgrid side. Proper relay-breaker operation coordination is proposed during fault along with the blocking of the back-to-back converters for seamless resynchronization. Both impedance and motor type loads are considered to verify the system stability. The impact of dc side voltage fluctuation of the DGs and DG tripping on power sharing is also investigated. The efficacy of the proposed control ar-rangement has been validated through simulation for various operating conditions. The model of the microgrid power system is simulated in PSCAD.
Resumo:
Abandoned object detection (AOD) systems are required to run in high traffic situations, with high levels of occlusion. Systems rely on background segmentation techniques to locate abandoned objects, by detecting areas of motion that have stopped. This is often achieved by using a medium term motion detection routine to detect long term changes in the background. When AOD systems are integrated into person tracking system, this often results in two separate motion detectors being used to handle the different requirements. We propose a motion detection system that is capable of detecting medium term motion as well as regular motion. Multiple layers of medium term (static) motion can be detected and segmented. We demonstrate the performance of this motion detection system and as part of an abandoned object detection system.
Resumo:
Games and related virtual environments have been a much-hyped area of the entertainment industry. The classic quote is that games are now approaching the size of Hollywood box office sales [1]. Books are now appearing that talk up the influence of games on business [2], and it is one of the key drivers of present hardware development. Some of this 3D technology is now embedded right down at the operating system level via the Windows Presentation Foundations – hit Windows/Tab on your Vista box to find out... In addition to this continued growth in the area of games, there are a number of factors that impact its development in the business community. Firstly, the average age of gamers is approaching the mid thirties. Therefore, a number of people who are in management positions in large enterprises are experienced in using 3D entertainment environments. Secondly, due to the pressure of demand for more computational power in both CPU and Graphical Processing Units (GPUs), your average desktop, any decent laptop, can run a game or virtual environment. In fact, the demonstrations at the end of this paper were developed at the Queensland University of Technology (QUT) on a standard Software Operating Environment, with an Intel Dual Core CPU and basic Intel graphics option. What this means is that the potential exists for the easy uptake of such technology due to 1. a broad range of workers being regularly exposed to 3D virtual environment software via games; 2. present desktop computing power now strong enough to potentially roll out a virtual environment solution across an entire enterprise. We believe such visual simulation environments can have a great impact in the area of business process modeling. Accordingly, in this article we will outline the communication capabilities of such environments, giving fantastic possibilities for business process modeling applications, where enterprises need to create, manage, and improve their business processes, and then communicate their processes to stakeholders, both process and non-process cognizant. The article then concludes with a demonstration of the work we are doing in this area at QUT.
Resumo:
Purpose: In 1970, Enright observed a distortion of perceived driving speed, induced by monocular application of a neutral density (ND) filter. If a driver looks out of the right side of a vehicle with a filter over the right eye, the driver perceives a reduction of the vehicle’s apparent velocity, while applying a ND filter over the left eye increases the vehicle’s apparent velocity. The purpose of the current study was to provide the first empirical measurements of the Enright phenomenon. Methods: Ten experienced drivers were tested and drove an automatic sedan on a closed road circuit. Filters (0.9 ND) were placed over the left, right or both eyes during a driving run, in addition to a control condition with no filters in place. Subjects were asked to look out of the right side of the car and adjust their driving speed to either 40 km/h or 60 km/h. Results: Without a filter or with both eyes filtered subjects showed good estimation of speed when asked to travel at 60 km/h but travelled a mean of 12 to 14 km/h faster than the requested 40 km/h. Subjects travelled faster than these baselines by a mean of 7 to 9 km/h (p < 0.001) with the filter over their right eye, and 3 to 5 km/h slower with the filter over their left eye (p < 0.05). Conclusions: The Enright phenomenon causes significant and measurable distortions of perceived driving speed under realworld driving conditions.
Resumo:
Monitoring Internet traffic is critical in order to acquire a good understanding of threats to computer and network security and in designing efficient computer security systems. Researchers and network administrators have applied several approaches to monitoring traffic for malicious content. These techniques include monitoring network components, aggregating IDS alerts, and monitoring unused IP address spaces. Another method for monitoring and analyzing malicious traffic, which has been widely tried and accepted, is the use of honeypots. Honeypots are very valuable security resources for gathering artefacts associated with a variety of Internet attack activities. As honeypots run no production services, any contact with them is considered potentially malicious or suspicious by definition. This unique characteristic of the honeypot reduces the amount of collected traffic and makes it a more valuable source of information than other existing techniques. Currently, there is insufficient research in the honeypot data analysis field. To date, most of the work on honeypots has been devoted to the design of new honeypots or optimizing the current ones. Approaches for analyzing data collected from honeypots, especially low-interaction honeypots, are presently immature, while analysis techniques are manual and focus mainly on identifying existing attacks. This research addresses the need for developing more advanced techniques for analyzing Internet traffic data collected from low-interaction honeypots. We believe that characterizing honeypot traffic will improve the security of networks and, if the honeypot data is handled in time, give early signs of new vulnerabilities or breakouts of new automated malicious codes, such as worms. The outcomes of this research include: • Identification of repeated use of attack tools and attack processes through grouping activities that exhibit similar packet inter-arrival time distributions using the cliquing algorithm; • Application of principal component analysis to detect the structure of attackers’ activities present in low-interaction honeypots and to visualize attackers’ behaviors; • Detection of new attacks in low-interaction honeypot traffic through the use of the principal component’s residual space and the square prediction error statistic; • Real-time detection of new attacks using recursive principal component analysis; • A proof of concept implementation for honeypot traffic analysis and real time monitoring.