894 resultados para Averaging operators


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter gives an overview of the smartphone app economy and its various constituent ecosystems. It examines the role of the app store model and the proliferation of mobile apps in the shift from value chains controlled by network operators and handset manufacturers, to value networks – or ecosystems – focused around operating systems and apps. It outlines some of the benefits and disadvantages for developers of the app store model for remuneration and distribution. The chapter concludes with a discussion of recent research on the size and employment effects of the app economy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Airport efficiency is important because it has a direct impact on customer safety and satisfaction and therefore the financial performance and sustainability of airports, airlines, and affiliated service providers. This is especially so in a world characterized by an increasing volume of both domestic and international air travel, price and other forms of competition between rival airports, airport hubs and airlines, and rapid and sometimes unexpected changes in airline routes and carriers. It also reflects expansion in the number of airports handling regional, national, and international traffic and the growth of complementary airport facilities including industrial, commercial, and retail premises. This has fostered a steadily increasing volume of research aimed at modeling and providing best-practice measures and estimates of airport efficiency using mathematical and econometric frontiers. The purpose of this chapter is to review these various methods as they apply to airports throughout the world. Apart from discussing the strengths and weaknesses of the different approaches and their key findings, the paper also examines the steps faced by researchers as they move through the modeling process in defining airport inputs and outputs and the purported efficiency drivers. Accordingly, the chapter provides guidance to those conducting empirical research on airport efficiency and serves as an aid for aviation regulators and airport operators among others interpreting airport efficiency research outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a generic and integrated solar powered remote Unmanned Air Vehicles (UAV) and Wireless Sensor Network (WSN) gas sensing system. The system uses a generic gas sensing system for CH4 and CO2 concentrations using metal oxide (MoX) and non-dispersive infrared sensors, and a new solar cell encapsulation method to power the UASs as well as a data management platform to store, analyse and share the information with operators and external users. The system was successfully field tested at ground and low altitudes, collecting, storing and transmitting data in real time to a central node for analysis and 3D mapping. The system can be used in a wide range of outdoor applications, especially in agriculture, bushfires, mining studies, opening the way to a ubiquitous low cost environmental monitoring. A video of the bench and flight test performed can be seen in the following link https://www.youtube.com/watch?v=Bwas7stYIxQ.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The huge amount of CCTV footage available makes it very burdensome to process these videos manually through human operators. This has made automated processing of video footage through computer vision technologies necessary. During the past several years, there has been a large effort to detect abnormal activities through computer vision techniques. Typically, the problem is formulated as a novelty detection task where the system is trained on normal data and is required to detect events which do not fit the learned ‘normal’ model. There is no precise and exact definition for an abnormal activity; it is dependent on the context of the scene. Hence there is a requirement for different feature sets to detect different kinds of abnormal activities. In this work we evaluate the performance of different state of the art features to detect the presence of the abnormal objects in the scene. These include optical flow vectors to detect motion related anomalies, textures of optical flow and image textures to detect the presence of abnormal objects. These extracted features in different combinations are modeled using different state of the art models such as Gaussian mixture model(GMM) and Semi- 2D Hidden Markov model(HMM) to analyse the performances. Further we apply perspective normalization to the extracted features to compensate for perspective distortion due to the distance between the camera and objects of consideration. The proposed approach is evaluated using the publicly available UCSD datasets and we demonstrate improved performance compared to other state of the art methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diagnostics of rolling element bearings involves a combination of different techniques of signal enhancing and analysis. The most common procedure presents a first step of order tracking and synchronous averaging, able to remove the undesired components, synchronous with the shaft harmonics, from the signal, and a final step of envelope analysis to obtain the squared envelope spectrum. This indicator has been studied thoroughly, and statistically based criteria have been obtained, in order to identify damaged bearings. The statistical thresholds are valid only if all the deterministic components in the signal have been removed. Unfortunately, in various industrial applications, characterized by heterogeneous vibration sources, the first step of synchronous averaging is not sufficient to eliminate completely the deterministic components and an additional step of pre-whitening is needed before the envelope analysis. Different techniques have been proposed in the past with this aim: The most widely spread are linear prediction filters and spectral kurtosis. Recently, a new technique for pre-whitening has been proposed, based on cepstral analysis: the so-called cepstrum pre-whitening. Owing to its low computational requirements and its simplicity, it seems a good candidate to perform the intermediate pre-whitening step in an automatic damage recognition algorithm. In this paper, the effectiveness of the new technique will be tested on the data measured on a full-scale industrial bearing test-rig, able to reproduce the harsh conditions of operation. A benchmark comparison with the traditional pre-whitening techniques will be made, as a final step for the verification of the potentiality of the cepstrum pre-whitening.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The transmission path from the excitation to the measured vibration on the surface of a mechanical system introduces a distortion both in amplitude and in phase. Moreover, in variable speed conditions, the amplification/attenuation and the phase shift, due to the transfer function of the mechanical system, varies in time. This phenomenon reduces the effectiveness of the traditionally tachometer based order tracking, compromising the results of a discrete-random separation performed by a synchronous averaging. In this paper, for the first time, the extent of the distortion is identified both in the time domain and in the order spectrum of the signal, highlighting the consequences for the diagnostics of rotating machinery. A particular focus is given to gears, providing some indications on how to take advantage of the quantification of the disturbance to better tune the techniques developed for the compensation of the distortion. The full theoretical analysis is presented and the results are applied to an experimental case.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Transit passenger market segmentation enables transit operators to target different classes of transit users to provide customized information and services. The Smart Card (SC) data, from Automated Fare Collection system, facilitates the understanding of multiday travel regularity of transit passengers, and can be used to segment them into identifiable classes of similar behaviors and needs. However, the use of SC data for market segmentation has attracted very limited attention in the literature. This paper proposes a novel methodology for mining spatial and temporal travel regularity from each individual passenger’s historical SC transactions and segments them into four segments of transit users. After reconstructing the travel itineraries from historical SC transactions, the paper adopts the Density-Based Spatial Clustering of Application with Noise (DBSCAN) algorithm to mine travel regularity of each SC user. The travel regularity is then used to segment SC users by an a priori market segmentation approach. The methodology proposed in this paper assists transit operators to understand their passengers and provide them oriented information and services.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Monitoring stream networks through time provides important ecological information. The sampling design problem is to choose locations where measurements are taken so as to maximise information gathered about physicochemical and biological variables on the stream network. This paper uses a pseudo-Bayesian approach, averaging a utility function over a prior distribution, in finding a design which maximizes the average utility. We use models for correlations of observations on the stream network that are based on stream network distances and described by moving average error models. Utility functions used reflect the needs of the experimenter, such as prediction of location values or estimation of parameters. We propose an algorithmic approach to design with the mean utility of a design estimated using Monte Carlo techniques and an exchange algorithm to search for optimal sampling designs. In particular we focus on the problem of finding an optimal design from a set of fixed designs and finding an optimal subset of a given set of sampling locations. As there are many different variables to measure, such as chemical, physical and biological measurements at each location, designs are derived from models based on different types of response variables: continuous, counts and proportions. We apply the methodology to a synthetic example and the Lake Eacham stream network on the Atherton Tablelands in Queensland, Australia. We show that the optimal designs depend very much on the choice of utility function, varying from space filling to clustered designs and mixtures of these, but given the utility function, designs are relatively robust to the type of response variable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are a number of pressing issues facing contemporary online environments that are causing disputes among participants and platform operators and increasing the likelihood of external regulation. A number of solutions have been proposed, including industry self-governance, top-down regulation and emergent self-governance such as EVE Online’s “Council of Stellar Management”. However, none of these solutions seem entirely satisfying; facing challenges from developers who fear regulators will not understand their platforms, or players who feel they are not sufficiently empowered to influence the platform, while many authors have raised concerns over the implementation of top-down regulation, and why the industry may be well-served to pre-empt such action. This paper considers case studies of EVE Online and the offshore gambling industry, and whether a version of self-governance may be suitable for the future of the industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current governance challenges facing the global games industry are heavily dominated by online games. Whilst much academic and industry attention has been afforded to Virtual Worlds, the more pressing contemporary challenges may arise in casual games, especially when found on social networks. As authorities are faced with an increasing volume of disputes between participants and platform operators, the likelihood of external regulation increases, and the role that such regulation would have on the industry – both internationally and within specific regions – is unclear. Kelly (2010) argues that “when you strip away the graphics of these [social] games, what you are left with is simply a button [...] You push it and then the game returns a value of either Win or Lose”. He notes that while “every game developer wants their game to be played, preferably addictively, because it’s so awesome”, these mechanics lead not to “addiction of engagement through awesomeness” but “the addiction of compulsiveness”, surmising that “the reality is that they’ve actually sort-of kind-of half-intentionally built a virtual slot machine industry”. If such core elements of social game design are questioned, this gives cause to question the real-money options to circumvent them. With players able to purchase virtual currency and speed the completion of tasks, the money invested by the 20% purchasing in-game benefits (Zainwinger, 2012) may well be the result of compulsion. The decision by the Japanese Consumer Affairs agency to investigate the ‘Kompu Gacha’ mechanic (in which players are rewarded for completing a set of items obtained through purchasing virtual goods such as mystery boxes), and the resultant verdict that such mechanics should be regulated through gambling legislation, demonstrates that politicians are beginning to look at the mechanics deployed in these environments. Purewal (2012) states that “there’s a reasonable argument that complete gacha would be regulated under gambling law under at least some (if not most) Western jurisdictions”. This paper explores the governance challenged within these games and platforms, their role in the global industry, and current practice amongst developers in the Australian and United States to address such challenges.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many applications can benefit from the accurate surface temperature estimates that can be made using a passive thermal-infrared camera. However, the process of radiometric calibration which enables this can be both expensive and time consuming. An ad hoc approach for performing radiometric calibration is proposed which does not require specialized equipment and can be completed in a fraction of the time of the conventional method. The proposed approach utilizes the mechanical properties of the camera to estimate scene temperatures automatically, and uses these target temperatures to model the effect of sensor temperature on the digital output. A comparison with a conventional approach using a blackbody radiation source shows that the accuracy of the method is sufficient for many tasks requiring temperature estimation. Furthermore, a novel visualization method is proposed for displaying the radiometrically calibrated images to human operators. The representation employs an intuitive coloring scheme and allows the viewer to perceive a large variety of temperatures accurately.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Performance of urban transit systems may be quantified and assessed using transit capacity and productive capacity in planning, design and operational management activities. Bunker (4) defines important productive performance measures of an individual transit service and transit line, which are extended in this paper to quantify efficiency and operating fashion of transit services and lines. Comparison of a hypothetical bus line’s operation during a morning peak hour and daytime hour demonstrates the usefulness of productiveness efficiency and passenger transmission efficiency, passenger churn and average proportion line length traveled to the operator in understanding their services’ and lines’ productive performance, operating characteristics, and quality of service. Productiveness efficiency can flag potential pass-up activity under high load conditions, as well as ineffective resource deployment. Proportion line length traveled can directly measure operating fashion. These measures can be used to compare between lines/routes and, within a given line, various operating scenarios and time horizons to target improvements. The next research stage is investigating within-line variation using smart card passenger data and field observation of pass-ups. Insights will be used to further develop practical guidance to operators.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim Facilities in retirement villages form a supportive environment for older residents. The purpose of this paper is to investigate the provision of these facilities in retirement villages, which are regarded as a viable accommodation option for the ever-increasing ageing population in Australia. Method A content analysis of 124 retirement villages operated by 22 developers in Queensland and South Australia was conducted for the research purpose. Results The most widely provided facilities are community centres, libraries, barbeque facilities, hairdressers/salons and billiards/snooker/pool tables. Commercial operators provide more facilities than not-for-profit organisations and larger retirement villages normally have more facilities due to the economics of scale involved. Conclusions The results of the study provide a useful reference for providing facilities within retirement villages that may support the quality lifestyles for the older residents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis considers and evaluates different approaches to regulating online gaming communities, including traditional top-down regulation, as well as bottom-up and hybrid forms led by participants. I examine the regulatory environment in both the video game and gambling industries through case studies of the science fiction, massively multiplayer game Eve Online and offshore gambling platforms and their community sites. I identify that the participant driven approach to regulation sometimes used in the offshore gambling industry was dependent on a number of factors, notably the strength of the community and the risks to platform operators of negative publicity. By subsequently comparing this to the video gaming industry, I suggest that participant driven processes may be an appropriate way to resolve disputes in the games industry, and show how these are – to a limited extent – already being applied.