987 resultados para funding models
Resumo:
Many newspapers and magazines have added “social media features” to their web-based information services in order to allow users to participate in the production of content. This study examines the specific impact of the firm’s investment in social media features on their online business models. We make a comparative case study of four Scandinavian print media firms that have added social media features to their online services. We show how social media features lead to online business model innovation, particularly linked to the firms’ value propositions. The paper discusses the repercussions of this transformation on firms’ relationship with consumers and with traditional content contributors. The modified value proposition also requires firms to acquire new competences in order to reap full benefit of their social media investments. We show that the firms have been unable to do so since they have not allowed the social media features to affect their online revenue models.
Resumo:
A typology of music distribution models is proposed consisting of the ownership model, the access model, and the context model. These models are not substitutes for each other and may co‐exist serving different market niches. The paper argues that increasingly the economic value created from recorded music is based on con‐text rather than on ownership. During this process, access‐based services temporarily generate economic value, but such services are destined to eventually become commoditised.
Resumo:
Fierce debates have characterised 2013 as the implications of government mandates for open access have been debated and pathways for implementation worked out. There is no doubt that journals will move to a mix of gold and green and there will be an unsettled relationship between the two. But what of books? Is it conceivable that in those subjects, such as in the humanities and social sciences, where something longer than the journal article is still the preferred form of scholarly communications that these will stay closed? Will it be acceptable to have some publicly funded research made available only in closed book form (regardless of whether print or digital) while other subjects where articles are favoured go open access? Frances Pinter is in the middle of these debates, having founded Knowledge Unlatched (see www.knowledgeunlatched.org). KU is a global library consortium enabling open access books. Knowledge Unlatched is helping libraries to work together for a sustainable open future for specialist academic books. Its vision is a healthy market that includes free access for end users. In this session she will review all the different models that are being experimented with around the world. These include author-side payments, institutional subsidies, research funding body approaches etc. She will compare and contrast these models with those that are already in place for journal articles. She will also review the policy landscape and report on how open access scholarly books are faring to date Frances Pinter, Founder, Knowledge Unlatched, UK
Resumo:
Whole image descriptors have recently been shown to be remarkably robust to perceptual change especially compared to local features. However, whole-image-based localization systems typically rely on heuristic methods for determining appropriate matching thresholds in a particular environment. These environment-specific tuning requirements and the lack of a meaningful interpretation of these arbitrary thresholds limits the general applicability of these systems. In this paper we present a Bayesian model of probability for whole-image descriptors that can be seamlessly integrated into localization systems designed for probabilistic visual input. We demonstrate this method using CAT-Graph, an appearance-based visual localization system originally designed for a FAB-MAP-style probabilistic input. We show that using whole-image descriptors as visual input extends CAT-Graph’s functionality to environments that experience a greater amount of perceptual change. We also present a method of estimating whole-image probability models in an online manner, removing the need for a prior training phase. We show that this online, automated training method can perform comparably to pre-trained, manually tuned local descriptor methods.
Resumo:
The acceptance of broadband ultrasound attenuation for the assessment of osteoporosis suffers from a limited understanding of ultrasound wave propagation through cancellous bone. It has recently been proposed that the ultrasound wave propagation can be described by a concept of parallel sonic rays. This concept approximates the detected transmission signal to be the superposition of all sonic rays that travel directly from transmitting to receiving transducer. The transit time of each ray is defined by the proportion of bone and marrow propagated. An ultrasound transit time spectrum describes the proportion of sonic rays having a particular transit time, effectively describing lateral inhomogeneity of transit times over the surface of the receiving ultrasound transducer. The aim of this study was to provide a proof of concept that a transit time spectrum may be derived from digital deconvolution of input and output ultrasound signals. We have applied the active-set method deconvolution algorithm to determine the ultrasound transit time spectra in the three orthogonal directions of four cancellous bone replica samples and have compared experimental data with the prediction from the computer simulation. The agreement between experimental and predicted ultrasound transit time spectrum analyses derived from Bland–Altman analysis ranged from 92% to 99%, thereby supporting the concept of parallel sonic rays for ultrasound propagation in cancellous bone. In addition to further validation of the parallel sonic ray concept, this technique offers the opportunity to consider quantitative characterisation of the material and structural properties of cancellous bone, not previously available utilising ultrasound.
Resumo:
In this paper an approach is presented for identification of a reduced model for coherent areas in power systems using phasor measurement units to represent the inter-area oscillations of the system. The generators which are coherent in a wide range of operating conditions form the areas in power systems and the reduced model is obtained by representing each area by an equivalent machine. The reduced nonlinear model is then identified based on the data obtained from measurement units. The simulation is performed on three test systems and the obtained results show high accuracy of identification process.
Resumo:
Exact solutions of partial differential equation models describing the transport and decay of single and coupled multispecies problems can provide insight into the fate and transport of solutes in saturated aquifers. Most previous analytical solutions are based on integral transform techniques, meaning that the initial condition is restricted in the sense that the choice of initial condition has an important impact on whether or not the inverse transform can be calculated exactly. In this work we describe and implement a technique that produces exact solutions for single and multispecies reactive transport problems with more general, smooth initial conditions. We achieve this by using a different method to invert a Laplace transform which produces a power series solution. To demonstrate the utility of this technique, we apply it to two example problems with initial conditions that cannot be solved exactly using traditional transform techniques.
Resumo:
Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.
Resumo:
A predictive model of terrorist activity is developed by examining the daily number of terrorist attacks in Indonesia from 1994 through 2007. The dynamic model employs a shot noise process to explain the self-exciting nature of the terrorist activities. This estimates the probability of future attacks as a function of the times since the past attacks. In addition, the excess of nonattack days coupled with the presence of multiple coordinated attacks on the same day compelled the use of hurdle models to jointly model the probability of an attack day and corresponding number of attacks. A power law distribution with a shot noise driven parameter best modeled the number of attacks on an attack day. Interpretation of the model parameters is discussed and predictive performance of the models is evaluated.
Resumo:
Childhood autism falls under the guise of autism spectrum disorders and is generally found in children over two years of age. There are of course variations in severity and clinical manifestations, however the most common features being disinterest in social interaction and engagement in ritualistic and repetitive behaviours. In Singapore the incidence of autism is on the rise as parents are becoming more aware of the early signs of autism and seek healthcare programmes to ensure the quality of life for their child is optimised. Two such programmes, Applied Behaiour Analysis and Floortime approach have proven successful in alleviating some of the behavioural and social skills problems associated with autism. Using positive behaviour reinforcement both Applied Behaviour Analysis and Floortime approach reward behaviour associated with positive social responses.
Resumo:
Quantum-inspired models have recently attracted increasing attention in Information Retrieval. An intriguing characteristic of the mathematical framework of quantum theory is the presence of complex numbers. However, it is unclear what such numbers could or would actually represent or mean in Information Retrieval. The goal of this paper is to discuss the role of complex numbers within the context of Information Retrieval. First, we introduce how complex numbers are used in quantum probability theory. Then, we examine van Rijsbergen’s proposal of evoking complex valued representations of informations objects. We empirically show that such a representation is unlikely to be effective in practice (confuting its usefulness in Information Retrieval). We then explore alternative proposals which may be more successful at realising the power of complex numbers.
Resumo:
Objective To examine the impact of applying for funding on personal workloads, stress and family relationships. Design Qualitative study of researchers preparing grant proposals. Setting Web-based survey on applying for the annual National Health and Medical Research Council (NHMRC) Project Grant scheme. Participants Australian researchers (n=215). Results Almost all agreed that preparing their proposals always took top priority over other work (97%) and personal (87%) commitments. Almost all researchers agreed that they became stressed by the workload (93%) and restricted their holidays during the grant writing season (88%). Most researchers agreed that they submitted proposals because chance is involved in being successful (75%), due to performance requirements at their institution (60%) and pressure from their colleagues to submit proposals (53%). Almost all researchers supported changes to the current processes to submit proposals (95%) and peer review (90%). Most researchers (59%) provided extensive comments on the impact of writing proposals on their work life and home life. Six major work life themes were: (1) top priority; (2) career development; (3) stress at work; (4) benefits at work; (5) time spent at work and (6) pressure from colleagues. Six major home life themes were: (1) restricting family holidays; (2) time spent on work at home; (3) impact on children; (4) stress at home; (5) impact on family and friends and (6) impact on partner. Additional impacts on the mental health and well-being of researchers were identified. Conclusions The process of preparing grant proposals for a single annual deadline is stressful, time consuming and conflicts with family responsibilities. The timing of the funding cycle could be shifted to minimise applicant burden, give Australian researchers more time to work on actual research and to be with their families.
Resumo:
In the current business world which companies’ competition is very compact in the business arena, quality in manufacturing and providing products and services can be considered as a means of seeking excellence and success of companies in this competition arena. Entering the era of e-commerce and emergence of new production systems and new organizational structures, traditional management and quality assurance systems have been challenged. Consequently, quality information system has been gained a special seat as one of the new tools of quality management. In this paper, quality information system has been studied with a review of the literature of the quality information system, and the role and position of quality Information System (QIS) among other information systems of a organization is investigated. The quality Information system models are analyzed and by analyzing and assessing presented models in quality information system a conceptual and hierarchical model of quality information system is suggested and studied. As a case study the hierarchical model of quality information system is developed by evaluating hierarchical models presented in the field of quality information system based on the Shetabkar Co.
Resumo:
Non-profit organizations (NPOs) are major providers of services in many fields of endeavour, and often receive financial support from government. This article investigates different forms of government/nonprofit funding relationships, with the viewpoint being mainly, though not exclusively, from the perspective of the non-profit agencies. While there are a number of existing typologies of government/NPO relations, these are dated and in need of further empirical analysis and testing. The article advances an empirically derived extension to current models of government/NPO relations. A future research agenda is outlined based on the constructs that underpin typologies, rather than discrete categorization of relationships.
Resumo:
Multimedia communication capabilities are rapidly expanding, and visual information is easily shared electronically, yet funding bodies still rely on paper grant proposal submissions. Incorporating modern technologies will streamline the granting process by increasing the fidelity of grant communication, improving the efficiency of review, and reducing the cost of the process.