917 resultados para Sampling schemes
Resumo:
In most classical frameworks for learning from examples, it is assumed that examples are randomly drawn and presented to the learner. In this paper, we consider the possibility of a more active learner who is allowed to choose his/her own examples. Our investigations are carried out in a function approximation setting. In particular, using arguments from optimal recovery (Micchelli and Rivlin, 1976), we develop an adaptive sampling strategy (equivalent to adaptive approximation) for arbitrary approximation schemes. We provide a general formulation of the problem and show how it can be regarded as sequential optimal recovery. We demonstrate the application of this general formulation to two special cases of functions on the real line 1) monotonically increasing functions and 2) functions with bounded derivative. An extensive investigation of the sample complexity of approximating these functions is conducted yielding both theoretical and empirical results on test functions. Our theoretical results (stated insPAC-style), along with the simulations demonstrate the superiority of our active scheme over both passive learning as well as classical optimal recovery. The analysis of active function approximation is conducted in a worst-case setting, in contrast with other Bayesian paradigms obtained from optimal design (Mackay, 1992).
Resumo:
We present a new method for estimating the expected return of a POMDP from experience. The estimator does not assume any knowle ge of the POMDP and allows the experience to be gathered with an arbitrary set of policies. The return is estimated for any new policy of the POMDP. We motivate the estimator from function-approximation and importance sampling points-of-view and derive its theoretical properties. Although the estimator is biased, it has low variance and the bias is often irrelevant when the estimator is used for pair-wise comparisons.We conclude by extending the estimator to policies with memory and compare its performance in a greedy search algorithm to the REINFORCE algorithm showing an order of magnitude reduction in the number of trials required.
Resumo:
This paper estimates the impact of a massive negative income shock led by the simultaneous crash down of several Ponzi schemes (also known as financial ``pyramids"") in Colombia on crime rates at the municipal level. Using novel data on the spatial incidence of the latest wave of Colombian pyramids and their crash down date, I estimate difference-in-differences models with both monthly and yearly frequency. I find that the negative income shock of the pyramids" crash down differentially exacerbates crime in affected municipalities compared to those with no presence of Ponzi schemes. This is true for minor offenses like commercial theft or residential burglary, but not for major crimes as murder or terrorism.
Resumo:
Presentation
Resumo:
In populational sampling it is vitally important to clarify and discern: first, the design or sampling method used to solve the research problem; second, the sampling size, taking into account different components (precision, reliability, variance); third, random selection and fourth, the precision estimate (sampling errors), so as to determine if it is possible to infer the obtained estimates from the target population. The existing difficulty to use concepts from the sampling theory is to understand them with absolute clarity and, to achieve it, the help from didactic-pedagogical strategies arranged as conceptual “mentefactos” (simple hierarchic diagrams organized from propositions) may prove useful. This paper presents the conceptual definition, through conceptual “mentefactos”, of the most important populational probabilistic sampling concepts, in order to obtain representative samples from populations in health research.
Resumo:
This paper uses a two-sided market model of hospital competition to study the implications of di§erent remunerations schemes on the physiciansí side. The two-sided market approach is characterized by the concept of common network externality (CNE) introduced by Bardey et al. (2010). This type of externality occurs when occurs when both sides value, possibly with di§erent intensities, the same network externality. We explicitly introduce e§ort exerted by doctors. By increasing the number of medical acts (which involves a costly e§ort) the doctor can increase the quality of service o§ered to patients (over and above the level implied by the CNE). We Örst consider pure salary, capitation or fee-for-service schemes. Then, we study schemes that mix fee-for-service with either salary or capitation payments. We show that salary schemes (either pure or in combination with fee-for-service) are more patient friendly than (pure or mixed) capitations schemes. This comparison is exactly reversed on the providersíside. Quite surprisingly, patients always loose when a fee-for-service scheme is introduced (pure of mixed). This is true even though the fee-for-service is the only way to induce the providers to exert e§ort and it holds whatever the patientsívaluation of this e§ort. In other words, the increase in quality brought about by the fee-for-service is more than compensated by the increase in fees faced by patients.
Resumo:
In November 2008, Colombian authorities dismantled a network of Ponzi schemes, making hundreds of thousands of investors lose tens of millions of dollars throughout the country. Using original data on the geographical incidence of the Ponzi schemes, this paper estimates the impact of their break down on crime. We find that the crash of Ponzi schemes differentially exacerbated crime in affected districts. Confirming the intuition of the standard economic model of crime, this effect is only present in places with relatively weak judicial and law enforcement institutions, and with little access to consumption smoothing mechanisms such as microcredit. In addition, we show that, with the exception of economically-motivated felonies such as robbery, violent crime is not affected by the negative shock.
Resumo:
Recurso preparado para los profesores de teatro que trabajan en las escuelas, nace de la práctica diaria de más de 20 años de docencia en centros de la ciudad. Proporciona planes detallados de trabajo, algunos de los cuales han sido utilizados una y otra vez, mientras que otros son más recientes; son de uso fácil, con listas de recursos y propuestas de recursos adicionales. Cada uno de estos planes de trabajo se ha enseñado a los jóvenes en las escuelas secundarias, y cada uno de los que se han desarrollado ha sido adaptado y revisado de acuerdo con lo que los niños han revelado que necesitan saber. Los planes se basan en fuentes e influencias y junto a cada plan de trabajo están las notas para el profesor que explican algunas de las ideas que hay detrás del enfoque, y pone de relieve aspectos particulares de la enseñanza del teatro.
Resumo:
Resumen basado en el de la publicación
Resumo:
One of the key aspects in 3D-image registration is the computation of the joint intensity histogram. We propose a new approach to compute this histogram using uniformly distributed random lines to sample stochastically the overlapping volume between two 3D-images. The intensity values are captured from the lines at evenly spaced positions, taking an initial random offset different for each line. This method provides us with an accurate, robust and fast mutual information-based registration. The interpolation effects are drastically reduced, due to the stochastic nature of the line generation, and the alignment process is also accelerated. The results obtained show a better performance of the introduced method than the classic computation of the joint histogram
Resumo:
In this paper, we present view-dependent information theory quality measures for pixel sampling and scene discretization in flatland. The measures are based on a definition for the mutual information of a line, and have a purely geometrical basis. Several algorithms exploiting them are presented and compare well with an existing one based on depth differences
Resumo:
In this paper we address the problem of extracting representative point samples from polygonal models. The goal of such a sampling algorithm is to find points that are evenly distributed. We propose star-discrepancy as a measure for sampling quality and propose new sampling methods based on global line distributions. We investigate several line generation algorithms including an efficient hardware-based sampling method. Our method contributes to the area of point-based graphics by extracting points that are more evenly distributed than by sampling with current algorithms
Resumo:
In the present paper we discuss and compare two different energy decomposition schemes: Mayer's Hartree-Fock energy decomposition into diatomic and monoatomic contributions [Chem. Phys. Lett. 382, 265 (2003)], and the Ziegler-Rauk dissociation energy decomposition [Inorg. Chem. 18, 1558 (1979)]. The Ziegler-Rauk scheme is based on a separation of a molecule into fragments, while Mayer's scheme can be used in the cases where a fragmentation of the system in clearly separable parts is not possible. In the Mayer scheme, the density of a free atom is deformed to give the one-atom Mulliken density that subsequently interacts to give rise to the diatomic interaction energy. We give a detailed analysis of the diatomic energy contributions in the Mayer scheme and a close look onto the one-atom Mulliken densities. The Mulliken density ρA has a single large maximum around the nuclear position of the atom A, but exhibits slightly negative values in the vicinity of neighboring atoms. The main connecting point between both analysis schemes is the electrostatic energy. Both decomposition schemes utilize the same electrostatic energy expression, but differ in how fragment densities are defined. In the Mayer scheme, the electrostatic component originates from the interaction of the Mulliken densities, while in the Ziegler-Rauk scheme, the undisturbed fragment densities interact. The values of the electrostatic energy resulting from the two schemes differ significantly but typically have the same order of magnitude. Both methods are useful and complementary since Mayer's decomposition focuses on the energy of the finally formed molecule, whereas the Ziegler-Rauk scheme describes the bond formation starting from undeformed fragment densities
Resumo:
En aquesta tesi proposem dos esquemes de xarxa amb control d'admissió per al trànsit elàstic TCP amb mecanismes senzills. Ambdós esquemes són capaços de proporcionar throughputs diferents i aïllament entre fluxos, on un "flux" es defineix com una seqüència de paquets relacionats dins d'una connexió TCP. Quant a l'arquitectura, ambdós fan servir classes de paquets amb diferents prioritats de descart, i un control d'admissió implícit, edge-to-edge i basat en mesures. En el primer esquema, les mesures són per flux, mentre que en el segon, les mesures són per agregat. El primer esquema aconsegueix un bon rendiment fent servir una modificació especial de les fonts TCP, mentre que el segon aconsegueix un bon rendiment amb fonts TCP estàndard. Ambdós esquemes han estat avaluats satisfactòriament a través de simulació en diferents topologies de xarxa i càrregues de trànsit.