999 resultados para serial method


Relevância:

80.00% 80.00%

Publicador:

Resumo:

We o¤er an axiomatization of the serial cost-sharing method of Friedman and Moulin (1999). The key property in our axiom system is Group Demand Monotonicity, asking that when a group of agents raise their demands, not all of them should pay less.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A group of agents participate in a cooperative enterprise producing a single good. Each participant contributes a particular type of input; output is nondecreasing in these contributions. How should it be shared? We analyze the implications of the axiom of Group Monotonicity: if a group of agents simultaneously decrease their input contributions, not all of them should receive a higher share of output. We show that in combination with other more familiar axioms, this condition pins down a very small class of methods, which we dub nearly serial.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We propose two axiomatic theories of cost sharing with the common premise that agents demand comparable -though perhaps different- commodities and are responsible for their own demand. Under partial responsibility the agents are not responsible for the asymmetries of the cost function: two agents consuming the same amount of output always pay the same price; this holds true under full responsibility only if the cost function is symmetric in all individual demands. If the cost function is additively separable, each agent pays her stand alone cost under full responsibility; this holds true under partial responsibility only if, in addition, the cost function is symmetric. By generalizing Moulin and Shenker’s (1999) Distributivity axiom to cost-sharing methods for heterogeneous goods, we identify in each of our two theories a different serial method. The subsidy-free serial method (Moulin, 1995) is essentially the only distributive method meeting Ranking and Dummy. The cross-subsidizing serial method (Sprumont, 1998) is the only distributive method satisfying Separability and Strong Ranking. Finally, we propose an alternative characterization of the latter method based on a strengthening of Distributivity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We survey recent axiomatic results in the theory of cost-sharing. In this litterature, a method computes the individual cost shares assigned to the users of a facility for any profile of demands and any monotonic cost function. We discuss two theories taking radically different views of the asymmetries of the cost function. In the full responsibility theory, each agent is accountable for the part of the costs that can be unambiguously separated and attributed to her own demand. In the partial responsibility theory, the asymmetries of the cost function have no bearing on individual cost shares, only the differences in demand levels matter. We describe several invariance and monotonicity properties that reflect both normative and strategic concerns. We uncover a number of logical trade-offs between our axioms, and derive axiomatic characterizations of a handful of intuitive methods: in the full responsibility approach, the Shapley-Shubik, Aumann-Shapley, and subsidyfree serial methods, and in the partial responsibility approach, the cross-subsidizing serial method and the family of quasi-proportional methods.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is an increased interest on the use of UAVs for environmental research such as tracking bush fires, volcanic eruptions, chemical accidents or pollution sources. The aim of this paper is to describe the theory and results of a bio-inspired plume tracking algorithm. A method for generating sparse plumes in a virtual environment was also developed. Results indicated the ability of the algorithms to track plumes in 2D and 3D. The system has been tested with hardware in the loop (HIL) simulations and in flight using a CO2 gas sensor mounted to a multi-rotor UAV. The UAV is controlled by the plume tracking algorithm running on the ground control station (GCS).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, first a Fortran code is developed for three dimensional linear elastostatics using constant boundary elements; the code is based on a MATLAB code developed by the author earlier. Next, the code is parallelized using BLACS, MPI, and ScaLAPACK. Later, the parallelized code is used to demonstrate the usefulness of the Boundary Element Method (BEM) as applied to the realtime computational simulation of biological organs, while focusing on the speed and accuracy offered by BEM. A computer cluster is used in this part of the work. The commercial software package ANSYS is used to obtain the `exact' solution against which the solution from BEM is compared; analytical solutions, wherever available, are also used to establish the accuracy of BEM. A pig liver is the biological organ considered. Next, instead of the computer cluster, a Graphics Processing Unit (GPU) is used as the parallel hardware. Results indicate that BEM is an interesting choice for the simulation of biological organs. Although the use of BEM for the simulation of biological organs is not new, the results presented in the present study are not found elsewhere in the literature. Also, a serial MATLAB code, and both serial and parallel versions of a Fortran code, which can solve three dimensional (3D) linear elastostatic problems using constant boundary elements, are provided as supplementary files that can be freely downloaded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most pattern mining methods yield a large number of frequent patterns, and isolating a small relevant subset of patterns is a challenging problem of current interest. In this paper, we address this problem in the context of discovering frequent episodes from symbolic time-series data. Motivated by the Minimum Description Length principle, we formulate the problem of selecting relevant subset of patterns as one of searching for a subset of patterns that achieves best data compression. We present algorithms for discovering small sets of relevant non-redundant episodes that achieve good data compression. The algorithms employ a novel encoding scheme and use serial episodes with inter-event constraints as the patterns. We present extensive simulation studies with both synthetic and real data, comparing our method with the existing schemes such as GoKrimp and SQS. We also demonstrate the effectiveness of these algorithms on event sequences from a composable conveyor system; this system represents a new application area where use of frequent patterns for compressing the event sequence is likely to be important for decision support and control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increased boating activities and new waterfront developments have contributed an estimated 3,000 dismantled, abandoned, junked, wrecked, derelict vessels to Florida coastal waters. This report outlines a method of siting and prioritizing derelict vessel removal using the Florida Keys as a test area. The data base was information on 240 vessels, obtained from Florida Marine Patrol files. Vessel location was plotted on 1:250,000 regional and 1:5,000 and 1:12,000 site maps. Type of vessel, length, hull material, engine, fuel tanks, overall condition, afloat and submerged characteristics, and accessibility, were used to derive parametric site indices of removal priority and removal difficulty. Results indicate 59 top priority cases which should be the focus of immediate clean up efforts in the Florida Keys. Half of these cases are rated low to moderate in removal difficulty; the remainder are difficult to remove. Removal difficulty is a surrogate for removal cost: low difficulty -low cost, high difficulty - high cost. The rating scheme offers coastal planners options of focusing removal operations either on (1) specific areas with clusters of high priority derelict vessels or on (2) selected targeted derelicts at various, specific locations. (PDF has 59 pages.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The procedure to conduct horizontal starch gel electrophoresis on enzymes is described in detail. Areas covered are (I) collection and storage of specimens, (2) preparation of tissues, (3) preparation of a starch gel, (4) application of enzyme extracts to a gel, (5) setting up a gel for electrophoresis, (6) slicing a gel, and (7) staining a gel. Recipes are also included for 47 enzyme stains and 3 selected gel buffers. (PDF file contains 26 pages.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fishery scientists engaged in estimating the size of free-swimming populations have never had a technique available to them whereby all the parameters could be estimated from a resource survey and where no parameter values need to be assumed. Recognizing the need for a technique of this kind, the staff of the Coastal Fisheries Resources Division of the Southwest Fisheries Center (SWFC) devised an egg production method for anchovy biomass assessment. Previously, anchovy biomass was estimated by approximate methods derived from a long-time series and anchovy larval abundance, which required about 5 ma of shiptime each year to integrate the area under a seasonal spawning curve. One major assumption used in the larval abundance census method is that there is constant proportionality between larval numbers and spawning biomass. This has now proved to be erroneous. (PDF file contains 105 pages.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The original method, proposed by Yentsch (1957), of determination of chlorophyll directly in the cells, attracts attention by its simplicity. In order to measure the content of chlorophyll by this method, a determined volume of suspension of algae is filtered through a membrane filter. The latter is dried a little, clarified by immersion oil, clamped between two glasses, and spectrophotometrized. Extinction is read off at , wavelengths equal to 670 millimicrons (around the maximum absorption of chlorophyll a in the cell) and 750 millimicrons (correction for non- specific absorption and dispersion of light by particles of the preparation). The method of Yentsch was employed by the authors for determination of chlorophyll-a in samples of phytoplankton. They conclude that in spite of the simplicity and convenience of determination the method must be applied sufficiently carefully. It is more suitable for analysis of cultures of algae, where, non-specific absorption of light is insignificant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reaction of Mn(II) with water-dissolved oxygen, to a higher manganese hydroxide in an alkaline medium, as with the longstanding classic Winkler method, is the first step in the method described here. The assumption for faultless results by the conventional and modified Winkler method is clean water, which contains no organic substances by Mn(III) or Mn(IV). In many cases, however, eg. in river and lake-water tests, it can be seen with the naked eye that after some time the originally brown-coloured precipitate of manganese hydroxide becomes more and more colourless. Oxygen content was analysed in the water samples and evaluated by raising the amount of the leuko-base and giving the corresponding dilution of the colouring matter solution formed still higher oxygen contents can be measured.