49 resultados para Sequential extraction
Resumo:
A sequential weakly efficient two-auction game with entry costs, interdependence between objects, two potential bidders and IPV assumption is presented here in order to give some theoretical predictions on the effects of geographical scale economies on local service privatization performance. It is shown that the first object seller takes profit of this interdependence. The interdependence externality rises effective competition for the first object, expressed as the probability of having more than one final bidder. Besides, if there is more than one final bidder in the first auction, seller extracts the entire bidder¿s expected future surplus differential between having won the first auction and having lost. Consequences for second object seller are less clear, reflecting the contradictory nature of the two main effects of object interdependence. On the one hand, first auction winner becomes ¿stronger¿, so that expected payments rise in a competitive environment. On the other hand, first auction loser becomes relatively ¿weaker¿, hence (probably) reducing effective competition for the second object. Additionally, some contributions to static auction theory with entry cost and asymmetric bidders are presented in the appendix
Resumo:
[eng] In the context of cooperative TU-games, and given an order of players, we consider the problem of distributing the worth of the grand coalition as a sequentia decision problem. In each step of process, upper and lower bounds for the payoff of the players are required related to successive reduced games. Sequentially compatible payoffs are defined as those allocation vectors that meet these recursive bounds. The core of the game is reinterpreted as a set of sequentally compatible payoffs when the Davis-Maschler reduced game is considered (Th.1). Independently of the reduction, the core turns out to be the intersections of the family of the sets of sequentially compatible payoffs corresponding to the different possible orderings (Th.2), so it is in some sense order-independent. Finally, we analyze advantagenous properties for the first player
Resumo:
We study the problem of the partition of a system of initial size V into a sequence of fragments s1,s2,s3 . . . . By assuming a scaling hypothesis for the probability p(s;V) of obtaining a fragment of a given size, we deduce that the final distribution of fragment sizes exhibits power-law behavior. This minimal model is useful to understanding the distribution of avalanche sizes in first-order phase transitions at low temperatures.
Resumo:
[eng] In the context of cooperative TU-games, and given an order of players, we consider the problem of distributing the worth of the grand coalition as a sequentia decision problem. In each step of process, upper and lower bounds for the payoff of the players are required related to successive reduced games. Sequentially compatible payoffs are defined as those allocation vectors that meet these recursive bounds. The core of the game is reinterpreted as a set of sequentally compatible payoffs when the Davis-Maschler reduced game is considered (Th.1). Independently of the reduction, the core turns out to be the intersections of the family of the sets of sequentially compatible payoffs corresponding to the different possible orderings (Th.2), so it is in some sense order-independent. Finally, we analyze advantagenous properties for the first player
Resumo:
Background: oscillatory activity, which can be separated in background and oscillatory burst pattern activities, is supposed to be representative of local synchronies of neural assemblies. Oscillatory burst events should consequently play a specific functional role, distinct from background EEG activity – especially for cognitive tasks (e.g. working memory tasks), binding mechanisms and perceptual dynamics (e.g. visual binding), or in clinical contexts (e.g. effects of brain disorders). However extracting oscillatory events in single trials, with a reliable and consistent method, is not a simple task. Results: in this work we propose a user-friendly stand-alone toolbox, which models in a reasonable time a bump time-frequency model from the wavelet representations of a set of signals. The software is provided with a Matlab toolbox which can compute wavelet representations before calling automatically the stand-alone application. Conclusion: The tool is publicly available as a freeware at the address: http:// www.bsp.brain.riken.jp/bumptoolbox/toolbox_home.html
Resumo:
In this paper we present a quantitative comparisons of different independent component analysis (ICA) algorithms in order to investigate their potential use in preprocessing (such as noise reduction and feature extraction) the electroencephalogram (EEG) data for early detection of Alzhemier disease (AD) or discrimination between AD (or mild cognitive impairment, MCI) and age-match control subjects.
Resumo:
In this paper we propose a method for computing JPEG quantization matrices for a given mean square error or PSNR. Then, we employ our method to compute JPEG standard progressive operation mode definition scripts using a quantization approach. Therefore, it is no longer necessary to use a trial and error procedure to obtain a desired PSNR and/or definition script, reducing cost. Firstly, we establish a relationship between a Laplacian source and its uniform quantization error. We apply this model to the coefficients obtained in the discrete cosine transform stage of the JPEG standard. Then, an image may be compressed using the JPEG standard under a global MSE (or PSNR) constraint and a set of local constraints determined by the JPEG standard and visual criteria. Secondly, we study the JPEG standard progressive operation mode from a quantization based approach. A relationship between the measured image quality at a given stage of the coding process and a quantization matrix is found. Thus, the definition script construction problem can be reduced to a quantization problem. Simulations show that our method generates better quantization matrices than the classical method based on scaling the JPEG default quantization matrix. The estimation of PSNR has usually an error smaller than 1 dB. This figure decreases for high PSNR values. Definition scripts may be generated avoiding an excessive number of stages and removing small stages that do not contribute during the decoding process with a noticeable image quality improvement.
Resumo:
The World Wide Web, the world¿s largest resource for information, has evolved from organizing information using controlled, top-down taxonomies to a bottom up approach that emphasizes assigning meaning to data via mechanisms such as the Social Web (Web 2.0). Tagging adds meta-data, (weak semantics) to the content available on the web. This research investigates the potential for repurposing this layer of meta-data. We propose a multi-phase approach that exploits user-defined tags to identify and extract domain-level concepts. We operationalize this approach and assess its feasibility by application to a publicly available tag repository. The paper describes insights gained from implementing and applying the heuristics contained in the approach, as well as challenges and implications of repurposing tags for extraction of domain-level concepts.
Resumo:
In this paper we consider a sequential allocation problem with n individuals. The first individual can consume any amount of some endowment leaving the remaining for the second individual, and so on. Motivated by the limitations associated with the cooperative or non-cooperative solutions we propose a new approach. We establish some axioms that should be satisfied, representativeness, impartiality, etc. The result is a unique asymptotic allocation rule. It is shown for n = 2; 3; 4; and a claim is made for general n. We show that it satisfies a set of desirable properties. Key words: Sequential allocation rule, River sharing problem, Cooperative and non-cooperative games, Dictator and ultimatum games. JEL classification: C79, D63, D74.
Resumo:
Introduction: Third molar extraction is the most frequent procedure in oral surgery. The present study evaluates the indication of third molar extraction as established by the primary care dentist (PCD) and the oral surgeon, and compares the justification for extraction with the principal reason for patient consultation. Patients and method: A descriptive study was made of 319 patients subjected to surgical removal of a third molar in the context of the Master of Oral Surgery and Implantology (Barcelona University Dental School, Barcelona, Spain) between July 2004 and March 2005. The following parameters were evaluated: sex, age, molar, type of impaction, position according to the classifications of Pell and Gregory and of Winter, and the reasons justifying extraction. Results: The lower third molars were the most commonly extracted molars (73.7%). A total of 69.6% of the teeth were covered by soft tissues only. Fifty-six percent of the lower molars corresponded to Pell and Gregory Class IIB, while 42.1% were in the vertical position. The most common reason for patient reference to our Service of Oral Surgery on the part of the PCD was prophylactic removal (51.0% versus 46.1% in the case of the oral surgeon). Discussion and conclusions. Our results show prophylaxis to be the principal indication of third molar extraction, followed by orthodontic reasons. Regarding third molars with associated clinical symptoms or signs, infectious disease-including pericoronitis- was the pathology most often observed by the oral surgeon, followed by caries. This order of frequency was seen to invert in the case of third molars referred for extraction by the PCD. A vertical position predominated among the third molars with associated pathology
Resumo:
Social interactions are a very important component in people"s lives. Social network analysis has become a common technique used to model and quantify the properties of social interactions. In this paper, we propose an integrated framework to explore the characteristics of a social network extracted from multimodal dyadic interactions. For our study, we used a set of videos belonging to New York Times" Blogging Heads opinion blog. The Social Network is represented as an oriented graph, whose directed links are determined by the Influence Model. The links" weights are a measure of the"influence" a person has over the other. The states of the Influence Model encode automatically extracted audio/visual features from our videos using state-of-the art algorithms. Our results are reported in terms of accuracy of audio/visual data fusion for speaker segmentation and centrality measures used to characterize the extracted social network.
Resumo:
All the experimental part of this final project was done at Laboratoire de Biotechnologie Environnementale (LBE) from the École Polytechnique Fédérale de Lausanne (EPFL), Switzerland, during 6 months (November 2013- May 2014). A fungal biofilter composed of woodchips was designed in order to remove micropollutants from the effluents of waste water treatment plants. Two fungi were tested: Pleurotus ostreatus and Trametes versicolor in order to evaluate their efficiency for the removal of two micropollutants: the anti-inflammatory drug naproxen and the antibiotic sulfamethoxazole,. Although Trametes versicolor was able to degrade quickly naproxen, this fungus was not any more active after one week of operation in the filter. Pleurotus ostreatus was, on contrary, able to survive more than 3 months in the filter, showing good removal efficiencies of naproxen and sulfamethoxazole during all this period, in tap water but also in real treated municipal wastewater. Several other experiments have provided insight on the removal mechanisms of these micropollutants in the fungal biofilter (degradation and adsorption) and also allowed to model the removal trend. Fungal treatment with Pleurotus ostreatus grown on wood substrates appeared to be a promising solution to improve micropollutants removal in wastewater.
Resumo:
In this paper, we propose a new supervised linearfeature extraction technique for multiclass classification problemsthat is specially suited to the nearest neighbor classifier (NN).The problem of finding the optimal linear projection matrix isdefined as a classification problem and the Adaboost algorithmis used to compute it in an iterative way. This strategy allowsthe introduction of a multitask learning (MTL) criterion in themethod and results in a solution that makes no assumptions aboutthe data distribution and that is specially appropriated to solvethe small sample size problem. The performance of the methodis illustrated by an application to the face recognition problem.The experiments show that the representation obtained followingthe multitask approach improves the classic feature extractionalgorithms when using the NN classifier, especially when we havea few examples from each class