941 resultados para Grid search algorithm
Resumo:
Our task in this paper is to analyze the organization of trading in the era of quantitativefinance. To do so, we conduct an ethnography of arbitrage, the trading strategy that bestexemplifies finance in the wake of the quantitative revolution. In contrast to value andmomentum investing, we argue, arbitrage involves an art of association - the constructionof equivalence (comparability) of properties across different assets. In place of essentialor relationa l characteristics, the peculiar valuation that takes place in arbitrage is based on an operation that makes something the measure of something else - associating securities to each other. The process of recognizing opportunities and the practices of making novel associations are shaped by the specific socio-spatial and socio-technical configurations of the trading room. Calculation is distributed across persons and instruments as the trading room organizes interaction among diverse principles of valuation.
Resumo:
This paper considers a job search model where the environment is notstationary along the unemployment spell and where jobs do not lastforever. Under this circumstance, reservation wages can be lower thanwithout separations, as in a stationary environment, but they can alsobe initially higher because of the non-stationarity of the model. Moreover,the time-dependence of reservation wages is stronger than with noseparations. The model is estimated structurally using Spanish data forthe period 1985-1996. The main finding is that, although the decrease inreservation wages is the main determinant of the change in the exit ratefrom unemployment for the first four months, later on the only effect comesfrom the job offer arrival rate, given that acceptance probabilities areroughly equal to one.
Resumo:
A welfare analysis of unemployment insurance (UI) is performed in a generalequilibrium job search model. Finitely-lived, risk-averse workers smooth consumption over time by accumulating assets, choose search effort whenunemployed, and suffer disutility from work. Firms hire workers, purchasecapital, and pay taxes to finance worker benefits; their equity is the assetaccumulated by workers. A matching function relates unemployment, hiringexpenditure, and search effort to the formation of jobs. The model is calibrated to US data; the parameters relating job search effort to the probability of job finding are chosen to match microeconomic studies ofunemployment spells. Under logarithmic utility, numerical simulation shows rather small welfaregains from UI. Even without UI, workers smooth consumption effectivelythrough asset accumulation. Greater risk aversion leads to substantiallylarger welfare gains from UI; however, even in this case much of its welfareimpact is due not to consumption smoothing effects, but rather to decreased work disutility, or to a variety of externalities.
Resumo:
In this paper I show how borrowing constraints and job search interact.I fit a dynamic model to data from the National Longitudinal Survey(1979-cohort) and show that borrowing constraints are significant. Agentswith more initial assets and more access to credit attain higher wagesfor several periods after high school graduation. The unemployed maintaintheir consumption by running down their assets, while the employed saveto buffer against future unemployment spells. I also show that, unlikein models with exogenous income streams, unemployment transfers, byallowing agents to attain higher wages do not 'crowd out' but increasesaving.
Resumo:
Diversity and aspects of the ecology of social wasps (Vespidae, Polistinae) in Central Amazonian "terra firme" forest. The knowledge of social wasp richness and biology in the Amazonian region is considered insufficient. Although the Amazonas state is the largest in the region, until now only two brief surveys were conducted there. Considering that the systematic inventory of an area is the first step towards its conservation and wise use, this study presents faunal data on social wasp diversity in a 25 km² area of "terra firme" (upland forest) at the Ducke Reserve, Manaus, Amazonas, Brazil. Wasps were collected in the understory, following a protocol of three collectors walking along 60 trails 1,000 m in extension for 16 days between August and October 2010. Methods used were active search of individuals with entomological nets and nest collecting. Fifty-eight species of social wasps, allocated in 13 genera, were recorded; 67% of the collected species belong to Polybia, Agelaia and Mischocyttarus; other genera were represented by only four species or less. The most frequent species in active searches were Agelaia fulvofasciata (DeGeer, 1773), Agelaia testacea (Fabricius, 1804) and Angiopolybia pallens (Lepeletier, 1836). Twelve species were collected in nests. Prior to this study, 65 Polistinae species were deposited at the INPA Collection. Collecting in the study grid, an area not previously sampled for wasps, resulted in an increase of 25% species, and species richness was 86. According to the results, there is evidence that the diversity of social wasps at the Ducke Reserve is even higher, making it one of the richest areas in the Brazilian Amazonia.
Resumo:
This paper compares two well known scan matching algorithms: the MbICP and the pIC. As a result of the study, it is proposed the MSISpIC, a probabilistic scan matching algorithm for the localization of an Autonomous Underwater Vehicle (AUV). The technique uses range scans gathered with a Mechanical Scanning Imaging Sonar (MSIS), and the robot displacement estimated through dead-reckoning with the help of a Doppler Velocity Log (DVL) and a Motion Reference Unit (MRU). The proposed method is an extension of the pIC algorithm. Its major contribution consists in: 1) using an EKF to estimate the local path traveled by the robot while grabbing the scan as well as its uncertainty and 2) proposing a method to group into a unique scan, with a convenient uncertainty model, all the data grabbed along the path described by the robot. The algorithm has been tested on an AUV guided along a 600m path within a marina environment with satisfactory results
Resumo:
Nominal Unification is an extension of first-order unification where terms can contain binders and unification is performed modulo α equivalence. Here we prove that the existence of nominal unifiers can be decided in quadratic time. First, we linearly-reduce nominal unification problems to a sequence of freshness and equalities between atoms, modulo a permutation, using ideas as Paterson and Wegman for first-order unification. Second, we prove that solvability of these reduced problems may be checked in quadràtic time. Finally, we point out how using ideas of Brown and Tarjan for unbalanced merging, we could solve these reduced problems more efficiently
Resumo:
Summary Background: We previously derived a clinical prognostic algorithm to identify patients with pulmonary embolism (PE) who are at low-risk of short-term mortality who could be safely discharged early or treated entirely in an outpatient setting. Objectives: To externally validate the clinical prognostic algorithm in an independent patient sample. Methods: We validated the algorithm in 983 consecutive patients prospectively diagnosed with PE at an emergency department of a university hospital. Patients with none of the algorithm's 10 prognostic variables (age >/= 70 years, cancer, heart failure, chronic lung disease, chronic renal disease, cerebrovascular disease, pulse >/= 110/min., systolic blood pressure < 100 mm Hg, oxygen saturation < 90%, and altered mental status) at baseline were defined as low-risk. We compared 30-day overall mortality among low-risk patients based on the algorithm between the validation and the original derivation sample. We also assessed the rate of PE-related and bleeding-related mortality among low-risk patients. Results: Overall, the algorithm classified 16.3% of patients with PE as low-risk. Mortality at 30 days was 1.9% among low-risk patients and did not differ between the validation and the original derivation sample. Among low-risk patients, only 0.6% died from definite or possible PE, and 0% died from bleeding. Conclusions: This study validates an easy-to-use, clinical prognostic algorithm for PE that accurately identifies patients with PE who are at low-risk of short-term mortality. Low-risk patients based on our algorithm are potential candidates for less costly outpatient treatment.
Resumo:
Des dels inicis dels ordinadors com a màquines programables, l’home ha intentat dotar-los de certa intel•ligència per tal de pensar o raonar el més semblant possible als humans. Un d’aquests intents ha sigut fer que la màquina sigui capaç de pensar de tal manera que estudiï jugades i guanyi partides d’escacs. En l’actualitat amb els actuals sistemes multi tasca, orientat a objectes i accés a memòria i gràcies al potent hardware del que disposem, comptem amb una gran varietat de programes que es dediquen a jugar a escacs. Però no hi ha només programes petits, hi ha fins i tot màquines senceres dedicades a calcular i estudiar jugades per tal de guanyar als millors jugadors del món. L’objectiu del meu treball és dur a terme un estudi i implementació d’un d’aquests programes, per això es divideix en dues parts. La part teòrica o de l’estudi, consta d’un estudi dels sistemes d’intel•ligència artificial que es dediquen a jugar a escacs, estudi i cerca d’una funció d’avaluació vàlida i estudi dels algorismes de cerca. La part pràctica del treball es basa en la implementació d’un sistema intel•ligent capaç de jugar a escacs amb certa lògica. Aquesta implementació es porta a terme amb l’ajuda de les llibreries SDL, utilitzant l’algorisme minimax amb poda alfa-beta i codi c++. Com a conclusió del projecte m’agradaria remarcar que l’estudi realitzat m’ha deixat veure que crear un joc d’escacs no era tan fàcil com jo pensava però m’ha aportat la satisfacció d’aplicar tot el que he après durant la carrera i de descobrir moltes altres coses noves.
Resumo:
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.
Resumo:
Recent studies of relativistic jet sources in the Galaxy, also known as microquasars, have been very useful in trying to understand the accretion/ejection processes that take place near compact objects. However, the number of sources involved in such studies is still small. In an attempt to increase the number of known microquasars we have carried out a search for new Radio Emitting X-ray Binaries (REXBs). These sources are the ones to be observed later with VLBI techniques to unveil their possible microquasar nature. To this end, we have performed a cross-identification between the X-ray ROSAT all sky survey Bright Source Catalog (RBSC) and the radio NRAO VLA Sky Survey (NVSS) catalogs under very restrictive selection criteria for sources with |b|<5 degrees. We have also conducted a deep observational radio and optical study for six of the selected candidates. At the end of this process two of the candidates appear to be promising, and deserve additional observations aimed to confirm their proposed microquasar nature.
Resumo:
The MAGIC collaboration has searched for high-energy gamma-ray emission of some of the most promising pulsar candidates above an energy threshold of 50 GeV, an energy not reachable up to now by other ground-based instruments. Neither pulsed nor steady gamma-ray emission has been observed at energies of 100 GeV from the classical radio pulsars PSR J0205+6449 and PSR J2229+6114 (and their nebulae 3C58 and Boomerang, respectively) and the millisecond pulsar PSR J0218+4232. Here, we present the flux upper limits for these sources and discuss their implications in the context of current model predictions.