503 resultados para algorithmic skeletons


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aquest Projecte pretén crear un simulador d’una màquina algorísmica, de caràcter didàctic amb Adobe Flash CS3 per a reproduir-lo amb Adobe Flash Lite Player, que és la versió per a dispositius mòbils. Simularà el comportament de la màquina algorísmica anomenada FEMTOPROC, que és capaç d’interpretar 4 instruccions molt senzilles: ADD, AND, NOT i JZ (jump if zero). Les diferents instruccions introduïdes que compondran un programa seran emmagatzemades en una memòria de 64 posicions de 8 bits cadascuna i hi haurà un Banc de Registres amb 8 registres de 8 bits, que es podrà inicialitzar al començament de la simulació.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to study the morphology of young Chrysomya albiceps forms, newly hatched larvae were collected at 2 hr intervals, during the first 56 hr; after this time the collection was made at 12 hr intervals. For identification and drawing, larvae were placed between a slide and a coverslip. The cephalopharyngeal skeletons along with the first and last segments were cut off for observation of their structures and spiracles. The larvae present microspines, which are distributed randomly throughout the 12 segments of the body surface; the cephalopharyngeal skeleton varies in shape and extent of sclerotization according to larval instar; the second and third instars have relatively long processes (tubercles) on the dorsal, lateral and ventral surfaces, with microspine circles on the terminal portion

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hypergraph width measures are a class of hypergraph invariants important in studying the complexity of constraint satisfaction problems (CSPs). We present a general exact exponential algorithm for a large variety of these measures. A connection between these and tree decompositions is established. This enables us to almost seamlessly adapt the combinatorial and algorithmic results known for tree decompositions of graphs to the case of hypergraphs and obtain fast exact algorithms. As a consequence, we provide algorithms which, given a hypergraph H on n vertices and m hyperedges, compute the generalized hypertree-width of H in time O*(2n) and compute the fractional hypertree-width of H in time O(1.734601n.m).1

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diphyllobothrium pacificum has been reported as a human parasite from coprolites and skeletons in Peru and Chile. Our analysis of Chinchorro mummies from Chile provides the oldest evidence of D. pacificum directly associated with human mummies. These mummies date between 4,000 and 5,000 years ago. The basis for our diagnosis is presented. We find that the size of the eggs in the mummies is smaller than other discoveries of D. pacificum. We suggest that this is due to the peculiar circumstances of preservation of parasite eggs within mummies and the release of immature eggs into the intestinal tract as the tapeworms decompose after the death of the host. This information is important to consider when making diagnoses from mummies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During the excavations of the XIX century Meadowlark cemetery (Manhattan, Kansas, US), samples of sediments were taken from around five skeletons, and analyzed to detect intestinal parasites. No helminth eggs were found, but immunological ELISA tests for Entamoeba histolytica were positive in three samples. The immunological techniques have been successfully used in paleoparasitology to detect protozoan infections. Amoebiasis could have been a severe disease in the past, especially where poor sanitary conditions prevailed, and there is evidence that this cemetery may have been used in a situation where poor sanitary conditions may have prevailed. The presence of this protozoan in US during the late XIX century gives information on the health of the population and provides additional data on the parasite's evolution since its appearance in the New World.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The analysis of the skeletons of past human populations provides some of the best biological data regarding the history of significant diseases such as tuberculosis. The purpose of this study is to present the pathological alterations of the bones in this disease deriving from the ancient time of the territory of the Hungarian Great Plain on the basis of the earlier references and new cases. The bone changes in tuberculosis were mainly manifested in the vertebrae and less frequently in the hip, however, further alterations were observed on the surface of the endocranium and the ribs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Syphilis is a sexually or congenitally transmitted infectious disease with an impact on the health of human populations that has undergone important cycles in different countries and periods of history. Its presence was first diagnosed in Europe in the late XIV century. In Portugal, although there are various written records of the infection in the last centuries, there are rare references to it in archeological findings (mummified bodies are also rare in Portugal). The current study describes a probable case of congenital syphilis in an 18-month-old girl buried in the Church of the Sacrament in Lisbon. Her body, dating to the XVIII century, was found mummified together with dozens of others, still not studied. Symmetrical periostitis of the long bones, osteitis, metaphyseal lesions, left knee articular, and epiphyseal destruction, and a rarefied lesion with a radiological appearance compatible with Wimberger's sign all point to a diagnosis of congenital syphilis. The diagnosis of this severe form of the infection, possibly related to the cause of death in this upper-class girl, calls attention to the disease's presence in XVIII century Lisbon and is consistent with the intense mobilization at the time in relation to the risks posed by so-called heredosyphilis. It is the first case of congenital syphilis in a child reported in archeological findings in Portugal, and can be correlated with other cases in skeletons of adults buried in cemeteries in Lisbon (in the XVI to XVIII centuries) and Coimbra (XIX century). Finally, this finding highlights the need to study the entire series of mummified bodies in the Church of the Sacrament in order to compare the paleopathological findings and existing historical documents on syphilis, so as to expand the paleoepidemiological knowledge of this infection in XVIII century Lisbon.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Genetically engineered organisms expressing spectroscopically active reporter molecules in response to chemical effectors display great potential as living transducers in sensing applications. Green fluorescent protein (gfp gene) bioreporters have distinct advantages over luminescent couterparts (lux gene), including applicability at the single-cell level, but are typically less sensitive. Here we describe a gfp-bearing bioreporter that is sensitive to naphthalene (a poorly water soluble pollutant behaving like a large class of hydrophobic compounds), is suitable for use in chemical assays and bioavailability studies, and has detection limits comparable to lux-bearing bioreporters for higher efficiency detection strategies. Simultaneously, we find that the exploitation of population response data from single-cell analysis is not an algorithmic conduit to enhanced signal detection and hence lower effector detection limits, as normally assumed. The assay reported functions to equal effect with or without biocide.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new radiolarian order - Archaeospicularia - is proposed for some Lower Paleozoic radiolarians previously considered to belong to Spumellaria and to Collodaria. It is characterized by a globular shell made of several spicules which can be free, interlocked, or fused to formed a latticed wall. The present paper gives the definition of this order and proposes a first classification. It is supposed that the Archaeospicularia represents the oldest radiolarian group and that in the Lower Paleozoic it gave rise to the orders Entactinaria, Albaillellaria, and probably Spumellaria by the reduction of the number of initial spicules. The origin of this order and its relationships with other groups of organisms with siliceous skeletons are also briefly discussed. (C) 2000 Academie des sciences / Editions scientifiques et medicales Elsevier SAS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid (whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then the problem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid(whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then theproblem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Revenue management (RM) is a complicated business process that can best be described ascontrol of sales (using prices, restrictions, or capacity), usually using software as a tool to aiddecisions. RM software can play a mere informative role, supplying analysts with formatted andsummarized data who use it to make control decisions (setting a price or allocating capacity fora price point), or, play a deeper role, automating the decisions process completely, at the otherextreme. The RM models and algorithms in the academic literature by and large concentrateon the latter, completely automated, level of functionality.A firm considering using a new RM model or RM system needs to evaluate its performance.Academic papers justify the performance of their models using simulations, where customerbooking requests are simulated according to some process and model, and the revenue perfor-mance of the algorithm compared to an alternate set of algorithms. Such simulations, whilean accepted part of the academic literature, and indeed providing research insight, often lackcredibility with management. Even methodologically, they are usually awed, as the simula-tions only test \within-model" performance, and say nothing as to the appropriateness of themodel in the first place. Even simulations that test against alternate models or competition arelimited by their inherent necessity on fixing some model as the universe for their testing. Theseproblems are exacerbated with RM models that attempt to model customer purchase behav-ior or competition, as the right models for competitive actions or customer purchases remainsomewhat of a mystery, or at least with no consensus on their validity.How then to validate a model? Putting it another way, we want to show that a particularmodel or algorithm is the cause of a certain improvement to the RM process compared to theexisting process. We take care to emphasize that we want to prove the said model as the causeof performance, and to compare against a (incumbent) process rather than against an alternatemodel.In this paper we describe a \live" testing experiment that we conducted at Iberia Airlineson a set of flights. A set of competing algorithms control a set of flights during adjacentweeks, and their behavior and results are observed over a relatively long period of time (9months). In parallel, a group of control flights were managed using the traditional mix of manualand algorithmic control (incumbent system). Such \sandbox" testing, while common at manylarge internet search and e-commerce companies is relatively rare in the revenue managementarea. Sandbox testing has an undisputable model of customer behavior but the experimentaldesign and analysis of results is less clear. In this paper we describe the philosophy behind theexperiment, the organizational challenges, the design and setup of the experiment, and outlinethe analysis of the results. This paper is a complement to a (more technical) related paper thatdescribes the econometrics and statistical analysis of the results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we introduce the concept of dyadic pulsations as a measure of sustainability in online discussion groups. Dyadic pulsations correspond to new communication exchanges occurring between two participants in a discussion group. A group that continuously integrates new participants in the on-going conversation is characterized by a steady dyadic pulsation rhythm. On the contrary, groups that either pursue close conversation or unilateral communication have no or very little dyadic pulsations. We show on two examples taken from Usenet discussion groups, that dyadic pulsations permit to anticipate future bursts in response delay time which are signs of group discussion collapses. We discuss ways of making this measure resilient to spam and other common algorithmic production that pollutes real discussions

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Species distribution modelling is central to both fundamental and applied research in biogeography. Despite widespread use of models, there are still important conceptual ambiguities as well as biotic and algorithmic uncertainties that need to be investigated in order to increase confidence in model results. We identify and discuss five areas of enquiry that are of high importance for species distribution modelling: (1) clarification of the niche concept; (2) improved designs for sampling data for building models; (3) improved parameterization; (4) improved model selection and predictor contribution; and (5) improved model evaluation. The challenges discussed in this essay do not preclude the need for developments of other areas of research in this field. However, they are critical for allowing the science of species distribution modelling to move forward.