986 resultados para relative loss bounds
Resumo:
We propose a new way to build a combined list from K base lists, each containing N items. A combined list consists of top segments of various sizes from each base list so that the total size of all top segments equals N. A sequence of item requests is processed and the goal is to minimize the total number of misses. That is, we seek to build a combined list that contains all the frequently requested items. We first consider the special case of disjoint base lists. There, we design an efficient algorithm that computes the best combined list for a given sequence of requests. In addition, we develop a randomized online algorithm whose expected number of misses is close to that of the best combined list chosen in hindsight. We prove lower bounds that show that the expected number of misses of our randomized algorithm is close to the optimum. In the presence of duplicate items, we show that computing the best combined list is NP-hard. We show that our algorithms still apply to a linearized notion of loss in this case. We expect that this new way of aggregating lists will find many ranking applications.
Resumo:
We consider an online learning scenario in which the learner can make predictions on the basis of a fixed set of experts. The performance of each expert may change over time in a manner unknown to the learner. We formulate a class of universal learning algorithms for this problem by expressing them as simple Bayesian algorithms operating on models analogous to Hidden Markov Models (HMMs). We derive a new performance bound for such algorithms which is considerably simpler than existing bounds. The bound provides the basis for learning the rate at which the identity of the optimal expert switches over time. We find an analytic expression for the a priori resolution at which we need to learn the rate parameter. We extend our scalar switching-rate result to models of the switching-rate that are governed by a matrix of parameters, i.e. arbitrary homogeneous HMMs. We apply and examine our algorithm in the context of the problem of energy management in wireless networks. We analyze the new results in the framework of Information Theory.
Resumo:
The association between human immunodeficiency virus type I (HIV-1) RNA load changes and the emergence of resistant virus variants was investigated in 24 HIV-1-infected asymptomatic persons during 2 years of treatment with zidovudine by sequentially measuring serum HIV-1 RNA load and the relative amounts of HIV-1 RNA containing mutations at reverse transcriptase (RT) codons 70 (K-->R), 41 (M-->L), and 215 (T-->Y/F). A mean maximum decline in RNA load occurred during the first month, followed by a resurgence between 1 and 3 months, which appeared independent of drug-resistance. Mathematical modeling suggests that this resurgence is caused by host-parasite dynamics, and thus reflects infection of the transiently increased numbers of CD4+ lymphocytes. Between 3 and 6 months of treatment, the RNA load returned to baseline values, which was associated with the emergence of virus containing a single lysine to arginine amino acid change at RT codon 70, only conferring an 8-fold reduction in susceptibility. Despite the relative loss of RNA load suppression, selection toward mutations at RT codons 215 and 41 continued. Identical patterns were observed in the mathematical model. While host-parasite dynamics and outgrowth of low-level resistant virus thus appear responsible for the loss of HIV-1 RNA load suppression, zidovudine continues to select for alternative mutations, conferring increasing levels of resistance.
Resumo:
A pathological feature of Alzheimer's disease (AD) is an area-specific neuronal loss that may be caused by excitotoxicity-related synaptic dysfunction. Relative expression levels of synaptopbysin, dynamin I, complexins I and II, N-cadherin, and alpha CaMKII were analysed in human brain tissue from AD cases and controls in hippocampus, and inferior temporal and occipital cortices. Synaptophysin and dynamin I are presynaptic terminal proteins not specific to any neurotransmitter system whereas complexin II, N-cadherin, and alpha CaMKII are specific for excitatory synapses. Complexin I is a presynaptic protein localised to inhibitory synapses. There were no significant differences in synaptophysin, dynamin I, N-cadherin, or alpha CaMKII protein levels between AD cases and controls. The complexin proteins were both markedly lower in AD cases than in controls (P < 0.01). Cases were also categorised by APOE genotype. Averaged across areas there was a 36% lowering of presynaptic proteins in AD cases carrying at least one epsilon 4 allele compared with in AD cases lacking the epsilon 4 allele. We infer that synaptic protein level is not indicative of neuronal loss, but the synaptic dysfunction may result from the marked relative loss of the complexins in AD, and lower levels of presynaptic proteins in AD cases with the APOE epsilon 4 allele. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
The latest Hungarian economic growth data, though favourable, do not let us forget that in the longer term growth is weak compared to the preceding period – as well as to the performance of the East-Central European region, which is more dynamic than the European average. In order to make sense of the past decade’s relative loss of pace and lay the foundations for future development policy, it is worth placing Hungary’s case in the context of the slowing tempo typical of middle-income countries. The economic development policies currently pursued by the government are aimed at increasing output in the processing industry, and by extension exports, while relevant international experience advises that it is the higher value-added activities of the global value chain, particularly business services, which should be developed further. In this way real wages and income levels could be increased, and the economy would be less exposed to the fluctuations of international cycles.
Resumo:
The galactose-binding lectin from the seeds of the jequirity plant (Abrus precatorius) was subjected to various chemical modifications in order to detect the amino acid residues involved in its binding activity. Modification of lysine, tyrosine, arginine, histidine, glutamic acid and aspartic acid residues did not affect the carbohydratebinding activity of the agglutinin. However, modification of tryptophan residues carried out in native and denaturing conditions with N-bromosuccinimide and 2- hydroxy-5-nitrobenzyl bromide led to a complete loss of its carbohydrate-binding activity. Under denaturing conditions 30 tryptophan residues/molecule were modified by both reagents, whereas only 16 and 18 residues/molecule were available for modification by N-bromosuccinimide and 2-hydroxy-5-nitrobenzyl bromide respectively under native conditions. The relative loss in haemagglutinating activity after the modification of tryptophan residues indicates that two residues/molecule are required for the carbohydrate-binding activity of the agglutinin. A partial protection was observed in the presence of saturating concentrations of lactose (0.15 M). The decrease in fluorescence intensity of Abrus agglutinin on modification of tryptophan residues is linear in the absence of lactose and shows a biphasic pattern in the presence of lactose, indicating that tryptophan residues go from a similar to a different molecular environment on saccharide binding. The secondary structure of the protein remains practically unchanged upon modification of tryptophan residues, as indicated by c.d. and immunodiffusion studies, confirming that the loss in activity is due to modification only.
Resumo:
This paper looks at the complexity of four different incremental problems. The following are the problems considered: (1) Interval partitioning of a flow graph (2) Breadth first search (BFS) of a directed graph (3) Lexicographic depth first search (DFS) of a directed graph (4) Constructing the postorder listing of the nodes of a binary tree. The last problem arises out of the need for incrementally computing the Sethi-Ullman (SU) ordering [1] of the subtrees of a tree after it has undergone changes of a given type. These problems are among those that claimed our attention in the process of our designing algorithmic techniques for incremental code generation. BFS and DFS have certainly numerous other applications, but as far as our work is concerned, incremental code generation is the common thread linking these problems. The study of the complexity of these problems is done from two different perspectives. In [2] is given the theory of incremental relative lower bounds (IRLB). We use this theory to derive the IRLBs of the first three problems. Then we use the notion of a bounded incremental algorithm [4] to prove the unboundedness of the fourth problem with respect to the locally persistent model of computation. Possibly, the lower bound result for lexicographic DFS is the most interesting. In [5] the author considers lexicographic DFS to be a problem for which the incremental version may require the recomputation of the entire solution from scratch. In that sense, our IRLB result provides further evidence for this possibility with the proviso that the incremental DFS algorithms considered be ones that do not require too much of preprocessing.
Resumo:
The questions that one should answer in engineering computations - deterministic, probabilistic/randomized, as well as heuristic - are (i) how good the computed results/outputs are and (ii) how much the cost in terms of amount of computation and the amount of storage utilized in getting the outputs is. The absolutely errorfree quantities as well as the completely errorless computations done in a natural process can never be captured by any means that we have at our disposal. While the computations including the input real quantities in nature/natural processes are exact, all the computations that we do using a digital computer or are carried out in an embedded form are never exact. The input data for such computations are also never exact because any measuring instrument has inherent error of a fixed order associated with it and this error, as a matter of hypothesis and not as a matter of assumption, is not less than 0.005 per cent. Here by error we imply relative error bounds. The fact that exact error is never known under any circumstances and any context implies that the term error is nothing but error-bounds. Further, in engineering computations, it is the relative error or, equivalently, the relative error-bounds (and not the absolute error) which is supremely important in providing us the information regarding the quality of the results/outputs. Another important fact is that inconsistency and/or near-consistency in nature, i.e., in problems created from nature is completely nonexistent while in our modelling of the natural problems we may introduce inconsistency or near-inconsistency due to human error or due to inherent non-removable error associated with any measuring device or due to assumptions introduced to make the problem solvable or more easily solvable in practice. Thus if we discover any inconsistency or possibly any near-inconsistency in a mathematical model, it is certainly due to any or all of the three foregoing factors. We do, however, go ahead to solve such inconsistent/near-consistent problems and do get results that could be useful in real-world situations. The talk considers several deterministic, probabilistic, and heuristic algorithms in numerical optimisation, other numerical and statistical computations, and in PAC (probably approximately correct) learning models. It highlights the quality of the results/outputs through specifying relative error-bounds along with the associated confidence level, and the cost, viz., amount of computations and that of storage through complexity. It points out the limitation in error-free computations (wherever possible, i.e., where the number of arithmetic operations is finite and is known a priori) as well as in the usage of interval arithmetic. Further, the interdependence among the error, the confidence, and the cost is discussed.
Resumo:
Many construction professionals and policy-makers would agree that client expectations should be accommodated during a building project. However, this aspiration is not easy to deal with as there may be conflicting interests within a client organization and these may change over time in the course of a project. This research asks why some client interests, and not others, are incorporated into the development of a building project. Actor-Network Theory (ANT) is used to study a single building project on a University campus. The building project is analysed as a number of discussions and negotiations, in which actors persuade each other to choose one solution over another. The analysis traces dynamic client engagement in decision-making processes as available options became increasingly constrained. However, this relative loss of control was countered by clients who continued the control over the timing of participants' involvement, and thus the way to impose their interests even at the later stage of the project.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Pós-graduação em Geografia - IGCE
Resumo:
A contínua incorporação de áreas florestais ao processo produtivo tem acarretado mudanças significativas na paisagem. Na Amazônia, com o avanço da fronteira agrícola, bem como a consolidação de atividades produtivas em determinadas áreas, essas transformações podem ser percebidas com maior evidência. Tal problemática também é observada nas Regiões de Integração - RI do Araguaia e Tapajós, Sudeste e Sudoeste do estado do Pará, respectivamente. Sendo assim, este trabalho tem como objetivo utilizar técnicas de mineração de dados e métricas de paisagem para identificar e analisar de forma automatizada os padrões de paisagens associados aos diferentes tipos de padrões de ocupação humana na Amazônia Legal, utilizando como recorte de análise, as Regiões de Integração do Araguaia e Tapajós no Estado do Pará, com dados de Uso e Cobertura da Terra do Projeto Terra Class para os anos de 2008 e 2010. Abordando, também, metodologias que visam identificar possíveis trajetórias de “evolução” da paisagem, no intuito de delinear recomendações visando uma melhor utilização da terra e dos recursos naturais disponíveis e, na tomada de decisão para a gestão territorial e implementação de políticas públicas. Portanto, verificou-se que a RI do Tapajós apresenta forte dinâmica de uso e cobertura da terra entre os anos de 2008 e 2010, principalmente no que tange as classes de uso da terra. No entanto, tanto para o ano de 2008 quanto para o ano de 2010 pode-se verificar que a Região ainda possui significativa parcela de áreas com cobertura vegetal. Já para a RI do Araguaia a dinâmica de uso e cobertura da terra ocorre de forma diferenciada, com significativa alteração entre as classes durante os anos analisados. No entanto, para a RI do Araguaia assim como para a RI do Tapajós a maior intensidade da dinâmica de uso ocorre entre as classes de pastagem, sendo que para a RI do Araguaia houve relativa perda das áreas com pastagem manejada (pasto limpo) para áreas de pasto com a presença de invasoras (pasto sujo) ou em fase de regeneração. O processo de mapeamento automatizado de Tipologias de Paisagem utilizando o Plugin GeoDMA do Terra View demonstrou-se eficaz e preciso, visto que os resultados alcançados apresentam coerência com a realidade de cada Região de Integração analisada.
Resumo:
OBJECTIVE: Deep brain stimulation (DBS) has emerged as a useful therapeutic option for patients with insufficient benefit from conservative treatment. METHODS: Nine patients with chronic DBS who suffered from cervical dystonia (4), generalized dystonia (2), hemidystonia (1), paroxysmal dystonia (1) and Meige syndrome (1) were available for formal follow-up at three years postoperatively, and beyond up to 10 years. All patients had undergone pallidal stimulation except one patient with paroxysmal dystonia who underwent thalamic stimulation. RESULTS: Maintained improvement was seen in all patients with pallidal stimulation up to 10 years after surgery except in one patient who had a relative loss of benefit in dystonia ratings but continued to have improved disability scores. After nine years of chronic thalamic stimulation there was a mild loss of efficacy which was regained when the target was changed to the pallidum in the patient with paroxysmal dystonia. There were no major complications related to surgery or to chronic stimulation. Pacemakers had to be replaced within 1.5 to 2 years, in general. CONCLUSION: DBS maintains marked long-term symptomatic and functional improvement in the majority of patients with dystonia.
Resumo:
A third glacier inventory (GI3) is presented for the province of Salzburg where 173 glaciers are located in the seven mountain ranges: Ankogel (47°4'N, 13°14'E), Glockner, Granatspitz, Sonnblick (Goldberg), Hochkönig, Venediger and Zillertal (47°8'N, 12°7'E). The basis for the new GI3 are orthophotos of 2007 and 2009 and the digital elevation model (DEM) of the southern part of Salzburg. On the basis of former inventories, area- and volume changes have been calculated. The biggest relative loss of glacier area per mountain range was found in the Ankogel range and on Hochkönig as a result of the disrupted structure of their small and thin glaciers. In terms of absolute values, the largest changes took place in the Glockner- and Venediger range with an area loss of -10.1 km**2 and -9.7 km**2 during the period between GI1 (1969) and GI3 (2007/2009), respectively. Volume changes have been calculated for nearly half of the glacier area in Salzburg, where DEMs were available. The Glockner, Granatspitz and Sonnblick mountain ranges showed a volume loss of -0.481 km**3 which corresponds to a mean thickness change of -10.5 m. An extrapolation of these changes to all of the 173 glaciers in Salzburg results in a loss of about 1.04 km**3 between GI1 and GI3 and 0.44 km**3 between GI2 and GI3. Overall annual changes in the province of Salzburg between GI2 and GI3 were higher than between GI1 and GI2 and show likewise changes such as those of Tyrol.