165 resultados para pessimistic


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to answer the practically important question of whether the down conductors of lightning protection systems to tall towers and buildings can be electrically isolated from the structure itself, this work is conducted. As a first step in this regard, it is presumed that the down conductor placed on metallic tower will be a pessimistic representation of the actual problem. This opinion was based on the fact that the proximity of heavy metallic structure will have a large damping effect. The post-stroke current distributions along the down conductors and towers, which can be quite different from that in the lightning channel, govern the post-stroke near field and the resulting gradient in the soil. Also, for a reliable estimation of the actual stroke current from the measured down conductor currents, it is essential to know the current distribution characteristics along the down conductors. In view of these, the present work attempts to deduce the post-stroke current and voltage distribution along typical down conductors and towers. A solution of the governing field equations on an electromagnetic model of the system is sought for the investigation. Simulation results providing the spatio-temporal distribution of the post-stroke current and voltage has provided very interesting results. It is concluded that it is almost impossible to achieve electrical isolation between the structure and the down conductor. Furthermore, there will be significant induction into the steel matrix of the supporting structure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper analyses the effect of spatial smoothing on the performance of MUSIC algorithm. In particular, an attempt is made to bring out two effects of the smoothing: (i) reduction of effective correlation between the impinging signals and (ii) reduction of the noise perturbations due to finite data. For the case of a two-source scenario with widely spaced sources, simplified expressions for improvement with smoothing have been obtained which provide more insight into the impact of smoothing. Specifically, a pessimistic estimate of the minimum value of source correlation beyond which the smoothing is beneficial is brought out by these expressions. Computer simulations are used to demonstrate the usefulness of the analytical results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To find the approximate stability limit on the forward gain in control systems with small time delay, this note suggests approximating the exponential in the characteristic equation by the first few terms of its series and using the Routh–Hurwitz criterion. This approximation avoids all the time-consuming graphical work and gives a somewhat pessimistic maximum bound for the gain constant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most of the existing WCET estimation methods directly estimate execution time, ET, in cycles. We propose to study ET as a product of two factors, ET = IC * CPI, where IC is instruction count and CPI is cycles per instruction. Considering directly the estimation of ET may lead to a highly pessimistic estimate since implicitly these methods may be using worst case IC and worst case CPI. We hypothesize that there exists a functional relationship between CPI and IC such that CPI=f(IC). This is ascertained by computing the covariance matrix and studying the scatter plots of CPI versus IC. IC and CPI values are obtained by running benchmarks with a large number of inputs using the cycle accurate architectural simulator, Simplescalar on two different architectures. It is shown that the benchmarks can be grouped into different classes based on the CPI versus IC relationship. For some benchmarks like FFT, FIR etc., both IC and CPI are almost a constant irrespective of the input. There are other benchmarks that exhibit a direct or an inverse relationship between CPI and IC. In such a case, one can predict CPI for a given IC as CPI=f(IC). We derive the theoretical worst case IC for a program, denoted as SWIC, using integer linear programming(ILP) and estimate WCET as SWIC*f(SWIC). However, if CPI decreases sharply with IC then measured maximum cycles is observed to be a better estimate. For certain other benchmarks, it is observed that the CPI versus IC relationship is either random or CPI remains constant with varying IC. In such cases, WCET is estimated as the product of SWIC and measured maximum CPI. It is observed that use of the proposed method results in tighter WCET estimates than Chronos, a static WCET analyzer, for most benchmarks for the two architectures considered in this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estimating program worst case execution time(WCET) accurately and efficiently is a challenging task. Several programs exhibit phase behavior wherein cycles per instruction (CPI) varies in phases during execution. Recent work has suggested the use of phases in such programs to estimate WCET with minimal instrumentation. However the suggested model uses a function of mean CPI that has no probabilistic guarantees. We propose to use Chebyshev's inequality that can be applied to any arbitrary distribution of CPI samples, to probabilistically bound CPI of a phase. Applying Chebyshev's inequality to phases that exhibit high CPI variation leads to pessimistic upper bounds. We propose a mechanism that refines such phases into sub-phases based on program counter(PC) signatures collected using profiling and also allows the user to control variance of CPI within a sub-phase. We describe a WCET analyzer built on these lines and evaluate it with standard WCET and embedded benchmark suites on two different architectures for three chosen probabilities, p={0.9, 0.95 and 0.99}. For p= 0.99, refinement based on PC signatures alone, reduces average pessimism of WCET estimate by 36%(77%) on Arch1 (Arch2). Compared to Chronos, an open source static WCET analyzer, the average improvement in estimates obtained by refinement is 5%(125%) on Arch1 (Arch2). On limiting variance of CPI within a sub-phase to {50%, 10%, 5% and 1%} of its original value, average accuracy of WCET estimate improves further to {9%, 11%, 12% and 13%} respectively, on Arch1. On Arch2, average accuracy of WCET improves to 159% when CPI variance is limited to 50% of its original value and improvement is marginal beyond that point.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Espanol: En la presente lista bibliográfica fueron recopiladas las referencias sobre los peces continentales de la Argentina, del período comprendido entre mediados del siglo XVIII y fines del año 2005. Incluye las listas bibliográficas publicadas durante los años 1981 a 2004, y las citas no mencionadas en ellas. Se incluyeron el ISSN o ISBN según correspondiera, la abreviatura oficial y el lugar de origen de las publicaciones. En algunos casos, los ISSN, las abreviaturas de los nombres de las publicaciones o su procedencia, mencionados en los catálogos, no coinciden con los de la home page de la publicación. Una bibliografía puede ser muy rica y aún estar incompleta. Requiere de sus lectores algún interés histórico, y aún un interés profundo por su tema. Ante una bibliografía, muchos investigadores preferirían no encontrar algunas referencias, y de hecho, muchas son oportunamente olvidadas. Por no saber como hacerlo, o por menosprecio, estas listas raramente son citadas en los trabajos, aunque sobre algunos temas en particular, sería realmente difícil formarse una idea si las bibliografías no existieran. Aún desde el comienzo es complicado precisar un criterio de inclusiones. Por ejemplo, gran parte de la ictiofauna Argentina se encuentra también en Brasil. ¿Justifica esto incluir informes perdidos sobre artes de pesca en una cuenca distante? ¿Deben los clásicos, que todo el mundo conoce y el que se inicia encontrará sin dificultad, ser incluidos? Aún a un grupo que se dedicara full time a este trabajo, le sería difícil verificar la precisión de las citas antiguas, en las que fechas y autoría cambian según la investigación histórica. En una bibliografía más o menos general, la perfección atenta contra la publicación. Sin embargo, pensamos que es conveniente hacerlas. Una mirada a este volumen, muestra la enorme cantidad de desarrollos en muchos temas, y la regla que uno de nosotros ha mencionado desde hace tiempo: siempre hay más publicado sobre un tema de lo que uno cree. La sospecha de que con sólo mirar lo que está hecho, muchos subsidios podrían utilizarse para algo más útil que algunas evaluaciones repetidas de recursos o biodiversidad, es un poco pesimista y no haremos perder trabajos insistiendo en eso. Cada generación elige sus metas, su propia base epistemológica, sus trabajos preferidos y los que desecha. Aún en trabajos perdidos o de mala calidad, es posible encontrar datos valiosos. Ningún proyecto, por mejor diseñado que esté, podrá mostrar en el presente los organismos que vivían en el pasado en un lugar en el que las condiciones han cambiado, o lo hará en términos de otra disciplina. En los temas aplicados la información del pasado puede ser importantísima. Aún en una disciplina tan conservadora como la nomenclatura, los cambios pueden ser exasperantes; no pueden serlo menos en las que intrínsecamente, como la ecología, es lo que estudian. Para dar una idea más precisa del desarrollo de la ictiología en la Argentina, esta lista podría ir acompañada de una apreciación crítica. Entendemos que una tarea así exige un trabajo diferente, de cierta magnitud y con no pocos elementos históricos. Aunque tiene deficiencias, la ictiología argentina constituye una acumulación de conocimientos de considerable calidad y pertinencia para la historia natural de América del Sur. Dejamos a los lectores que cada uno haga la suya. English: For the present list, references on freshwater fishes of Argentina were compiled from the period between middle XVIII Century and the end of the year 2005. It includes previous lists published during 1981 to 2004, and references not mentioned therein. The ISSN o ISBN numbers were included, as well as the official abbreviations and the place of origin of the periodicals. In some cases, these data as quoted in catalogs, do not agree with those in the home page of the publication. A bibliography may be very rich and anyway never complete. It requires from its readers some historic interest and indeed a deep interest on his (her) subject. Browsing a bibliography, many researchers would prefer not to find some references, and in fact, sometimes they forget some of them. Not imagining how to do it, or because people do not concede importance to them, bibliographic lists are rarely quoted in papers, though some subjects would be rather difficult to understand if list of publications would not exist. Even from the beginning, it is difficult to precise a criterion of inclusions. For example, many Argentine fishes occur also in Brazil. Does this justify the inclusion of grey reports on a distant basin? Should classic works, that everybody knows and are easily found, be included? Is near impossible, even for a group dedicated full time to this work, to verify the precision of old citations, whose dates and authorship change according to authorities and historical research. In a more or less general bibliography, completeness is against publication. Nevertheless, we think that is convenient to prepare these lists. A look at this volume shows the enormous developments in many subjects, and the rule that one of us mentioned long ago: there are always more papers on any subject than one suspects. Looking at what has already been done raises the suspicion that many grants could be used for something more useful than repeated evaluations of biodiversity or resources. This is a bit pessimistic, and we do not want to erase working opportunities. Each generation chooses its targets, its own epistemological base, its preferred papers and those that rejects. Even in lost or bad quality papers, the possibility of finding valuable information exists. No project, whatever the appropriateness of its design, could show at present which organisms lived in the past in a place where environmental conditions have changed, or it will do it in terms of another discipline. In applied subjects, information from the past can be very important. Even in a conservative discipline as nomenclature, changes can be exasperating. They are not lesser in those like ecology, where change itself is studied. To provide a more precise idea of the development of ichthyology in Argentina, this list could be accompanied by a critical appreciation. We understood that such an aim requires a different work, with no few historical elements and of certain magnitude. In spite of some deficiencies, Argentine ichthyology, resulting from collaboration of both local and foreign people, constitutes a bulk of knowledge of considerable quality and pertinence for the natural history of South America. We leave each reader to make his (or her) own evaluation. (Texto en Espanol. PDF tiene cien setenta paginas.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O aumento exponencial dos gastos em saúde demanda estudos econômicos que subsidiem as decisões de agentes públicos ou privados quanto à incorporação de novas tecnologias aos sistemas de saúde. A tomografia de emissão de pósitrons (PET) é uma tecnologia de imagem da área de medicina nuclear, de alto custo e difusão ainda recente no país. O nível de evidência científica acumulada em relação a seu uso no câncer pulmonar de células não pequenas (CPCNP) é significativo, com a tecnologia mostrando acurácia superior às técnicas de imagem convencionais no estadiamento mediastinal e à distância. Avaliação econômica realizada em 2013 aponta para seu custo-efetividade no estadiamento do CPCNP em comparação à estratégia atual de manejo baseada no uso da tomografia computadorizada, na perspectiva do SUS. Sua incorporação ao rol de procedimentos disponibilizados pelo SUS pelo Ministério da Saúde (MS) ocorreu em abril de 2014, mas ainda se desconhecem os impactos econômico-financeiros decorrentes desta decisão. Este estudo buscou estimar o impacto orçamentário (IO) da incorporação da tecnologia PET no estadiamento do CPCNP para os anos de 2014 a 2018, a partir da perspectiva do SUS como financiador da assistência à saúde. As estimativas foram calculadas pelo método epidemiológico e usaram como base modelo de decisão do estudo de custo-efetividade previamente realizado. Foram utilizados dados nacionais de incidência; de distribuição de doença e acurácia das tecnologias procedentes da literatura e de custos, de estudo de microcustos e das bases de dados do SUS. Duas estratégias de uso da nova tecnologia foram analisadas: (a) oferta da PET-TC a todos os pacientes; e (b) oferta restrita àqueles que apresentem resultados de TC prévia negativos. Adicionalmente, foram realizadas análises de sensibilidade univariadas e por cenários extremos, para avaliar a influência nos resultados de possíveis fontes de incertezas nos parâmetros utilizados. A incorporação da PET-TC ao SUS implicaria a necessidade de recursos adicionais de R$ 158,1 (oferta restrita) a 202,7 milhões (oferta abrangente) em cinco anos, e a diferença entre as duas estratégias de oferta é de R$ 44,6 milhões no período. Em termos absolutos, o IO total seria de R$ 555 milhões (PET-TC para TC negativa) e R$ 600 milhões (PET-TC para todos) no período. O custo do procedimento PET-TC foi o parâmetro de maior influência sobre as estimativas de gastos relacionados à nova tecnologia, seguido da proporção de pacientes submetidos à mediastinoscopia. No cenário por extremos mais otimista, os IOs incrementais reduzir-se-iam para R$ 86,9 (PET-TC para TC negativa) e R$ 103,9 milhões (PET-TC para todos), enquanto no mais pessimista os mesmos aumentariam para R$ 194,0 e R$ 242,2 milhões, respectivamente. Resultados sobre IO, aliados às evidências de custo-efetividade da tecnologia, conferem maior racionalidade às decisões finais dos gestores. A incorporação da PET no estadiamento clínico do CPCNP parece ser financeiramente factível frente à magnitude do orçamento do MS, e potencial redução no número de cirurgias desnecessárias pode levar à maior eficiência na alocação dos recursos disponíveis e melhores desfechos para os pacientes com estratégias terapêuticas mais bem indicadas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A região Centro-Oeste do Brasil tornou-se nos últimos 40 anos grande produtora de grãos e carne bovina. As condições edafoclimáticas, o sistema de manejo do solo e o descumprimento de leis ambientais trouxeram conseqüências drásticas à região como o agravamento do processo hídrico erosivo, principalmente na Bacia do Alto Taquari (BAT). Cerca de 90% da BAT localiza-se na porção norte do estado de Mato Grosso do Sul (MS), porém os efeitos do transporte de sedimentos e volume de água são refletidos a jusante dos rios, na Bacia do Pantanal. Utilizando-se pressupostos do Painel Intergovernamental de Mudanças Climáticas (IPCC) foram estabelecidos cenários de mudanças climáticas na Bacia do Alto Taquari, visando identificar áreas com maior vulnerabilidade ao processo erosivo em função de pressões de uso da terra. Usando a modelagem dinâmica no TerraME (Environment Modeling) foram gerados cenários topopluviais até 2100, considerando-se para a temperatura do ar média anual um aumento de 1C, em cenário otimista e, em pessimista, elevações térmicas de 3C. Para a precipitação pluvial média anual um cenário foi com aumento de 15% e outro com reduções de 15%. Os dados foram espacializados no ArcGis 9.2 e exportados para o TerraView 3.2, criando-se espaços celulares e integrando-se com as informações do modelo digital do terreno do Shuttle Radar Topography Mission (SRTM) para geração dos mapas topoclimáticos e simulações de cenários no TerraMe. Os resultados apontam que 85% da área da BAT nas condições atuais as temperaturas médias variam entre 23,6 a 25,7C. As simulações térmicas no cenário otimista indicam que em 40 anos as temperaturas tendem a superar o maior limite térmico médio nas áreas ao longo do rio Taquari, no sentido Oeste-Leste. Esses valores evidenciam elevações nas taxas evapotranspiratórias de matas ciliares, indicando reduções na vazão do Taquari. Em cenário pessimista essas temperaturas antecipam sua ocorrência, em um prazo de 20 anos. Os cenários com acréscimo de 15% na precipitação pluvial mostram aumentos no volume de água precipitada na parte norte da Bacia, região mais vulnerável aos problemas de erosão hídrica. Cenários do regime térmico-hídrico apontam áreas mais sensíveis às mudanças climáticas na parte oeste da BAT e impactos ambientais também na Bacia do Pantanal. Conclui-se que o TerraME é indicado para gerar cenários de mudanças climáticas em bacias hidrográficas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

当今智能科学与智能技术是科技界的一个热点。在这方面有很多争议。有些学者怀疑智能技术前景,将现有“智能系统”与人相比,根本上否定有什么智能,认为只不过是一套专用软件。有的则过份乐观,我们认为过份悲观或过份乐观都是错误的,这里首先要区分智能科学与智能技术。前者是研究人所具有的自然现象,后者是开发一种技术去替代人的某些脑力劳动。其次,应正确认识现有智能技术方法的局限性。

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The vehicle navigation problem studied in Bell (2009) is revisited and a time-dependent reverse Hyperstar algorithm is presented. This minimises the expected time of arrival at the destination, and all intermediate nodes, where expectation is based on a pessimistic (or risk-averse) view of unknown link delays. This may also be regarded as a hyperpath version of the Chabini and Lan (2002) algorithm, which itself is a time-dependent A* algorithm. Links are assigned undelayed travel times and maximum delays, both of which are potentially functions of the time of arrival at the respective link. The driver seeks probabilities for link use that minimise his/her maximum exposure to delay on the approach to each node, leading to the determination of the pessimistic expected time of arrival. Since the context considered is vehicle navigation where the driver is not making repeated trips, the probability of link use may be interpreted as a measure of link attractiveness, so a link with a zero probability of use is unattractive while a link with a probability of use equal to one will have no attractive alternatives. A solution algorithm is presented and proven to solve the problem provided the node potentials are feasible and a FIFO condition applies for undelayed link travel times. The paper concludes with a numerical example.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation is an exercise in practical theology, which investigates and responds to the problem of changing holiness identity in the Church of the Nazarene. The first part of the study is an empirical investigation into the social context of contemporary Nazarene holiness identity and practices among Nazarenes in three congregations located in the Northeast United States. Previous research relied too heavily on secularization and sect-church theory to understand the dynamics of religious identity change among Nazarenes. The theological result was a pessimistic appraisal of the future possibilities of holiness identity and practice in the Church of the Nazarene. This study employs an alternative theory—Nancy T. Ammerman's theory of narrative religious identity—to understand the dynamics of lived religious life within these congregations and to identify the various holiness narratives at play. Ammerman's theory facilitates an empirical description of the multiple holiness identities emerging out of the social contexts of these Nazarene congregations and offers a way to account for identity change. At the heart of this research is the theoretical notion that a particular religious identity, in the case of the Church of the Nazarene, the "sanctified person," emerges out of a particular ecclesial context characterized by religious narratives and practices that shape this identity. Chapter one reviews the problem of holiness identity in the Church of the Nazarene and offers an analysis of recent sociological attempts to understand the changing identity among Nazarenes. Chapter two draws on sociological research to describe and depict the range of views of holiness held by some contemporary Nazarenes. Chapter three identifies the varieties of holiness identity within the three Nazarene congregations that are part of the study. Chapter four investigates the social sources that shape the various holiness identities discovered in these congregations. Chapter five is a description of the many ways religious narratives are enacted and engaged within these congregations. The second part of the study is a theological critique of contemporary Nazarene holiness identity. Chapter six draws on the theory of narrative identity proposed by Nancy Ammerman and outlines a theoretical model which describes the social conditions necessary to shape holiness identity, "the sanctified person," within the context of the local congregation. Finally, chapter seven draws on the theological resources of Mennonite scholar and historian John Howard Yoder to propose a way of construing and facilitating holiness identity formation that takes the ecclesiality of hoilness more seriously, emphasizes a clearer relationship between Jesus and the "Christlikeness" that is central to holiness, and highlights the importance of religious practices in the formation of a holiness identity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we propose a new class of Concurrency Control Algorithms that is especially suited for real-time database applications. Our approach relies on the use of (potentially) redundant computations to ensure that serializable schedules are found and executed as early as possible, thus, increasing the chances of a timely commitment of transactions with strict timing constraints. Due to its nature, we term our concurrency control algorithms Speculative. The aforementioned description encompasses many algorithms that we call collectively Speculative Concurrency Control (SCC) algorithms. SCC algorithms combine the advantages of both Pessimistic and Optimistic Concurrency Control (PCC and OCC) algorithms, while avoiding their disadvantages. On the one hand, SCC resembles PCC in that conflicts are detected as early as possible, thus making alternative schedules available in a timely fashion in case they are needed. On the other hand, SCC resembles OCC in that it allows conflicting transactions to proceed concurrently, thus avoiding unnecessary delays that may jeopardize their timely commitment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Various concurrency control algorithms differ in the time when conflicts are detected, and in the way they are resolved. In that respect, the Pessimistic and Optimistic Concurrency Control (PCC and OCC) alternatives represent two extremes. PCC locking protocols detect conflicts as soon as they occur and resolve them using blocking. OCC protocols detect conflicts at transaction commit time and resolve them using rollbacks (restarts). For real-time databases, blockages and rollbacks are hazards that increase the likelihood of transactions missing their deadlines. We propose a Speculative Concurrency Control (SCC) technique that minimizes the impact of blockages and rollbacks. SCC relies on the use of added system resources to speculate on potential serialization orders and to ensure that if such serialization orders materialize, the hazards of blockages and roll-backs are minimized. We present a number of SCC-based algorithms that differ in the level of speculation they introduce, and the amount of system resources (mainly memory) they require. We show the performance gains (in terms of number of satisfied timing constraints) to be expected when a representative SCC algorithm (SCC-2S) is adopted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While there are many reasons to continue to smoke in spite of its consequences for health, the concern that many smoke because they misperceive the risks of smoking remains a focus of public discussion and motivates tobacco control policies and litigation. In this paper we investigate the relative accuracy of mature smokers' risk perceptions about future survival, and a range of morbidities and disabilities. Using data from the survey on smoking (SOS) conducted for this research, we compare subjective beliefs elicited from the SOS with corresponding individual-specific objective probabilities estimated from the health and retirement study. Overall, consumers in the age group studied, 50-70, are not overly optimistic in their perceptions of health risk. If anything, smokers tend to be relatively pessimistic about these risks. The finding that smokers are either well informed or pessimistic regarding a broad range of health risks suggests that these beliefs are not pivotal in the decision to continue smoking. Although statements by the tobacco companies may have been misleading and thus encouraged some to start smoking, we find no evidence that systematic misinformation about the health consequences of smoking inhibits quitting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

MURAWSKI AND COLLEAGUES STATE THAT OUR assessment of the impacts of global marine biodiversity loss is overly pessimistic. They imply that management interventions are likely to reverse current trends of overfishing, and that the U.S. National Marine Fisheries Service (NMFS) has already met that goal. They cite Georges Bank haddock as an example and contest that catch metrics (as used in our global analysis) are sufficient to track the status of this particular fish stock and possibly others. We agree that precise biomass data are preferable, but these are rarely available. Here, we illustrate that catches are a good proxy of the status of haddock, although there can be a short delay in detecting recovery under intense management. While NMFS’s own data show that full recovery is still uncommon (<5% of overfished stocks) (1), we strongly agree that destructive trends can be turned around and that rebuilding efforts need to be intensified to meet that goal. But we must not miss the forest for the trees: Continuing focus on single, well-assessed, economically viable species will leave most of the ocean’s declining biodiversity under the radar.