889 resultados para Almost always propositional logic
Resumo:
FOTO-ASSEMBLAGE consiste em nomenclatura sugestionada para definir os trabalhos que tenho produzido a partir da junção de fotografias digitais. As elaborações e fundamentações desses trabalhos representam também o cerne das pesquisas que resultaram na presente dissertação. Em princípio, o termo foto-assemblage haveria de referir-se a questões técnicas ou formais dessa prática. Contudo, ao desenvolver as pesquisas alguns procedimentos acabaram por determinar certas nuances que revelaram aspectos comuns também em seus e conteúdos. Como resultado de construções artísticas juntando fotografias desde 2009, cheguei às composições sintéticas aqui apresentadas, construídas a partir de duas fotografias. Aventei o nome foto-assemblage por observar nas imagens resultantes ressalvas que as distinguiriam de certas convenções atribuídas à ideia de fotografia. Ao mesmo tempo, as referidas imagens proporiam um possível desdobramento ao entendimento de assemblage enquanto técnica artística. Ainda que não seja uma regra, fotografias revelam imagens de momentos. Em sua relação com a compreensão humana de tempo ou espaço, fotografias quase sempre contêm instâncias mínimas. Fotografias, contudo, podem ser também compreendidas como uma contração de um percurso de tempo. Toda imagem fotográfica pode ser assimilada como resultante de determinados acontecimentos anteriores e mesmo tida como elemento gerador de conseqüências futuras. Seguindo esse entendimento, o que proponho com a foto-assemblage é que essa lide com segmentos de tempo ou de espaço contidos numa mesma imagem. Essas fotografias originárias ganhariam uma nova atribuição, sendo retiradas de seu contexto original, serviriam de balizas do percurso de tempo ou espaço suprimido e subjetivado entre elas. Poeticamente, eventos ocorridos entre as fotografias originárias estariam contidos nas imagens produzidas. O termo assemblage foi incorporado às artes a partir de 1953, por Jean Dubuffet, para descrever trabalhos que seriam algo mais do que simples colagem. A ideia de assemblage se baseia no princípio de que todo e qualquer material ou objeto colhido de nosso mundo cotidiano pode ser incorporado a uma obra de arte, criando um novo conjunto, sem que perca seu sentido original. Esse objeto é arrancado de seu uso habitual e inserido num novo contexto, tecendo laços de relação com os demais elementos, construindo narrativas num novo ambiente, o da obra. Na ideia da foto-assemblage, entretanto, é sugerido uso das imagens fotográficas originárias não como objetos que estariam em um mundo cotidiano, mas sim como imagem na concepção do que seria uma entidade mental. Adoto como que uma visão mágica onde as imagens originárias e básicas estariam numa outra dimensão, num plano bidimensional, não manipulável por nós habitantes da tridimensionalidade. Nesse ambiente imaginário ou não, as fotografias são assentadas consolidando a foto-assemblage. Quando a foto-assemblage se concretiza, se corporifica numa mídia, sendo impressa para uma contemplação, ai então, passaria a integrar nosso mundo tridimensional. O resultado poderia ser admitido como um híbrido, uma terceira coisa, a partir de duas que já não se dissociam mais no ensejo de uma compreensão estética. Ao final da dissertação, apresento experiências práticas que resultaram em quatro séries de imagens em foto-assemblage. Cada série enfatiza aspectos peculiares do que denomino paisagem expandida, representando percursos de tempo, espaço ou trajetos entre o mundo concreto e mundos do inconsciente.
Resumo:
We provide a comprehensive overview of many recent algorithms for approximate inference in Gaussian process models for probabilistic binary classification. The relationships between several approaches are elucidated theoretically, and the properties of the different algorithms are corroborated by experimental results. We examine both 1) the quality of the predictive distributions and 2) the suitability of the different marginal likelihood approximations for model selection (selecting hyperparameters) and compare to a gold standard based on MCMC. Interestingly, some methods produce good predictive distributions although their marginal likelihood approximations are poor. Strong conclusions are drawn about the methods: The Expectation Propagation algorithm is almost always the method of choice unless the computational budget is very tight. We also extend existing methods in various ways, and provide unifying code implementing all approaches.
Resumo:
A produção de H2S (sulfeto de hidrogênio) é um dos principais problemas na indústria do petróleo, sendo esta uma das causas da corrosão de tanques de estocagem e tubulações. Essa produção é possibilitada devido à injeção de água do mar durante o processo de recuperação secundária do petróleo, onde alguns micro-organismos presentes nessa água, tais como as bactérias anaeróbias heterotróficas totais (BANHT) e as bactérias redutoras de sulfato (BRS), que promovem a redução do sulfato a sulfeto. Atualmente, o método de quantificação destes micro-organismos é a técnica do Número Mais Provável (NMP) que estima o resultado em aproximadamente 28 dias. No presente trabalho foi utilizada a metodologia da produção semi-contínua de sulfeto biogênico pelo período de 15 dias, com o intuito de correlacionar com os resultados das quantificações de BANHT e BRS através da técnica convencional do NMP. Foram analisadas amostras de diferentes procedências da indústria do petróleo, apresentando variação na salinidade de 0 a 16 g.L-1. O objetivo deste procedimento foi avaliar as velocidades específicas e instantâneas de produção de H2S, sugerindo, desta forma, quais amostras apresentam maior potencial para a produção biogênica de sulfeto e em quais condições essa produção se dá. Observou-se que em todas as amostras a geração do H2S se dá de forma crescente até a estabilização desta produção, sendo esta obtida quase sempre em seis dias (144h) do crescimento microbiano. A produção do sulfeto biogênico se deu de forma mais intensa nas amostras do fundo de tanque de estocagem de óleo e da água de formação. A quantificação das BANHT e das BRS foram avaliadas pelo método do NMP de acordo com a tabela de Harrigan, a qual subestima a população microbiana, desconsiderando erros provenientes da técnica
Resumo:
Two bycatch reduction devices (BRDs)—the extended mesh funnel (EMF) and the Florida fisheye (FFE)—were evaluated in otter trawls with net mouth circumferences of 14 m, 17 m, and 20 m and total net areas of 45 m2. Each test net was towed 20 times in parallel with a control net that had the same dimensions and configuration but no BRD. Both BRDs were tested at night during fall 1996 and winter 1997 in Tampa Bay, Florida. Usually, the bycatch was composed principally of finfish (44 species were captured); horseshoe crabs and blue crabs seasonally predominated in some trawls. Ten finfish species composed 92% of the total finfish catch; commercially or recreationally valuable species accounted for 7% of the catch. Mean finfish size in the BRD-equipped nets was usually slightly smaller than that in the control nets. Compared with the corresponding control nets, both biomass and number of finfish were almost always less in the BRD-equipped nets but neither shrimp number nor biomass were significantly reduced. The differences in proportions of both shrimp and finfish catch between the BRD-equipped and control nets varied between seasons and among net sizes, and differences in finfish catch were specific for each BRD type and season. In winter, shrimp catch was highest and size range of shrimp was greater than in fall. Season-specific differences in shrimp catch among the BRD types occurred only in the 14-m, EMF nets. Finfish bycatch species composition was also highly seasonal; each species was captured mainly during only one season. However, regardless of the finfish composition, the shrimp catch was relatively constant. In part as a result of this study, the State of Florida now requires the use of BRDs in state waters.
Resumo:
The production of colour by homogenised fish material in a simplified sugar medium containing and acid indicator has been made use of for the rapid approximation of bacterial load in such products. The medium thus developed contains poptone, tryptone, yeast extract, sodium chloride and beef extract besides dextrose. The time of colour production is influenced to some extent by the level of sodium chloride in the medium and is almost always inversely proportional to the bacterial load in the homogenate.
Resumo:
Since the inception of the tuna long line fishery in the Indian Ocean in 1952, an annual average of 10% of the number of tunas and spear fishes caught continues to be damaged by sharks. In spite of the fact that this method of fishing for tunas is also resulting in the exploitation of a significant quantity of the tuna-preying sharks, the extent of the damage by these predators continues to be fairly constant. Quite often the damaged tunas are acceptable to the market, especially for canning. On the other hand report of damage caused by killer-whales, occasional at the beginning of the fishery in the Indian Ocean, has been increasing in frequency each year and since 1960 tuna fishermen have been desperately calling for ways and means of reducing the damage caused by these mammals. Unlike sharks killer-whales do not get hooked on the tuna long line; and tunas damaged by killer-whales are almost always unfit even for canning. The problem of predation by killer-whales exists not only in the whole of the Indian Ocean including the Timor and Banda Seas but also in the Atlantic and Pacific Oceans, especially in the seas around New Guinea, Samoa, Caroline and Marshal Islands. The seriousness of this problem of predation was highlighted at the annual tuna research conference held in Kochi, Japan, in February 1963, and steps were taken to devote considerable attention to this problem.
Resumo:
The diversity indices can be used as a good measure for studying the effect of industrial pollution because industrial wastes and sewage almost always reduce the diversity of natural systems into which they are discharged. A measurement of diversity often provides a better index of pollution than a direct measurement of pollutants. The assessment of macrobenthos diversity with respect to diversity indices reflects the marine population and habitat disturbance, and also serves as an important indicator of environmental conditions. The present study was designed to investigate the diversity indices of selected macrobenthos at two ecologically distinct locations on the Karanja creek (District - Raigad), Maharashtra, west coast of India. Results on various diversity indices like Index of Frequency (F) or Importance Probability (Pi), Index of Dominance (c), Rarity Index (R), Shannon's Index of General Diversity (H¹) Margaf’s Richness Index (R sub(1)) and Evenness Index (e) did not vary significantly. This demonstrates that at present, Karanja creek harbours varied forms of macrobenthic community showing no effect of human disturbance, but in future, measures must be taken for the protection and conservation of macrobenthic community of the creek.
Resumo:
We investigate the performance of different variants of a suitably tailored Tabu Search optimisation algorithm on a higher-order design problem. We consider four objective func- tions to describe the performance of a compressor stator row, subject to a number of equality and inequality constraints. The same design problem has been previously in- vestigated through single-, bi- and three-objective optimisation studies. However, in this study we explore the capabilities of enhanced variants of our Multi-objective Tabu Search (MOTS) optimisation algorithm in the context of detailed 3D aerodynamic shape design. It is shown that with these enhancements to the local search of the MOTS algorithm we can achieve a rapid exploration of complicated design spaces, but there is a trade-off be- tween speed and the quality of the trade-off surface found. Rapidly explored design spaces reveal the extremes of the objective functions, but the compromise optimum areas are not very well explored. However, there are ways to adapt the behaviour of the optimiser and maintain both a very efficient rate of progress towards the global optimum Pareto front and a healthy number of design configurations lying on the trade-off surface and exploring the compromise optimum regions. These compromise solutions almost always represent the best qualitative balance between the objectives under consideration. Such enhancements to the effectiveness of design space exploration make engineering design optimisation with multiple objectives and robustness criteria ever more practicable and attractive for modern advanced engineering design. Finally, new research questions are addressed that highlight the trade-offs between intelligence in optimisation algorithms and acquisition of qualita- tive information through computational engineering design processes that reveal patterns and relations between design parameters and objective functions, but also speed versus optimum quality. © 2012 AIAA.
Resumo:
Finding countermodels is an effective way of disproving false conjectures. In first-order predicate logic, model finding is an undecidable problem. But if a finite model exists, it can be found by exhaustive search. The finite model generation problem in the first-order logic can also be translated to the satisfiability problem in the propositional logic. But a direct translation may not be very efficient. This paper discusses how to take the symmetries into account so as to make the resulting problem easier. A static method for adding constraints is presented, which can be thought of as an approximation of the least number heuristic (LNH). Also described is a dynamic method, which asks a model searcher like SEM to generate a set of partial models, and then gives each partial model to a propositional prover. The two methods are analyzed, and compared with each other.
Resumo:
\0\05{\0\0\0\0\0\0\0\0 a uniform wall illuminated by a spot light often gives a strong impression of the illuminant color. How can it be possible to know if it is a white wall illuminated by yellow light or a yellow wall illuminated by white light? If the wall is a Lambertian reflector, it would not be possible to tell the difference. However, in the real world, some amount of specular reflection is almost always present. In this memo, it is shown that the computation is possible in most practical cases.
Resumo:
This paper introduces Denotational Proof Languages (DPLs). DPLs are languages for presenting, discovering, and checking formal proofs. In particular, in this paper we discus type-alpha DPLs---a simple class of DPLs for which termination is guaranteed and proof checking can be performed in time linear in the size of the proof. Type-alpha DPLs allow for lucid proof presentation and for efficient proof checking, but not for proof search. Type-omega DPLs allow for search as well as simple presentation and checking, but termination is no longer guaranteed and proof checking may diverge. We do not study type-omega DPLs here. We start by listing some common characteristics of DPLs. We then illustrate with a particularly simple example: a toy type-alpha DPL called PAR, for deducing parities. We present the abstract syntax of PAR, followed by two different kinds of formal semantics: evaluation and denotational. We then relate the two semantics and show how proof checking becomes tantamount to evaluation. We proceed to develop the proof theory of PAR, formulating and studying certain key notions such as observational equivalence that pervade all DPLs. We then present NDL, a type-alpha DPL for classical zero-order natural deduction. Our presentation of NDL mirrors that of PAR, showing how every basic concept that was introduced in PAR resurfaces in NDL. We present sample proofs of several well-known tautologies of propositional logic that demonstrate our thesis that DPL proofs are readable, writable, and concise. Next we contrast DPLs to typed logics based on the Curry-Howard isomorphism, and discuss the distinction between pure and augmented DPLs. Finally we consider the issue of implementing DPLs, presenting an implementation of PAR in SML and one in Athena, and end with some concluding remarks.
Resumo:
Liu, Yonghuai. Automatic 3d free form shape matching using the graduated assignment algorithm. Pattern Recognition, vol. 38, no. 10, pp. 1615-1631, 2005.
Resumo:
Background: Rationing of access to antiretroviral therapy already exists in sub-Saharan Africa and will intensify as national treatment programs develop. The number of people who are medically eligible for therapy will far exceed the human, infrastructural, and financial resources available, making rationing of public treatment services inevitable. Methods: We identified 15 criteria by which antiretroviral therapy could be rationed in African countries and analyzed the resulting rationing systems across 5 domains: clinical effectiveness, implementation feasibility, cost, economic efficiency, and social equity. Findings: Rationing can be explicit or implicit. Access to treatment can be explicitly targeted to priority subpopulations such as mothers of newborns, skilled workers, students, or poor people. Explicit conditions can also be set that cause differential access, such as residence in a designated geographic area, co-payment, access to testing, or a demonstrated commitment to adhere to therapy. Implicit rationing on the basis of first-come, first-served or queuing will arise when no explicit system is enforced; implicit systems almost always allow a high degree of queue-jumping by the elite. There is a direct tradeoff between economic efficiency and social equity. Interpretation: Rationing is inevitable in most countries for some period of time. Without deliberate social policy decisions, implicit rationing systems that are neither efficient nor equitable will prevail. Governments that make deliberate choices, and then explain and defend those choices to their constituencies, are more likely to achieve a socially desirable outcome from the large investments now being made than are those that allow queuing and queue-jumping to dominate.
Resumo:
System F is a type system that can be seen as both a proof system for second-order propositional logic and as a polymorphic programming language. In this work we explore several extensions of System F by types which express subtyping constraints. These systems include terms which represent proofs of subtyping relationships between types. Given a proof that one type is a subtype of another, one may use a coercion term constructor to coerce terms from the first type to the second. The ability to manipulate type constraints as first-class entities gives these systems a lot of expressive power, including the ability to encode generalized algebraic data types and intensional type analysis. The main contributions of this work are in the formulation of constraint types and a proof of strong normalization for an extension of System F with constraint types.
Resumo:
In three experiments, undergraduates rated autobiographical memories on scales derived from existing theories of memory. In multiple regression analyses, ratings of the degree to which subjects recollected (i.e., relived) their memories were predicted by visual imagery, auditory imagery, and emotions, whereas ratings of belief in the accuracy of their memories were predicted by knowledge of the setting. Recollection was predicted equally well in between- and within-subjects analyses, but belief consistently had smaller correlations and multiple regression predictions between subjects; individual differences in the cognitive scales that we measured could not account well for individual differences in belief. In contrast, measures of mood (Beck Depression Index) and dissociation (Dissociative Experience Scale) added predictive value for belief, but not for recollection. We also found that highly relived memories almost always had strong visual images and that remember/know judgments made on autobiographical memories were more closely related to belief than to recollection.