976 resultados para Texte impossible
Resumo:
Soldatova, L. N., Aubrey, W., King, R. D., Clare, A. J. (2008). The EXACT description of biomedical protocols. Bioinformatics, 24 (13), i295-i303 Sponsorship: BBSRC / RAEng / EPSRC specialissue: ISMB
Resumo:
Oficjalne kredyty ratunkowe dla niektórych krajów w strefie euro nie mogą zastąpić tam reform, które – poprawiając sytuację gospodarczą – pozwolą odzyskać owym krajom zaufanie rynków finansowych. W literaturze wymienia się dwie główne strukturalne słabości wspólnej waluty (euro): 1. Jednolita polityka pieniądza nie jest w stanie uwzględnić odmienności sytuacji poszczególnych krajów, a wspólny pieniądz nie pozwala na dewaluację w obrębie strefy euro; 2. Strefa euro istnieje bez „unii politycznej”. Artykuł analizuje zasadność tych tez.
Resumo:
Jednym z wyników prognoz wykonywanych z zastosowaniem globalnych i regionalnych modeli klimatycznych jest odkrycie wysokiego prawdopodobieństwa wzrostu częstości oraz natężenia ekstremalnych opadów. Poznanie prawidłowości ich powtarzalności i zasięgu przestrzennego ma oczywiście bardzo duże znaczenie gospodarcze i społeczne. Dlatego, niezależnie od wprowadzania nowych technik pomiarowych należy dokonywać analizy i reinterpretacji archiwalnych danych, korzystając z możliwości stwarzanych przez rozwój GIS. Głównym celem opracowania jest analiza prawidłowości przestrzennej i czasowej zmienności miesięcznych oraz rocznych maksymalnych dobowych sum opadów (MSDO) z lat 1956-1980, z obszaru Polski. W pracy wykorzystano nowe w geografii polskiej metody geostatystyczne. Do publikacji dołączono dysk DVD ze źródłową bazą danych i najważniejszymi wynikami w postaci numerycznej i kartograficznej.
Resumo:
Abstract. The paper presents a list of 437 verbs which have not been recorded in lexicography. As the source of reference, the author consulted a spelling dictionary of the Polish language, Wielki słownik ortograficzny PWN, 2nd edition, 2006. The concept of this paper originated from the wish to satisfy the curiosity after reading Indeks neologizmów [Index of Neologisms] prepared by Krystyna Waszakowa in her work Przejawy internacjonalizacji w słowotwórstwie współczesnej polszczyzny [Word-formative internationalisation processes in modern Polish]. The index contains a list of nouns. Given that K. Waszakowa did not take verbs into account (there are far (?) fewer neo-verbs than neo-nouns), the author decided to find out whether it is true that the number of verb neologisms is so small that their philological analysis is pointless from the point of view of research, vocabulary registration, etc. If nouns, such as podczłowiek, miniokupacja, redefinicja, are of interest, why not record the prefixal constructions of the do-, z-, od-, na-, w-, wy-, za-, od-, nad- etc. -robić type? The analysis included randomly selected texts from the „Rzeczpospolita” daily (without any thorough preparation with respect to the content; the texts available were sequentially analysed until the satisfactory result was obtained). The texts under review included an incomplete (it is virtually impossible to determine completeness in this case) electronic archive from the years 1993–2006.
Resumo:
Celem niniejszego opracowania jest analiza założeń polityki ochrony cyberprzestrzeni RP zaprezentowanych w dokumencie zatytułowanym Polityka ochrony cyberprzestrzeni Rzeczypospolitej Polskiej, zaprezentowanym w 2013 r. przez Ministerstwo Administracji i Cyfryzacji i Agencję Bezpieczeństwa Wewnętrznego. Artykuł poddaje analizie postulaty i wytyczne tam zamieszczone, jak również konfrontuje te założenia z elementami systemu ochrony cyberprzestrzeni RP. Zgodzić należy się z twórcami tej strategii, iż zapewnienie stanu pełnego bezpieczeństwa teleinformatycznego, jest niemożliwe. Można mówić jedynie osiągnięciu pewnego, akceptowalnego jego poziomu. Wydaje się, że do osiągnięcia tego celu, powinna w znaczącym stopniu przyczynić się realizacja priorytetów polityki ochrony cyberprzestrzeni RP, a wśród nich w szczególności określenie kompetencji podmiotów odpowiedzialnych za bezpieczeństwo cyberprzestrzeni, stworzenie i realizacja spójnego dla wszystkich podmiotów administracji rządowej systemu zarządzania bezpieczeństwem cyberprzestrzeni oraz ustanowienie wytycznych w tym zakresie dla podmiotów niepublicznych, stworzenie trwałego systemu koordynacji i wymiany informacji pomiędzy podmiotami odpowiedzialnymi za bezpieczeństwo cyberprzestrzeni i użytkownikami cyberprzestrzeni, zwiększenie świadomości użytkowników cyberprzestrzeni w zakresie metod i środków bezpieczeństwa.
Resumo:
Wydział Neofilologii
Resumo:
O carvão e outros combustíveis fósseis, continuarão a ser, por décadas, a principal matéria-prima energética para as Centrais Térmicas, não obstante os esforços para, dentro do possível, substituir os combustíveis fósseis por fontes de energia renovável.Tal como está, hoje, bem documentado, a produção de gases com efeito estufa (GEE), designadamente CO2, resulta da combustão dos ditos combustíveis fósseis, sendo que se espera ser possível mitigar substancialmente a emissão de tais gases com a aplicação das chamadas Tecnologias Limpas do Carvão.Há, pois, necessidade de promover o abatimento do CO2 através de Tecnologias de Emissão Zero ou Tecnologias Livres de Carbono, incluindo designadamente a Captura, o Transporte e a Sequestração geológica de CO2 correspondentes ao que é costume designar por Tecnologias CAC (Captação e Armazenamento de Carbono). De facto, tais tecnologias e, designadamente, o armazenamento geológico de CO2 são as únicas que, no estado actual do conhecimento, são capazes de permitir que se cumpram as metas do ambicioso programa da EU para a energia e o ambiente conhecido por “20 20 para 2020” em conjugação com os aspectos económicos das directivas relativas ao Comércio Europeu de Licenças de Emissão – CELE (Directivas 2003/87/EC, 2004/101/EC e 2009/29/EC).A importância do tema está, aliás, bem demonstrada com o facto da Comissão Europeia ter formalmente admitido que as metas supracitadas serão impossíveis de atingir sem Sequestração Geológica de CO2. Esta é, pois, uma das razões de ter sido recentemente publicada a Directiva Europeia 2009/31/EC de 23 de Abril de 2009 expressamente dedicada ao tema do Armazenamento Geológico de CO2.Ora, a questão do armazenamento geológico de CO2 implica, para além das Tecnologias CAC acima mencionadas e da sua viabilização em termos tanto técnicos como económicos, ou seja, neste último aspecto, competitiva com o sistema CELE, também o conhecimento, da percepção pública sobre o assunto. Isto é, a praticabilidade das Tecnologias CAC implica que se conheça a opinião pública sobre o tema e, naturalmente, que face a esta realidade se prestem os esclarecimentos necessários como, aliás, é reconhecido na própria Directiva Europeia 2009/31/EC.Dado que a Fundação Fernando Pessoa / Universidade Fernando Pessoa através do seu Centro de Investigação em Alterações Globais, Energia, Ambiente e Bioengenharia – CIAGEB tem ultimado um Projecto de Engenharia relativo à Sequestração Geológica de CO2 nos Carvões (Metantracites) da Bacia Carbonífera do Douro – o Projecto COSEQ, preocupou-se naturalmente, desde o início, com o lançamento de inquéritos de percepção da opinião pública sobre o assunto.Tal implicou, nesta fase, a tradução para português e o lançamento do inquérito europeu ACCSEPT que não tinha sido ainda formalmente lançado de forma generalizada entre nós. Antes, porém, de lançar publicamente tal inquérito – o que está actualmente já em curso – resolveu-se testar o método de lançamento, a recolha de dados e o seu tratamento com uma amostra correspondente ao que se designou por Comunidade Fernando Pessoa, i.e. o conjunto de docentes, discentes, funcionários e outras pessoas relacionadas com a Universidade Fernando Pessoa (cerca de 5000 individualidades).Este trabalho diz, precisamente, respeito à preparação, lançamento e análise dos resultados do dito inquérito Europeu ACCSEPT a nível da Comunidade Fernando Pessoa. Foram recebidas 525 respostas representando 10,5% da amostra. A análise de resultados foi sistematicamente comparada com os obtidos nos outros países europeus, através do projecto ACCSEPT e, bem assim, com os resultados obtidos num inquérito homólogo lançado no Brasil. The use of coal, and other fossil fuels, will remain for decades as the main source of energy for power generation, despite the important efforts made to replace, as far as possible, fossil fuels with renewable power sources.As is well documented, the production of Greenhouse Gases (GHG), mainly CO2, arises primarily from the combustion of fossil fuels. The increasing application of Clean Coal Technologies-CCTs, is expected to mitigate substantially against the emission of such gases.There is consequently a need to promote the CO2 abatement through Zero Emission (Carbon Free) Technologies - ZETs, which includes CO2 capture, transport and geological storage, i.e. the so-called CCS (Carbon, Capture and Storage) technologies. In fact, these technologies are the only ones that are presently able to conform to the ambitious EU targets set out under the “20 20 by 2020” EU energy and environment programme, jointly with the economic aspects of the EU Directives 2003/87/EC, 2004/101/EC and 2009/29/EC concerned with the Greenhouse Gas Emissions Allowance Trading Scheme – ETS scheme. The European Commission formal admission that the referred targets will be impossible to reach without the implementation and contribution of geological storage clearly demonstrate the importance of this particular issue, and for this reason the EC Directive 2009/31/EC of April 23, 2009 on Geological Storage of CO2 was recently published.In considering the technical and economical viabilities of CCS technologies, the latter in competition with the ETS scheme, it is believed that public perception will dictate the success of the development and implementation of CO2 geological storage at a large industrial level. This means that, in order to successfully implement CCS technologies, not only must public opinion be taken into consideration but objective information must also be provided to the public in order to raise subject awareness, as recognized in the referred Directive 2009/31/EC.In this context, the Fernando Pessoa Foundation / University Fernando Pessoa, through its CIAGEB (Global Change, Energy, Environment and Bioengineering) RDID&D Unit, is the sponsor of an Engineering Project for the Geological Sequestration of CO2 in Douro Coalfield Meta-anthracites - the COSEQ Project, and is therefore also engaged in public perception surveys with regards to CCS technologies.At this stage, the original European ACCSEPT inquiry was translated to Portuguese and submitted only to the “Fernando Pessoa Community” - comprising university lecturers, students, other employees, as well as, former students and persons that have a professional or academic relationship with the university (c. 5000 individuals). The results obtained from this first inquiry will be used to improve the survey informatics system in terms of communication, database, and data treatment prior to resubmission of the inquiry to the Portuguese public at large.The present publication summarizes the process and the results obtained from the ACCSEPT survey distributed to the “Fernando Pessoa Community”. 525 replies, representing 10.5% of the sample, have been received and analysed. The assessment of the results was systematically compared with those obtained from other European Countries, as reported by the ACCSEPT inquiry, as well as with those from an identical inquiry launched in Brazil.
Resumo:
Tese de Doutoramento apresentada à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Doutor em Ciências da Informação.
Resumo:
We propose a new notion of cryptographic tamper evidence. A tamper-evident signature scheme provides an additional procedure Div which detects tampering: given two signatures, Div can determine whether one of them was generated by the forger. Surprisingly, this is possible even after the adversary has inconspicuously learned (exposed) some-or even all-the secrets in the system. In this case, it might be impossible to tell which signature is generated by the legitimate signer and which by the forger. But at least the fact of the tampering will be made evident. We define several variants of tamper-evidence, differing in their power to detect tampering. In all of these, we assume an equally powerful adversary: she adaptively controls all the inputs to the legitimate signer (i.e., all messages to be signed and their timing), and observes all his outputs; she can also adaptively expose all the secrets at arbitrary times. We provide tamper-evident schemes for all the variants and prove their optimality. Achieving the strongest tamper evidence turns out to be provably expensive. However, we define a somewhat weaker, but still practical, variant: α-synchronous tamper-evidence (α-te) and provide α-te schemes with logarithmic cost. Our α-te schemes use a combinatorial construction of α-separating sets, which might be of independent interest. We stress that our mechanisms are purely cryptographic: the tamper-detection algorithm Div is stateless and takes no inputs except the two signatures (in particular, it keeps no logs), we use no infrastructure (or other ways to conceal additional secrets), and we use no hardware properties (except those implied by the standard cryptographic assumptions, such as random number generators). Our constructions are based on arbitrary ordinary signature schemes and do not require random oracles.
Resumo:
As the commoditization of sensing, actuation and communication hardware increases, so does the potential for dynamically tasked sense and respond networked systems (i.e., Sensor Networks or SNs) to replace existing disjoint and inflexible special-purpose deployments (closed-circuit security video, anti-theft sensors, etc.). While various solutions have emerged to many individual SN-centric challenges (e.g., power management, communication protocols, role assignment), perhaps the largest remaining obstacle to widespread SN deployment is that those who wish to deploy, utilize, and maintain a programmable Sensor Network lack the programming and systems expertise to do so. The contributions of this thesis centers on the design, development and deployment of the SN Workbench (snBench). snBench embodies an accessible, modular programming platform coupled with a flexible and extensible run-time system that, together, support the entire life-cycle of distributed sensory services. As it is impossible to find a one-size-fits-all programming interface, this work advocates the use of tiered layers of abstraction that enable a variety of high-level, domain specific languages to be compiled to a common (thin-waist) tasking language; this common tasking language is statically verified and can be subsequently re-translated, if needed, for execution on a wide variety of hardware platforms. snBench provides: (1) a common sensory tasking language (Instruction Set Architecture) powerful enough to express complex SN services, yet simple enough to be executed by highly constrained resources with soft, real-time constraints, (2) a prototype high-level language (and corresponding compiler) to illustrate the utility of the common tasking language and the tiered programming approach in this domain, (3) an execution environment and a run-time support infrastructure that abstract a collection of heterogeneous resources into a single virtual Sensor Network, tasked via this common tasking language, and (4) novel formal methods (i.e., static analysis techniques) that verify safety properties and infer implicit resource constraints to facilitate resource allocation for new services. This thesis presents these components in detail, as well as two specific case-studies: the use of snBench to integrate physical and wireless network security, and the use of snBench as the foundation for semester-long student projects in a graduate-level Software Engineering course.
Resumo:
An increasing number of applications, such as distributed interactive simulation, live auctions, distributed games and collaborative systems, require the network to provide a reliable multicast service. This service enables one sender to reliably transmit data to multiple receivers. Reliability is traditionally achieved by having receivers send negative acknowledgments (NACKs) to request from the sender the retransmission of lost (or missing) data packets. However, this Automatic Repeat reQuest (ARQ) approach results in the well-known NACK implosion problem at the sender. Many reliable multicast protocols have been recently proposed to reduce NACK implosion. But, the message overhead due to NACK requests remains significant. Another approach, based on Forward Error Correction (FEC), requires the sender to encode additional redundant information so that a receiver can independently recover from losses. However, due to the lack of feedback from receivers, it is impossible for the sender to determine how much redundancy is needed. In this paper, we propose a new reliable multicast protocol, called ARM for Adaptive Reliable Multicast. Our protocol integrates ARQ and FEC techniques. The objectives of ARM are (1) reduce the message overhead due to NACK requests, (2) reduce the amount of data transmission, and (3) reduce the time it takes for all receivers to receive the data intact (without loss). During data transmission, the sender periodically informs the receivers of the number of packets that are yet to be transmitted. Based on this information, each receiver predicts whether this amount is enough to recover its losses. Only if it is not enough, that the receiver requests the sender to encode additional redundant packets. Using ns simulations, we show the superiority of our hybrid ARQ-FEC protocol over the well-known Scalable Reliable Multicast (SRM) protocol.
Resumo:
Weak references are references that do not prevent the object they point to from being garbage collected. Most realistic languages, including Java, SML/NJ, and OCaml to name a few, have some facility for programming with weak references. Weak references are used in implementing idioms like memoizing functions and hash-consing in order to avoid potential memory leaks. However, the semantics of weak references in many languages are not clearly specified. Without a formal semantics for weak references it becomes impossible to prove the correctness of implementations making use of this feature. Previous work by Hallett and Kfoury extends λgc, a language for modeling garbage collection, to λweak, a similar language with weak references. Using this previously formalized semantics for weak references, we consider two issues related to well-behavedness of programs. Firstly, we provide a new, simpler proof of the well-behavedness of the syntactically restricted fragment of λweak defined previously. Secondly, we give a natural semantic criterion for well-behavedness much broader than the syntactic restriction, which is useful as principle for programming with weak references. Furthermore we extend the result, proved in previously of λgc, which allows one to use type-inference to collect some reachable objects that are never used. We prove that this result holds of our language, and we extend this result to allow the collection of weakly-referenced reachable garbage without incurring the computational overhead sometimes associated with collecting weak bindings (e.g. the need to recompute a memoized function). Lastly we use extend the semantic framework to model the key/value weak references found in Haskell and we prove the Haskell is semantics equivalent to a simpler semantics due to the lack of side-effects in our language.
Resumo:
The thesis as a whole argues that Spinoza’s Ethics in both method and content is aimed at the normal, partly rational person. Chapter 1 is on Spinoza’s writing style, finding that rather than being arid and technical, it aims to convince the reader by means of various rhetorical techniques, so does not assume an already rational reader. The following chapters of Part 1 examine whether the Ethics’ use of the synthetic geometric method exposes it to Descartes’ critique of that method in the “Second Replies” to his Meditations, that it is not suitable for pedagogy. This involves a consideration of the role of the TIE, finding in that early text not the analytic wing of a two-part analytic-synthetic method, but rather a defence and necessitation of a stand-alone synthetic method. Part 2 of the thesis develops this study of Spinoza’s writing for the common man to consider whether he is writing about the common man. This is done by examining one of the seemingly most abstract propositions in the Ethics, 4P72, which claims that a free man will not deceive even to save his own life. The study examines who exactly is this “free man” and what is his role in the Ethics. The study looks at the examples of free men in the TTP and at the concept of the model in the Ethics, and finds that rather than the free man being an impossible ideal which we can aim at but never achieve, everyone is free to some extent, and that even normal people are at times “the free man”.
Resumo:
Identification et réédition d'une inscription copte fragmentaire portant le texte du Notre-Père. L'inscription était peinte à l'intérieur d'une maison adossée à un mur du temple de Ramsès III à Médinet Abou, devenu le village de Djèmé à l'époque chrétienne.
Resumo:
Édition de trois inscriptions (une grecque, une copte et une bilingue grec-copte) de la montagne thébaine. La première porte le texte du symbole de Nicée Constantinople, la deuxième une invocation et la troisième une invocation et une date.