960 resultados para Perfect
Resumo:
X. Wang, J. Yang, X. Teng, W. Xia, and R. Jensen. Feature Selection based on Rough Sets and Particle Swarm Optimization. Pattern Recognition Letters, vol. 28, no. 4, pp. 459-471, 2007.
Resumo:
Emeseh, Engobo, 'Corporate Responsibility for Crime: Thinking outside the Box' I University of Botswana Law Journal (2005) 28-49 RAE2008
Resumo:
Greaves, G.; Meneau, F.; ap Gwynn, I.A.; Wade, S., (2003). The rheology of collapsing zeolites amorphized by temperature and pressure. Nature Materials 2, 622-629. RAE2008
Resumo:
Wydział Historyczny: Instytut Historii Sztuki
Resumo:
Poszukiwanie uniwersalnej definicji bezpieczeństwa Polski w niestabilnym systemie Unii Europejskiej opiera się głównie na odnalezieniu się w roli gracza i aktora, który jako samodzielny podmiot bierze aktywny udział w wielowymiarowym unijnym systemie negocjacji i przetargów (brokering between different interests). Polska musi mieć przygotowany swój program działania w UE o charakterze strategicznym i taktycznym włączając w niego państwo-centryczne priorytety horyzontalne i sektorowe, zarówno antykryzysowe jak i antagonistyczne i dysfunkcjonalne. Wymaga to perfekcyjnego przygotowania wykształconego zespołu ludzi zajmujących się bezpieczeństwem. Konieczne są bardzo wysokie umiejętności organizacyjne i wysoki stopień znajomości sposobu funkcjonowania państw w relacjach do całości i poszczególnych elementów UE. Wszystko to sprowadza się do konieczności wypracowywania specyficznego modus operandi polskiego bezpieczeństwa, na który poza znanymi już regułami i procedurami składa się ich interwencyjne zaplecze instytucjonalno-administracyjne oraz logistyczno-techniczne. Polska musi też posiąść zdolność do adaptacji do otaczającego świata (Europy) poprzez poszerzanie bazy funkcjonowania systemu integracyjnego. Wiąże się to bezpośrednio z dostosowywaniem do permanentnej zmiany w Unii Europejskiej i globalnym otoczeniu. Adaptacja jest również istotna z punktu widzenia potrzeby stabilizowania systemu. Pozwala neutralizować wszelkie próby zakłóceń funkcjonalnych jej struktury, pozycji i zbioru kompetencji. Adaptację powinna uzupełniać realistyczna innowacyjność i misyjność Polski widoczna przez wprowadzanie do środowiska (otoczenia) nowych reguł i mechanizmów bezpieczeństwa. Innowacyjność wiąże się z inicjowaniem nowego stylu/sposobu myślenia o bezpieczeństwie, a w związku z tym z nowatorstwem w zakresie wielopoziomowego (wieloprzestrzennego) ujmowania bezpieczeństwa. Na tak rozumiane bezpieczeństwo państwa składa się nie tylko zdolność obronna (militarna), ale także siła gospodarki oraz zasoby, którym Polska powinna dysponować. Misyjność sprowadza się natomiast do promowania i propagowania wartości przypisanych państwu narodowemu - niezapisanych w unijnych traktatach takich jak potęga, racja stanu i niepodległość.
Resumo:
Wydział Teologiczny
Resumo:
Trabalho apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Medicina Dentária.
Resumo:
Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas
Resumo:
Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Medicina Dentária
Resumo:
Predictability - the ability to foretell that an implementation will not violate a set of specified reliability and timeliness requirements - is a crucial, highly desirable property of responsive embedded systems. This paper overviews a development methodology for responsive systems, which enhances predictability by eliminating potential hazards resulting from physically-unsound specifications. The backbone of our methodology is the Time-constrained Reactive Automaton (TRA) formalism, which adopts a fundamental notion of space and time that restricts expressiveness in a way that allows the specification of only reactive, spontaneous, and causal computation. Using the TRA model, unrealistic systems - possessing properties such as clairvoyance, caprice, in finite capacity, or perfect timing - cannot even be specified. We argue that this "ounce of prevention" at the specification level is likely to spare a lot of time and energy in the development cycle of responsive systems - not to mention the elimination of potential hazards that would have gone, otherwise, unnoticed. The TRA model is presented to system developers through the CLEOPATRA programming language. CLEOPATRA features a C-like imperative syntax for the description of computation, which makes it easier to incorporate in applications already using C. It is event-driven, and thus appropriate for embedded process control applications. It is object-oriented and compositional, thus advocating modularity and reusability. CLEOPATRA is semantically sound; its objects can be transformed, mechanically and unambiguously, into formal TRA automata for verification purposes, which can be pursued using model-checking or theorem proving techniques. Since 1989, an ancestor of CLEOPATRA has been in use as a specification and simulation language for embedded time-critical robotic processes.
Resumo:
Predictability — the ability to foretell that an implementation will not violate a set of specified reliability and timeliness requirements - is a crucial, highly desirable property of responsive embedded systems. This paper overviews a development methodology for responsive systems, which enhances predictability by eliminating potential hazards resulting from physically-unsound specifications. The backbone of our methodology is a formalism that restricts expressiveness in a way that allows the specification of only reactive, spontaneous, and causal computation. Unrealistic systems — possessing properties such as clairvoyance, caprice, infinite capacity, or perfect timing — cannot even be specified. We argue that this "ounce of prevention" at the specification level is likely to spare a lot of time and energy in the development cycle of responsive systems - not to mention the elimination of potential hazards that would have gone, otherwise, unnoticed.
Resumo:
Predictability -- the ability to foretell that an implementation will not violate a set of specified reliability and timeliness requirements -- is a crucial, highly desirable property of responsive embedded systems. This paper overviews a development methodology for responsive systems, which enhances predictability by eliminating potential hazards resulting from physically-unsound specifications. The backbone of our methodology is the Time-constrained Reactive Automaton (TRA) formalism, which adopts a fundamental notion of space and time that restricts expressiveness in a way that allows the specification of only reactive, spontaneous, and causal computation. Using the TRA model, unrealistic systems – possessing properties such as clairvoyance, caprice, infinite capacity, or perfect timing -- cannot even be specified. We argue that this "ounce of prevention" at the specification level is likely to spare a lot of time and energy in the development cycle of responsive systems -- not to mention the elimination of potential hazards that would have gone, otherwise, unnoticed. The TRA model is presented to system developers through the Cleopatra programming language. Cleopatra features a C-like imperative syntax for the description of computation, which makes it easier to incorporate in applications already using C. It is event-driven, and thus appropriate for embedded process control applications. It is object-oriented and compositional, thus advocating modularity and reusability. Cleopatra is semantically sound; its objects can be transformed, mechanically and unambiguously, into formal TRA automata for verification purposes, which can be pursued using model-checking or theorem proving techniques. Since 1989, an ancestor of Cleopatra has been in use as a specification and simulation language for embedded time-critical robotic processes.
Resumo:
Do humans and animals learn exemplars or prototypes when they categorize objects and events in the world? How are different degrees of abstraction realized through learning by neurons in inferotemporal and prefrontal cortex? How do top-down expectations influence the course of learning? Thirty related human cognitive experiments (the 5-4 category structure) have been used to test competing views in the prototype-exemplar debate. In these experiments, during the test phase, subjects unlearn in a characteristic way items that they had learned to categorize perfectly in the training phase. Many cognitive models do not describe how an individual learns or forgets such categories through time. Adaptive Resonance Theory (ART) neural models provide such a description, and also clarify both psychological and neurobiological data. Matching of bottom-up signals with learned top-down expectations plays a key role in ART model learning. Here, an ART model is used to learn incrementally in response to 5-4 category structure stimuli. Simulation results agree with experimental data, achieving perfect categorization in training and a good match to the pattern of errors exhibited by human subjects in the testing phase. These results show how the model learns both prototypes and certain exemplars in the training phase. ART prototypes are, however, unlike the ones posited in the traditional prototype-exemplar debate. Rather, they are critical patterns of features to which a subject learns to pay attention based on past predictive success and the order in which exemplars are experienced. Perturbations of old memories by newly arriving test items generate a performance curve that closely matches the performance pattern of human subjects. The model also clarifies exemplar-based accounts of data concerning amnesia.
Resumo:
Along with the growing demand for cryptosystems in systems ranging from large servers to mobile devices, suitable cryptogrophic protocols for use under certain constraints are becoming more and more important. Constraints such as calculation time, area, efficiency and security, must be considered by the designer. Elliptic curves, since their introduction to public key cryptography in 1985 have challenged established public key and signature generation schemes such as RSA, offering more security per bit. Amongst Elliptic curve based systems, pairing based cryptographies are thoroughly researched and can be used in many public key protocols such as identity based schemes. For hardware implementions of pairing based protocols, all components which calculate operations over Elliptic curves can be considered. Designers of the pairing algorithms must choose calculation blocks and arrange the basic operations carefully so that the implementation can meet the constraints of time and hardware resource area. This thesis deals with different hardware architectures to accelerate the pairing based cryptosystems in the field of characteristic two. Using different top-level architectures the hardware efficiency of operations that run at different times is first considered in this thesis. Security is another important aspect of pairing based cryptography to be considered in practically Side Channel Analysis (SCA) attacks. The naively implemented hardware accelerators for pairing based cryptographies can be vulnerable when taking the physical analysis attacks into consideration. This thesis considered the weaknesses in pairing based public key cryptography and addresses the particular calculations in the systems that are insecure. In this case, countermeasures should be applied to protect the weak link of the implementation to improve and perfect the pairing based algorithms. Some important rules that the designers must obey to improve the security of the cryptosystems are proposed. According to these rules, three countermeasures that protect the pairing based cryptosystems against SCA attacks are applied. The implementations of the countermeasures are presented and their performances are investigated.
Resumo:
This thesis argues that through the prism of America’s Cold War, scientism has emerged as the metanarrative of the postnuclear age. The advent of the bomb brought about a new primacy for mechanical and hyperrational thinking in the corridors of power not just in terms of managing the bomb itself but diffusing this ideology throughout the culture in social sciences, economics and other such institutional systems. The human need to mitigate or ameliorate against the chaos of the universe lies at the heart of not just religious faith but in the desire for perfect control. Thus there has been a transference of power from religious faith to the apparent material power of science and technology and the terra firma these supposedly objective means supply. The Cold War, however was a highly ideologically charged opposition between the two superpowers, and the scientific methodology that sprang forth to manage the Cold War and the bomb, in the United States, was not an objective scientific system divorced from the paranoia and dogma but a system that assumed a radically fundamentalist idea of capitalism. This is apparent in the widespread diffusion of game theory throughout Western postindustrial institutions. The inquiry of the thesis thus examines the texts that engage and criticise American Cold War methodology, beginning with the nuclear moment, so to speak, and Dr Strangelove’s incisive satire of moral abdication to machine processes. Moving on chronologically, the thesis examines the diffusion of particular kinds of masculinity and sexuality in postnuclear culture in Crash and End Zone and finishing up its analysis with the ethnographic portrayal of a modern American city in The Wire. More than anything else, the thesis wishes to reveal to what extent this technocratic consciousness puts pressure on language and on binding narratives.