937 resultados para Threshold crypto-graphic schemes and algorithms
Resumo:
Graphic Design is a profession established with advent artistic vanguards of the twentieth century. As professional activity, it works with planning and projects which involve the visual solution of communication and information problems. Then, The Surface Design is the professional practice pledged with the elaboration of projects for coatings and application in products, giving attention to the production and materials in the processes. The relation between both Design areas is connected, at first, by the two-dimensional characteristic. In Brazil, academic study about Surface Design is in evolution and some universities works with this subject. Analyzing the concepts and characteristics concerning methodologies that comprehend both Graphic Design and Surface Design becomes a current and including subject. Therefore, it is necessary the conceptual unfolding and methodological in order to establish relations and contributions involved in the field of Graphic Design and apply them in Surface Design. Thus, the present study aimed to check concepts and particular functions of Graphic Design and to establish its relations and contributions for Surface Design. The results of this study can be used as teaching support and professional practice.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Pós-graduação em Matemática em Rede Nacional - IBILCE
Resumo:
The nutritional requirements of crops, in general, becomes more intense with the beginning of the reproductive phase, being more critical at the time of seed formation, when considerable amounts of nutrients are they translocation, this requirement should be increased to the fact that nutrients are essential to training and development of new bodies of booking. This study aimed to evaluate the agronomic efficiency of foliar application of zinc (zinc oxide Zn 700 g L-1 ) in bean plant, compared to leaf application of zinc sulphate (ZnSO4) and control (without application of Zn). The experiment was installed in the Faculty of Agricultural Sciences - UNESP / Campus de Botucatu-SP. Was placed in containers with a capacity of 20L of soil and leaf applications encompassing four schemes and two of rain, with 5 replicates per treatment, a total of 40 vessels. The results for the factorial design did not show in general, significantly different answers when evaluated on the simulation of rain or the lack of simulation. The treatment (700g L-1 of ZnO) has demonstrated agronomic efficiency as its foliar application, with results equal or exceed the application of ZnSO4 and control when applied at the same dose of Zn.
Resumo:
Comic books, graphic narrative and sequential art, had their origin in newspapers, in the period of the Industrial Revolution. First published weekly in the comic strip format, with the passage of time, the new form of literature has gained more and more public and comic strips became complete stories in the format of comic books and later, graphic novels. This course's final paper aims to present the main components of comics: the picture and the text; and to examine the way in which these two elements overlap and complement each other in the configuration of comics as a whole. As the object of analysis, it was chosen the graphic novel Spider-Man: Blue, first published in 2002 as part of a project composed by three other titles from the double winning cartoonists of the Eisner Award, Jeph Loeb and Tim Sale. For theoretical background, was chosen the book Os Quadrinhos - Linguagem e Semiótica: Um Estudo Abrangente da Arte Sequencial written by the researcher Antonio Luiz Cagnin, which presents a study of all components found in the sequential art and treated in this work: the narrative time, visual plans, balloons, caption, and onomatopoeia
Resumo:
This undergraduate thesis aims formally define aspects of Quantum Turing Machine using as a basis quantum finite automata. We introduce the basic concepts of quantum mechanics and quantum computing through principles such as superposition, entanglement of quantum states, quantum bits and algorithms. We demonstrate the Bell's teleportation theorem, enunciated in the form of Deutsch-Jozsa definition for quantum algorithms. The way as the overall text were written omits formal aspects of quantum mechanics, encouraging computer scientists to understand the framework of quantum computation. We conclude our thesis by listing the Quantum Turing Machine's main limitations regarding the well-known Classical Turing Machines
Resumo:
Comic books, graphic narrative and sequential art, had their origin in newspapers, in the period of the Industrial Revolution. First published weekly in the comic strip format, with the passage of time, the new form of literature has gained more and more public and comic strips became complete stories in the format of comic books and later, graphic novels. This course's final paper aims to present the main components of comics: the picture and the text; and to examine the way in which these two elements overlap and complement each other in the configuration of comics as a whole. As the object of analysis, it was chosen the graphic novel Spider-Man: Blue, first published in 2002 as part of a project composed by three other titles from the double winning cartoonists of the Eisner Award, Jeph Loeb and Tim Sale. For theoretical background, was chosen the book Os Quadrinhos - Linguagem e Semiótica: Um Estudo Abrangente da Arte Sequencial written by the researcher Antonio Luiz Cagnin, which presents a study of all components found in the sequential art and treated in this work: the narrative time, visual plans, balloons, caption, and onomatopoeia
Resumo:
This undergraduate thesis aims formally define aspects of Quantum Turing Machine using as a basis quantum finite automata. We introduce the basic concepts of quantum mechanics and quantum computing through principles such as superposition, entanglement of quantum states, quantum bits and algorithms. We demonstrate the Bell's teleportation theorem, enunciated in the form of Deutsch-Jozsa definition for quantum algorithms. The way as the overall text were written omits formal aspects of quantum mechanics, encouraging computer scientists to understand the framework of quantum computation. We conclude our thesis by listing the Quantum Turing Machine's main limitations regarding the well-known Classical Turing Machines
Resumo:
We employ the approach of stochastic dynamics to describe the dissemination of vector-borne diseases such as dengue, and we focus our attention on the characterization of the threshold of the epidemic. The coexistence space comprises two representative spatial structures for both human and mosquito populations. The human population has its evolution described by a process that is similar to the Susceptible-Infected-Recovered (SIR) dynamics. The population of mosquitoes follows a dynamic of the type of the Susceptible Infected-Susceptible (SIS) model. The coexistence space is a bipartite lattice constituted by two structures representing the human and mosquito populations. We develop a truncation scheme to solve the evolution equations for the densities and the two-site correlations from which we get the threshold of the disease and the reproductive ratio. We present a precise deØnition of the reproductive ratio which reveals the importance of the correlations developed in the early stage of the disease. According to our deØnition, the reproductive rate is directed related to the conditional probability of the occurrence of a susceptible human (mosquito) given the presence in the neighborhood of an infected mosquito (human). The threshold of the epidemic as well as the phase transition between the epidemic and the non-epidemic states are also obtained by performing Monte Carlo simulations. References: [1] David R. de Souza, T^ania Tom∂e, , Suani R. T. Pinho, Florisneide R. Barreto and M∂ario J. de Oliveira, Phys. Rev. E 87, 012709 (2013). [2] D. R. de Souza, T. Tom∂e and R. M. ZiÆ, J. Stat. Mech. P03006 (2011).
Resumo:
This thesis gathers the work carried out by the author in the last three years of research and it concerns the study and implementation of algorithms to coordinate and control a swarm of mobile robots moving in unknown environments. In particular, the author's attention is focused on two different approaches in order to solve two different problems. The first algorithm considered in this work deals with the possibility of decomposing a main complex task in many simple subtasks by exploiting the decentralized implementation of the so called \emph{Null Space Behavioral} paradigm. This approach to the problem of merging different subtasks with assigned priority is slightly modified in order to handle critical situations that can be detected when robots are moving through an unknown environment. In fact, issues can occur when one or more robots got stuck in local minima: a smart strategy to avoid deadlock situations is provided by the author and the algorithm is validated by simulative analysis. The second problem deals with the use of concepts borrowed from \emph{graph theory} to control a group differential wheel robots by exploiting the Laplacian solution of the consensus problem. Constraints on the swarm communication topology have been introduced by the use of a range and bearing platform developed at the Distributed Intelligent Systems and Algorithms Laboratory (DISAL), EPFL (Lausanne, CH) where part of author's work has been carried out. The control algorithm is validated by demonstration and simulation analysis and, later, is performed by a team of four robots engaged in a formation mission. To conclude, the capabilities of the algorithm based on the local solution of the consensus problem for differential wheel robots are demonstrated with an application scenario, where nine robots are engaged in a hunting task.
Resumo:
The aim of this thesis was to describe the development of motion analysis protocols for applications on upper and lower limb extremities, by using inertial sensors-based systems. Inertial sensors-based systems are relatively recent. Knowledge and development of methods and algorithms for the use of such systems for clinical purposes is therefore limited if compared with stereophotogrammetry. However, their advantages in terms of low cost, portability, small size, are a valid reason to follow this direction. When developing motion analysis protocols based on inertial sensors, attention must be given to several aspects, like the accuracy of inertial sensors-based systems and their reliability. The need to develop specific algorithms/methods and software for using these systems for specific applications, is as much important as the development of motion analysis protocols based on them. For this reason, the goal of the 3-years research project described in this thesis was achieved first of all trying to correctly design the protocols based on inertial sensors, in terms of exploring and developing which features were suitable for the specific application of the protocols. The use of optoelectronic systems was necessary because they provided a gold standard and accurate measurement, which was used as a reference for the validation of the protocols based on inertial sensors. The protocols described in this thesis can be particularly helpful for rehabilitation centers in which the high cost of instrumentation or the limited working areas do not allow the use of stereophotogrammetry. Moreover, many applications requiring upper and lower limb motion analysis to be performed outside the laboratories will benefit from these protocols, for example performing gait analysis along the corridors. Out of the buildings, the condition of steady-state walking or the behavior of the prosthetic devices when encountering slopes or obstacles during walking can also be assessed. The application of inertial sensors on lower limb amputees presents conditions which are challenging for magnetometer-based systems, due to ferromagnetic material commonly adopted for the construction of idraulic components or motors. INAIL Prostheses Centre stimulated and, together with Xsens Technologies B.V. supported the development of additional methods for improving the accuracy of MTx in measuring the 3D kinematics for lower limb prostheses, with the results provided in this thesis. In the author’s opinion, this thesis and the motion analysis protocols based on inertial sensors here described, are a demonstration of how a strict collaboration between the industry, the clinical centers, the research laboratories, can improve the knowledge, exchange know-how, with the common goal to develop new application-oriented systems.
Resumo:
In this work we conduct an experimental analysis on different behavioral models of economic choice. In particular, we analyze the role of overconfidence in shaping the beliefs of economics agents about the future path of their consumption or investment. We discuss the relevance of this bias in expectation formation both from a static and from a dynamic point of view and we analyze the effect of possible interventions aimed to achieve some policy goals. The methodology we follow is both theoretical and empirical. In particular, we make large use of controlled economic field experiments in order to test the predictions of the theoretical models we propose. In the second part of the thesis we discuss the role of cognition and personality in affecting economic preferences and choices. In this way we make a bridge between established psychological research and novel findings in economics. Finally, we conduct a field study on the role of incentives on education. We design different incentive schemes and we test, on randomized groups of students, their effectiveness in improving academic performance.
Resumo:
Photovoltaic (PV) conversion is the direct production of electrical energy from sun without involving the emission of polluting substances. In order to be competitive with other energy sources, cost of the PV technology must be reduced ensuring adequate conversion efficiencies. These goals have motivated the interest of researchers in investigating advanced designs of crystalline silicon solar (c-Si) cells. Since lowering the cost of PV devices involves the reduction of the volume of semiconductor, an effective light trapping strategy aimed at increasing the photon absorption is required. Modeling of solar cells by electro-optical numerical simulation is helpful to predict the performance of future generations devices exhibiting advanced light-trapping schemes and to provide new and more specific guidelines to industry. The approaches to optical simulation commonly adopted for c-Si solar cells may lead to inaccurate results in case of thin film and nano-stuctured solar cells. On the other hand, rigorous solvers of Maxwell equations are really cpu- and memory-intensive. Recently, in optical simulation of solar cells, the RCWA method has gained relevance, providing a good trade-off between accuracy and computational resources requirement. This thesis is a contribution to the numerical simulation of advanced silicon solar cells by means of a state-of-the-art numerical 2-D/3-D device simulator, that has been successfully applied to the simulation of selective emitter and the rear point contact solar cells, for which the multi-dimensionality of the transport model is required in order to properly account for all physical competing mechanisms. In the second part of the thesis, the optical problems is discussed. Two novel and computationally efficient RCWA implementations for 2-D simulation domains as well as a third RCWA for 3-D structures based on an eigenvalues calculation approach have been presented. The proposed simulators have been validated in terms of accuracy, numerical convergence, computation time and correctness of results.
Resumo:
Solo il 60% dei candidati alla resincronizzazione cardiaca risponde in termini di rimodellamento ventricolare inverso che è il più forte predittore di riduzione della mortalità e delle ospedalizzazioni. Due cause possibili della mancata risposta sono la programmazione del dispositivo e i limiti dell’ approccio transvenoso. Nel corso degli anni di dottorato ho effettuato tre studi per ridurre il numero di non responder. Il primo studio valuta il ritardo interventricolare. Al fine di ottimizzare le risorse e fornire un reale beneficio per il paziente ho ricercato la presenza di predittori di ritardo interventricolare diverso dal simultaneo, impostato nella programmazione di base. L'unico predittore è risultato essere l’ intervallo QRS> 160 ms, quindi ho proposto una flow chart per ottimizzare solo i pazienti che avranno nella programmazione ottimale un intervallo interventricolare non simultaneo. Il secondo lavoro valuta la fissazione attiva del ventricolo sinistro con stent. I dislocamenti, la soglia alta di stimolazione del miocardio e la stimolazione del nervo frenico sono tre problematiche che limitano la stimolazione biventricolare. Abbiamo analizzato più di 200 angiografie per vedere le condizioni anatomiche predisponenti la dislocazione del catetere. Prospetticamente abbiamo deciso di utilizzare uno stent per fissare attivamente il catetere ventricolare sinistro in tutti i pazienti che presentavano le caratteristiche anatomiche favorenti la dislocazione. Non ci sono più state dislocazioni, c’è stata una migliore risposta in termini di rimodellamento ventricolare inverso e non ci sono state modifiche dei parametri elettrici del catetere. Il terzo lavoro ha valutato sicurezza ed efficacia della stimolazione endoventricolare sinistra. Abbiamo impiantato 26 pazienti giudicati non responder alla terapia di resincronizzazione cardiaca. La procedura è risultata sicura, il rischio di complicanze è simile alla stimolazione biventricolare classica, ed efficace nell’arrestare la disfunzione ventricolare sinistra e / o migliorare gli effetti clinici in un follow-up medio.