961 resultados para General-purpose computing on graphics processing units (GPGPU)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The relationship between humans and non-human animals dates back to the Prehistoric Era, when groups of humans migrated from the nomadic and extractive stage to sedentariness, starting to develop agriculture and animal husbandry. Ancient Greek philosophy, notably the Aristotelian school of philosophy, posited that nature has not done anything for nothing, and all things have a purpose: plants were created for the sake of animals, and these for the good of men, while the Bible preaches the view that the world was created for the good of men and other species were subordinated to their wants and needs. During the Renaissance, centuries XIV to XVI, anthropocentrism was established as the main philosophical concept. However, in relation to the treatment of animals, Renaissance did not differ substantially from medieval scholasticism, considering animals like machines, devoid of pain and immortal soul. In this context, scientific knowledge about plants, non-human animals and nature in general, is built on anthropocentric values, thus influencing the construction of school education in the disciplines of Science and Biology. Nowadays, at São Paulo state schools, specifically in the Ensino Fundamental II (6th to 9th grade), the program of the discipline Ciências da Natureza e suas Tecnologias is set by the Currículo Oficial do Estado de São Paulo, via the São Paulo Faz Escola Program, implemented by the Secretaria Estadual de Educação in 2010. This documentary research used the methodology of Content Analysis and aimed to analyze the presentation of non-human animals in Caderno do Professor and Caderno do Aluno, from 6th to 9th grades of the discipline Ciências da Natureza e suas Tecnologias. The analysis of the courseware revealed that its contents were influenced by the anthropocentric view, in both implicitly and explicitly ways, conveying anthropomorphic, utilitarian, stereotyped and derogatory statements towards...
Resumo:
The relationship between humans and non-human animals dates back to the Prehistoric Era, when groups of humans migrated from the nomadic and extractive stage to sedentariness, starting to develop agriculture and animal husbandry. Ancient Greek philosophy, notably the Aristotelian school of philosophy, posited that nature has not done anything for nothing, and all things have a purpose: plants were created for the sake of animals, and these for the good of men, while the Bible preaches the view that the world was created for the good of men and other species were subordinated to their wants and needs. During the Renaissance, centuries XIV to XVI, anthropocentrism was established as the main philosophical concept. However, in relation to the treatment of animals, Renaissance did not differ substantially from medieval scholasticism, considering animals like machines, devoid of pain and immortal soul. In this context, scientific knowledge about plants, non-human animals and nature in general, is built on anthropocentric values, thus influencing the construction of school education in the disciplines of Science and Biology. Nowadays, at São Paulo state schools, specifically in the Ensino Fundamental II (6th to 9th grade), the program of the discipline Ciências da Natureza e suas Tecnologias is set by the Currículo Oficial do Estado de São Paulo, via the São Paulo Faz Escola Program, implemented by the Secretaria Estadual de Educação in 2010. This documentary research used the methodology of Content Analysis and aimed to analyze the presentation of non-human animals in Caderno do Professor and Caderno do Aluno, from 6th to 9th grades of the discipline Ciências da Natureza e suas Tecnologias. The analysis of the courseware revealed that its contents were influenced by the anthropocentric view, in both implicitly and explicitly ways, conveying anthropomorphic, utilitarian, stereotyped and derogatory statements towards...
Resumo:
OBJECTIVES: This prospective, randomized, experimental study with rats aimed to investigate the influence of general treatment strategies on the motor recovery of Wistar rats with moderate contusive spinal cord injury. METHODS: A total of 51 Wistar rats were randomized into five groups: control, maze, ramp, runway, and sham (laminectomy only). The rats underwent spinal cord injury at the T9-T10 levels using the NYU-Impactor. Each group was trained for 12 minutes twice a week for two weeks before and five weeks after the spinal cord injury, except for the control group. Functional motor recovery was assessed with the Basso, Beattie, and Bresnahan Scale on the first postoperative day and then once a week for five weeks. The animals were euthanized, and the spinal cords were collected for histological analysis. RESULTS: Ramp and maze groups showed an earlier and greater functional improvement effect than the control and runway groups. However, over time, unexpectedly, all of the groups showed similar effects as the control group, with spontaneous recovery. There were no histological differences in the injured area between the trained and control groups. CONCLUSION: Short-term benefits can be associated with a specific training regime; however, the same training was ineffective at maintaining superior long-term recovery. These results might support new considerations before hospital discharge of patients with spinal cord injuries.
Resumo:
This paper presents an optimum user-steered boundary tracking approach for image segmentation, which simulates the behavior of water flowing through a riverbed. The riverbed approach was devised using the image foresting transform with a never-exploited connectivity function. We analyze its properties in the derived image graphs and discuss its theoretical relation with other popular methods such as live wire and graph cuts. Several experiments show that riverbed can significantly reduce the number of user interactions (anchor points), as compared to live wire for objects with complex shapes. This paper also includes a discussion about how to combine different methods in order to take advantage of their complementary strengths.
Resumo:
The adsorption of NO on transition-metal (TM) surfaces has been widely studied by experimental and theoretical techniques; however, our atomistic understanding of the interaction of nitrogen monoxide (NO) with small TM clusters is far from satisfactory, which compromises a deep understanding of real catalyst devices. In this study, we report a density functional theory study of the adsorption properties of NO on the TM13 (TM = Rh, Pd, Ir, Pt) clusters employing the projected augmented wave method. We found that the interaction of NO with TM13 is much more complex than that for NO/TM(111). In particular, for low symmetry TM13 clusters, there is a strong rearrangement of the electronic charge density upon NO adsorption and, as a consequence, the adsorption energy shows a very complex dependence even for adsorption sites with the same local effective coordination. We found a strong enhancement of the binding energy of NO to the TM13 clusters compared with the TM(111) surfaces, as the antibonding NO states are not occupied for NO/TM13, and the general relationship based on the d-band model between adsorption energy and the center of gravity of the occupied d-states does not hold for the studied TM13 clusters, in particular, for clusters with low symmetry. In contrast with the adsorption energy trends, the geometric NO/TM13 parameters and the vibrational N-O frequencies for different coordination sites follow the same trend as for the respective TM(111) surfaces, while the changes in the frequencies between different surfaces and TM13 clusters reflect the strong NO-TM13 interaction.
Resumo:
Partindo-se do pressuposto de que o desenho é instrumento de diálogo entre o arquiteto e ele mesmo ou com terceiros e que este tipo de representação vem perdendo espaço para tecnologias digitais, este trabalho tem como objetivo estudar e analisar desenhos de projetos selecionados do arquiteto Paulo Mendes da Rocha, como contribuição para a discussão sobre o papel do desenho analógico no processo projetivo atual. O estudo acerca de quatro projetos (Ginásio do Clube Paulistano, Residência Butantã, Residência Millan e MuBE) se deu através de marcações realizadas pela pesquisadora sobre desenhos originais do arquiteto, com o objetivo de detectar intenções projetuais, conceitos e características dos projetos. Para tanto, foram feitas marcações gráficas sobre os desenhos originais, de maneira a evidenciar uma leitura particular da pesquisadora, permitindo assim uma melhor compreensão dos projetos escolhidos para o estudo. O desenho sobre o desenho marca os pontos que o julgamento considera importantes, tratando-se de uma leitura particular e permitindo melhor compreensão dos projetos para quem o pratica, uma vez que o ato de desenhar está estreitamente relacionado ao de pensar. Optou-se por apresentar as imagens relacionadas a citações diretas de autores
Resumo:
This thesis intends to investigate two aspects of Constraint Handling Rules (CHR). It proposes a compositional semantics and a technique for program transformation. CHR is a concurrent committed-choice constraint logic programming language consisting of guarded rules, which transform multi-sets of atomic formulas (constraints) into simpler ones until exhaustion [Frü06] and it belongs to the declarative languages family. It was initially designed for writing constraint solvers but it has recently also proven to be a general purpose language, being as it is Turing equivalent [SSD05a]. Compositionality is the first CHR aspect to be considered. A trace based compositional semantics for CHR was previously defined in [DGM05]. The reference operational semantics for such a compositional model was the original operational semantics for CHR which, due to the propagation rule, admits trivial non-termination. In this thesis we extend the work of [DGM05] by introducing a more refined trace based compositional semantics which also includes the history. The use of history is a well-known technique in CHR which permits us to trace the application of propagation rules and consequently it permits trivial non-termination avoidance [Abd97, DSGdlBH04]. Naturally, the reference operational semantics, of our new compositional one, uses history to avoid trivial non-termination too. Program transformation is the second CHR aspect to be considered, with particular regard to the unfolding technique. Said technique is an appealing approach which allows us to optimize a given program and in more detail to improve run-time efficiency or spaceconsumption. Essentially it consists of a sequence of syntactic program manipulations which preserve a kind of semantic equivalence called qualified answer [Frü98], between the original program and the transformed ones. The unfolding technique is one of the basic operations which is used by most program transformation systems. It consists in the replacement of a procedure-call by its definition. In CHR every conjunction of constraints can be considered as a procedure-call, every CHR rule can be considered as a procedure and the body of said rule represents the definition of the call. While there is a large body of literature on transformation and unfolding of sequential programs, very few papers have addressed this issue for concurrent languages. We define an unfolding rule, show its correctness and discuss some conditions in which it can be used to delete an unfolded rule while preserving the meaning of the original program. Finally, confluence and termination maintenance between the original and transformed programs are shown. This thesis is organized in the following manner. Chapter 1 gives some general notion about CHR. Section 1.1 outlines the history of programming languages with particular attention to CHR and related languages. Then, Section 1.2 introduces CHR using examples. Section 1.3 gives some preliminaries which will be used during the thesis. Subsequentely, Section 1.4 introduces the syntax and the operational and declarative semantics for the first CHR language proposed. Finally, the methodologies to solve the problem of trivial non-termination related to propagation rules are discussed in Section 1.5. Chapter 2 introduces a compositional semantics for CHR where the propagation rules are considered. In particular, Section 2.1 contains the definition of the semantics. Hence, Section 2.2 presents the compositionality results. Afterwards Section 2.3 expounds upon the correctness results. Chapter 3 presents a particular program transformation known as unfolding. This transformation needs a particular syntax called annotated which is introduced in Section 3.1 and its related modified operational semantics !0t is presented in Section 3.2. Subsequently, Section 3.3 defines the unfolding rule and prove its correctness. Then, in Section 3.4 the problems related to the replacement of a rule by its unfolded version are discussed and this in turn gives a correctness condition which holds for a specific class of rules. Section 3.5 proves that confluence and termination are preserved by the program modifications introduced. Finally, Chapter 4 concludes by discussing related works and directions for future work.
Resumo:
[EN]This paper describes a low-cost system that allows the user to visualize different glasses models in live video. The user can also move the glasses to adjust its position on the face. The system, which runs at 9.5 frames/s on general-purpose hardware, has a homeostatic module that keeps image parameters controlled. This is achieved by using a camera with motorized zoom, iris, white balance, etc. This feature can be specially useful in environments with changing illumination and shadows, like in an optical shop. The system also includes a face and eye detection module and a glasses management module.
Resumo:
Nowadays licensing practices have increased in importance and relevance driving the widespread diffusion of markets for technologies. Firms are shifting from a tactical to a strategic attitude towards licensing, addressing both business and corporate level objectives. The Open Innovation Paradigm has been embraced. Firms rely more and more on collaboration and external sourcing of knowledge. This new model of innovation requires firms to leverage on external technologies to unlock the potential of firms’ internal innovative efforts. In this context, firms’ competitive advantage depends both on their ability to recognize available opportunities inside and outside their boundaries and on their readiness to exploit them in order to fuel their innovation process dynamically. Licensing is one of the ways available to firm to ripe the advantages associated to an open attitude in technology strategy. From the licensee’s point view this implies challenging the so-called not-invented-here syndrome, affecting the more traditional firms that emphasize the myth of internal research and development supremacy. This also entails understanding the so-called cognitive constraints affecting the perfect functioning of markets for technologies that are associated to the costs for the assimilation, integration and exploitation of external knowledge by recipient firms. My thesis aimed at shedding light on new interesting issues associated to in-licensing activities that have been neglected by the literature on licensing and markets for technologies. The reason for this gap is associated to the “perspective bias” affecting the works within this stream of research. With very few notable exceptions, they have been generally concerned with the investigation of the so-called licensing dilemma of the licensor – whether to license out or to internally exploit the in-house developed technologies, while neglecting the licensee’s perspective. In my opinion, this has left rooms for improving the understanding of the determinants and conditions affecting licensing-in practices. From the licensee’s viewpoint, the licensing strategy deals with the search, integration, assimilation, exploitation of external technologies. As such it lies at the very hearth of firm’s technology strategy. Improving our understanding of this strategy is thus required to assess the full implications of in-licensing decisions as they shape firms’ innovation patterns and technological capabilities evolution. It also allow for understanding the so-called cognitive constraints associated to the not-invented-here syndrome. In recognition of that, the aim of my work is to contribute to the theoretical and empirical literature explaining the determinants of the licensee’s behavior, by providing a comprehensive theoretical framework as well as ad-hoc conceptual tools to understand and overcome frictions and to ease the achievement of satisfactory technology transfer agreements in the marketplace. Aiming at this, I investigate licensing-in in three different fashions developed in three research papers. In the first work, I investigate the links between licensing and the patterns of firms’ technological search diversification according to the framework of references of the Search literature, Resource-based Theory and the theory of general purpose technologies. In the second paper - that continues where the first one left off – I analyze the new concept of learning-bylicensing, in terms of development of new knowledge inside the licensee firms (e.g. new patents) some years after the acquisition of the license, according to the Dynamic Capabilities perspective. Finally, in the third study, Ideal with the determinants of the remuneration structure of patent licenses (form and amount), and in particular on the role of the upfront fee from the licensee’s perspective. Aiming at this, I combine the insights of two theoretical approaches: agency and real options theory.
Resumo:
The digital electronic market development is founded on the continuous reduction of the transistors size, to reduce area, power, cost and increase the computational performance of integrated circuits. This trend, known as technology scaling, is approaching the nanometer size. The lithographic process in the manufacturing stage is increasing its uncertainty with the scaling down of the transistors size, resulting in a larger parameter variation in future technology generations. Furthermore, the exponential relationship between the leakage current and the threshold voltage, is limiting the threshold and supply voltages scaling, increasing the power density and creating local thermal issues, such as hot spots, thermal runaway and thermal cycles. In addiction, the introduction of new materials and the smaller devices dimension are reducing transistors robustness, that combined with high temperature and frequently thermal cycles, are speeding up wear out processes. Those effects are no longer addressable only at the process level. Consequently the deep sub-micron devices will require solutions which will imply several design levels, as system and logic, and new approaches called Design For Manufacturability (DFM) and Design For Reliability. The purpose of the above approaches is to bring in the early design stages the awareness of the device reliability and manufacturability, in order to introduce logic and system able to cope with the yield and reliability loss. The ITRS roadmap suggests the following research steps to integrate the design for manufacturability and reliability in the standard CAD automated design flow: i) The implementation of new analysis algorithms able to predict the system thermal behavior with the impact to the power and speed performances. ii) High level wear out models able to predict the mean time to failure of the system (MTTF). iii) Statistical performance analysis able to predict the impact of the process variation, both random and systematic. The new analysis tools have to be developed beside new logic and system strategies to cope with the future challenges, as for instance: i) Thermal management strategy that increase the reliability and life time of the devices acting to some tunable parameter,such as supply voltage or body bias. ii) Error detection logic able to interact with compensation techniques as Adaptive Supply Voltage ASV, Adaptive Body Bias ABB and error recovering, in order to increase yield and reliability. iii) architectures that are fundamentally resistant to variability, including locally asynchronous designs, redundancy, and error correcting signal encodings (ECC). The literature already features works addressing the prediction of the MTTF, papers focusing on thermal management in the general purpose chip, and publications on statistical performance analysis. In my Phd research activity, I investigated the need for thermal management in future embedded low-power Network On Chip (NoC) devices.I developed a thermal analysis library, that has been integrated in a NoC cycle accurate simulator and in a FPGA based NoC simulator. The results have shown that an accurate layout distribution can avoid the onset of hot-spot in a NoC chip. Furthermore the application of thermal management can reduce temperature and number of thermal cycles, increasing the systemreliability. Therefore the thesis advocates the need to integrate a thermal analysis in the first design stages for embedded NoC design. Later on, I focused my research in the development of statistical process variation analysis tool that is able to address both random and systematic variations. The tool was used to analyze the impact of self-timed asynchronous logic stages in an embedded microprocessor. As results we confirmed the capability of self-timed logic to increase the manufacturability and reliability. Furthermore we used the tool to investigate the suitability of low-swing techniques in the NoC system communication under process variations. In this case We discovered the superior robustness to systematic process variation of low-swing links, which shows a good response to compensation technique as ASV and ABB. Hence low-swing is a good alternative to the standard CMOS communication for power, speed, reliability and manufacturability. In summary my work proves the advantage of integrating a statistical process variation analysis tool in the first stages of the design flow.
Resumo:
In this thesis the performances of the CMS Drift Tubes Local Trigger System of the CMS detector are studied. CMS is one of the general purpose experiments that will operate at the Large Hadron Collider at CERN. Results from data collected during the Cosmic Run At Four Tesla (CRAFT) commissioning exercise, a globally coordinated run period where the full experiment was involved and configured to detect cosmic rays crossing the CMS cavern, are presented. These include analyses on the precision and accuracy of the trigger reconstruction mechanism and measurement of the trigger efficiency. The description of a method to perform system synchronization is also reported, together with a comparison of the outcomes of trigger electronics and its software emulator code.
Resumo:
Mixed integer programming is up today one of the most widely used techniques for dealing with hard optimization problems. On the one side, many practical optimization problems arising from real-world applications (such as, e.g., scheduling, project planning, transportation, telecommunications, economics and finance, timetabling, etc) can be easily and effectively formulated as Mixed Integer linear Programs (MIPs). On the other hand, 50 and more years of intensive research has dramatically improved on the capability of the current generation of MIP solvers to tackle hard problems in practice. However, many questions are still open and not fully understood, and the mixed integer programming community is still more than active in trying to answer some of these questions. As a consequence, a huge number of papers are continuously developed and new intriguing questions arise every year. When dealing with MIPs, we have to distinguish between two different scenarios. The first one happens when we are asked to handle a general MIP and we cannot assume any special structure for the given problem. In this case, a Linear Programming (LP) relaxation and some integrality requirements are all we have for tackling the problem, and we are ``forced" to use some general purpose techniques. The second one happens when mixed integer programming is used to address a somehow structured problem. In this context, polyhedral analysis and other theoretical and practical considerations are typically exploited to devise some special purpose techniques. This thesis tries to give some insights in both the above mentioned situations. The first part of the work is focused on general purpose cutting planes, which are probably the key ingredient behind the success of the current generation of MIP solvers. Chapter 1 presents a quick overview of the main ingredients of a branch-and-cut algorithm, while Chapter 2 recalls some results from the literature in the context of disjunctive cuts and their connections with Gomory mixed integer cuts. Chapter 3 presents a theoretical and computational investigation of disjunctive cuts. In particular, we analyze the connections between different normalization conditions (i.e., conditions to truncate the cone associated with disjunctive cutting planes) and other crucial aspects as cut rank, cut density and cut strength. We give a theoretical characterization of weak rays of the disjunctive cone that lead to dominated cuts, and propose a practical method to possibly strengthen those cuts arising from such weak extremal solution. Further, we point out how redundant constraints can affect the quality of the generated disjunctive cuts, and discuss possible ways to cope with them. Finally, Chapter 4 presents some preliminary ideas in the context of multiple-row cuts. Very recently, a series of papers have brought the attention to the possibility of generating cuts using more than one row of the simplex tableau at a time. Several interesting theoretical results have been presented in this direction, often revisiting and recalling other important results discovered more than 40 years ago. However, is not clear at all how these results can be exploited in practice. As stated, the chapter is a still work-in-progress and simply presents a possible way for generating two-row cuts from the simplex tableau arising from lattice-free triangles and some preliminary computational results. The second part of the thesis is instead focused on the heuristic and exact exploitation of integer programming techniques for hard combinatorial optimization problems in the context of routing applications. Chapters 5 and 6 present an integer linear programming local search algorithm for Vehicle Routing Problems (VRPs). The overall procedure follows a general destroy-and-repair paradigm (i.e., the current solution is first randomly destroyed and then repaired in the attempt of finding a new improved solution) where a class of exponential neighborhoods are iteratively explored by heuristically solving an integer programming formulation through a general purpose MIP solver. Chapters 7 and 8 deal with exact branch-and-cut methods. Chapter 7 presents an extended formulation for the Traveling Salesman Problem with Time Windows (TSPTW), a generalization of the well known TSP where each node must be visited within a given time window. The polyhedral approaches proposed for this problem in the literature typically follow the one which has been proven to be extremely effective in the classical TSP context. Here we present an overall (quite) general idea which is based on a relaxed discretization of time windows. Such an idea leads to a stronger formulation and to stronger valid inequalities which are then separated within the classical branch-and-cut framework. Finally, Chapter 8 addresses the branch-and-cut in the context of Generalized Minimum Spanning Tree Problems (GMSTPs) (i.e., a class of NP-hard generalizations of the classical minimum spanning tree problem). In this chapter, we show how some basic ideas (and, in particular, the usage of general purpose cutting planes) can be useful to improve on branch-and-cut methods proposed in the literature.
Resumo:
This work has been realized by the author in his PhD course in Electrical, Computer Science and Telecommunication at the University of Bologna, Faculty of Engineering, Italy. All the documentation here reported is a summary of years of work, under the supervision of Prof. Oreste Andrisano, coordinator of Wireless Communication Laboratory - WiLab, in Bologna. The subject of this thesis is the transmission of video in a context of heterogeneous network, and in particular, using a wireless channel. All the instrumentation that has been used for the characterization of the telecommunication systems belongs to CNR (National Research Council), CNIT (Italian Inter- University Center), and DEIS (Dept. of Electrical, Computer Science, and Systems). From November 2009 to July 2010, the author spent his time abroad, working in collaboration with DLR - German Aerospace Center in Munich, Germany, on channel coding area, developing a general purpose decoder machine to decode a huge family of iterative codes. A patent concerning Doubly Generalized-Low Density Parity Check codes has been produced by the author as well as some important scientic papers, published on IEEE journals and conferences.