983 resultados para CUDA (Computer architecture)
Resumo:
X-ray computed log tomography has always been applied for qualitative reconstructions. In most cases, a series of consecutive slices of the timber are scanned to estimate the 3D image reconstruction of the entire log. However, the unexpected movement of the timber under study influences the quality of image reconstruction since the position and orientation of some scanned slices can be incorrectly estimated. In addition, the reconstruction time remains a significant challenge for practical applications. The present study investigates the possibility to employ modern physics engines for the problem of estimating the position of a moving rigid body and its scanned slices which are subject to X-ray computed tomography. The current work includes implementations of the extended Kalman filter and an algebraic reconstruction method for fan-bean computer tomography. In addition, modern techniques such as NVidia PhysX and CUDA are used in current study. As the result, it is numerically shown that it is possible to apply the extended Kalman filter together with a real-time physics engine, known as PhysX, in order to determine the position of a moving object. It is shown that the position of the rigid body can be determined based only on reconstructions of its slices. However, the simulation of the body movement sometimes is subject to an error during Kalman filter employment as PhysX is not always able to continue simulating the movement properly because of incorrect state estimation.
Resumo:
Human beings have always strived to preserve their memories and spread their ideas. In the beginning this was always done through human interpretations, such as telling stories and creating sculptures. Later, technological progress made it possible to create a recording of a phenomenon; first as an analogue recording onto a physical object, and later digitally, as a sequence of bits to be interpreted by a computer. By the end of the 20th century technological advances had made it feasible to distribute media content over a computer network instead of on physical objects, thus enabling the concept of digital media distribution. Many digital media distribution systems already exist, and their continued, and in many cases increasing, usage is an indicator for the high interest in their future enhancements and enriching. By looking at these digital media distribution systems, we have identified three main areas of possible improvement: network structure and coordination, transport of content over the network, and the encoding used for the content. In this thesis, our aim is to show that improvements in performance, efficiency and availability can be done in conjunction with improvements in software quality and reliability through the use of formal methods: mathematical approaches to reasoning about software so that we can prove its correctness, together with the desirable properties. We envision a complete media distribution system based on a distributed architecture, such as peer-to-peer networking, in which different parts of the system have been formally modelled and verified. Starting with the network itself, we show how it can be formally constructed and modularised in the Event-B formalism, such that we can separate the modelling of one node from the modelling of the network itself. We also show how the piece selection algorithm in the BitTorrent peer-to-peer transfer protocol can be adapted for on-demand media streaming, and how this can be modelled in Event-B. Furthermore, we show how modelling one peer in Event-B can give results similar to simulating an entire network of peers. Going further, we introduce a formal specification language for content transfer algorithms, and show that having such a language can make these algorithms easier to understand. We also show how generating Event-B code from this language can result in less complexity compared to creating the models from written specifications. We also consider the decoding part of a media distribution system by showing how video decoding can be done in parallel. This is based on formally defined dependencies between frames and blocks in a video sequence; we have shown that also this step can be performed in a way that is mathematically proven correct. Our modelling and proving in this thesis is, in its majority, tool-based. This provides a demonstration of the advance of formal methods as well as their increased reliability, and thus, advocates for their more wide-spread usage in the future.
Resumo:
The computer game industry has grown steadily for years, and in revenues it can be compared to the music and film industries. The game industry has been moving to digital distribution. Computer gaming and the concept of business model are discussed among industrial practitioners and the scientific community. The significance of the business model concept has increased in the scientific literature recently, although there is still a lot of discussion going on on the concept. In the thesis, the role of the business model in the computer game industry is studied. Computer game developers, designers, project managers and organization leaders in 11 computer game companies were interviewed. The data was analyzed to identify the important elements of computer game business model, how the business model concept is perceived and how the growth of the organization affects the business model. It was identified that the importance of human capital is crucial to the business. As games are partly a product of creative thinking also innovation and the creative process are highly valued. The same applies to technical skills when performing various activities. Marketing and customer relationships are also considered as key elements in the computer game business model. Financing and partners are important especially for startups, when the organization is dependent on external funding and third party assets. The results of this study provide organizations with improved understanding on how the organization is built and what business model elements are weighted.
Resumo:
The aim of this study was to evaluate changes in canola yield components and seed physiological quality in response to different sowing densities. The study was made in a greenhouse at the REIPESOL Company Technological Center, Madrid - Spain, with the commercial "Toccata" hybrid variety. The initial sowing density was 360,000 plants/ha and the plant population was later thinned down to include treatments of 250 and 180 thousand plants/ha. Harvested seeds were sent to the Seed Technology Center Laboratory (CATES) at the Madrid Polytechnic University (UPM) to evaluate changes in plant architecture and yield components, as well as the seed physiological quality of different plant parts. Results demonstrated that canola plants showed changes in morphology and yield components in response to different sowing densities. The population of 250,000 plants/ha showed the best seed yield demonstrating that maximum yield is directly related to a correct sowing density. The number of pods/plant was the most important component for increased seed yield/plant and seed yield/area. The spatial distribution of canola seeds in the plant and canola sowing density did not affect seed physiological quality.
Resumo:
If emerging markets are to achieve their objective of joining the ranks of industrialized, developed countries, they must use their economic and political influence to support radical change in the international financial system. This working paper recommends John Maynard Keynes's "clearing union" as a blueprint for reform of the international financial architecture that could address emerging market grievances more effectively than current approaches. Keynes's proposal for the postwar international system sought to remedy some of the same problems currently facing emerging market economies. It was based on the idea that financial stability was predicated on a balance between imports and exports over time, with any divergence from balance providing automatic financing of the debit countries by the creditor countries via a global clearinghouse or settlement system for trade and payments on current account. This eliminated national currency payments for imports and exports; countries received credits or debits in a notional unit of account fixed to national currency. Since the unit of account could not be traded, bought, or sold, it would not be an international reserve currency. The credits with the clearinghouse could only be used to offset debits by buying imports, and if not used for this purpose they would eventually be extinguished; hence the burden of adjustment would be shared equally - credit generated by surpluses would have to be used to buy imports from the countries with debit balances. Emerging market economies could improve upon current schemes for regionally governed financial institutions by using this proposal as a template for the creation of regional clearing unions using a notional unit of account.
Resumo:
Developing nations vary in data usage techniques with respect to developed nations because of lack of standard information technology architecture. With the concept of globalization in the modern times, there is a necessity of information sharing between different developing nations for better advancements in socio-economic and science and technology fields. A robust IT architecture is needed and has to be built between different developing nations which eases information sharing and other data usage methods. A framework like TOGAF may work in this case as a normal IT framework may not fit to meet the requirements of an enterprise architecture. The intention of the thesis is to build an enterprise architecture between different developing nations using a framework TOGAF
Resumo:
Today, the user experience and usability in software application are becoming a major design issue due to the adaptation of many processes using new technologies. Therefore, the study of the user experience and usability might be included in every software development project and, thus, they should be tested to get traceable results. As a result of different testing methods to evaluate the concepts, a non-expert on the topic might have doubts on which option he/she should opt for and how to interpret the outcomes of the process. This work aims to create a process to ease the whole testing methodology based on the process created by Seffah et al. and a supporting software tool to follow the procedure of these testing methods for the user experience and usability.
Resumo:
1804 (T4).
Resumo:
Variante(s) de titre : Précis historique des productions des arts, peinture, sculpture, architecture et gravure
Resumo:
1802 (T2).
Resumo:
1805 (T5).
Resumo:
1803 (T3).
Resumo:
1801 (T1).
Resumo:
1925/01/15 (SER4,N424).