785 resultados para Rendering
Resumo:
The performance of real-time networks is under continuous improvement as a result of several trends in the digital world. However, these tendencies not only cause improvements, but also exacerbates a series of unideal aspects of real-time networks such as communication latency, jitter of the latency and packet drop rate. This Thesis focuses on the communication errors that appear on such realtime networks, from the point-of-view of automatic control. Specifically, it investigates the effects of packet drops in automatic control over fieldbuses, as well as the architectures and optimal techniques for their compensation. Firstly, a new approach to address the problems that rise in virtue of such packet drops, is proposed. This novel approach is based on the simultaneous transmission of several values in a single message. Such messages can be from sensor to controller, in which case they are comprised of several past sensor readings, or from controller to actuator in which case they are comprised of estimates of several future control values. A series of tests reveal the advantages of this approach. The above-explained approach is then expanded as to accommodate the techniques of contemporary optimal control. However, unlike the aforementioned approach, that deliberately does not send certain messages in order to make a more efficient use of network resources; in the second case, the techniques are used to reduce the effects of packet losses. After these two approaches that are based on data aggregation, it is also studied the optimal control in packet dropping fieldbuses, using generalized actuator output functions. This study ends with the development of a new optimal controller, as well as the function, among the generalized functions that dictate the actuator’s behaviour in the absence of a new control message, that leads to the optimal performance. The Thesis also presents a different line of research, related with the output oscillations that take place as a consequence of the use of classic co-design techniques of networked control. The proposed algorithm has the goal of allowing the execution of such classical co-design algorithms without causing an output oscillation that increases the value of the cost function. Such increases may, under certain circumstances, negate the advantages of the application of the classical co-design techniques. A yet another line of research, investigated algorithms, more efficient than contemporary ones, to generate task execution sequences that guarantee that at least a given number of activated jobs will be executed out of every set composed by a predetermined number of contiguous activations. This algorithm may, in the future, be applied to the generation of message transmission patterns in the above-mentioned techniques for the efficient use of network resources. The proposed task generation algorithm is better than its predecessors in the sense that it is capable of scheduling systems that cannot be scheduled by its predecessor algorithms. The Thesis also presents a mechanism that allows to perform multi-path routing in wireless sensor networks, while ensuring that no value will be counted in duplicate. Thereby, this technique improves the performance of wireless sensor networks, rendering them more suitable for control applications. As mentioned before, this Thesis is centered around techniques for the improvement of performance of distributed control systems in which several elements are connected through a fieldbus that may be subject to packet drops. The first three approaches are directly related to this topic, with the first two approaching the problem from an architectural standpoint, whereas the third one does so from more theoretical grounds. The fourth approach ensures that the approaches to this and similar problems that can be found in the literature that try to achieve goals similar to objectives of this Thesis, can do so without causing other problems that may invalidate the solutions in question. Then, the thesis presents an approach to the problem dealt with in it, which is centered in the efficient generation of the transmission patterns that are used in the aforementioned approaches.
Resumo:
This thesis reports in detail studies of industrial solid wastes valorization as alternative raw materials. All tested wastes are classified as non-hazardous and are generated in the pulp and paper process, including primary sludge, dregs, grits, lime mud and bottom ash (this generated in a process that occurs in parallel to the production of cellulose, whose aim is the production of energy to supply the plant through the combustion of forest biomass in fluidized bed). A detailed general characterization was performed at each waste and according to their characteristics, they were selected some applications in materials with potential use, specifically in Fibercement, Bituminous Mixture for regularization layer and industrial mortars (rendering mortars and cementitious-adhesive). After decided to application each waste was specifically tested to proceed the setting up of formulations containing different content of waste in replacement of the raw conventional material. As an isolated case, the bottom ash was tested not only as an alternative raw material for construction materials, but also it was tested for its use in fluidized bed in which the waste is generated as raw material. Both dregs and bottom ash had undergone special treatment to make possible to obtain a better quality of waste in order do not compromise the final product characteristics and process. The dregs were tested in bituminous mixtures as received and also washed (on the laboratory scale to remove soluble salts) and bottom ash were washed and screened in industrial scale (for removal of soluble salts, especially chlorides and coarse fraction particles elimination - particles larger than 1 mm size). The remaining residues form used in such as received avoiding additional costs. The results indicated potential and some limitations for each application to the use of these wastes as alternative raw material, but in some cases, the benefits in relation to valorization overlap with its limitations in both aspects, environmental and economic.
Resumo:
For the past decades it has been a worldwide concern to reduce the emission of harmful gases released during the combustion of fossil fuels. This goal has been addressed through the reduction of sulfur-containing compounds, and the replacement of fossil fuels by biofuels, such as bioethanol, produced in large scale from biomass. For this purpose, a new class of solvents, the Ionic Liquids (ILs), has been applied, aiming at developing new processes and replacing common organic solvents in the current processes. ILs can be composed by a large number of different combinations of cations and anions, which confer unique but desired properties to ILs. The ability of fine-tuning the properties of ILs to meet the requirements of a specific application range by mixing different cations and anions arises as the most relevant aspect for rendering ILs so attractive to researchers. Nonetheless, due to the huge number of possible combinations between the ions it is required the use of cheap predictive approaches for anticipating how they will act in a given situation. Molecular dynamics (MD) simulation is a statistical mechanics computational approach, based on Newton’s equations of motion, which can be used to study macroscopic systems at the atomic level, through the prediction of their properties, and other structural information. In the case of ILs, MD simulations have been extensively applied. The slow dynamics associated to ILs constitutes a challenge for their correct description that requires improvements and developments of existent force fields, as well as larger computational efforts (longer times of simulation). The present document reports studies based on MD simulations devoted to disclose the mechanisms of interaction established by ILs in systems representative of fuel and biofuels streams, and at biomass pre-treatment process. Hence, MD simulations were used to evaluate different systems composed of ILs and thiophene, benzene, water, ethanol and also glucose molecules. For the latter molecules, it was carried out a study aiming to ascertain the performance of a recently proposed force field (GROMOS 56ACARBO) to reproduce the dynamic behavior of such molecules in aqueous solution. The results here reported reveal that the interactions established by ILs are dependent on the individual characteristics of each IL. Generally, the polar character of ILs is deterministic in their propensity to interact with the other molecules. Although it is unquestionable the advantage of using MD simulations, it is necessary to recognize the need for improvements and developments of force fields, not only for a successful description of ILs, but also for other relevant compounds such as the carbohydrates.
Resumo:
REVERIE (REal and Virtual Engagement in Realistic Immersive Environments [1]) targets novel research to address the demanding challenges involved with developing state-of-the-art technologies for online human interaction. The REVERIE framework enables users to meet, socialise and share experiences online by integrating cutting-edge technologies for 3D data acquisition and processing, networking, autonomy and real-time rendering. In this paper, we describe the innovative research that is showcased through the REVERIE integrated framework through richly defined use-cases which demonstrate the validity and potential for natural interaction in a virtual immersive and safe environment. Previews of the REVERIE demo and its key research components can be viewed at www.youtube.com/user/REVERIEFP7.
Digital Debris of Internet Art: An Allegorical and Entropic Resistance to the Epistemology of Search
Resumo:
This Ph.D., by thesis, proposes a speculative lens to read Internet Art via the concept of digital debris. In order to do so, the research explores the idea of digital debris in Internet Art from 1993 to 2011 in a series of nine case studies. Here, digital debris are understood as words typed in search engines and which then disappear; bits of obsolete codes which are lingering on the Internet, abandoned website, broken links or pieces of ephemeral information circulating on the Internet and which are used as a material by practitioners. In this context, the thesis asks what are digital debris? The thesis argues that the digital debris of Internet Art represent an allegorical and entropic resistance to the what Art Historian David Joselit calls the Epistemology of Search. The ambition of the research is to develop a language in-between the agency of the artist and the autonomy of the algorithm, as a way of introducing Internet Art to a pluridisciplinary audience, hence the presence of the comparative studies unfolding throughout the thesis, between Internet Art and pionners in the recycling of waste in art, the use of instructions as a medium and the programming of poetry. While many anthropological and ethnographical studies are concerned with the material object of the computer as debris once it becomes obsolete, very few studies have analysed waste as discarded data. The research shifts the focus from an industrial production of digital debris (such as pieces of hardware) to obsolete pieces of information in art practice. The research demonstrates that illustrations of such considerations can be found, for instance, in Cory Arcangel’s work Data Diaries (2001) where QuickTime files are stolen, disassembled, and then re-used in new displays. The thesis also looks at Jodi’s approach in Jodi.org (1993) and Asdfg (1998), where websites and hyperlinks are detourned, deconstructed, and presented in abstract collages that reveals the architecture of the Internet. The research starts in a typological manner and classifies the pieces of Internet Art according to the structure at play in the work. Indeed if some online works dealing with discarded documents offer a self-contained and closed system, others nurture the idea of openness and unpredictability. The thesis foregrounds the ideas generated through the artworks and interprets how those latter are visually constructed and displayed. Not only does the research questions the status of digital debris once they are incorporated into art practice but it also examine the method according to which they are retrieved, manipulated and displayed to submit that digital debris of Internet Art are the result of both semantic and automated processes, rendering them both an object of discourse and a technical reality. Finally, in order to frame the serendipity and process-based nature of the digital debris, the Ph.D. concludes that digital debris are entropic . In other words that they are items of language to-be, paradoxically locked in a constant state of realisation.
Resumo:
Lines and edges provide important information for object categorization and recognition. In addition, one brightness model is based on a symbolic interpretation of the cortical multi-scale line/edge representation. In this paper we present an improved scheme for line/edge extraction from simple and complex cells and we illustrate the multi-scale representation. This representation can be used for visual reconstruction, but also for nonphotorealistic rendering. Together with keypoints and a new model of disparity estimation, a 3D wireframe representation of e.g. faces can be obtained in the future.
Resumo:
A new scheme for painterly rendering (NPR) has been developed. This scheme is based on visual perception, in particular themulti-scale line/edge representation in the visual cortex. The Amateur Painter (TAP) is the user interface on top of the rendering scheme. It allows to (semi)automatically create paintings from photographs, with different types of brush strokes and colour manipulations. In contrast to similar painting tools, TAP has a set of menus that reflects the procedure followed by a normal painter. In addition, menus and options have been designed such that they are very intuitive, avoiding a jungle of sub-menus with options from image processing that children and laymen do not understand. Our goal is to create a tool that is extremely easy to use, with the possibility that the user becomes interested in painting techniques, styles, and fine arts in general.
Resumo:
Tese de dout., Engenharia Electrónica e de Computadores, Faculdade de Ciência e Tecnologia, Universidade do Algarve, 2007
Resumo:
This paper provides an overview of the sources and effects of the RF impairments limiting and rendering the performance of the future wireless communication transceivers costly as well as hindering their wide-spread use in commercial products. As transmission bandwidths and carrier frequencies increase effect of these impairments worsen. This paper studies and presents analytical evaluations of the performance degradation due to the RF impairments in terms of bit-error-rate and image rejection ratio. The paper also give highlights of the various aspects of the research carried out in mitigating the effects of these impairments primarily in the digital signal processing domain at the baseband as well as providing low-complexity hardware implementations of such algorithms incorporating a number of power and area saving techniques.
Resumo:
In studies which analyse the social distance between spouses at the moment a couple is formed, and which attempt to understand the role of the family, and in particular of marriage, in crystallising social divisions, the concept of homogamy has often been purely descriptive. This article questions this static approach and seeks to pinpoint the changes which social homogamy undergoes in the course of conjugal life, addressing women’s decisions on work–family articulation. Drawing on a critical approach to the concept of rational choice, the article intends to demonstrate the merit of an interpretative approach by analysing how members of a sample of 27 university-educated Portuguese partnered mothers take their decisions in the context of an interdependency framework in which the dynamics of family interaction tend to thwart individual career path development, rendering spouses dependent on each other.
Resumo:
Thesis (Ph.D.)--University of Washington, 2015
Resumo:
Aldred, the glossator of the Lindisfarne Gospels, presents himself as carefully rendering the Latin lemmata in front of him, in terms of both their internal structure and meaning. His work includes a very high number of multiple glosses, which often attempt to clarify the polysemous character of a lemma or to provide additional information. This paper explores the multiple glosses including different lexemes which Aldred added to lexical lemmata in Mark’s Gospel in an attempt to establish whether there is any correlation between Aldred’s ordering practices and the frequency with which he used the interpretamenta to render those lemmata. The results of the study show some preference for placing the interpretamentum which most commonly renders the Latin lemma in first position, although Aldred’s practice is not fully consistent.
Resumo:
Over the last few decades, China has seen a steep rise in diverse eco city and low carbon city policies. Recently, attention has begun to focus on the perceived shortcomings in the practical delivery of related initiatives, with several publications suggesting a gap between ambitious policy goals and the emerging realities of the newly built environment. To probe this further, in this article we examine – based on the policy network approach – how the gap between high-level national policies and local practice implementation can be explained in the current Chinese context. We develop a four-pronged typology of eco city projects based on differential involvement of key (policy) actor groups, followed by a mapping of what are salient policy network relations among these actors in each type. Our analysis suggests that, within the overall framework of national policy, a core axis in the network relations is that between local government and land developers. In some cases, central government agencies– often with buy-in from international architecture, engineering and consulting firms – seek to influence local government planning through various incentives aimed at rendering sustainability a serious consideration. However, this is mostly done in a top-down manner, which overemphasizes a rational, technocratic planning mode while underemphasizing interrelationships among actors. This makes the emergence of a substantial implementation gap in eco city practice an almost predictable outcome. Consequently, we argue that special attention be paid in particular to the close interdependency between the interests of local government actors and those of land and real estate developers. Factoring in this aspect of the policy network is essential if eco city implementation is to gain proper traction on the ground.
Resumo:
Architectural rendering for Moulton Hall, Chapman College, Orange, California. Completed in 1975 (2 floors, 44,592 sq.ft.), this building is named in memory of an artist and patroness of the arts, Nellie Gail Moulton. Within this structure are the departments of Art, Communications, and Theatre/Dance as well as the Guggenheim Gallery and Waltmar Theatre.
Resumo:
Infrared thermography is a non-invasive technique that measures mid to long-wave infrared radiation emanating from all objects and converts this to temperature. As an imaging technique, the value of modern infrared thermography is its ability to produce a digitized image or high speed video rendering a thermal map of the scene in false colour. Since temperature is an important environmental parameter influencing animal physiology and metabolic heat production an energetically expensive process, measuring temperature and energy exchange in animals is critical to understanding physiology, especially under field conditions. As a non-contact approach, infrared thermography provides a non-invasive complement to physiological data gathering. One caveat, however, is that only surface temperatures are measured, which guides much research to those thermal events occurring at the skin and insulating regions of the body. As an imaging technique, infrared thermal imaging is also subject to certain uncertainties that require physical modeling, which is typically done via built-in software approaches. Infrared thermal imaging has enabled different insights into the comparative physiology of phenomena ranging from thermogenesis, peripheral blood flow adjustments, evaporative cooling, and to respiratory physiology. In this review, I provide background and guidelines for the use of thermal imaging, primarily aimed at field physiologists and biologists interested in thermal biology. I also discuss some of the better known approaches and discoveries revealed from using thermal imaging with the objective of encouraging more quantitative assessment.