967 resultados para Prove it works
Resumo:
Faced with a future of rising energy costs there is a need for industry to manage energy more carefully in order to meet its economic objectives. A problem besetting the growth of energy conservation in the UK is that a large proportion of energy consumption is used in a low intensive manner in organisations where they would be responsibility for energy efficiency is spread over a large number of personnel who each see only small energy costs. In relation to this problem in the non-energy intensive industrial sector, an application of an energy management technique known as monitoring and targeting (M & T) has been installed at the Whetstone site of the General Electric Company Limited in an attempt to prove it as a means for motivating line management and personnel to save energy. The objective energy saving for which the M & T was devised is very specific. During early energy conservation work at the site there had been a change from continuous to intermittent heating but the maintenance of the strategy was receiving a poor level of commitment from line management and performance was some 5% - 10% less than expected. The M & T is concerned therefore with heat for space heating for which a heat metering system was required. Metering of the site high pressure hot water system posed technical difficulties and expenditure was also limited. This led to a ‘tin-house' design being installed for a price less than the commercial equivalent. The timespan of work to achieve an operational heat metering system was 3 years which meant that energy saving results from the scheme were not observed during the study. If successful the replication potential is the larger non energy intensive sites from which some 30 PT savings could be expected in the UK.
Resumo:
This hands-on, practical guide for ESL/EFL teachers and teacher educators outlines, for those who are new to doing action research, what it is and how it works. Straightforward and reader friendly, it introduces the concepts and offers a step-by-step guide to going through an action research process, including illustrations drawn widely from international contexts. Specifically, the text addresses: •action research and how it differs from other forms of research •the steps involved in developing an action research project •ways of developing a research focus •methods of data collection •approaches to data analysis •making sense of action research for further classroom action. Each chapter includes a variety of pedagogical activities: •Pre-Reading questions ask readers to consider what they already know about the topic •Reflection Points invite readers to think about/discuss what they have read •action points ask readers to carry out action-research tasks based on what they have read •Classroom Voices illustrate aspects of action research from teachers internationally •Summary Points provide a synopsis of the main points in the chapter
Resumo:
This chapter reports on a framework that has been successfully used to analyze the e-business capabilities of an organization with a view to developing their e-capability maturity levels. This should be the first stage of any systems development project. The framework has been used widely within start-up companies and well-established companies both large and small; it has been deployed in the service and manufacturing sectors. It has been applied by practitioners and consultants to help improve e-business capability levels, and by academics for teaching and research purposes at graduate and undergraduate levels. This chapter will provide an account of the unique e-business planning and analysis framework (E-PAF) and demonstrate how it works via an abridged version of a case study (selected from hundreds that have been produced). This will include a brief account of the three techniques that are integrated to form the analysis framework: quality function deployment (QFD) (Akao, 1972), the balanced scorecard (BSC) (Kaplan & Norton, 1992), and value chain analysis (VCA) (Porter, 1985). The case study extract is based on an online community and dating agency service identified as VirtualCom which has been produced through a consulting assignment with the founding directors of that company and has not been published previously. It has been chosen because it gives a concise, comprehensive example from an industry that is relatively easy to relate to.
Resumo:
Real-time systems are usually modelled with timed automata and real-time requirements relating to the state durations of the system are often specifiable using Linear Duration Invariants, which is a decidable subclass of Duration Calculus formulas. Various algorithms have been developed to check timed automata or real-time automata for linear duration invariants, but each needs complicated preprocessing and exponential calculation. To the best of our knowledge, these algorithms have not been implemented. In this paper, we present an approximate model checking technique based on a genetic algorithm to check real-time automata for linear durration invariants in reasonable times. Genetic algorithm is a good optimization method when a problem needs massive computation and it works particularly well in our case because the fitness function which is derived from the linear duration invariant is linear. ACM Computing Classification System (1998): D.2.4, C.3.
Resumo:
Photo-activated disinfection is beginning to be used in dental surgery to treat deep seated bacterial infection. It works by combining a photosensitiser and light of a specific frequency to generate singlet oxygen which is toxic to many types of bacteria. It is suggested that this technique could be used as a means to help treat infection more generally. To do so, it needs to work with materials and geometries exhibiting different physical and optical characteristics to teeth. In these trials, samples of stainless steel and polymethylmethacrylate were exposed to bacterial solutions of Staphylococcus aureus and Staphylococcus epidermis. These were treated with tolonium chloride-based photo-activated disinfection regimes showing positive results with typically 4 log10 reductions in colony forming units. Tests were also carried out using slotted samples to represent geometric features which might be found on implants. These tests, showed disinfectant effect however to a much lesser degree. © 2011 Inderscience Enterprises Ltd.
Resumo:
Methods for accessing data on the Web have been the focus of active research over the past few years. In this thesis we propose a method for representing Web sites as data sources. We designed a Data Extractor data retrieval solution that allows us to define queries to Web sites and process resulting data sets. Data Extractor is being integrated into the MSemODB heterogeneous database management system. With its help database queries can be distributed over both local and Web data sources within MSemODB framework. ^ Data Extractor treats Web sites as data sources, controlling query execution and data retrieval. It works as an intermediary between the applications and the sites. Data Extractor utilizes a twofold “custom wrapper” approach for information retrieval. Wrappers for the majority of sites are easily built using a powerful and expressive scripting language, while complex cases are processed using Java-based wrappers that utilize specially designed library of data retrieval, parsing and Web access routines. In addition to wrapper development we thoroughly investigate issues associated with Web site selection, analysis and processing. ^ Data Extractor is designed to act as a data retrieval server, as well as an embedded data retrieval solution. We also use it to create mobile agents that are shipped over the Internet to the client's computer to perform data retrieval on behalf of the user. This approach allows Data Extractor to distribute and scale well. ^ This study confirms feasibility of building custom wrappers for Web sites. This approach provides accuracy of data retrieval, and power and flexibility in handling of complex cases. ^
Resumo:
Methods for accessing data on the Web have been the focus of active research over the past few years. In this thesis we propose a method for representing Web sites as data sources. We designed a Data Extractor data retrieval solution that allows us to define queries to Web sites and process resulting data sets. Data Extractor is being integrated into the MSemODB heterogeneous database management system. With its help database queries can be distributed over both local and Web data sources within MSemODB framework. Data Extractor treats Web sites as data sources, controlling query execution and data retrieval. It works as an intermediary between the applications and the sites. Data Extractor utilizes a two-fold "custom wrapper" approach for information retrieval. Wrappers for the majority of sites are easily built using a powerful and expressive scripting language, while complex cases are processed using Java-based wrappers that utilize specially designed library of data retrieval, parsing and Web access routines. In addition to wrapper development we thoroughly investigate issues associated with Web site selection, analysis and processing. Data Extractor is designed to act as a data retrieval server, as well as an embedded data retrieval solution. We also use it to create mobile agents that are shipped over the Internet to the client's computer to perform data retrieval on behalf of the user. This approach allows Data Extractor to distribute and scale well. This study confirms feasibility of building custom wrappers for Web sites. This approach provides accuracy of data retrieval, and power and flexibility in handling of complex cases.
Resumo:
The field of experience and reflection in this dissertation is the Pau and Lata: Artisticpedagogical project and its activities in the field of music education It was created in 1996, by the Community School Sementes da Luz, located on Tabuleiro do Martins district, Maceió / AL. The work extended to the Rio Grande do Norte and later returned to Alagoas, keeping their activities in both states, involving approximately 280 people. The issues that moved us front the experience of Pau e Lata were: What are the main references and theoretical-methodological elements that constitute the formation of the musician in Pau e Lata? How members perceive this project and include themselves in the educational process of music formation? How it works and what is the meaning of the use of instruments and the learning of musical writing and reading? These questions lead us to undertake this dissertation, in order to deepen reflection on the processes of musical training on Pau e Lata, relating the experiences of its members in the process and the theoretical references governing their educational practice. In this sense, we outline the research objectives, which are: describe the Pau e Lata project, focusing on their context of action and their methodological processes; investigate the relationship between the effective participation of its members in the process of composition of the artistic and pedagogical repertoire and its performance in the field of cultural militancy in the environment where it operates. The writing process of this research is based on the phenomenological perspective. Therefore constitute our methodological research path two roads that communicate: 1) the organization and description of historical record of Pau e Lata (supporting documents, certificates, posters, etc.) and memories of the researcher and from other members of the group. 2) the formation of focal groups and writing and sending, via online, testimonials the participants of Pau and Lata relating to issues scrap and onomatopoeia, respectively. Participated in this process 11 components, adding the presence of the researcher, with the age between 21-45 years, all members of Pau e Lata, Core UFRN. The results of this research are focused on the discussion of three axes that describe and guide the work of the Pau e Lata: collective work, the use of the scrap as instrument and the onomatopoeia as base of a methodological process of musical training. This score was composed of three parts. The first part is presented from a collection of references from Pau e Lata, composed of printed and videographic records. The second part refers to the instrument used by Pau e Lata, and the perception of group members on these instruments, which occurs so that they are integrated in the training of the musician.The third axis tells how and what it means learning of music writing and reading, that occurs in two related aspects: the teaching-learning process and the body as a musical element in this process, associated with other actions characterized as studies and theoretical deepening
Resumo:
This dissertation offers an investigation of the role of visual strategies, art, and representation in reconciling Indian Residential School history in Canada. This research builds upon theories of biopolitics, settler colonialism, and race to examine the project of redress and reconciliation as nation and identity building strategies engaged in the ongoing structural invasion of settler colonialism. It considers the key policy moments and expressions of the federal government—from RCAP to the IRSSA and subsequent apology—as well as the visual discourse of reconciliation as it works through archival photography, institutional branding, and commissioned works. These articulations are read alongside the creative and critical work of Indigenous artists and knowledge producers working within and outside of hegemonic structures on the topics of Indian Residential School history and redress. In particular the works of Jeff Thomas, Adrian Stimson, Krista Belle Stewart, Christi Belcourt, Luke Marston, Peter Morin, and Carey Newman are discussed in this dissertation. These works must be understood in relationship to the normative discourse of reconciliation as a legitimizing mechanism of settler colonial hegemony. Beyond the binary of cooptation and autonomous resistance, these works demonstrate the complexity of representing Indigeneity: as an ongoing site of settler colonial encounter and simultaneously the forum for the willful refusal of contingency or containment.
Resumo:
Design embeds ideas in communication and artefacts in subtle and psychologically powerful ways. Sociologist Pierre Bourdieu coined the term ‘symbolic violence’ to describe how powerful ideologies, priorities, values and even sensibilities are constructed and reproduced through cultural institutions, processes and practices. Through symbolic violence, individuals learn to consider unjust conditions as natural and even come to value customs and ideas that are oppressive. Symbolic violence normalises structural violence and enables real violence to take place, often preceding it and later justifying it. Feminist, class, race and indigenous scholars and activists describe how oppressions (how patriarchy, racism, colonialism, etc.) exist within institutions and structures, and also within cultural practices that embed ideologies into everyday life. The theory of symbolic violence sheds light on how design can function to naturalise oppressions and then obfuscate power relations around this process. Through symbolic violence, design can function as an enabler for the exploitation of certain groups of people and the environment they (and ultimately ‘we’) depend on to live. Design functions as symbolic violence when it is involved with the creation and reproduction of ideas, practices, tools and processes that result in structural and other types of violence (including ecocide). Breaking symbolic violence involves discovering how it works and building capacities to challenge and transform dysfunctional ideologies, structures and institutions. This conversation will give participants an opportunity to discuss, critique and/or develop the theory of design as symbolic violence as a basis for the development of design strategies for social justice.
Resumo:
The goal of this work is to present an efficient CAD-based adjoint process chain for calculating parametric sensitivities (derivatives of the objective function with respect to the CAD parameters) in timescales acceptable for industrial design processes. The idea is based on linking parametric design velocities (geometric sensitivities computed from the CAD model) with adjoint surface sensitivities. A CAD-based design velocity computation method has been implemented based on distances between discrete representations of perturbed geometries. This approach differs from other methods due to the fact that it works with existing commercial CAD packages (unlike most analytical approaches) and it can cope with the changes in CAD model topology and face labeling. Use of the proposed method allows computation of parametric sensitivities using adjoint data at a computational cost which scales with the number of objective functions being considered, while it is essentially independent of the number of design variables. The gradient computation is demonstrated on test cases for a Nozzle Guide Vane (NGV) model and a Turbine Rotor Blade model. The results are validated against finite difference values and good agreement is shown. This gradient information can be passed to an optimization algorithm, which will use it to update the CAD model parameters.
Resumo:
The need to steer economic development has always been great and as management model has the balanced scorecard has been popular since the mid- 1990s, mainly in the private sector but also in the municipal sector. The introduction of the balanced scorecard has been primarily to organizations to see more than economic dimensions. The Balanced Scorecard was originally a measurement system, and today it works more as a strategic instrument. In our study is a case study to evaluate a municipality and how they make use of the balanced scorecard as a tool for strategic and value-adding work in municipal activities. In the local business is it important that the organization adapts the balanced scorecard, so it fits on the basis that it is a politically driven organization, with mandates, committees and administrations. In our study, we used a qualitative method with a deductive approach. In the study, we have gathered information through a case study where we interviewed 7 people in leading positions. In our analysis and results section, we came to the conclusion that the municipality does not use the balanced scorecard correctly. We also found that the balanced scorecard as a tool for value creation and strategic planning does not work in a favorable way. In our study, we see difficulties with the implementation of the balanced scorecard. If the municipality has invested in implementing the balanced scorecard at all levels of the business so the municipality would be able to use it on one of the activities more adequately. When the municipality is a politically driven organization, it is important that vision alive and changing based on the conditions that reflect the outside world and the municipality in general. Looking at a vivid vision, goals and business ideas, it's balanced scorecard in line with how a balanced scorecard should look like. The municipality has a strategic plan in terms of staff and employees at large. In the study, we have seen that the strategic plan is not followed up in a good way and for the business favorably, the municipality chooses the easy way out for evaluation. Employee participation to changes and ongoing human resources management feels nonexistent. However, as has been the vision of creating empowered and motivated employees. In our conclusion, we describe how we in our study look at the use of the balanced scorecard in municipal operations. We can also discern that a balanced scorecard as a tool for value creation and strategic work is good if it is used properly. In the study, we have concluded that the municipality we have chosen to study should not use the balanced scorecard when you have not created the tools and platforms required for employees, civil servants and politicians to evaluate, monitor and create a living scorecard change over time. The study reveals major shortcomings in the implementation, evaluation and follow-up possibilities, and the consequence of this is that the balanced scorecard is not - 4 - preferable in municipal operations as a strategic instrument for value creation and long-term planning.
Resumo:
Beginning with Montaigne’s essayistic dictum Que sais je? — ‘What do I know?’ — this PhD thesis examines the literary history, formal qualities, and theoretical underpinnings of the personal essay to both investigate and to practice its relevance as an approach to writing about art. The thesis proposes the essay as intrinsically linked to research, critical writing, and art making; it is a literary method that embodies the real experience of attempting to answer a question. The essay is a processual and reflexive mode of enquiry: a form that conveys not just the essayist’s thought, but the sense and texture of its movement as it attempts to understand its object. It is often invoked, across disciplines, in reference to the possibility of a more liberal sense of creative practice — one that conceptually and stylistically privileges collage, fragmentation, hybridity, chance, open-endedness, and the meander. Within this question of the essay as form, the thesis contains two distinct and parallel strands of analysis — subject matter and essay writing as research. At the core of the study lie two close-readings: Ana Mendieta’s Labyrinth of Venus (1982) and Le Couvent de la Tourette (1959) by Le Corbusier and Iannis Xenakis. In each case, the writing draws, in its tone and texture, on a range of literary influences, weaving together different voices, discussions, and approaches to enquiry. The practice of essay writing is presented alongside, part and party to, research: a method of interrogation that embraces risk and uncertainty, and simultaneously enacts its own findings as a critical-creative mode of study-via-form, and form-via-study. The thesis is presented as a book-length essay, in which the art in question is equal and intimately connected to the writing used to address it. Method and form are designed to respond to the oft-cited challenge of the essay as fundamentally unmethodical, ranging, and diverse. Research, critical study, writerly description, and storytelling are combined to elucidate and expose each other based not on surface continuity, but on a deep interconnection among ideas that, through language, cohere and become related — imbued with an affinity for one another. The consummate product is the argument, as it works across genres, disciplines, descriptive and critical models, to challenge the narrative structure and language used within contemporary writing about art.
Resumo:
Hyperspectral instruments have been incorporated in satellite missions, providing data of high spectral resolution of the Earth. This data can be used in remote sensing applications, such as, target detection, hazard prevention, and monitoring oil spills, among others. In most of these applications, one of the requirements of paramount importance is the ability to give real-time or near real-time response. Recently, onboard processing systems have emerged, in order to overcome the huge amount of data to transfer from the satellite to the ground station, and thus, avoiding delays between hyperspectral image acquisition and its interpretation. For this purpose, compact reconfigurable hardware modules, such as field programmable gate arrays (FPGAs) are widely used. This paper proposes a parallel FPGA-based architecture for endmember’s signature extraction. This method based on the Vertex Component Analysis (VCA) has several advantages, namely it is unsupervised, fully automatic, and it works without dimensionality reduction (DR) pre-processing step. The architecture has been designed for a low cost Xilinx Zynq board with a Zynq-7020 SoC FPGA based on the Artix-7 FPGA programmable logic and tested using real hyperspectral data sets collected by the NASA’s Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS) over the Cuprite mining district in Nevada. Experimental results indicate that the proposed implementation can achieve real-time processing, while maintaining the methods accuracy, which indicate the potential of the proposed platform to implement high-performance, low cost embedded systems, opening new perspectives for onboard hyperspectral image processing.
Resumo:
The `Outorga Onerosa do Direito de Construir - OODC` (Public Concession of Building Rights), instrument instituted by The City Statute in 2001, has as main objective the recovery of urban property, seeking for a fair distribution the urbanization benefits. The possibility of usage of the OODC instrument is linked to the maximum utilization coefficient, determined to specific areas in accordance to existing infrastructure conditions, further taking into account the formal real estate market, expansion axis and crowding. Being an instrument which establishes values to be paid for a better use of land, it maintains a narrow relation to the real estate, incentivizing or discouraging the crowding in specific areas. The present study investigates the relationship between the criteria for the making of the Public Concession of Building Rights instrument and the dynamics of the formal real estate market. It takes as empiric universe Parnamirim (RN), part of the Natal Metropolitan Area (RN), focusing on the application of the OODC in the period of 2008-2010. It seeks to better understand the necessary basis for the formulation of the instrument, about how it works and its relation to the formal real estate market. It aims to depict the formal real estate market by presenting the production of urban space in Parnamirim in terms of intensity and nature of the real estate, furthermore identifying the licensed properties through the application of the municipality instrument. For the conclusion, it is discussed the criteria for the formation of OODC, its relationship to the dynamics of the formal real estate market and its influencing possibilities in the processes of usage and occupation of land in the context of urban planning