907 resultados para Building heating systems
Resumo:
Com o aumento constante de procura de recursos naturais por parte dos vários setores da sociedade é urgente encontrar soluções para reduzir o seu consumo sem se travar a expansão demográfica que se tem vindo a sentir nos grandes centros urbanos. É através da implementação de medidas de sustentabilidade e pelo aumento da eficiência de utilização desses recursos que se tem vindo a combater esta tendência cada vez maior de consumismo global, sendo isto apenas possível com a implementação de ferramentas tecnológicas avançadas que permitem estabelecer limites ao considerado eficiente e premiando, em termos financeiros e de imagem de marketing, as entidades que o alcancem. O LEED é um sistema de certificação de sustentabilidade voluntário de edifícios residenciais e comerciais que estabelece métricas de comparação de parâmetros indicadores de consumos energéticos, hídricos e de materiais em todo o ciclo de vida do edifício e que tem vindo a ganhar destaque em crescendo a nível mundial. Esta dissertação teve como objetivo comparar a performance de consumo energético no âmbito do sistema LEED com a do sistema de certificação energética de edifícios nacional (SCE) de um grande edifício de serviços, estabelecendo um paralelismo de semelhanças e diferenças entre os dois e de avaliar os efeitos de potenciais medidas de eficiência energética e seus efeitos nas classificações de mérito obtidas em cada sistema. Os resultados obtidos na simulação que permitiu avaliar a performance foi muito satisfatório, tendo sido aproveitado pela empresa para efeitos de certificação LEED do edifício em estudo.
Resumo:
With the rise of smart phones, lifelogging devices (e.g. Google Glass) and popularity of image sharing websites (e.g. Flickr), users are capturing and sharing every aspect of their life online producing a wealth of visual content. Of these uploaded images, the majority are poorly annotated or exist in complete semantic isolation making the process of building retrieval systems difficult as one must firstly understand the meaning of an image in order to retrieve it. To alleviate this problem, many image sharing websites offer manual annotation tools which allow the user to “tag” their photos, however, these techniques are laborious and as a result have been poorly adopted; Sigurbjörnsson and van Zwol (2008) showed that 64% of images uploaded to Flickr are annotated with < 4 tags. Due to this, an entire body of research has focused on the automatic annotation of images (Hanbury, 2008; Smeulders et al., 2000; Zhang et al., 2012a) where one attempts to bridge the semantic gap between an image’s appearance and meaning e.g. the objects present. Despite two decades of research the semantic gap still largely exists and as a result automatic annotation models often offer unsatisfactory performance for industrial implementation. Further, these techniques can only annotate what they see, thus ignoring the “bigger picture” surrounding an image (e.g. its location, the event, the people present etc). Much work has therefore focused on building photo tag recommendation (PTR) methods which aid the user in the annotation process by suggesting tags related to those already present. These works have mainly focused on computing relationships between tags based on historical images e.g. that NY and timessquare co-exist in many images and are therefore highly correlated. However, tags are inherently noisy, sparse and ill-defined often resulting in poor PTR accuracy e.g. does NY refer to New York or New Year? This thesis proposes the exploitation of an image’s context which, unlike textual evidences, is always present, in order to alleviate this ambiguity in the tag recommendation process. Specifically we exploit the “what, who, where, when and how” of the image capture process in order to complement textual evidences in various photo tag recommendation and retrieval scenarios. In part II, we combine text, content-based (e.g. # of faces present) and contextual (e.g. day-of-the-week taken) signals for tag recommendation purposes, achieving up to a 75% improvement to precision@5 in comparison to a text-only TF-IDF baseline. We then consider external knowledge sources (i.e. Wikipedia & Twitter) as an alternative to (slower moving) Flickr in order to build recommendation models on, showing that similar accuracy could be achieved on these faster moving, yet entirely textual, datasets. In part II, we also highlight the merits of diversifying tag recommendation lists before discussing at length various problems with existing automatic image annotation and photo tag recommendation evaluation collections. In part III, we propose three new image retrieval scenarios, namely “visual event summarisation”, “image popularity prediction” and “lifelog summarisation”. In the first scenario, we attempt to produce a rank of relevant and diverse images for various news events by (i) removing irrelevant images such memes and visual duplicates (ii) before semantically clustering images based on the tweets in which they were originally posted. Using this approach, we were able to achieve over 50% precision for images in the top 5 ranks. In the second retrieval scenario, we show that by combining contextual and content-based features from images, we are able to predict if it will become “popular” (or not) with 74% accuracy, using an SVM classifier. Finally, in chapter 9 we employ blur detection and perceptual-hash clustering in order to remove noisy images from lifelogs, before combining visual and geo-temporal signals in order to capture a user’s “key moments” within their day. We believe that the results of this thesis show an important step towards building effective image retrieval models when there lacks sufficient textual content (i.e. a cold start).
Resumo:
Nanotechnology has revolutionised humanity's capability in building microscopic systems by manipulating materials on a molecular and atomic scale. Nan-osystems are becoming increasingly smaller and more complex from the chemical perspective which increases the demand for microscopic characterisation techniques. Among others, transmission electron microscopy (TEM) is an indispensable tool that is increasingly used to study the structures of nanosystems down to the molecular and atomic scale. However, despite the effectivity of this tool, it can only provide 2-dimensional projection (shadow) images of the 3D structure, leaving the 3-dimensional information hidden which can lead to incomplete or erroneous characterization. One very promising inspection method is Electron Tomography (ET), which is rapidly becoming an important tool to explore the 3D nano-world. ET provides (sub-)nanometer resolution in all three dimensions of the sample under investigation. However, the fidelity of the ET tomogram that is achieved by current ET reconstruction procedures remains a major challenge. This thesis addresses the assessment and advancement of electron tomographic methods to enable high-fidelity three-dimensional investigations. A quality assessment investigation was conducted to provide a quality quantitative analysis of the main established ET reconstruction algorithms and to study the influence of the experimental conditions on the quality of the reconstructed ET tomogram. Regular shaped nanoparticles were used as a ground-truth for this study. It is concluded that the fidelity of the post-reconstruction quantitative analysis and segmentation is limited, mainly by the fidelity of the reconstructed ET tomogram. This motivates the development of an improved tomographic reconstruction process. In this thesis, a novel ET method was proposed, named dictionary learning electron tomography (DLET). DLET is based on the recent mathematical theorem of compressed sensing (CS) which employs the sparsity of ET tomograms to enable accurate reconstruction from undersampled (S)TEM tilt series. DLET learns the sparsifying transform (dictionary) in an adaptive way and reconstructs the tomogram simultaneously from highly undersampled tilt series. In this method, the sparsity is applied on overlapping image patches favouring local structures. Furthermore, the dictionary is adapted to the specific tomogram instance, thereby favouring better sparsity and consequently higher quality reconstructions. The reconstruction algorithm is based on an alternating procedure that learns the sparsifying dictionary and employs it to remove artifacts and noise in one step, and then restores the tomogram data in the other step. Simulation and real ET experiments of several morphologies are performed with a variety of setups. Reconstruction results validate its efficiency in both noiseless and noisy cases and show that it yields an improved reconstruction quality with fast convergence. The proposed method enables the recovery of high-fidelity information without the need to worry about what sparsifying transform to select or whether the images used strictly follow the pre-conditions of a certain transform (e.g. strictly piecewise constant for Total Variation minimisation). This can also avoid artifacts that can be introduced by specific sparsifying transforms (e.g. the staircase artifacts the may result when using Total Variation minimisation). Moreover, this thesis shows how reliable elementally sensitive tomography using EELS is possible with the aid of both appropriate use of Dual electron energy loss spectroscopy (DualEELS) and the DLET compressed sensing algorithm to make the best use of the limited data volume and signal to noise inherent in core-loss electron energy loss spectroscopy (EELS) from nanoparticles of an industrially important material. Taken together, the results presented in this thesis demonstrates how high-fidelity ET reconstructions can be achieved using a compressed sensing approach.
Resumo:
The big data era has dramatically transformed our lives; however, security incidents such as data breaches can put sensitive data (e.g. photos, identities, genomes) at risk. To protect users' data privacy, there is a growing interest in building secure cloud computing systems, which keep sensitive data inputs hidden, even from computation providers. Conceptually, secure cloud computing systems leverage cryptographic techniques (e.g., secure multiparty computation) and trusted hardware (e.g. secure processors) to instantiate a “secure” abstract machine consisting of a CPU and encrypted memory, so that an adversary cannot learn information through either the computation within the CPU or the data in the memory. Unfortunately, evidence has shown that side channels (e.g. memory accesses, timing, and termination) in such a “secure” abstract machine may potentially leak highly sensitive information, including cryptographic keys that form the root of trust for the secure systems. This thesis broadly expands the investigation of a research direction called trace oblivious computation, where programming language techniques are employed to prevent side channel information leakage. We demonstrate the feasibility of trace oblivious computation, by formalizing and building several systems, including GhostRider, which is a hardware-software co-design to provide a hardware-based trace oblivious computing solution, SCVM, which is an automatic RAM-model secure computation system, and ObliVM, which is a programming framework to facilitate programmers to develop applications. All of these systems enjoy formal security guarantees while demonstrating a better performance than prior systems, by one to several orders of magnitude.
Resumo:
A large percentage of Vanier College's technology students do not attain their College degrees within the scheduled three years of their program. A closer investigation of the problem revealed that in many of these cases these students had completed all of their program professional courses but they had not completed all of the required English and/or Humanities courses. Fortunately, most of these students do extend their stay at the college for the one or more semesters required for graduation, although some choose to go on into the workforce without returning to complete the missing English and/or Humanities and without their College Degrees. The purpose of this research was to discover if there was any significant measure of association between a student's family linguistic background, family cultural background, high school average, and/or College English Placement Test results and his or her likelihood of succeeding in his or her English and/or Humanities courses within the scheduled three years of the program. Because of both demographic differences between 'hard' and 'soft' technologies, including student population, more specifically gender ratios and student average ages in specific programs; and program differences, including program writing requirements and types of practical skill activities required; in order to have a more uniform sample, the research was limited to the hard technologies where students work hands-on with hardware and/or computers and tend to have overall low research and writing requirements. Based on a review of current literature and observations made in one of the hard technology programs at Vanier College, eight research questions were developed. These questions were designed to examine different aspects of success in the English and Humanities courses such as failure and completion rates and the number of courses remaining after the end of the fifth semester and as well examine how the students assessed their ability to communicate in English. The eight research questions were broken down into a total of 54 hypotheses. The high number of hypotheses was required to address a total of seven independent variables: primary home language, high school language of instruction, student's place of birth (Canada, Not-Canada), student's parents' place of birth (Both-born-in-Canada, Not-both-born-in-Canada), high school averages and English placement level (as a result of the College English Entry Test); and eleven dependent variables: number of English completed, number of English failed, whether all English were completed by the end of the 5th semester (yes, no), number of Humanities courses completed, number of Humanities courses failed, whether all the Humanities courses were completed by the end of the 5th semester (yes, no), the total number of English and Humanities courses left, and the students' assessments of their ability to speak, read and write in English. The data required to address the hypotheses were collected from two sources, from the students themselves and from the College. Fifth and sixth semester students from Building Engineering Systems, Computer and Digital Systems, Computer Science and Industrial Electronics Technology Programs were surveyed to collect personal information including family cultural and linguistic history and current language usages, high school language of instruction, perceived fluency in speaking, reading and writing in English and perceived difficulty in completing English and Humanities courses. The College was able to provide current academic information on each of the students, including copies of college program planners and transcripts, and high school transcripts for students who attended a high school in Quebec. Quantitative analyses were done on the data using the SPSS statistical analysis program. Of the fifty-four hypotheses analysed, in fourteen cases the results supported the research hypotheses, in the forty other cases the null hypotheses had to be accepted. One of the findings was that there was a strong significant association between a student's primary home language and place of birth and his or her perception of his or her ability to communicate in English (speak, read, and write) signifying that both students whose primary home language was not English and students who were not born in Canada, considered themselves, on average, to be weaker in these skills than did students whose primary home language was English. Although this finding was noteworthy, the two most significant findings were the association found between a student's English entry placement level and the number of English courses failed and the association between the parents' place of birth and the student's likelihood of succeeding in both his or her English and Humanities courses. According to the research results, the mean number of English courses failed, on average, by students placed in the lowest entry level of College English was significantly different from the number of English courses failed by students placed in any of the other entry level English courses. In this sample students who were placed in the lowest entry level of College English failed, on average, at least three times as many English courses as those placed in any of the other English entry level courses. These results are significant enough that they will be brought to the attention of the appropriate College administration. The results of this research also appeared to indicate that the most significant determining factor in a student's likelihood of completing his or her English and Humanities courses is his or her parents' place of birth (both-born-in-Canada or not-both-born-in-Canada). Students who had at least one parent who was not born in Canada, would, on average, fail a significantly higher number of English courses, be significantly more likely to still have at least one English course left to complete by the end of the 5th semester, fail a significantly higher number of Humanities courses, be significantly more likely to still have at least one Humanities course to complete by the end of the 5th semester and have significantly more combined English and Humanities courses to complete at the end of their 5th semester than students with both parents born in Canada. This strong association between students' parents' place of birth and their likelihood of succeeding in their English and Humanities courses within the three years of their program appears to indicate that acculturation may be a more significant factor than either language or high school averages, for which no significant association was found for any of the English and Humanities related dependent variables. Although the sample size for this research was only 60 students and more research needs to be conducted in this area, to see if these results are supported with other groups within the College, these results are still significant. If the College can identify, at admission, the students who will be more likely to have difficulty in completing their English and Humanities courses, the College will now have the opportunity to intercede during or before the first semester, and offer these students the support they require in order to increase their chances of success in their education, whether it be classes or courses designed to meet their specific needs, special mentoring, tutoring or other forms of support. With the necessary support, the identified students will have a greater opportunity of successfully completing their programs within the scheduled three years, while at the same time the College will have improved its capacity to meeting the needs of its students.
Resumo:
Besides the sustaining of healthy and comfortable indoor climate, the air conditioning system should also achieve for energy efficiency. The target indoor climate can be ob-tained with different systems; this study focuses on comparing the energy efficiency of different air conditioning room unit systems in different climates. The calculations are made with dynamic energy simulation software IDA ICE by comparing the indoor cli-mate and energy consumption of an office building with different systems in different climates. The aim of the study is to compare the energy efficiency of chilled beam systems to other common systems: variable air volume, fan coil and radiant ceiling systems. Besides the annual energy consumption also the sustainability of target indoor climate is compared between the simulations. Another aim is to provide conclusions to be used in the product development of the chilled beam systems’ energy efficiency. The adaptable chilled beam system and the radiant ceiling system prove to be energy efficient independent of the climate. The challenge of reliable comparison is that other systems are not able to reach the target indoor climate as well as the others. The complex calculation environment of the simulation software, made assumptions and excluding of the financial aspects complicate comparing the big picture. The results show that the development of the chilled beam systems should concentrate on energy efficient night heating, flexible demand based ventilation and capacity control and possibilities on integrating the best practices with other systems.
Resumo:
Occupants’ behaviour when improving the indoor environment plays a significant role in saving energy in buildings. Therefore the key step to reducing energy consumption and carbon emissions from buildings is to understand how occupants interact with the environment they are exposed to in terms of achieving thermal comfort and well-being; though such interaction is complex. This paper presents a dynamic process of occupant behaviours involving technological, personal and psychological adaptations in response to varied thermal conditions based on the data covering four seasons gathered from the field study in Chongqing, China. It demonstrates that occupants are active players in environmental control and their adaptive responses are driven strongly by ambient thermal stimuli and vary from season to season and from time to time, even on the same day. Positive, dynamic, behavioural adaptation will help save energy used in heating and cooling buildings. However, when environmental parameters cannot fully satisfy occupants’ requirements, negative behaviours could conflict with energy saving. The survey revealed that about 23% of windows are partly open for fresh air when air-conditioners are in operation in summer. This paper addresses the issues how the building and environmental systems should be designed, operated and managed in a way that meets the requirements of energy efficiency without compromising wellbeing and productivity.
Resumo:
"NBSIR 76-1562."
Resumo:
Distributed control systems consist of sensors, actuators and controllers, interconnected by communication networks and are characterized by a high number of concurrent process. This work presents a proposal for a procedure to model and analyze communication networks for distributed control systems in intelligent building. The approach considered for this purpose is based on the characterization of the control system as a discrete event system and application of coloured Petri net as a formal method for specification, analysis and verification of control solutions. With this approach, we develop the models that compose the communication networks for the control systems of intelligent building, which are considered the relationships between the various buildings systems. This procedure provides a structured development of models, facilitating the process of specifying the control algorithm. An application example is presented in order to illustrate the main features of this approach.
Resumo:
The most-used refrigeration system is the vapor-compression system. In this cycle, the compressor is the most complex and expensive component, especially the reciprocating semihermetic type, which is often used in food product conservation. This component is very sensitive to variations in its operating conditions. If these conditions reach unacceptable levels, failures are practically inevitable. Therefore, maintenance actions should be taken in order to maintain good performance of such compressors and to avoid undesirable stops of the system. To achieve such a goal, one has to evaluate the reliability of the system and/or the components. In this case, reliability means the probability that some equipment cannot perform their requested functions for an established time period, under defined operating conditions. One of the tools used to improve component reliability is the failure mode and effect analysis (FMEA). This paper proposes that the methodology of FMEA be used as a tool to evaluate the main failures found in semihermetic reciprocating compressors used in refrigeration systems. Based on the results, some suggestions for maintenance are addressed.
Resumo:
Os edifícios estão a ser construídos com um número crescente de sistemas de automação e controlo não integrados entre si. Esta falta de integração resulta num caos tecnológico, o que cria dificuldades nas três fases da vida de um edifício, a fase de estudo, a de implementação e a de exploração. O desenvolvimento de Building Automation System (BAS) tem como objectivo assegurar condições de conforto, segurança e economia de energia. Em edifícios de grandes dimensões a energia pode representar uma percentagem significativa da factura energética anual. Um BAS integrado deverá contribuir para uma diminuição significativa dos custos de desenvolvimento, instalação e gestão do edifício, o que pode também contribuir para a redução de CO2. O objectivo da arquitectura proposta é contribuir para uma estratégia de integração que permita a gestão integrada dos diversos subsistemas do edifício (e.g. aquecimento, ventilação e ar condicionado (AVAC), iluminação, segurança, etc.). Para realizar este controlo integrado é necessário estabelecer uma estratégia de cooperação entre os subsistemas envolvidos. Um dos desafios para desenvolver um BAS com estas características consistirá em estabelecer a interoperabilidade entre os subsistemas como um dos principais objectivos a alcançar, dado que o fornecimento dos referidos subsistemas assenta normalmente numa filosofia multi-fornecedor, sendo desenvolvidos usando tecnologias heterogéneas. Desta forma, o presente trabalho consistiu no desenvolvimento de uma plataforma que se designou por Building Intelligence Open System (BIOS). Na implementação desta plataforma adoptou-se uma arquitectura orientada a serviços ou Service Oriented Architecture (SOA) constituída por quatro elementos fundamentais: um bus cooperativo, denominado BIOSbus, implementado usando Jini e JavaSpaces, onde todos os serviços serão ligados, disponibilizando um mecanismo de descoberta e um mecanismo que notificada as entidades interessadas sobre alterações do estado de determinado componente; serviços de comunicação que asseguram a abstracção do Hardware utilizado da automatização das diversas funcionalidades do edifício; serviços de abstracção de subsistemas no acesso ao bus; clientes, este podem ser nomeadamente uma interface gráfica onde é possível fazer a gestão integrada do edifício, cliente de coordenação que oferece a interoperabilidade entre subsistemas e os serviços de gestão energética que possibilita a activação de algoritmos de gestão racional de energia eléctrica.
Resumo:
The recent trends of chip architectures with higher number of heterogeneous cores, and non-uniform memory/non-coherent caches, brings renewed attention to the use of Software Transactional Memory (STM) as a fundamental building block for developing parallel applications. Nevertheless, although STM promises to ease concurrent and parallel software development, it relies on the possibility of aborting conflicting transactions to maintain data consistency, which impacts on the responsiveness and timing guarantees required by embedded real-time systems. In these systems, contention delays must be (efficiently) limited so that the response times of tasks executing transactions are upper-bounded and task sets can be feasibly scheduled. In this paper we assess the use of STM in the development of embedded real-time software, defending that the amount of contention can be reduced if read-only transactions access recent consistent data snapshots, progressing in a wait-free manner. We show how the required number of versions of a shared object can be calculated for a set of tasks. We also outline an algorithm to manage conflicts between update transactions that prevents starvation.
Resumo:
Most of today’s embedded systems are required to work in dynamic environments, where the characteristics of the computational load cannot always be predicted in advance. Furthermore, resource needs are usually data dependent and vary over time. Resource constrained devices may need to cooperate with neighbour nodes in order to fulfil those requirements and handle stringent non-functional constraints. This paper describes a framework that facilitates the distribution of resource intensive services across a community of nodes, forming temporary coalitions for a cooperative QoSaware execution. The increasing need to tailor provided service to each application’s specific needs determines the dynamic selection of peers to form such a coalition. The system is able to react to load variations, degrading its performance in a controlled fashion if needed. Isolation between different services is achieved by guaranteeing a minimal service quality to accepted services and by an efficient overload control that considers the challenges and opportunities of dynamic distributed embedded systems.
Resumo:
The considerable amount of energy consumed on Earth is a major cause for not achieving sustainable development. Buildings are responsible for the highest worldwide energy consumption, nearly 40%. Strong efforts have been made in what concerns the reduction of buildings operational energy (heating, hot water, ventilation, electricity), since operational energy is so far the highest energy component in a building life cycle. However, as operational energy is being reduced the embodied energy increases. One of the building elements responsible for higher embodied energy consumption is the building structural system. Therefore, the present work is going to study part of embodied energy (initial embodied energy) in building structures using a life cycle assessment methodology, in order to contribute for a greater understanding of embodied energy in buildings structural systems. Initial embodied energy is estimated for a building structure by varying the span and the structural material type. The results are analysed and compared for different stages, and some conclusions are drawn. At the end of this work it was possible to conclude that the building span does not have considerable influence in embodied energy consumption of building structures. However, the structural material type has influence in the overall energetic performance. In fact, with this research it was possible that building structure that requires more initial embodied energy is the steel structure; then the glued laminated timber structure; and finally the concrete structure.