996 resultados para Illinois Building Commission
Resumo:
Os edifícios estão a ser construídos com um número crescente de sistemas de automação e controlo não integrados entre si. Esta falta de integração resulta num caos tecnológico, o que cria dificuldades nas três fases da vida de um edifício, a fase de estudo, a de implementação e a de exploração. O desenvolvimento de Building Automation System (BAS) tem como objectivo assegurar condições de conforto, segurança e economia de energia. Em edifícios de grandes dimensões a energia pode representar uma percentagem significativa da factura energética anual. Um BAS integrado deverá contribuir para uma diminuição significativa dos custos de desenvolvimento, instalação e gestão do edifício, o que pode também contribuir para a redução de CO2. O objectivo da arquitectura proposta é contribuir para uma estratégia de integração que permita a gestão integrada dos diversos subsistemas do edifício (e.g. aquecimento, ventilação e ar condicionado (AVAC), iluminação, segurança, etc.). Para realizar este controlo integrado é necessário estabelecer uma estratégia de cooperação entre os subsistemas envolvidos. Um dos desafios para desenvolver um BAS com estas características consistirá em estabelecer a interoperabilidade entre os subsistemas como um dos principais objectivos a alcançar, dado que o fornecimento dos referidos subsistemas assenta normalmente numa filosofia multi-fornecedor, sendo desenvolvidos usando tecnologias heterogéneas. Desta forma, o presente trabalho consistiu no desenvolvimento de uma plataforma que se designou por Building Intelligence Open System (BIOS). Na implementação desta plataforma adoptou-se uma arquitectura orientada a serviços ou Service Oriented Architecture (SOA) constituída por quatro elementos fundamentais: um bus cooperativo, denominado BIOSbus, implementado usando Jini e JavaSpaces, onde todos os serviços serão ligados, disponibilizando um mecanismo de descoberta e um mecanismo que notificada as entidades interessadas sobre alterações do estado de determinado componente; serviços de comunicação que asseguram a abstracção do Hardware utilizado da automatização das diversas funcionalidades do edifício; serviços de abstracção de subsistemas no acesso ao bus; clientes, este podem ser nomeadamente uma interface gráfica onde é possível fazer a gestão integrada do edifício, cliente de coordenação que oferece a interoperabilidade entre subsistemas e os serviços de gestão energética que possibilita a activação de algoritmos de gestão racional de energia eléctrica.
Resumo:
Lifelong learning (LLL) has received increasing attention in recent years. It implies that learning should take place at all stages of the “life cycle and it should be life-wide, that is embedded in all life contexts from the school to the work place, the home and the community” (Green, 2002, p.613). The ‘learning society’, is the vision of a society where there are recognized opportunities for learning for every person, wherever they are and however old they happen to be. Globalization and the rise of new information technologies are some of the driving forces that cause depreciation of specialised competences. This happens very quickly in terms of economic value; consequently, workers of all skills levels, during their working life, must have the opportunity to update “their technical skills and enhance general skills to keep pace with continuous technological change and new job requirements” (Fahr, 2005, p. 75). It is in this context that LLL tops the policy agenda of international bodies, national governments and non-governmental organizations, in the field of education and training, to justify the need for LLL opportunities for the population as they face contemporary employability challenges. It is in this context that the requirement and interest to analyse the behaviour patterns of adult learners has developed over the last few years
Resumo:
One of the main arguments in favour of the adoption and convergence with the international accounting standards published by the IASB (i.e. IAS/IFRS) is that these will allow comparability of financial reporting across countries. However, because these standards use verbal probability expressions (v.g. “probable”) when establishing the recognition and disclosure criteria for accounting elements, they require professional accountants to interpret and classify the probability of an outcome or event taking into account those terms and expressions and to best decide in terms of financial reporting. This paper reports part of a research we carried out on the interpretation of “in context” verbal probability expressions used in the IAS/IFRS by the auditors registered with the Portuguese Securities Market Commission, the Comissão do Mercado de Valores Mobiliários (CMVM). Our results provide support for the hypothesis that culture affects the CMVM registered auditors’ interpretation of verbal probability expressions through its influence on the accounting value (or attitude) of conservatism. Our results also suggest that there are significant differences in their interpretation of the term “probable”, which is consistent with literature in general. Since “probable” is the most frequent verbal probability expression used in the IAS/IFRS, this may have a negative impact on financial statements comparability.
Resumo:
In the work of Paul Auster (Newark, 1947 - ), we find two main themes: the sense of loss and existential drift and the loneliness of the individual fully committed to the work of writing, as if he had been confined to the book that commands his life. However, this second theme is clearly the dominant one because the character's space of solitude may include its own wandering, because this wandering is also often performed inside the four walls of a room, just like it is narrated inside the space of the page and the book. Both in his poetry, essays and fiction, Auster seems to face the work of writing as an actual physical effort of effective construction, as if the words that are aligned in the poem-text were stones to place in a row when building a wall or some other structure in stone.
Resumo:
The paper focuses on the importance of Darwin’s work for the shaping of Henri Bergson’s philosophy, bearing on mind that the two authors first intercepted symbolically in 1859, when On the Origin of Species was published and Bergson was born. Bergson studied the biological sciences of his time, whose results were integrated in a metaphysical thought. He belonged to spiritualistic positivism, a philosophy that goes from the positive data of sciences and finds the ultimate explanation of reality in a spiritual principle. He was interested in the positive evolution of the natural world and in the works of naturalists such as Lamarck, De Vries or Eimer. Darwin was among these authors, being responsible for a vision of evolution that went from the scientific level to other domains. Bergson defends the “insufficiency of pure Darwinism” by pointing out the necessity to compensate scientific evolution with an internal metaphysical reading of the real, which he considered to be “true evolutionism”. This criticism is the most visible aspect of the relations between both works. However, an attentive look verifies that Darwin’s influence overcomes the divergence of positions concerning the extent of “evolution”. The French philosopher knew not only the 1859’s bestseller, but also studies by Darwin about ethology, entomology and botany, which contributed to the fact that the naturalist’s impact gained fundamental importance in Bergson’s philosophical perspective.
Resumo:
This paper aims to present a contrastive approach between three different ways of building concepts after proving the similar syntactic possibilities that coexist in terms. However, from the semantic point of view we can see that each language family has a different distribution in meaning. But the most important point we try to show is that the differences found in the psychological process when communicating concepts should guide the translator and the terminologist in the target text production and the terminology planning process. Differences between languages in the information transmission process are due to the different roles the different types of knowledge play. We distinguish here the analytic-descriptive knowledge and the analogical knowledge among others. We also state that none of them is the best when determining the correctness of a term, but there has to be adequacy criteria in the selection process. This concept building or term building success is important when looking at the linguistic map of the information society.
Resumo:
For industrial environments it is true that Ethernet technologies are there to stay. In fact, a number of characteristics are boosting the eagerness of extending Ethernet to also cover factory-floor applications. Fullduplex links, non-blocking and priority-based switching, bandwidth availability, just to mention a few, are characteristics upon which that eagerness is building up. But, will Ethernet technologies really manage to replace traditional field bus networks? Fieldbus fundamentalists often argue that the two things are not comparable. In fact, Ethernet technology, by itself, does not include features above the lower layers of the OSI communication model. Where are the higher layers and the application enablers that permit building real industrial applications? And, taking for free that they are available, what is the impact of those protocols, mechanisms and application models on the overall performance of Ethernet-based distributed factory-floor applications?
Resumo:
Managing the physical and compute infrastructure of a large data center is an embodiment of a Cyber-Physical System (CPS). The physical parameters of the data center (such as power, temperature, pressure, humidity) are tightly coupled with computations, even more so in upcoming data centers, where the location of workloads can vary substantially due, for example, to workloads being moved in a cloud infrastructure hosted in the data center. In this paper, we describe a data collection and distribution architecture that enables gathering physical parameters of a large data center at a very high temporal and spatial resolutionof the sensor measurements. We think this is an important characteristic to enable more accurate heat-flow models of the data center andwith them, _and opportunities to optimize energy consumption. Havinga high resolution picture of the data center conditions, also enables minimizing local hotspots, perform more accurate predictive maintenance (pending failures in cooling and other infrastructure equipment can be more promptly detected) and more accurate billing. We detail this architecture and define the structure of the underlying messaging system that is used to collect and distribute the data. Finally, we show the results of a preliminary study of a typical data center radio environment.
Resumo:
The recent trends of chip architectures with higher number of heterogeneous cores, and non-uniform memory/non-coherent caches, brings renewed attention to the use of Software Transactional Memory (STM) as a fundamental building block for developing parallel applications. Nevertheless, although STM promises to ease concurrent and parallel software development, it relies on the possibility of aborting conflicting transactions to maintain data consistency, which impacts on the responsiveness and timing guarantees required by embedded real-time systems. In these systems, contention delays must be (efficiently) limited so that the response times of tasks executing transactions are upper-bounded and task sets can be feasibly scheduled. In this paper we assess the use of STM in the development of embedded real-time software, defending that the amount of contention can be reduced if read-only transactions access recent consistent data snapshots, progressing in a wait-free manner. We show how the required number of versions of a shared object can be calculated for a set of tasks. We also outline an algorithm to manage conflicts between update transactions that prevents starvation.
Resumo:
Most of today’s embedded systems are required to work in dynamic environments, where the characteristics of the computational load cannot always be predicted in advance. Furthermore, resource needs are usually data dependent and vary over time. Resource constrained devices may need to cooperate with neighbour nodes in order to fulfil those requirements and handle stringent non-functional constraints. This paper describes a framework that facilitates the distribution of resource intensive services across a community of nodes, forming temporary coalitions for a cooperative QoSaware execution. The increasing need to tailor provided service to each application’s specific needs determines the dynamic selection of peers to form such a coalition. The system is able to react to load variations, degrading its performance in a controlled fashion if needed. Isolation between different services is achieved by guaranteeing a minimal service quality to accepted services and by an efficient overload control that considers the challenges and opportunities of dynamic distributed embedded systems.
Resumo:
In this paper will be discussed different types of scenarios and the aims for using scenarios. Normaly they are being used by organisations due to the need to anticipate processes, to support policy-making and to understand the complexities of relations. Such organisations can be private companies, R&D organisations and networks of organisations, or even by some public administration institutions. Some cases will be discussed as the methods for ongoing scenario-building process (Shell Internacional). Scenarios should anticipate possible relations among social actors as in the Triple Helix Model, and is possible to develop strategic intelligence in the innovation process that would enable the construction of scenarios. Such processes can be assessed. The focus will be made in relation to the steps chosen for the WORKS scenarios. In this case is there a model of work changes that can be used for foresight? Differences according to sectors were found, as well on other dimensions. Problems of assessment are analysed with specific application to the scenario construction methods.
Resumo:
Two new metal- organic compounds {[Cu-3(mu(3)-4-(p)tz)(4)(mu(2)-N-3)(2)(DMF)(2)](DMF)(2)}(n) (1) and {[Cu(4ptz) (2)(H2O)(2)]}(n) (2) {4-ptz = 5-(4-pyridyl)tetrazolate} with 3D and 2D coordination networks, respectively, have been synthesized while studying the effect of reaction conditions on the coordination modes of 4-pytz by employing the [2 + 3] cycloaddition as a tool for generating in situ the 5-substituted tetrazole ligands from 4-pyridinecarbonitrile and NaN3 in the presence of a copper(II) salt. The obtained compounds have been structurally characterized and the topological analysis of 1 discloses a topologically unique trinodal 3,5,6-connected 3D network which, upon further simplification, results in a uninodal 8-connected underlying net with the bcu (body centred cubic) topology driven by the [Cu-3(mu(2)-N-3)(2)] cluster nodes and mu(3)-4-ptz linkers. In contrast, the 2D metal-organic network in 2 has been classified as a uninodal 4-connected underlying net with the sql [Shubnikov tetragonal plane net] topology assembled from the Cu nodes and mu(2)-4-ptz linkers. The catalytic investigations disclosed that 1 and 2 act as active catalyst precursors towards the microwave-assisted homogeneous oxidation of secondary alcohols (1-phenylethanol, cyclohexanol, 2-hexanol, 3-hexanol, 2-octanol and 3-octanol) with tert-butylhydroperoxide, leading to the yields of the corresponding ketones up to 86% (TOF = 430 h(-1)) and 58% (TOF = 290 h(-1)) in the oxidation of 1-phenylethanol and cyclohexanol, respectively, after 1 h under low power ( 10 W) microwave irradiation, and in the absence of any added solvent or additive.
Resumo:
Outlining the best strategies for seismic risk mitigation requires that both benefits and costs of retrofitting are known in advance. The assessment of the vulnerability of building typologies is a first step of a more extensive effort, concerning the analysis of the viability of seismic risk mitigation and taking into account retrofitting costs. The methodology adopted to obtain the seismic vulnerability of some classes of residential buildings existing in mainland Portugal is presented. This methodology is based on a structural analysis of individual buildings belonging to the same typology. An application example is presented to illustrate the methodology. Fragility curves of “boxed” building typology are also presented and broken down into three height classes: low rise, medium-rise and high-rise. These curves are based on average capacity spectra derived from several individual buildings belonging to the same typology.
Resumo:
Business History, Vol 50 No 2, p147-162
Resumo:
Este trabalho descreve a abordagem abrangente sobre a melhoria do sistema de gestão da qualidade na Unidade de Imagiologia do Hospital da Boavista através da implementação das normas de acreditação da Joint Commission International (JCI). Fundamental para a melhoria geral da qualidade é a redução contínua de riscos para os doentes e para os profissionais da Unidade. Tais riscos podem existir ao nível do ambiente físico assim como no circuito dos exames e dos doentes. A acreditação em Saúde é uma das prioridades estratégicas do Ministério da Saúde e tem como objetivo fortalecer a confiança dos cidadãos nos profissionais de saúde bem como nas instituições de saúde. É importante que Portugal cultive a melhoria da qualidade e segurança nas instituições de saúde mantendo uma relação adequada custo/benefício. A União Europeia tem feito um esforço para que a acreditação seja harmoniosa nos seus princípios, no entanto é respeitada sempre a prevalência da legislação de cada país, bem como as suas especificações culturais e religiosas (Shaw, 2006), responsabilizando-o pelo seu sistema de saúde O trabalho aqui apresentado tem como objetivo principal fundamentar a escolha do modelo de acreditação da JCI para o Hospital da Boavista, nomeadamente para a Unidade de Imagiologia, ver se os padrões estão de acordo com os procedimentos da Unidade, identificar falhas e apontar possiveis melhorias. Pretende-se ainda mostrar a importância da implementação dos sistemas de certificação e acreditação da gestão da qualidade, documentada pela experiência profissional, bem como o know-how do Hospital da Boavista, assim como a complementaridade dos programas da gestão da qualidade, certificação e acreditação. A escolha do modelo de acreditação da JCI, foi uma opção do Hospital da Boavista baseada na credibilidade e no grau de exigência que a entidade impõe. Foi imperativo que a Unidade de Imagiologia realizasse as suas funções de forma válida e fiável e que disponibilizasse produtos / serviços de qualidade. A monitorização e consequente controlo de qualidade do serviço prestado pela Unidade de Imagiologia, foi difícil mas simplificado, em parte, devido ao sistema de gestão da qualidade ISO 9001:2008 já implementado, tendo este sido consolidado com a implementação da acreditação da JCI, com padrões específicos bem definidos na gestão do controlo de qualidade na Unidade de Imagiologia do Hospital da Boavista.