307 resultados para LED lighting
Resumo:
The issue of whether improved building services such as air quality, provision of daylight, thermal comfort etc, have a positive impact on the health and productivity of building occupants is still an open question. There is significant anecdotal evidence supporting the notion that health and productivity of building occupants can be improved by improving the quality of the indoor environment, but there are actually few published quantitative studies to substantiate this contention. This paper reports on a comprehensive review of the worldwide literature which relates health of building occupants with the different aspects of the indoor environment which are believed to impact of these issues, with a particular focus on studies in Australia, The paper analyses the existing research and identifies the key deficiencies in our existing understanding of this problem. The key focus of this research is office and school buildings, but the scope of the literature surveyed includes all commercial buildings, including industrial buildings. There is a notable absence of detailed studies on this link in Australian buildings, although there are studies on thermal comfort, and a number of studies on indoor air quality in Australia, which do not make the connection to health and productivity. Many international studies have focused on improved lighting, and in particular the provision of daylight in buildings, but again there are few studies in Australia which focus in this area.
Resumo:
Faces are complex patterns that often differ in only subtle ways. Face recognition algorithms have difficulty in coping with differences in lighting, cameras, pose, expression, etc. We propose a novel approach for facial recognition based on a new feature extraction method called fractal image-set encoding. This feature extraction method is a specialized fractal image coding technique that makes fractal codes more suitable for object and face recognition. A fractal code of a gray-scale image can be divided in two parts – geometrical parameters and luminance parameters. We show that fractal codes for an image are not unique and that we can change the set of fractal parameters without significant change in the quality of the reconstructed image. Fractal image-set coding keeps geometrical parameters the same for all images in the database. Differences between images are captured in the non-geometrical or luminance parameters – which are faster to compute. Results on a subset of the XM2VTS database are presented.
Resumo:
In practical terms, conceptual modeling is at the core of systems analysis and design. The plurality of modeling methods available has however been regarded as detrimental, and as a strong indication that a common view or theoretical grounding of modeling is wanting. This theoretical foundation must universally address all potential matters to be represented in a model, which consequently suggested ontology as the point of departure for theory development. The Bunge–Wand–Weber (BWW) ontology has become a widely accepted modeling theory. Its application has simultaneously led to the recognition that, although suitable as a meta-model, the BWW ontology needs to be enhanced regarding its expressiveness in empirical domains. In this paper, a first step in this direction has been made by revisiting BUNGE’s ontology, and by proposing the integration of a “hierarchy of systems” in the BWW ontology for accommodating domain specific conceptualizations.
Resumo:
The integrated and process oriented nature of Enterprise Systems (ES) has led organizations to use process modeling as an aid in managing these systems. Enterprise Systems success factor studies explicitly and implicitly state the importance of process modeling and its contribution to overall Enterprise System success. However, no empirical evidence exists on how to conduct process modeling successfully and possibly differentially in the main phases of the ES life-cycle. This paper reports on an empirical investigation of the factors that influence process modeling success. An a-priori model with 8 candidate success factors has been developed to this stage. This paper introduces the research context and objectives, describes the research design and the derived model, and concludes by looking ahead to the next phases of the research design.
Resumo:
A range of influences, both technical and organizational, has encouraged the widespread adoption of Enterprise Systems (ES). The integrated and process-oriented nature of Enterprise Systems has led organizations to use process modelling as a means of managing the complexity of these systems, and to aid in achieving business goals. Past research illustrates how process modelling is applied across different Enterprise Systems lifecycle phases. However, no empirical evidence exists to evaluate what factors are essential for a successful process modelling initiative, in general or in an ES context. This research-in-progress paper reports on an empirical investigation of the factors that influence process modelling success. It presents an a-priori process modelling critical-success-factors-model, describes its derivation, and concludes with an outlook to the next stages of the research.
Resumo:
For decades, marketing and marketing research have been based on a concept of consumer behaviour that is deeply embedded in a linear notion of marketing activities. With increasing regularity, key organising frameworks for marketing and marketing activities are being challenged by academics and practitioners alike. In turn, this has led to the search for new approaches and tools that will help marketers understand the interaction among attitudes, emotions and product/brand choice. More recently, the approach developed by Harvard Professor, Gerald Zaltman, referred to as the Zaltman Metaphor Elicitation Technique (ZMET) has gained considerable interest. This paper seeks to demonstrate the effectiveness of this alternative qualitative method, using a non-conventional approach, thus providing a useful contribution to the qualitative research area.
Resumo:
The soda process was the first chemical pulping method and was patented in 1845. Soda pulping led to kraft pulping, which involves the combined use of sodium hydroxide and sodium sulfide. Today, kraft pulping dominates the chemical pulping industry. However, about 10% of the total chemical pulp produced in the world is made using non-wood material, such as bagasse and wheat straw. The soda process is the preferred method of chemical pulping of non-wood materials, because it is considered to be economically viable on a small scale and for bagasse is compatible with sugarcane processing. With recent developments, the soda process can be designed to produce minimal effluent discharge and the fouling of evaporators by silica precipitation. The aim of this work is to produce bagasse fibres suitable for papermaking and allied applications and to produce sulfur-free lignin for use in specialty applications. A preliminary economic analysis of the soda process for producing commodity silica, lignin and pulp for papermaking is presented.
Resumo:
Spaces without northerly orientations have an impact on the ‘energy behaviour’ of a building. This paper outlines possible energy savings and better performance achieved by different zenithal solar passive strategies (skylights, roof monitors and clerestory roof windows) and element arrangements across the roof in zones of cold to temperate climates typical of the central and central-southern Argentina. Analyses were undertaken considering daylighting, thermal and ventilation performances of the different strategies. The results indicate that heating,ventilation and lighting loads in spaces without an equator-facing facade can be significantly reduced by implementing solar passive strategies. In the thermal aspect, the solar saving fraction reached for the different strategies were averaged 43.16% for clerestories, 41.4% for roof monitors and 38.86% for skylights for a glass area of 9% to the floor area. The results also indicate average illuminance levels above 500 lux for the different clerestory and monitor arrangements, uniformity ratios of 0.66–0.82 for the most distributed arrangements and day-lighting factors between 11.78 and 20.30% for clear sky conditions, depending on the strategy. In addition, minimum air changes rates of 4 were reached for the most extreme conditions.
Resumo:
Sounds of the Suburb was a commissioned public art proposal based upon a brief set by Queensland Rail for the major redevelopment at their Brunswick Street Railway Station, Fortitude Valley, Brisbane. I proposed a large scale, electronic artwork to be distributed across the glass fronted structure of their station’s new concourse building. It was designed as a network of LED based ‘tracking’ - along which would travel electronically animated, ‘trains’ of text synchronised to the actual train timetables. Each message packet moved endlessly through a complex spatial network of ‘tracks’ and ‘stations’ set both inside, outside and via the concourse. The design was underpinned by large scale image of sound waves etched onto the architecture’s glass and was accompanied by two inset monitors each presenting ghosted images of passenger movements within the concourse, time-delay recorded and then cross-combined in realtime to form new composites.----- Each moving, reprogrammable phrase was conceived as a ‘train of thought’ and ostensibly contained an idea or concept about popular cultures surrounding contemporary music – thereby meeting the brief that the work should speak to the diverse musical cultures central to Fortitude Valley’s image as an entertainment hub. These cultural ‘memes’, gathered from both passengers and the music press were situated alongside quotes from philosophies of networking, speed and digital ecologies. These texts would continually propagate, replicate and cross fertlise as they moved throughout the ‘network’, thereby writing a constantly evolving ‘textual soundcape’ of that place. This idea was further cemented through the pace, scale and rhythm of passenger movements continually recorded and re-presented on the smaller screens.
Resumo:
Knowmore (House of Commons) is a large scale generative interactive installation that incorporates embodied interaction, dynamic image creation, new furniture forms, touch sensitivity, innovative collaborative processes and multichannel generative sound creation. A large circular table spun by hand and a computer-controlled video projection falls on its top, creating an uncanny blend of physical object and virtual media. Participants’ presence around the table and how they touch it is registered, allowing up to five people to collaboratively ‘play’ this deeply immersive audiovisual work. Set within an ecological context, the work subtly asks what kind of resources and knowledges might be necessary to move us past simply knowing what needs to be changed to instead actually embodying that change, whilst hinting at other deeply relational ways of understanding and knowing the world. The work has successfully operated in two high traffic public environments, generating a subtle form of interactivity that allows different people to interact at different paces and speeds and with differing intentions, each contributing towards dramatic public outcomes. The research field involved developing new interaction and engagement strategies for eco-political media arts practice. The context was the creation of improved embodied, performative and improvisational experiences for participants; further informed by ‘Sustainment’ theory. The central question was, what ontological shifts may be necessary to better envision and align our everyday life choices in ways that respect that which is shared by all - 'The Commons'. The methodology was primarily practice-led and in concert with underlying theories. The work’s knowledge contribution was to question how new media interactive experience and embodied interaction might prompt participants to reflect upon the kind of resources and knowledges required to move past simply knowing what needs to be changed to instead actually embodying that change. This was achieved through focusing on the power of embodied learning implied by the works' strongly physical interface (i.e. the spinning of a full size table) in concert with the complex field of layered imagery and sound. The work was commissioned by the State Library of Queensland and Queensland Artworkers Alliance and significantly funded by The Australia Council for the Arts, Arts Queensland, QUT, RMIT Centre for Animation and Interactive Media and industry partners E2E Visuals. After premiering for 3 months at the State Library of Queensland it was curated into the significant ‘Mediations Biennial of Modern Art’ in Poznan, Poland. The work formed the basis of two papers, was reviewed in Realtime (90), was overviewed at Subtle Technologies (2010) in Toronto and shortlisted for ISEA 2011 Istanbul and included in the edited book/catalogue ‘Art in Spite of Economics’, a collaboration between Leonardo/ISAST (MIT Press); Goldsmiths, University of London; ISEA International; and Sabanci University, Istanbul.
Resumo:
here/there/then/now was a practice-led research project that brought together 10 independent artists in dance, music, theatre and visual/media arts to create a site-specific program within the walls of the Brisbane Powerhouse. The purpose was to explore how to best conceive flexible performance platforms, theatricalise site-specific work and engage new audiences through forms of promenade experience that could provide open choices on how and where to view it. The sold out season of 6 performances, which took place 14-19 May 2002, presented three discrete performance installations set in intimate parts of the building, each with their own aesthetic and communicative intention, culminating in a fourth in-theatre installation, where memories of the first three coalesced and were reinterrogated. Each site thereby investigated meaning-making via the moving body and its critical relationship with space and objects, in a dramatic re-contextualisation of traditional solo dance forms, now re-articulated through interdisciplinary practices. The benefit of this approach was the creation of a layered and multimodal experience that could be both shared and subsequently critiqued by performers and audience alike.
Resumo:
A computational framework for enhancing design in an evolutionary approach with a dynamic hierarchical structure is presented in this paper. This framework can be used as an evolutionary kernel for building computer-supported design systems. It provides computational components for generating, adapting and exploring alternative design solutions at multiple levels of abstraction with hierarchically structured design representations. In this paper, preliminary experimental results of using this framework in several design applications are presented.
Resumo:
John Frazer's architectural work is inspired by living and generative processes. Both evolutionary and revolutionary, it explores informatin ecologies and the dynamics of the spaces between objects. Fuelled by an interest in the cybernetic work of Gordon Pask and Norbert Wiener, and the possibilities of the computer and the "new science" it has facilitated, Frazer and his team of collaborators have conducted a series of experiments that utilize genetic algorithms, cellular automata, emergent behaviour, complexity and feedback loops to create a truly dynamic architecture. Frazer studied at the Architectural Association (AA) in London from 1963 to 1969, and later became unit master of Diploma Unit 11 there. He was subsequently Director of Computer-Aided Design at the University of Ulter - a post he held while writing An Evolutionary Architecture in 1995 - and a lecturer at the University of Cambridge. In 1983 he co-founded Autographics Software Ltd, which pioneered microprocessor graphics. Frazer was awarded a person chair at the University of Ulster in 1984. In Frazer's hands, architecture becomes machine-readable, formally open-ended and responsive. His work as computer consultant to Cedric Price's Generator Project of 1976 (see P84)led to the development of a series of tools and processes; these have resulted in projects such as the Calbuild Kit (1985) and the Universal Constructor (1990). These subsequent computer-orientated architectural machines are makers of architectural form beyond the full control of the architect-programmer. Frazer makes much reference to the multi-celled relationships found in nature, and their ongoing morphosis in response to continually changing contextual criteria. He defines the elements that describe his evolutionary architectural model thus: "A genetic code script, rules for the development of the code, mapping of the code to a virtual model, the nature of the environment for the development of the model and, most importantly, the criteria for selection. In setting out these parameters for designing evolutionary architectures, Frazer goes beyond the usual notions of architectural beauty and aesthetics. Nevertheless his work is not without an aesthetic: some pieces are a frenzy of mad wire, while others have a modularity that is reminiscent of biological form. Algorithms form the basis of Frazer's designs. These algorithms determine a variety of formal results dependent on the nature of the information they are given. His work, therefore, is always dynamic, always evolving and always different. Designing with algorithms is also critical to other architects featured in this book, such as Marcos Novak (see p150). Frazer has made an unparalleled contribution to defining architectural possibilities for the twenty-first century, and remains an inspiration to architects seeking to create responsive environments. Architects were initially slow to pick up on the opportunities that the computer provides. These opportunities are both representational and spatial: computers can help architects draw buildings and, more importantly, they can help architects create varied spaces, both virtual and actual. Frazer's work was groundbreaking in this respect, and well before its time.