996 resultados para space standards
Resumo:
The Parker Morris report of 1961 attempted, through the application of scientific principles, to define the minimum living space standards needed to accommodate household activities. But while early modernist research into ideas of existenzminimum were the work of avant-garde architects and thinkers, this report was commissioned by the British State. This normalization of scientific enquiry into space can be considered not only a response to new conditions in the mass production of housing – economies of scale, prefabrication, system-building and modular coordination – but also to the post-war boom in consumer goods. The domestic interior was assigned a key role as a privileged site of mass consumption as the production and micro-management of space in Britain became integral to the development of a planned national economy underpinned by Fordist principles. The apparently placeless and scale-less diagrams executed by Gordon Cullen to illustrate Parker Morris emblematize these relationships. Walls dissolve as space flows from inside to outside in a homogenized and ephemeral landscape whose limits are perhaps only the boundaries of the nation state and the circuits of capital.
Resumo:
6.00 pm. If people like watching T.V. while they are eating their evening meal, space for a low table is needed (Ministry of Housing and Local Government, Space in the Home, 1963, p. 4).
This paper re-examines the 1961 Parker Morris report on housing standards in Britain. It explores the origins, scope, text and iconography of the report and suggests that these not only express a particularly modernist conception of space but one which presupposed very specific economic conditions and geographies.
Also known as Homes for Today and Tomorrow Parker Morris attempted, through the application of scientific principles, to define the minimum living space standards needed to accommodate household activities. But while early modernist research into notions of existenzminimum were the work of avant-garde architects and thinkers, Homes for Today and Tomorrow and its sister design manual Space in the Home were commissioned by the British State. This normalization of scientific enquiry into space can be considered not only as a response to new conditions in the mass production of housing – economies of scale, prefabrication, system-building and modular coordination – but also to the post-war boom in consumer goods. In this, it is suggested that the domestic interior was assigned a key role as a privileged site of mass consumption as the production and micro-management of space in Britain became integral to the development of a planned national economy underpinned by Fordist principles. Parker Morris, therefore, sought to accommodate activities which were pre-determined not so much by traditional social or familial ties but rather by recently introduced commodities such as the television set, white goods, table tennis tables and train sets. This relationship between the domestic interior and the national economy are emblematized by the series of placeless and scale-less diagrams executed by Gordon Cullen in Space in the Home. Here, walls dissolve as space flows from inside to outside in a homogenized and ephemeral landscape whose limits are perhaps only the boundaries of the nation state and the circuits of capital.
In Britain, Parker Morris was the last explicit State-sponsored attempt to prescribe a normative spatial programme for national living. The calm neutral efficiency of family-life expressed in its diagrams was almost immediately problematised by the rise of 1960s counter-culture, the feminist movement and the oil crisis of 1972 which altered perhaps forever the spatial, temporal and economic conditions it had taken for granted. The debate on space-standards, however, continues.
Resumo:
Shipping list no.: 2002-0187-P.
Resumo:
In this paper, we examine Florida’s sixth-eighth grade geography standards to determine the potential for teaching critical geography, a field that interrogates space, place, power, and identity. While 57% of the standards demonstrated evidence of critical thinking, only six standards foster higher levels of critique consistent with critical geography.
Resumo:
Some Engineering Faculties are turning to the problem-based learning (PBL)paradigm to engender necessary skills and competence in their graduates. Since, at the same time, some Faculties are moving towards distance education, questions are being asked about the effectiveness of PBL for technical fields such as Engineering when delivered in virtual space. This paper outlines an investigation of how student attributes affect their learning experience in PBL courses offered in virtual space. A frequency distribution was superimposed on the outcome space of a phenomenographical study on a suitable PBL course to investigate the effect of different student attributes on the learning experience. It was discovered that the quality, quantity, and style of facilitator interaction had the greatest impact on the student learning experience. This highlights the need to establish consistent student interaction plans and to set, and ensure compliance with, minimum standards with respect to facilitation and student interactions.
Resumo:
Written for Redland City Council in collaboration with the Capalaba Stakeholders Group. The provisions detailed in this report constitute a protocol agreement developed through the Capalaba Stakeholders Group between 2009 and 2011 around young people and public spaces in Redland City, Queensland. These provisions include agreed principles, standards and responses to tensions or unacceptable behaviour, how various tensions and problems can be resolved in constructive ways and how people, including young people can work together to make a public or community accessed space safe and accessible. It is based on the recognition that young people are part of the community and that strategies to resolve tensions that arise should be inclusive and employ a mixed methods approach.
Resumo:
Presentation Structure: - THEORY - CASE STUDY 1: Southbank Institute of Technology - CASE STUDY 2: QUT Science and Technology Precinct - MORE IDEAS - ACTIVITY
Resumo:
Despite the ubiquitous nature of the discourse on human rights there is currently little research on the emergence of disclosure by multinational corporations on their human rights obligations or the regulatory dynamic that may lie behind this trend. In an attempt to begin to explore the extent to which, if any, the language of human rights has entered the discourse of corporate accountability, this paper investigates the adoption of the International Labour Organisation's (ILO) human rights standards by major multinational garment retail companies that source products from developing countries, as disclosed through their reporting media. The paper has three objectives. Firstly, to empirically explore the extent to which a group of multinational garment retailers invoke the language of human rights when disclosing their corporate responsibilities. The paper reviews corporate reporting media including social responsibility codes of conduct, annual reports and stand-alone social responsibility reports released by 18 major global clothing and retail companies during a period from 1990 to 2007. We find that the number of companies adopting and disclosing on the ILO's workplace human rights standards has significantly increased since 1998 – the year in which the ILO's standards were endorsed and accepted by the global community (ILO, 1998). Secondly, drawing on a combination of Responsive Regulation theory and neo-institutional theory, we tentatively seek to understand the regulatory space that may have influenced these large corporations to adopt the language of human rights obligations. In particular, we study the role that International Governmental Organisation's (IGO) such as ILO may have played in these disclosures. Finally, we provide some critical reflections on the power and potential within the corporate adoption of the language of human rights.
Resumo:
Diversification and expansion of global higher education in the 21st century, has resulted in Learning Landscapes for architectural education that can no longer be sustained by the traditional model. Changes have resulted because of surging student numbers, extensions to traditional curricula, evolving competency standards and accreditation requirements, and modified geographical and pedagogical boundaries. The influx of available new technology has helped to democratise knowledge, transforming when, where and how learning takes place. Pressures on government funded higher education budgets highlight the need for a critical review of the current approach to the design and use of learning environments. Efficient design of physical space contributes significantly to savings in provision, management and use of facilities, while also potentially improving pedagogical quality. The purpose of this research is to identify emerging trends in the design of future Learning Landscapes for architectural education in Australasia; to understand where and how students of architecture are likely to learn, in the future context. It explores the important linkages between space, place, pedagogy, technology and context, using a multi methodological qualitative research approach. An Australasian context study will explore the Learning Landscapes of 23 Schools of Architecture across Australia, New Zealand and Papua New Guinea. The focus of this paper is on the methodology which is being employed to undertake dynamic data collection for the study. The research will be determined through mapping all forms of architectural learning environments, pedagogical approaches and contextual issues, to bridge the gap between academic theory, and architectural design practice. An initial understanding that pedagogy is an intrinsic component imbedded within the design of learning environments, will play an important role. Active learning environments which are exemplified by the architectural design studio, support dynamic project based and collaborative connected learning models. These have recently become a lot more common in disciplines outside of design and the arts. It is anticipated, therefore, that the implications for this research may well have a positive impact far beyond the confines of the architectural studio learning environment.
Resumo:
Large MIMO systems with tens of antennas in each communication terminal using full-rate non-orthogonal space-time block codes (STBC) from Cyclic Division Algebras (CDA) can achieve the benefits of both transmit diversity as well as high spectral efficiencies. Maximum-likelihood (ML) or near-ML decoding of these large-sized STBCs at low complexities, however, has been a challenge. In this paper, we establish that near-ML decoding of these large STBCs is possible at practically affordable low complexities. We show that the likelihood ascent search (LAS) detector, reported earlier by us for V-BLAST, is able to achieve near-ML uncoded BER performance in decoding a 32x32 STBC from CDA, which employs 32 transmit antennas and sends 32(2) = 1024 complex data symbols in 32 time slots in one STBC matrix (i.e., 32 data symbols sent per channel use). In terms of coded BER, with a 16x16 STBC, rate-3/4 turbo code and 4-QAM (i.e., 24 bps/Hz), the LAS detector performs close to within just about 4 dB from the theoretical MIMO capacity. Our results further show that, with LAS detection, information lossless (ILL) STBCs perform almost as good as full-diversity ILL (FD-ILL) STBCs. Such low-complexity detectors can potentially enable implementation of high spectral efficiency large MIMO systems that could be considered in wireless standards.
Resumo:
A major challenge in wireless communications is overcoming the deleterious effects of fading, a phenomenon largely responsible for the seemingly inevitable dropped call. Multiple-antennas communication systems, commonly referred to as MIMO systems, employ multiple antennas at both transmitter and receiver, thereby creating a multitude of signalling pathways between transmitter and receiver. These multiple pathways give the signal a diversity advantage with which to combat fading. Apart from helping overcome the effects of fading, MIMO systems can also be shown to provide a manyfold increase in the amount of information that can be transmitted from transmitter to receiver. Not surprisingly,MIMO has played, and continues to play, a key role in the advancement of wireless communication.Space-time codes are a reference to a signalling format in which information about the message is dispersed across both the spatial (or antenna) and time dimension. Algebraic techniques drawing from algebraic structures such as rings, fields and algebras, have been extensively employed in the construction of optimal space-time codes that enable the potential of MIMO communication to be realized, some of which have found their way into the IEEE wireless communication standards. In this tutorial article, reflecting the authors’interests in this area, we survey some of these techniques.
Resumo:
The design cycle for complex special-purpose computing systems is extremely costly and time-consuming. It involves a multiparametric design space exploration for optimization, followed by design verification. Designers of special purpose VLSI implementations often need to explore parameters, such as optimal bitwidth and data representation, through time-consuming Monte Carlo simulations. A prominent example of this simulation-based exploration process is the design of decoders for error correcting systems, such as the Low-Density Parity-Check (LDPC) codes adopted by modern communication standards, which involves thousands of Monte Carlo runs for each design point. Currently, high-performance computing offers a wide set of acceleration options that range from multicore CPUs to Graphics Processing Units (GPUs) and Field Programmable Gate Arrays (FPGAs). The exploitation of diverse target architectures is typically associated with developing multiple code versions, often using distinct programming paradigms. In this context, we evaluate the concept of retargeting a single OpenCL program to multiple platforms, thereby significantly reducing design time. A single OpenCL-based parallel kernel is used without modifications or code tuning on multicore CPUs, GPUs, and FPGAs. We use SOpenCL (Silicon to OpenCL), a tool that automatically converts OpenCL kernels to RTL in order to introduce FPGAs as a potential platform to efficiently execute simulations coded in OpenCL. We use LDPC decoding simulations as a case study. Experimental results were obtained by testing a variety of regular and irregular LDPC codes that range from short/medium (e.g., 8,000 bit) to long length (e.g., 64,800 bit) DVB-S2 codes. We observe that, depending on the design parameters to be simulated, on the dimension and phase of the design, the GPU or FPGA may suit different purposes more conveniently, thus providing different acceleration factors over conventional multicore CPUs.
Resumo:
Please consult the paper edition of this thesis to read. It is available on the 5th Floor of the Library at Call Number: Z 9999 R43 S54 2005