948 resultados para Restoration design
Resumo:
Bangkok Metropolitan Region (BMR) is the centre for various major activities in Thailand including political, industry, agriculture, and commerce. Consequently, the BMR is the highest and most densely populated area in Thailand. Thus, the demand for houses in the BMR is also the largest, especially in subdivision developments. For these reasons, the subdivision development in the BMR has increased substantially in the past 20 years and generated large numbers of subdivision developments (AREA, 2009; Kridakorn Na Ayutthaya & Tochaiwat, 2010). However, this dramatic growth of subdivision development has caused several problems including unsustainable development, especially for subdivision neighbourhoods, in the BMR. There have been rating tools that encourage the sustainability of neighbourhood design in subdivision development, but they still have practical problems. Such rating tools do not cover the scale of the development entirely; and they concentrate more on the social and environmental conservation aspects, which have not been totally accepted by the developers (Boonprakub, 2011; Tongcumpou & Harvey, 1994). These factors strongly confirm the need for an appropriate rating tool for sustainable subdivision neighbourhood design in the BMR. To improve level of acceptance from all stakeholders in subdivision developments industry, the new rating tool should be developed based on an approach that unites the social, environmental, and economic approaches, such as eco-efficiency principle. Eco-efficiency is the sustainability indicator introduced by the World Business Council for Sustainable Development (WBCSD) since 1992. The eco-efficiency is defined as the ratio of the product or service value according to its environmental impact (Lehni & Pepper, 2000; Sorvari et al., 2009). Eco-efficiency indicator is concerned to the business, while simultaneously, is concerned with to social and the environment impact. This study aims to develop a new rating tool named "Rating for sustainable subdivision neighbourhood design (RSSND)". The RSSND methodology is developed by a combination of literature reviews, field surveys, the eco-efficiency model development, trial-and-error technique, and the tool validation process. All required data has been collected by the field surveys from July to November 2010. The ecoefficiency model is a combination of three different mathematical models; the neighbourhood property price (NPP) model, the neighbourhood development cost (NDC) model, and the neighbourhood occupancy cost (NOC) model which are attributable to the neighbourhood subdivision design. The NPP model is formulated by hedonic price model approach, while the NDC model and NOC model are formulated by the multiple regression analysis approach. The trial-and-error technique is adopted for simplifying the complex mathematic eco-efficiency model to a user-friendly rating tool format. Credibility of the RSSND has been validated by using both rated and non-rated of eight subdivisions. It is expected to meet the requirements of all stakeholders which support the social activities of the residents, maintain the environmental condition of the development and surrounding areas, and meet the economic requirements of the developers.
Resumo:
Cold-formed steel Lipped Channel Beams (LCB) with web openings are commonly used as floor joists and bearers in building structures. Shear behaviour of these beams is more complicated and their shear capacities are considerably reduced by the presence of web openings. Hence detailed numerical and experimental studies of simply supported LCBs under a mid-span load with aspect ratios of 1.0 and 1.5 were undertaken to investigate the shear behaviour and strength of LCBs with web openings. Experimental and numerical results showed that the current design rules in cold-formed steel structures design codes are very conservative. Improved design equations were therefore proposed for the shear strength of LCBs with web openings based on both experimental and numerical results. This research showed a significant reduction in shear capacities of LCBs when large web openings are included for the purpose of locating building services. A cost effective method of eliminating such detrimental effects of large circular web openings was also therefore investigated using experimental and numerical studies. For this purpose LCBS were reinforced using plate, stud, transverse and sleeve stiffeners with varying sizes and thicknesses that were welded and screw-fastened to the web of LCBs. These studies showed that plate stiffeners were the most suitable. Suitable screw-fastened plate stiffener arrangements with optimum thicknesses were then proposed for LCBs with web openings to restore their original shear capacities. This paper presents the details of finite element analyses and experiments of LCBs with web openings in shear, and the development of improved shear design rules. It then describes the experimental and numerical studies to determine the optimum plate stiffener arrangements and the results. The proposed shear design rules in this paper can be considered for inclusion in the future versions of cold-formed steel design codes.
Resumo:
Daylight devices are important components of any climate responsive façade system. But, the evolution of parametric CAD systems and digital fabrication has had an impact on architectural form so that regular forms are shifting to complex geometries. Architectural and engineering integration of daylight devices in envelopes with complex geometries is a challenge in terms of design and performance evaluation. The purpose of this paper is to assess daylight performance of a building with a climatic responsive envelope with complex geometry that integrates shading devices in the façade. The case study is based on the Esplanade buildings in Singapore. Climate-based day-light metrics such as Daylight Availability and Useful Daylight Illuminance are used. DIVA (daylight simulation), and Grasshopper (parametric analysis) plug-ins for Rhinoceros have been employed to examine the range of performance possibilities. Parameters such as dimension, inclination of the device, projected shadows and shape have been changed in order to maximize daylight availability and Useful Daylight Illuminance while minimizing glare probability. While orientation did not have a great impact on the results, aperture of the shading devices did, showing that shading devices with a projection of 1.75 m to 2.00 m performed best, achieving target lighting levels without issues of glare.
Resumo:
Multi-Objective optimization for designing of a benchmark cogeneration system known as CGAM cogeneration system has been performed. In optimization approach, the thermoeconomic and Environmental aspects have been considered, simultaneously. The environmental objective function has been defined and expressed in cost terms. One of the most suitable optimization techniques developed using a particular class of search algorithms known as; Multi-Objective Particle Swarm Optimization (MOPSO) algorithm has been used here. This approach has been applied to find the set of Pareto optimal solutions with respect to the aforementioned objective functions. An example of fuzzy decision-making with the aid of Bellman-Zadeh approach has been presented and a final optimal solution has been introduced.
Resumo:
Upgrading old buildings with the evolution of building requirements, this project investigates new approaches that can be applied to strengthen our own heritage buildings using historical and comparative analysis of heritage building restorations locally and abroad. Within the newly developing field of Heritage Engineering, it evaluates the innovative Concrete Overlay technique adapted to building restoration of the Brisbane City Hall. This study aims to extend the application of Concrete Overlay techniques and determine its compatibility specifically to heritage buildings. Concrete overlay involves drilling new reinforcement and placing concrete on top of the existing structure. It is akin to a bone transplant or bone grafting in the case of a human being and has been used by engineers to strengthen newer bridges.
Resumo:
One in five Australian workers believes that work doesn’t fit well with their family and social commitments. Concurrently, organisations are recognising that to stay competitive they need policies and practices that support the multiple aspects of employees’ lives. Many employees work in group environments yet there is currently little group level work-life balance research. This paper proposes a new theoretical framework developed to understand the design of work groups to better facilitate work-life balance. This new framework focuses on task and relational job designs, group structures and processes and workplace culture.
Resumo:
Cell line array (CMA) and tissue microarray (TMA) technologies are high-throughput methods for analysing both the abundance and distribution of gene expression in a panel of cell lines or multiple tissue specimens in an efficient and cost-effective manner. The process is based on Kononen's method of extracting a cylindrical core of paraffin-embedded donor tissue and inserting it into a recipient paraffin block. Donor tissue from surgically resected paraffin-embedded tissue blocks, frozen needle biopsies or cell line pellets can all be arrayed in the recipient block. The representative area of interest is identified and circled on a haematoxylin and eosin (H&E)-stained section of the donor block. Using a predesigned map showing a precise spacing pattern, a high density array of up to 1,000 cores of cell pellets and/or donor tissue can be embedded into the recipient block using a tissue arrayer from Beecher Instruments. Depending on the depth of the cell line/tissue removed from the donor block 100-300 consecutive sections can be cut from each CMA/TMA block. Sections can be stained for in situ detection of protein, DNA or RNA targets using immunohistochemistry (IHC), fluorescent in situ hybridisation (FISH) or mRNA in situ hybridisation (RNA-ISH), respectively. This chapter provides detailed methods for CMA/TMA design, construction and analysis with in-depth notes on all technical aspects including tips to deal with common pitfalls the user may encounter. © Springer Science+Business Media, LLC 2011.
Resumo:
Introduction Novel ecosystems that contain new combinations of invasive alien plants (IAPs) present a challenge for managers. Yet, control strategies that focus on the removal of the invasive species and/or restoring historical disturbance regimes often do not provide the best outcome for long-term control of IAPs and the promotion of more desirable plant species. Methods This study seeks to identify the primary drivers of grassland invasion to then inform management practices toward the restoration of native ecosystems. By revisiting both published and unpublished data from experiments and case studies within mainly an Australian context for native grassland management, we show how alternative states models can help to design control strategies to manage undesirable IAPs by manipulating grazing pressure. Results Ungulate grazing is generally considered antithetical to invasive species management because in many countries where livestock production is a relatively new disturbance to grasslands (such as in Australia and New Zealand as well as Canada and the USA), selective grazing pressure may have facilitated opportunities for IAPs to establish. We find that grazing stock can be used to manipulate species composition in favour of the desirable components in pastures, but whether grazing is rested or strategically applied depends on the management goal, sizes of populations of the IAP and more desirable species, and climatic and edaphic conditions. Conclusions Based on our findings, we integrated these relationships to develop a testable framework for managing IAPs with strategic grazing that considers both the current state of the plant community and the desired future state—i.e. the application of the principles behind reclamation, rehabilitation, restoration or all three—over time.
Resumo:
The recent advances in the understanding of the pathogenesis of ovarian cancer have been helpful in addressing issues in diagnosis, prognosis and management. The study of ovarian tumours by novel techniques such as immunohistochemistry, fluorescent in situ hybridisation, comparative genomic hybridisation, polymerase chain reaction and new tumour markers have aided the evaluation and application of new concepts into clinical practice. The correlation of novel surrogate tumour specific features with response to treatment and outcome in patients has defined prognostic factors which may allow the future design of tailored therapy based on a molecular profile of the tumour. These have also been used to design new approaches to therapy such as antibody targeting and gene therapy. The delineation of roles of c-erbB2, c-fms and other novel receptor kinases in the pathogenesis of ovarian cancer has led initially to the development of anti-c-erbB2 monoclonal antibody therapy. The discovery of BRCA1 and BRCA2 genes will have an impact in the diagnosis and the prevention of familial ovarian cancer. The important role played by recessive genes such as p53 in cancer has raised the possibility of restoration of gene function by gene therapy. Although the pathological diagnosis of ovarian cancer is still confirmed principally on morphological features, addition of newer investigations will increasingly be useful in addressing difficult diagnostic problems. The increasingly rapid pace of discovery of genes important in disease, makes it imperative that the evaluation of their contribution in the pathogenesis of ovarian cancer is undertaken swiftly, thus improving the overall management of patients and their outcome.
Resumo:
Recently, researchers have noted that traditional knowledge systems (TKSs) can inspire technology design. They have also noted that the interdependency between Aboriginal culture and “landscape” provides insight into an embodied approach to HCI [1]: People’s experience of place and construction of space does not separate the mind, the body, and the surroundings [2]. However, we notice that increased recognition of Aboriginal TKS is no easy panacea for the constraints on design prescribed by the way the “technology race” (pun intended) abstracts spaces. Instead, paradoxes for the cultural “localization” of technology, mentioned in previous columns in this series, emerge from complex power relations between TKSs and dominant knowledge.
Resumo:
This paper presents an analysis of the studio as the signature pedagogy of design education. A number of theoretical models of learning, pedagogy, and education are used to interrogate the studio for its advantages and shortcomings, and to identify opportunities for the integration of new technologies and to explore the affordances that they might offer. In particular the theoretical ideas of signature pedagogies, conversational frameworks, and pedagogical patterns are used to justify the ‘unique’ status of the studio as a dominant learning environment and mode of delivery within design education. Such analysis identifies the opportunities for technological intervention and enhancement of the design studio through a re-examining of its fundamental pedagogical signature. This paper maps the dimensions and qualities that define the signature pedagogy against a range of delivery modes and technological media forms. Through such investigation it seeks to identify appropriate opportunities for technology; in essence offering a structure or framework for the analysis of future enquiry and experimentation.
Resumo:
In 2012, Queensland University of Technology (QUT) committed to the massive project of revitalizing its Bachelor of Science (ST01) degree. Like most universities in Australia, QUT has begun work to align all courses by 2015 to the requirements of the updated Australian Qualifications Framework (AQF) which is regulated by the Tertiary Education Quality and Standards Agency (TEQSA). From the very start of the redesigned degree program, students approach scientific study with an exciting mix of theory and highly topical real world examples through their chosen “grand challenge.” These challenges, Fukushima and nuclear energy for example, are the lenses used to explore science and lead to 21st century learning outcomes for students. For the teaching and learning support staff, our grand challenge is to expose all science students to multidisciplinary content with a strong emphasis on embedding information literacies into the curriculum. With ST01, QUT is taking the initiative to rethink not only content but how units are delivered and even how we work together between the faculty, the library and learning and teaching support. This was the desired outcome but as we move from design to implementation, has this goal been achieved? A main component of the new degree is to ensure scaffolding of information literacy skills throughout the entirety of the three year course. However, with the strong focus on problem-based learning and group work skills, many issues arise both for students and lecturers. A move away from a traditional lecture style is necessary but impacts on academics’ workload and comfort levels. Therefore, academics in collaboration with librarians and other learning support staff must draw on each others’ expertise to work together to ensure pedagogy, assessments and targeted classroom activities are mapped within and between units. This partnership can counteract the tendency of isolated, unsupported academics to concentrate on day-to-day teaching at the expense of consistency between units and big picture objectives. Support staff may have a more holistic view of a course or degree than coordinators of individual units, making communication and truly collaborative planning even more critical. As well, due to staffing time pressures, design and delivery of new curriculum is generally done quickly with no option for the designers to stop and reflect on the experience and outcomes. It is vital we take this unique opportunity to closely examine what QUT has and hasn’t achieved to be able to recommend a better way forward. This presentation will discuss these important issues and stumbling blocks, to provide a set of best practice guidelines for QUT and other institutions. The aim is to help improve collaboration within the university, as well as to maximize students’ ability to put information literacy skills into action. As our students embark on their own grand challenges, we must challenge ourselves to honestly assess our own work.
Resumo:
Multiple reaction monitoring (MRM) mass spectrometry coupled with stable isotope dilution (SID) and liquid chromatography (LC) is increasingly used in biological and clinical studies for precise and reproducible quantification of peptides and proteins in complex sample matrices. Robust LC-SID-MRM-MS-based assays that can be replicated across laboratories and ultimately in clinical laboratory settings require standardized protocols to demonstrate that the analysis platforms are performing adequately. We developed a system suitability protocol (SSP), which employs a predigested mixture of six proteins, to facilitate performance evaluation of LC-SID-MRM-MS instrument platforms, configured with nanoflow-LC systems interfaced to triple quadrupole mass spectrometers. The SSP was designed for use with low multiplex analyses as well as high multiplex approaches when software-driven scheduling of data acquisition is required. Performance was assessed by monitoring of a range of chromatographic and mass spectrometric metrics including peak width, chromatographic resolution, peak capacity, and the variability in peak area and analyte retention time (RT) stability. The SSP, which was evaluated in 11 laboratories on a total of 15 different instruments, enabled early diagnoses of LC and MS anomalies that indicated suboptimal LC-MRM-MS performance. The observed range in variation of each of the metrics scrutinized serves to define the criteria for optimized LC-SID-MRM-MS platforms for routine use, with pass/fail criteria for system suitability performance measures defined as peak area coefficient of variation <0.15, peak width coefficient of variation <0.15, standard deviation of RT <0.15 min (9 s), and the RT drift <0.5min (30 s). The deleterious effect of a marginally performing LC-SID-MRM-MS system on the limit of quantification (LOQ) in targeted quantitative assays illustrates the use and need for a SSP to establish robust and reliable system performance. Use of a SSP helps to ensure that analyte quantification measurements can be replicated with good precision within and across multiple laboratories and should facilitate more widespread use of MRM-MS technology by the basic biomedical and clinical laboratory research communities.
Resumo:
Total Artificial Hearts are mechanical pumps which can be used to replace the failing natural heart. This novel study developed a means of controlling a new design of pump to reproduce physiological flow bringing closer the realisation of a practical artificial heart. Using a mathematical model of the device, an optimisation algorithm was used to determine the best configuration for the magnetic levitation system of the pump. The prototype device was constructed and tested in a mock circulation loop. A physiological controller was designed to replicate the Frank-Starling like balancing behaviour of the natural heart. The device and controller provided sufficient support for a human patient while also demonstrating good response to various physiological conditions and events. This novel work brings the design of a practical artificial heart closer to realisation.