869 resultados para Fatigue life distribution
Resumo:
A need for an efficient life care management of building portfolio is becoming increasingly due to increase in aging building infrastructure globally. Appropriate structural engineering practices along with facility management can assist in optimising the remaining life cycle costs for existing public building portfolio. A more precise decision to either demolish, refurbish, do nothing or rebuilt option for any typical building under investigation is needed. In order to achieve this, the status of health of the building needs to be assessed considering several aspects including economic and supply-demand considerations. An investment decision for a refurbishment project competing with other capital works and/or refurbishment projects can be supported by emerging methodology residual service life assessment. This paper discusses challenges in refurbishment projects of public buildings and with a view towards development of residual service life assessment methodology
Resumo:
This paper describes the process adopted in developing an integrated decision support framework for planning of office building refurbishment projects, with specific emphasize on optimising rentable floor space, structural strengthening, residual life and sustainability. Expert opinion on the issues to be considered in a tool is being captured through the DELPHI process, which is currently ongoing. The methodology for development of the integrated tool will be validated through decisions taken during a case study project: refurbishment of CH1 building of Melbourne City Council, which will be followed through to completion by the research team. Current status of the CH1 planning will be presented in the context of the research project.
Resumo:
Quantum key distribution (QKD) promises secure key agreement by using quantum mechanical systems. We argue that QKD will be an important part of future cryptographic infrastructures. It can provide long-term confidentiality for encrypted information without reliance on computational assumptions. Although QKD still requires authentication to prevent man-in-the-middle attacks, it can make use of either information-theoretically secure symmetric key authentication or computationally secure public key authentication: even when using public key authentication, we argue that QKD still offers stronger security than classical key agreement.
Resumo:
There is evidence that many heating, ventilating & air conditioning (HVAC) systems, installed in larger buildings, have more capacity than is ever required to keep the occupants comfortable. This paper explores the reasons why this can occur, by examining a typical brief/design/documentation process. Over-sized HVAC systems cost more to install and operate and may not be able to control thermal comfort as well as a “right-sized” system. These impacts are evaluated, where data exists. Finally, some suggestions are developed to minimise both the extent of, and the negative impacts of, HVAC system over-sizing, for example: • Challenge “rules of thumb” and/or brief requirements which may be out of date. • Conduct an accurate load estimate, using AIRAH design data, specific to project location, and then resist the temptation to apply “safety factors • Use a load estimation program that accounts for thermal storage and diversification of peak loads for each zone and air handling system. • Select chiller sizes and staged or variable speed pumps and fans to ensure good part load performance. • Allow for unknown future tenancies by designing flexibility into the system, not by over-sizing. For example, generous sizing of distribution pipework and ductwork will allow available capacity to be redistributed. • Provide an auxiliary tenant condenser water loop to handle high load areas. • Consider using an Integrated Design Process, build an integrated load and energy use simulation model and test different operational scenarios • Use comprehensive Life Cycle Cost analysis for selection of the most optimal design solutions. This paper is an interim report on the findings of CRC-CI project 2002-051-B, Right-Sizing HVAC Systems, which is due for completion in January 2006.
Resumo:
The endeavour to obtain estimates of durability of components for use in lifecycle assessment or costing and infrastructure and maintenance planning systems is large. The factor method and the reference service life concept provide a very valuable structure, but do not resolve the central dilemma of the need to derive an extensive database of service life. Traditional methods of estimating service life, such as dose functions or degradation models, can play a role in developing this database, however the scale of the problem clearly indicates that individual dose functions cannot be derived for each component in each different local and geographic setting. Thus, a wider range of techniques is required in order to devise reference service life. This paper outlines the approaches being taken in the Cooperative Research Centre for Construction Innovation project to predict reference service life. Approaches include the development of fundamental degradation and microclimate models, the development of a situation-based reasoning ‘engine’ to vary the ‘estimator’ of service life, and the development of a database on expert performance (Delphi study). These methods should be viewed as complementary rather than as discrete alternatives. As discussed in the paper, the situation-based reasoning approach in fact has the possibility of encompassing all other methods.
Resumo:
In this paper, the placement of sectionalizers, as well as, a cross-connection is optimally determined so that the objective function is minimized. The objective function employed in this paper consists of two main parts, the switch cost and the reliability cost. The switch cost is composed of the cost of sectionalizers and cross-connection and the reliability cost is assumed to be proportional to a reliability index, SAIDI. To optimize the allocation of sectionalizers and cross-connection problem realistically, the cost related to each element is considered as discrete. In consequence of binary variables for the availability of sectionalizers, the problem is extremely discrete. Therefore, the probability of local minimum risk is high and a heuristic-based optimization method is needed. A Discrete Particle Swarm Optimization (DPSO) is employed in this paper to deal with this discrete problem. Finally, a testing distribution system is used to validate the proposed method.
Resumo:
Isolation of a faulted segment, from either side of a fault, in a radial feeder that has several converter interfaced DGs is a challenging task when current sensing protective devices are employed. The protective device, even if it senses a downstream fault, may not operate if fault current level is low due to the current limiting operation of converters. In this paper, a new inverse type relay is introduced based on line admittance measurement to protect a distribution network, which has several converter interfaced DGs. The basic operation of this relay, its grading and reach settings are explained. Moreover a method is proposed to compensate the fault resistance such that the relay operation under this condition is reliable. Then designed relay performances are evaluated in a radial distribution network. The results are validated through PSCAD/EMTDC simulation and MATLAB calculations.
Resumo:
Queensland Department of Main Roads, Australia, spends approximately A$ 1 billion annually for road infrastructure asset management. To effectively manage road infrastructure, firstly road agencies not only need to optimise the expenditure for data collection, but at the same time, not jeopardise the reliability in using the optimised data to predict maintenance and rehabilitation costs. Secondly, road agencies need to accurately predict the deterioration rates of infrastructures to reflect local conditions so that the budget estimates could be accurately estimated. And finally, the prediction of budgets for maintenance and rehabilitation must provide a certain degree of reliability. This paper presents the results of case studies in using the probability-based method for an integrated approach (i.e. assessing optimal costs of pavement strength data collection; calibrating deterioration prediction models that suit local condition and assessing risk-adjusted budget estimates for road maintenance and rehabilitation for assessing life-cycle budget estimates). The probability concept is opening the path to having the means to predict life-cycle maintenance and rehabilitation budget estimates that have a known probability of success (e.g. produce budget estimates for a project life-cycle cost with 5% probability of exceeding). The paper also presents a conceptual decision-making framework in the form of risk mapping in which the life-cycle budget/cost investment could be considered in conjunction with social, environmental and political issues.
Resumo:
This paper discusses challenges to developers of a national Life Cycle Inventory (LCI) database on which to base assessment of building environmental impacts and a key to development of a fully integrated eco-design tool created for automated eco-efficiency assessment of commercial building design direct from 3D CAD. The scope of this database includes Australian and overseas processing burdens involved in acquiring, processing, transporting, fabricating, finishing and using metals, masonry, timber, glazing, ceramics, plastics, fittings, composites and coatings. Burdens are classified, calculated and reported for all flows of raw materials, fuels, energy and emissions to and from the air, soil and water associated with typical products and services in building construction, fitout and operation. The aggregated life cycle inventory data provides the capacity to generate environmental impact assessment reports based on accepted performance indicators. Practitioners can identify hot spots showing high environmental burdens of a proposed design and drill down to report on specific building components. They can compare assessments with case studies and operational estimates to assist in eco-efficient design of a building, fitout and operation.
Resumo:
Understanding the differences between the temporal and physical aspects of the building life cycle is an essential ingredient in the development of Building Environmental Assessment (BEA) tools. This paper illustrates a theoretical Life Cycle Assessment (LCA) framework aligning temporal decision-making with that of material flows over building development phases. It was derived during development of a prototype commercial building design tool that was based on a 3-D CAD information and communications technology (ICT) platform and LCA software. The framework aligns stakeholder BEA needs and the decision-making process against characteristics of leading green building tools. The paper explores related integration of BEA tool development applications on such ICT platforms. Key framework modules are depicted and practical examples for BEA are provided for: • Definition of investment and service goals at project initiation; • Design integrated to avoid overlaps/confusion over the project life cycle; • Detailing the supply chain considering building life cycle impacts; • Delivery of quality metrics for occupancy post-construction/handover; • Deconstruction profiling at end of life to facilitate recovery.
Resumo:
Alvin Tofflers Bild des Prosumers beeinflußt weiterhin maßgeblich unser Verständnis vieler heutzutage als „Social Media“ oder „Web 2.0“ beschriebener nutzergesteuerter, kollaborativer Prozesse der Inhaltserstellung. Ein genauerer Blick auf Tofflers eigene Beschreibung seines Prosumermodells offenbart jedoch, daß es fest im Zeitalter der Massenmedienvorherrschaft verankert bleibt: der Prosumer ist eben nicht jener aus eigenem Antrieb aktive, kreative Ersteller und Weiterbearbeiter neuer Inhalte, wie er heutzutage in Projekten von der Open-Source-Software über die Wikipedia bis hin zu Second Life zu finden ist, sondern nur ein ganz besonders gut informierter, und daher in seinem Konsumverhalten sowohl besonders kritischer als auch besonders aktiver Konsument. Hochspezialisierte, High-End-Konsumenten etwa im Hi-Fi- oder Automobilbereich stellen viel eher das Idealbild des Prosumers dar als das für Mitarbeiter in oft eben gerade nicht (oder zumindest noch nicht) kommerziell erfaßten nutzergesteuerten Kollaborationsprojekten der Fall ist. Solches von Tofflers in den 70ern erarbeiteten Modells zu erwarten, ist sicherlich ohnehin zuviel verlangt. Das Problem liegt also nicht bei Toffler selbst, sondern vielmehr in den im Industriezeitalter vorherrschenden Vorstellungen eines recht deutlich in Produktion, Distribution, und Konsum eingeteilten Prozesses. Diese Dreiteilung war für die Erschaffung materieller wie immaterieller Güter durchaus notwendig – sie ist selbst für die konventionellen Massenmedien zutreffend, bei denen Inhaltsproduktion ebenso aus kommerziellen Gründen auf einige wenige Institutionen konzentriert war wie das für die Produktion von Konsumgütern der Fall ist. Im beginnenden Informationszeitalter, beherrscht durch dezentralisierte Mediennetzwerke und weithin erhaltbare und erschwingliche Produktionsmittel, liegt der Fall jedoch anders. Was passiert, wenn Distribution automatisch erfolgt, und wenn beinahe jeder Konsument auch Produzent sein kann, anstelle einer kleinen Schar von kommerziell unterstützten Produzenten, denen bestenfallls vielleicht eine Handvoll von nahezu professionellen Prosumern zur Seite steht? Was geschieht, wenn sich die Zahl der von Eric von Hippel als ‚lead user’ beschriebenen als Produzenten aktiven Konsumenten massiv ausdehnt – wenn, wie Wikipedias Slogan es beschreibt, ‚anyone can edit’, wenn also potentiell jeder Nutzer aktiv an der Inhaltserstellung teilnehmen kann? Um die kreative und kollaborative Beteiligung zu beschreiben, die heutzutage nutzergesteuerte Projekte wie etwa die Wikipedia auszeichnet, sind Begriffe wie ‚Produktion’ und ‚Konsum’ nur noch bedingt nützlich – selbst in Konstruktionen wie 'nutzergesteuerte Produktion' oder 'P2P-Produktion'. In den Nutzergemeinschaften, die an solchen Formen der Inhaltserschaffung teilnehmen, haben sich Rollen als Konsumenten und Benutzer längst unwiederbringlich mit solchen als Produzent vermischt: Nutzer sind immer auch unausweichlich Produzenten der gemeinsamen Informationssammlung, ganz egal, ob sie sich dessens auch bewußt sind: sie haben eine neue, hybride Rolle angenommen, die sich vielleicht am besten als 'Produtzer' umschreiben lassen kann. Projekte, die auf solche Produtzung (Englisch: produsage) aufbauen, finden sich in Bereichen von Open-Source-Software über Bürgerjournalismus bis hin zur Wikipedia, und darüberhinaus auch zunehmend in Computerspielen, Filesharing, und selbst im Design materieller Güter. Obwohl unterschiedlich in ihrer Ausrichtung, bauen sie doch auf eine kleine Zahl universeller Grundprinzipien auf. Dieser Vortrag beschreibt diese Grundprinzipien, und zeigt die möglichen Implikationen dieses Übergangs von Produktion (und Prosumption) zu Produtzung auf.
Resumo:
Perspectives on work-life balance (WLB) reflected in political, media and organisational discourse, would maintain that WLB is on the agenda because of broad social, economic and political factors (Fleetwood 2007). In contrast, critical scholarship which examines work-life balance (WLB) and its associated practices maintains that workplace flexibility is more than a quasi-functionalist response to contemporary problems faced by individuals, families or organisations. For example, the literature identifies where flexible work arrangements have not lived up to expectations of a panacea for work-home conflicts, being characterised as much by employer-driven working conditions that disadvantage workers and constrain balance, as they are by employee friendly practices that enable it (Charlesworth 1997). Further, even where generous organisational work-life balance policies exist, under-utilisation is an issue (Schaefer et al, 2007). Compounding these issues is that many employees perceive their paid work as becoming more intense, pressured and demanding (Townsend et al 2003).
Resumo:
The purpose of this study was to examine the impact of pain on functioning across multiple quality of life (QOL) domains among individuals with multiple sclerosis (MS). A total of 219 people were recruited from a regional MS society membership database to serve as the community-based study sample. All participants completed a questionnaire containing items about their demographic and clinical characteristics, validated measures of QOL and MS-related disability, and a question on whether or not they had experienced clinically significant pain in the preceding 2 weeks. Respondents who reported pain then completed an in-person structured pain interview assessing pain characteristics (intensity, quality, location, extent, and duration). Comparisons between participants with and without MS-related pain demonstrated that pain prevalence and intensity were strongly correlated with QOL: physical health, psychological health, level of independence, and global QOL were more likely to be impaired among people with MS when pain was present, and the extent of impairment was associated with the intensity of pain. Moreover, these relationships remained significant even after statistically controlling for multiple demographic and clinical covariates associated with self-reported QOL. These findings suggest that for people with MS, pain is an important source of distress and disability beyond that caused by neurologic impairments.
Resumo:
Background Primary prevention of childhood overweight is an international priority. In Australia 20-25% of 2-8 year olds are already overweight. These children are at substantially increased the risk of becoming overweight adults, with attendant increased risk of morbidity and mortality. Early feeding practices determine infant exposure to food (type, amount, frequency) and include responses (eg coercion) to infant feeding behaviour (eg. food refusal). There is correlational evidence linking parenting style and early feeding practices to child eating behaviour and weight status. A focus on early feeding is consistent with the national focus on early childhood as the foundation for life-long health and well being. The NOURISH trial aims to implement and evaluate a community-based intervention to promote early feeding practices that will foster healthy food preferences and intake and preserve the innate capacity to self-regulate food intake in young children. Methods/Design This randomised controlled trial (RCT) aims to recruit 820 first-time mothers and their healthy term infants. A consecutive sample of eligible mothers will be approached postnatally at major maternity hospitals in Brisbane and Adelaide. Initial consent will be for re-contact for full enrolment when the infants are 4-7 months old. Individual mother- infant dyads will be randomised to usual care or the intervention. The intervention will provide anticipatory guidance via two modules of six fortnightly parent education and peer support group sessions, each followed by six months of regular maintenance contact. The modules will commence when the infants are aged 4-7 and 13-16 months to coincide with establishment of solid feeding, and autonomy and independence, respectively. Outcome measures will be assessed at baseline, with follow up at nine and 18 months. These will include infant intake (type and amount of foods), food preferences, feeding behaviour and growth and self-reported maternal feeding practices and parenting practices and efficacy. Covariates will include sociodemographics, infant feeding mode and temperament, maternal weight status and weight concern and child care exposure. Discussion Despite the strong rationale to focus on parents’ early feeding practices as a key determinant of child food preferences, intake and self-regulatory capacity, prospective longitudinal and intervention studies are rare. This trial will be amongst to provide Level II evidence regarding the impact of an intervention (commencing prior to age 12 months) on children’s eating patterns and behaviours. Trial Registration: ACTRN12608000056392
Resumo:
It is widely acknowledged that “quality of life” (QoL) is an imprecise concept, which is difficult to define (Arnold, 1991; Ball et al., 2000; Bury & Holme, 1993; Byrne & MacLean, 1997; Guse & Masesar, 1999; McDowell & Newell, 1996). McDowell and Newell (1996) described the term as “intuitively familiar” (p.382), suggesting that everyone believes that they know what it means; while, in reality its meaning differs from person to person. Recent years, have seen steadily increasing interest in the study and measurement of QoL related to human services, which reflects greater importance being attached to accountability in its widest sense. Anecdotally, many care staff will indicate that ensuring good QoL for their clients is important to them, but how can we ascertain whether we are achieving positive QoL outcomes, and given the complexities of the concept and its measurement, how can we best incorporate QoL assessment into everyday practice? This chapter will explore the issues of QoL definition and measurement, particularly as they pertain to aged care. It will consider many measurement tool options, and provide advice on how to choose an appropriate instrument for your circumstances. Issues of quality of care and their relationship to QoL will also be considered, and the chapter will conclude with a discussion on the integration of QoL assessment into practice. Because residential aged care constitutes a living environment as well as a care environment, QoL is considered particularly pertinent in this context, and as such, it will provide much of the focus for the chapter